NASA Technical Reports Server (NTRS)
Newsom, J. R.; Mukhopadhyay, V.
1983-01-01
A method for designing robust feedback controllers for multiloop systems is presented. Robustness is characterized in terms of the minimum singular value of the system return difference matrix at the plant input. Analytical gradients of the singular values with respect to design variables in the controller are derived. A cumulative measure of the singular values and their gradients with respect to the design variables is used with a numerical optimization technique to increase the system's robustness. Both unconstrained and constrained optimization techniques are evaluated. Numerical results are presented for a two-input/two-output drone flight control system.
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Mukhopadhyay, V.
1983-01-01
A method for designing robust feedback controllers for multiloop systems is presented. Robustness is characterized in terms of the minimum singular value of the system return difference matrix at the plant input. Analytical gradients of the singular values with respect to design variables in the controller are derived. A cumulative measure of the singular values and their gradients with respect to the design variables is used with a numerical optimization technique to increase the system's robustness. Both unconstrained and constrained optimization techniques are evaluated. Numerical results are presented for a two output drone flight control system.
Analytical challenges for conducting rapid metabolism characterization for QIVIVE.
Tolonen, Ari; Pelkonen, Olavi
2015-06-05
For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Approximate analytical relationships for linear optimal aeroelastic flight control laws
NASA Astrophysics Data System (ADS)
Kassem, Ayman Hamdy
1998-09-01
This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.
Closed-loop, pilot/vehicle analysis of the approach and landing task
NASA Technical Reports Server (NTRS)
Schmidt, D. K.; Anderson, M. R.
1985-01-01
Optimal-control-theoretic modeling and frequency-domain analysis is the methodology proposed to evaluate analytically the handling qualities of higher-order manually controlled dynamic systems. Fundamental to the methodology is evaluating the interplay between pilot workload and closed-loop pilot/vehicle performance and stability robustness. The model-based metric for pilot workload is the required pilot phase compensation. Pilot/vehicle performance and loop stability is then evaluated using frequency-domain techniques. When these techniques were applied to the flight-test data for thirty-two highly-augmented fighter configurations, strong correlation was obtained between the analytical and experimental results.
Polling, Saskia; Hatters, Danny M; Mok, Yee-Foong
2013-01-01
Defining the aggregation process of proteins formed by poly-amino acid repeats in cells remains a challenging task due to a lack of robust techniques for their isolation and quantitation. Sedimentation velocity methodology using fluorescence detected analytical ultracentrifugation is one approach that can offer significant insight into aggregation formation and kinetics. While this technique has traditionally been used with purified proteins, it is now possible for substantial information to be collected with studies using cell lysates expressing a GFP-tagged protein of interest. In this chapter, we describe protocols for sample preparation and setting up the fluorescence detection system in an analytical ultracentrifuge to perform sedimentation velocity experiments on cell lysates containing aggregates formed by poly-amino acid repeat proteins.
Islas, Gabriela; Hernandez, Prisciliano
2017-01-01
To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027
Posch, Tjorben Nils; Pütz, Michael; Martin, Nathalie; Huhn, Carolin
2015-01-01
In this review we introduce the advantages and limitations of electromigrative separation techniques in forensic toxicology. We thus present a summary of illustrative studies and our own experience in the field together with established methods from the German Federal Criminal Police Office rather than a complete survey. We focus on the analytical aspects of analytes' physicochemical characteristics (e.g. polarity, stereoisomers) and analytical challenges including matrix tolerance, separation from compounds present in large excess, sample volumes, and orthogonality. For these aspects we want to reveal the specific advantages over more traditional methods. Both detailed studies and profiling and screening studies are taken into account. Care was taken to nearly exclusively document well-validated methods outstanding for the analytical challenge discussed. Special attention was paid to aspects exclusive to electromigrative separation techniques, including the use of the mobility axis, the potential for on-site instrumentation, and the capillary format for immunoassays. The review concludes with an introductory guide to method development for different separation modes, presenting typical buffer systems as starting points for different analyte classes. The objective of this review is to provide an orientation for users in separation science considering using capillary electrophoresis in their laboratory in the future.
Advances in Adaptive Control Methods
NASA Technical Reports Server (NTRS)
Nguyen, Nhan
2009-01-01
This poster presentation describes recent advances in adaptive control technology developed by NASA. Optimal Control Modification is a novel adaptive law that can improve performance and robustness of adaptive control systems. A new technique has been developed to provide an analytical method for computing time delay stability margin for adaptive control systems.
Impact of the macroeconomic factors on university budgeting the US and Russia
NASA Astrophysics Data System (ADS)
Bogomolova, Arina; Balk, Igor; Ivachenko, Natalya; Temkin, Anatoly
2017-10-01
This paper discuses impact of macroeconomics factor on the university budgeting. Modern developments in the area of data science and machine learning made it possible to utilise automated techniques to address several problems of humankind ranging from genetic engineering and particle physics to sociology and economics. This paper is the first step to create a robust toolkit which will help universities sustain macroeconomic challenges utilising modern predictive analytics techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.
2009-04-14
This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less
Developments in Cylindrical Shell Stability Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Starnes, James H., Jr.
1998-01-01
Today high-performance computing systems and new analytical and numerical techniques enable engineers to explore the use of advanced materials for shell design. This paper reviews some of the historical developments of shell buckling analysis and design. The paper concludes by identifying key research directions for reliable and robust methods development in shell stability analysis and design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.
1990-08-01
This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less
Control algorithms for aerobraking in the Martian atmosphere
NASA Technical Reports Server (NTRS)
Ward, Donald T.; Shipley, Buford W., Jr.
1991-01-01
The Analytic Predictor Corrector (APC) and Energy Controller (EC) atmospheric guidance concepts were adapted to control an interplanetary vehicle aerobraking in the Martian atmosphere. Changes are made to the APC to improve its robustness to density variations. These changes include adaptation of a new exit phase algorithm, an adaptive transition velocity to initiate the exit phase, refinement of the reference dynamic pressure calculation and two improved density estimation techniques. The modified controller with the hybrid density estimation technique is called the Mars Hybrid Predictor Corrector (MHPC), while the modified controller with a polynomial density estimator is called the Mars Predictor Corrector (MPC). A Lyapunov Steepest Descent Controller (LSDC) is adapted to control the vehicle. The LSDC lacked robustness, so a Lyapunov tracking exit phase algorithm is developed to guide the vehicle along a reference trajectory. This algorithm, when using the hybrid density estimation technique to define the reference path, is called the Lyapunov Hybrid Tracking Controller (LHTC). With the polynomial density estimator used to define the reference trajectory, the algorithm is called the Lyapunov Tracking Controller (LTC). These four new controllers are tested using a six degree of freedom computer simulation to evaluate their robustness. The MHPC, MPC, LHTC, and LTC show dramatic improvements in robustness over the APC and EC.
Protein assay structured on paper by using lithography
NASA Astrophysics Data System (ADS)
Wilhelm, E.; Nargang, T. M.; Al Bitar, W.; Waterkotte, B.; Rapp, B. E.
2015-03-01
There are two main challenges in producing a robust, paper-based analytical device. The first one is to create a hydrophobic barrier which unlike the commonly used wax barriers does not break if the paper is bent. The second one is the creation of the (bio-)specific sensing layer. For this proteins have to be immobilized without diminishing their activity. We solve both problems using light-based fabrication methods that enable fast, efficient manufacturing of paper-based analytical devices. The first technique relies on silanization by which we create a flexible hydrophobic barrier made of dimethoxydimethylsilane. The second technique demonstrated within this paper uses photobleaching to immobilize proteins by means of maskless projection lithography. Both techniques have been tested on a classical lithography setup using printed toner masks and on a lithography system for maskless lithography. Using these setups we could demonstrate that the proposed manufacturing techniques can be carried out at low costs. The resolution of the paper-based analytical devices obtained with static masks was lower due to the lower mask resolution. Better results were obtained using advanced lithography equipment. By doing so we demonstrated, that our technique enables fabrication of effective hydrophobic boundary layers with a thickness of only 342 μm. Furthermore we showed that flourescine-5-biotin can be immobilized on the non-structured paper and be employed for the detection of streptavidinalkaline phosphatase. By carrying out this assay on a paper-based analytical device which had been structured using the silanization technique we proofed biological compatibility of the suggested patterning technique.
Addressing Climate Change in Long-Term Water Planning Using Robust Decisionmaking
NASA Astrophysics Data System (ADS)
Groves, D. G.; Lempert, R.
2008-12-01
Addressing climate change in long-term natural resource planning is difficult because future management conditions are deeply uncertain and the range of possible adaptation options are so extensive. These conditions pose challenges to standard optimization decision-support techniques. This talk will describe a methodology called Robust Decisionmaking (RDM) that can complement more traditional analytic approaches by utilizing screening-level water management models to evaluate large numbers of strategies against a wide range of plausible future scenarios. The presentation will describe a recent application of the methodology to evaluate climate adaptation strategies for the Inland Empire Utilities Agency in Southern California. This project found that RDM can provide a useful way for addressing climate change uncertainty and identify robust adaptation strategies.
A Corona Discharge Initiated Electrochemical Electrospray Ionization Technique
Lloyd, John R.; Hess, Sonja
2009-01-01
We report here the development of a corona discharge (CD) initiated electrochemical (EC) electrospray ionization (ESI) technique using a standard electrospray ion source. This is a new ionization technique distinct from ESI, electrochemistry inherent to ESI, APCI, and techniques using hydroxyl radicals produced under atmospheric pressure conditions. By maximizing the observable CD at the tip of a stainless steel ESI capillary, efficient electrochemical oxidation of electrochemically active compounds is observed. For electrochemical oxidation to be observed, the ionization potential of the analyte must be lower than Fe. Ferrocene labeled compounds were chosen as the electrochemically active moiety. The electrochemical cell in the ESI source was robust and generated ions with selectivity according to the ionization potential of the analytes and up to zeptomolar sensitivity. Our results indicate that CD initiated electrochemical ionization has the potential to become a powerful technique to increase the dynamic range, sensitivity and selectivity of ESI experiments. Synopsis Using a standard ESI source a corona discharge initiated electrochemical ionization technique was established resulting from the electrochemistry occurring at the CD electrode surface. PMID:19747843
ERIC Educational Resources Information Center
Lix, Lisa M.; And Others
1996-01-01
Meta-analytic techniques were used to summarize the statistical robustness literature on Type I error properties of alternatives to the one-way analysis of variance "F" test. The James (1951) and Welch (1951) tests performed best under violations of the variance homogeneity assumption, although their use is not always appropriate. (SLD)
Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples.
Artigues, Margalida; Abellà, Jordi; Colominas, Sergi
2017-11-14
Amperometric biosensors based on the use of glucose oxidase (GOx) are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan) onto highly ordered titanium dioxide nanotube arrays (TiO₂NTAs) has been evaluated. The GOx-Chitosan/TiO₂NTAs biosensor showed a sensitivity of 5.46 μA·mM -1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%), reproducibility (RSD = 2.5%), accuracy (95-105% recovery), and robustness (RSD = 3.3%). Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx-Chitosan/TiO₂NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated.
Guidance to Achieve Accurate Aggregate Quantitation in Biopharmaceuticals by SV-AUC.
Arthur, Kelly K; Kendrick, Brent S; Gabrielson, John P
2015-01-01
The levels and types of aggregates present in protein biopharmaceuticals must be assessed during all stages of product development, manufacturing, and storage of the finished product. Routine monitoring of aggregate levels in biopharmaceuticals is typically achieved by size exclusion chromatography (SEC) due to its high precision, speed, robustness, and simplicity to operate. However, SEC is error prone and requires careful method development to ensure accuracy of reported aggregate levels. Sedimentation velocity analytical ultracentrifugation (SV-AUC) is an orthogonal technique that can be used to measure protein aggregation without many of the potential inaccuracies of SEC. In this chapter, we discuss applications of SV-AUC during biopharmaceutical development and how characteristics of the technique make it better suited for some applications than others. We then discuss the elements of a comprehensive analytical control strategy for SV-AUC. Successful implementation of these analytical control elements ensures that SV-AUC provides continued value over the long time frames necessary to bring biopharmaceuticals to market. © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2015-01-01
This report documents a case study on the application of Reliability Engineering techniques to achieve an optimal balance between performance and robustness by tuning the functional parameters of a complex non-linear control system. For complex systems with intricate and non-linear patterns of interaction between system components, analytical derivation of a mathematical model of system performance and robustness in terms of functional parameters may not be feasible or cost-effective. The demonstrated approach is simple, structured, effective, repeatable, and cost and time efficient. This general approach is suitable for a wide range of systems.
Expansion of transient operating data
NASA Astrophysics Data System (ADS)
Chipman, Christopher; Avitabile, Peter
2012-08-01
Real time operating data is very important to understand actual system response. Unfortunately, the amount of physical data points typically collected is very small and often interpretation of the data is difficult. Expansion techniques have been developed using traditional experimental modal data to augment this limited set of data. This expansion process allows for a much improved description of the real time operating response. This paper presents the results from several different structures to show the robustness of the technique. Comparisons are made to a more complete set of measured data to validate the approach. Both analytical simulations and actual experimental data are used to illustrate the usefulness of the technique.
Supercritical fluid chromatography: a promising alternative to current bioanalytical techniques.
Dispas, Amandine; Jambo, Hugues; André, Sébastien; Tyteca, Eva; Hubert, Philippe
2018-01-01
During the last years, chemistry was involved in the worldwide effort toward environmental problems leading to the birth of green chemistry. In this context, green analytical tools were developed as modern Supercritical Fluid Chromatography in the field of separative techniques. This chromatographic technique knew resurgence a few years ago, thanks to its high efficiency, fastness and robustness of new generation equipment. These advantages and its easy hyphenation to MS fulfill the requirements of bioanalysis regarding separation capacity and high throughput. In the present paper, the technical aspects focused on bioanalysis specifications will be detailed followed by a critical review of bioanalytical supercritical fluid chromatography methods published in the literature.
Piccinonna, Sara; Ragone, Rosa; Stocchero, Matteo; Del Coco, Laura; De Pascali, Sandra Angelica; Schena, Francesco Paolo; Fanizzi, Francesco Paolo
2016-05-15
Nuclear Magnetic Resonance (NMR) spectroscopy is emerging as a powerful technique in olive oil fingerprinting, but its analytical robustness has to be proved. Here, we report a comparative study between two laboratories on olive oil (1)H NMR fingerprinting, aiming to demonstrate the robustness of NMR-based metabolomics in generating comparable data sets for cultivar classification. Sample preparation and data acquisition were performed independently in two laboratories, equipped with different resolution spectrometers (400 and 500 MHz), using two identical sets of mono-varietal olive oils. Partial Least Squares (PLS)-based techniques were applied to compare the data sets produced by the two laboratories. Despite differences in spectrum baseline, and in intensity and shape of peaks, the amount of shared information was significant (almost 70%) and related to cultivar (same metabolites discriminated between cultivars). In conclusion, regardless of the variability due to operator and machine, the data sets from the two participating units were comparable for the purpose of classification. Copyright © 2015 Elsevier Ltd. All rights reserved.
The Superior Lambert Algorithm
NASA Astrophysics Data System (ADS)
der, G.
2011-09-01
Lambert algorithms are used extensively for initial orbit determination, mission planning, space debris correlation, and missile targeting, just to name a few applications. Due to the significance of the Lambert problem in Astrodynamics, Gauss, Battin, Godal, Lancaster, Gooding, Sun and many others (References 1 to 15) have provided numerous formulations leading to various analytic solutions and iterative methods. Most Lambert algorithms and their computer programs can only work within one revolution, break down or converge slowly when the transfer angle is near zero or 180 degrees, and their multi-revolution limitations are either ignored or barely addressed. Despite claims of robustness, many Lambert algorithms fail without notice, and the users seldom have a clue why. The DerAstrodynamics lambert2 algorithm, which is based on the analytic solution formulated by Sun, works for any number of revolutions and converges rapidly at any transfer angle. It provides significant capability enhancements over every other Lambert algorithm in use today. These include improved speed, accuracy, robustness, and multirevolution capabilities as well as implementation simplicity. Additionally, the lambert2 algorithm provides a powerful tool for solving the angles-only problem without artificial singularities (pointed out by Gooding in Reference 16), which involves 3 lines of sight captured by optical sensors, or systems such as the Air Force Space Surveillance System (AFSSS). The analytic solution is derived from the extended Godal’s time equation by Sun, while the iterative method of solution is that of Laguerre, modified for robustness. The Keplerian solution of a Lambert algorithm can be extended to include the non-Keplerian terms of the Vinti algorithm via a simple targeting technique (References 17 to 19). Accurate analytic non-Keplerian trajectories can be predicted for satellites and ballistic missiles, while performing at least 100 times faster in speed than most numerical integration methods.
Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting
2017-01-01
Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. PMID:28235782
Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples
2017-01-01
Amperometric biosensors based on the use of glucose oxidase (GOx) are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan) onto highly ordered titanium dioxide nanotube arrays (TiO2NTAs) has been evaluated. The GOx–Chitosan/TiO2NTAs biosensor showed a sensitivity of 5.46 μA·mM−1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%), reproducibility (RSD = 2.5%), accuracy (95–105% recovery), and robustness (RSD = 3.3%). Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx–Chitosan/TiO2NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated. PMID:29135931
NASA Technical Reports Server (NTRS)
Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William
2015-01-01
Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).
Analytical study of robustness of a negative feedback oscillator by multiparameter sensitivity
2014-01-01
Background One of the distinctive features of biological oscillators such as circadian clocks and cell cycles is robustness which is the ability to resume reliable operation in the face of different types of perturbations. In the previous study, we proposed multiparameter sensitivity (MPS) as an intelligible measure for robustness to fluctuations in kinetic parameters. Analytical solutions directly connect the mechanisms and kinetic parameters to dynamic properties such as period, amplitude and their associated MPSs. Although negative feedback loops are known as common structures to biological oscillators, the analytical solutions have not been presented for a general model of negative feedback oscillators. Results We present the analytical expressions for the period, amplitude and their associated MPSs for a general model of negative feedback oscillators. The analytical solutions are validated by comparing them with numerical solutions. The analytical solutions explicitly show how the dynamic properties depend on the kinetic parameters. The ratio of a threshold to the amplitude has a strong impact on the period MPS. As the ratio approaches to one, the MPS increases, indicating that the period becomes more sensitive to changes in kinetic parameters. We present the first mathematical proof that the distributed time-delay mechanism contributes to making the oscillation period robust to parameter fluctuations. The MPS decreases with an increase in the feedback loop length (i.e., the number of molecular species constituting the feedback loop). Conclusions Since a general model of negative feedback oscillators was employed, the results shown in this paper are expected to be true for many of biological oscillators. This study strongly supports that the hypothesis that phosphorylations of clock proteins contribute to the robustness of circadian rhythms. The analytical solutions give synthetic biologists some clues to design gene oscillators with robust and desired period. PMID:25605374
Fair, Justin D.; Bailey, William F.; Felty, Robert A.; Gifford, Amy E.; Shultes, Benjamin; Volles, Leslie H.
2010-01-01
Development of a robust reliable technique that permits for the rapid quantitation of volatile organic chemicals is an important first step to remediation associated with vapor intrusion. This paper describes the development of an analytical method that allows for the rapid and precise identification and quantitation of halogenated and nonhalogenated contaminants commonly found within the ppbv level at sites where vapor intrusion is a concern. PMID:20885969
Optical Drug Monitoring: Photoacoustic Imaging of Nanosensors to Monitor Therapeutic Lithium In Vivo
Cash, Kevin J.; Li, Chiye; Xia, Jun; Wang, Lihong V.; Clark, Heather A.
2015-01-01
Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes. PMID:25588028
Cash, Kevin J; Li, Chiye; Xia, Jun; Wang, Lihong V; Clark, Heather A
2015-02-24
Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal, we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes.
Speciated arsenic in air: measurement methodology and risk assessment considerations.
Lewis, Ari S; Reid, Kim R; Pollock, Margaret C; Campleman, Sharan L
2012-01-01
Accurate measurement of arsenic (As) in air is critical to providing a more robust understanding of arsenic exposures and associated human health risks. Although there is extensive information available on total arsenic in air, less is known on the relative contribution of each arsenic species. To address this data gap, the authors conducted an in-depth review of available information on speciated arsenic in air. The evaluation included the type of species measured and the relative abundance, as well as an analysis of the limitations of current analytical methods. Despite inherent differences in the procedures, most techniques effectively separated arsenic species in the air samples. Common analytical techniques such as inductively coupled plasma mass spectrometry (ICP-MS) and/or hydride generation (HG)- or quartz furnace (GF)-atomic absorption spectrometry (AAS) were used for arsenic measurement in the extracts, and provided some of the most sensitive detection limits. The current analysis demonstrated that, despite limited comparability among studies due to differences in seasonal factors, study duration, sample collection methods, and analytical methods, research conducted to date is adequate to show that arsenic in air is mainly in the inorganic form. Reported average concentrations of As(III) and As(V) ranged up to 7.4 and 10.4 ng/m3, respectively, with As(V) being more prevalent than As(III) in most studies. Concentrations of the organic methylated arsenic compounds are negligible (in the pg/m3 range). However because of the variability in study methods and measurement methodology, the authors were unable to determine the variation in arsenic composition as a function of source or particulate matter (PM) fraction. In this work, the authors include the implications of arsenic speciation in air on potential exposure and risks. The authors conclude that it is important to synchronize sample collection, preparation, and analytical techniques in order to generate data more useful for arsenic inhalation risk assessment, and a more robust documentation of quality assurance/quality control (QA/QC) protocols is necessary to ensure accuracy, precision, representativeness, and comparability.
Homeostatic enhancement of active mechanotransduction
NASA Astrophysics Data System (ADS)
Milewski, Andrew; O'Maoiléidigh, Dáibhid; Hudspeth, A. J.
2018-05-01
Our sense of hearing boasts exquisite sensitivity to periodic signals. Experiments and modeling imply, however, that the auditory system achieves this performance for only a narrow range of parameter values. As a result, small changes in these values could compromise the ability of the mechanosensory hair cells to detect stimuli. We propose that, rather than exerting tight control over parameters, the auditory system employs a homeostatic mechanism that ensures the robustness of its operation to variation in parameter values. Through analytical techniques and computer simulations we investigate whether a homeostatic mechanism renders the hair bundle's signal-detection ability more robust to alterations in experimentally accessible parameters. When homeostasis is enforced, the range of values for which the bundle's sensitivity exceeds a threshold can increase by more than an order of magnitude. The robustness of cochlear function based on somatic motility or hair bundle motility may be achieved by employing the approach we describe here.
Chen, Yi-Ting; Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting
2017-05-01
Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
Analytical applications of MIPs in diagnostic assays: future perspectives.
Bedwell, Thomas S; Whitcombe, Michael J
2016-03-01
Many efforts have been made to produce artificial materials with biomimetic properties for applications in binding assays. Among these efforts, the technique of molecular imprinting has received much attention because of the high selectivity obtainable for molecules of interest, robustness of the produced polymers, simple and short synthesis, and excellent cost efficiency. In this review, progress in the field of molecularly imprinted sorbent assays is discussed-with a focus on work conducted from 2005 to date.
Hicks, Michael B; Regalado, Erik L; Tan, Feng; Gong, Xiaoyi; Welch, Christopher J
2016-01-05
Supercritical fluid chromatography (SFC) has long been a preferred method for enantiopurity analysis in support of pharmaceutical discovery and development, but implementation of the technique in regulated GMP laboratories has been somewhat slow, owing to limitations in instrument sensitivity, reproducibility, accuracy and robustness. In recent years, commercialization of next generation analytical SFC instrumentation has addressed previous shortcomings, making the technique better suited for GMP analysis. In this study we investigate the use of modern SFC for enantiopurity analysis of several pharmaceutical intermediates and compare the results with the conventional HPLC approaches historically used for analysis in a GMP setting. The findings clearly illustrate that modern SFC now exhibits improved precision, reproducibility, accuracy and robustness; also providing superior resolution and peak capacity compared to HPLC. Based on these findings, the use of modern chiral SFC is recommended for GMP studies of stereochemistry in pharmaceutical development and manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.
2017-06-01
Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.
Dinov, Ivo D
2016-01-01
Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.
Nariya, Maulik K; Kim, Jae Hyun; Xiong, Jian; Kleindl, Peter A; Hewarathna, Asha; Fisher, Adam C; Joshi, Sangeeta B; Schöneich, Christian; Forrest, M Laird; Middaugh, C Russell; Volkin, David B; Deeds, Eric J
2017-11-01
There is growing interest in generating physicochemical and biological analytical data sets to compare complex mixture drugs, for example, products from different manufacturers. In this work, we compare various crofelemer samples prepared from a single lot by filtration with varying molecular weight cutoffs combined with incubation for different times at different temperatures. The 2 preceding articles describe experimental data sets generated from analytical characterization of fractionated and degraded crofelemer samples. In this work, we use data mining techniques such as principal component analysis and mutual information scores to help visualize the data and determine discriminatory regions within these large data sets. The mutual information score identifies chemical signatures that differentiate crofelemer samples. These signatures, in many cases, would likely be missed by traditional data analysis tools. We also found that supervised learning classifiers robustly discriminate samples with around 99% classification accuracy, indicating that mathematical models of these physicochemical data sets are capable of identifying even subtle differences in crofelemer samples. Data mining and machine learning techniques can thus identify fingerprint-type attributes of complex mixture drugs that may be used for comparative characterization of products. Copyright © 2017 American Pharmacists Association®. All rights reserved.
Numerical realization of the variational method for generating self-trapped beams
NASA Astrophysics Data System (ADS)
Duque, Erick I.; Lopez-Aguayo, Servando; Malomed, Boris A.
2018-03-01
We introduce a numerical variational method based on the Rayleigh-Ritz optimization principle for predicting two-dimensional self-trapped beams in nonlinear media. This technique overcomes the limitation of the traditional variational approximation in performing analytical Lagrangian integration and differentiation. Approximate soliton solutions of a generalized nonlinear Schr\\"odinger equation are obtained, demonstrating robustness of the beams of various types (fundamental, vortices, multipoles, azimuthons) in the course of their propagation. The algorithm offers possibilities to produce more sophisticated soliton profiles in general nonlinear models.
Robustness and Vulnerability of Networks with Dynamical Dependency Groups.
Bai, Ya-Nan; Huang, Ning; Wang, Lei; Wu, Zhi-Xi
2016-11-28
The dependency property and self-recovery of failure nodes both have great effects on the robustness of networks during the cascading process. Existing investigations focused mainly on the failure mechanism of static dependency groups without considering the time-dependency of interdependent nodes and the recovery mechanism in reality. In this study, we present an evolving network model consisting of failure mechanisms and a recovery mechanism to explore network robustness, where the dependency relations among nodes vary over time. Based on generating function techniques, we provide an analytical framework for random networks with arbitrary degree distribution. In particular, we theoretically find that an abrupt percolation transition exists corresponding to the dynamical dependency groups for a wide range of topologies after initial random removal. Moreover, when the abrupt transition point is above the failure threshold of dependency groups, the evolving network with the larger dependency groups is more vulnerable; when below it, the larger dependency groups make the network more robust. Numerical simulations employing the Erdős-Rényi network and Barabási-Albert scale free network are performed to validate our theoretical results.
Analytical redundancy and the design of robust failure detection systems
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The Failure Detection and Identification (FDI) process is viewed as consisting of two stages: residual generation and decision making. It is argued that a robust FDI system can be achieved by designing a robust residual generation process. Analytical redundancy, the basis for residual generation, is characterized in terms of a parity space. Using the concept of parity relations, residuals can be generated in a number of ways and the design of a robust residual generation process can be formulated as a minimax optimization problem. An example is included to illustrate this design methodology. Previously announcedd in STAR as N83-20653
Fabricating microfluidic valve master molds in SU-8 photoresist
NASA Astrophysics Data System (ADS)
Dy, Aaron J.; Cosmanescu, Alin; Sluka, James; Glazier, James A.; Stupack, Dwayne; Amarie, Dragos
2014-05-01
Multilayer soft lithography has become a powerful tool in analytical chemistry, biochemistry, material and life sciences, and medical research. Complex fluidic micro-circuits require reliable components that integrate easily into microchips. We introduce two novel approaches to master mold fabrication for constructing in-line micro-valves using SU-8. Our fabrication techniques enable robust and versatile integration of many lab-on-a-chip functions including filters, mixers, pumps, stream focusing and cell-culture chambers, with in-line valves. SU-8 created more robust valve master molds than the conventional positive photoresists used in multilayer soft lithography, but maintained the advantages of biocompatibility and rapid prototyping. As an example, we used valve master molds made of SU-8 to fabricate PDMS chips capable of precisely controlling beads or cells in solution.
Robust design of a 2-DOF GMV controller: a direct self-tuning and fuzzy scheduling approach.
Silveira, Antonio S; Rodríguez, Jaime E N; Coelho, Antonio A R
2012-01-01
This paper presents a study on self-tuning control strategies with generalized minimum variance control in a fixed two degree of freedom structure-or simply GMV2DOF-within two adaptive perspectives. One, from the process model point of view, using a recursive least squares estimator algorithm for direct self-tuning design, and another, using a Mamdani fuzzy GMV2DOF parameters scheduling technique based on analytical and physical interpretations from robustness analysis of the system. Both strategies are assessed by simulation and real plants experimentation environments composed of a damped pendulum and an under development wind tunnel from the Department of Automation and Systems of the Federal University of Santa Catarina. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
Robust volcano plot: identification of differential metabolites in the presence of outliers.
Kumar, Nishith; Hoque, Md Aminul; Sugimoto, Masahiro
2018-04-11
The identification of differential metabolites in metabolomics is still a big challenge and plays a prominent role in metabolomics data analyses. Metabolomics datasets often contain outliers because of analytical, experimental, and biological ambiguity, but the currently available differential metabolite identification techniques are sensitive to outliers. We propose a kernel weight based outlier-robust volcano plot for identifying differential metabolites from noisy metabolomics datasets. Two numerical experiments are used to evaluate the performance of the proposed technique against nine existing techniques, including the t-test and the Kruskal-Wallis test. Artificially generated data with outliers reveal that the proposed method results in a lower misclassification error rate and a greater area under the receiver operating characteristic curve compared with existing methods. An experimentally measured breast cancer dataset to which outliers were artificially added reveals that our proposed method produces only two non-overlapping differential metabolites whereas the other nine methods produced between seven and 57 non-overlapping differential metabolites. Our data analyses show that the performance of the proposed differential metabolite identification technique is better than that of existing methods. Thus, the proposed method can contribute to analysis of metabolomics data with outliers. The R package and user manual of the proposed method are available at https://github.com/nishithkumarpaul/Rvolcano .
NASA Astrophysics Data System (ADS)
Setty, Srinivas J.; Cefola, Paul J.; Montenbruck, Oliver; Fiedler, Hauke
2016-05-01
Catalog maintenance for Space Situational Awareness (SSA) demands an accurate and computationally lean orbit propagation and orbit determination technique to cope with the ever increasing number of observed space objects. As an alternative to established numerical and analytical methods, we investigate the accuracy and computational load of the Draper Semi-analytical Satellite Theory (DSST). The standalone version of the DSST was enhanced with additional perturbation models to improve its recovery of short periodic motion. The accuracy of DSST is, for the first time, compared to a numerical propagator with fidelity force models for a comprehensive grid of low, medium, and high altitude orbits with varying eccentricity and different inclinations. Furthermore, the run-time of both propagators is compared as a function of propagation arc, output step size and gravity field order to assess its performance for a full range of relevant use cases. For use in orbit determination, a robust performance of DSST is demonstrated even in the case of sparse observations, which is most sensitive to mismodeled short periodic perturbations. Overall, DSST is shown to exhibit adequate accuracy at favorable computational speed for the full set of orbits that need to be considered in space surveillance. Along with the inherent benefits of a semi-analytical orbit representation, DSST provides an attractive alternative to the more common numerical orbit propagation techniques.
NASA Astrophysics Data System (ADS)
Grose, C. J.
2008-05-01
Numerical geodynamics models of heat transfer are typically thought of as specialized topics of research requiring knowledge of specialized modelling software, linux platforms, and state-of-the-art finite-element codes. I have implemented analytical and numerical finite-difference techniques with Microsoft Excel 2007 spreadsheets to solve for complex solid-earth heat transfer problems for use by students, teachers, and practicing scientists without specialty in geodynamics modelling techniques and applications. While implementation of equations for use in Excel spreadsheets is occasionally cumbersome, once case boundary structure and node equations are developed, spreadsheet manipulation becomes routine. Model experimentation by modifying parameter values, geometry, and grid resolution makes Excel a useful tool whether in the classroom at the undergraduate or graduate level or for more engaging student projects. Furthermore, the ability to incorporate complex geometries and heat-transfer characteristics makes it ideal for first and occasionally higher order geodynamics simulations to better understand and constrain the results of professional field research in a setting that does not require the constraints of state-of-the-art modelling codes. The straightforward expression and manipulation of model equations in excel can also serve as a medium to better understand the confusing notations of advanced mathematical problems. To illustrate the power and robustness of computation and visualization in spreadsheet models I focus primarily on one-dimensional analytical and two-dimensional numerical solutions to two case problems: (i) the cooling of oceanic lithosphere and (ii) temperatures within subducting slabs. Excel source documents will be made available.
Analytical robustness of quantitative NIR chemical imaging for Islamic paper characterization
NASA Astrophysics Data System (ADS)
Mahgoub, Hend; Gilchrist, John R.; Fearn, Thomas; Strlič, Matija
2017-07-01
Recently, spectral imaging techniques such as Multispectral (MSI) and Hyperspectral Imaging (HSI) have gained importance in the field of heritage conservation. This paper explores the analytical robustness of quantitative chemical imaging for Islamic paper characterization by focusing on the effect of different measurement and processing parameters, i.e. acquisition conditions and calibration on the accuracy of the collected spectral data. This will provide a better understanding of the technique that can provide a measure of change in collections through imaging. For the quantitative model, special calibration target was devised using 105 samples from a well-characterized reference Islamic paper collection. Two material properties were of interest: starch sizing and cellulose degree of polymerization (DP). Multivariate data analysis methods were used to develop discrimination and regression models which were used as an evaluation methodology for the metrology of quantitative NIR chemical imaging. Spectral data were collected using a pushbroom HSI scanner (Gilden Photonics Ltd) in the 1000-2500 nm range with a spectral resolution of 6.3 nm using a mirror scanning setup and halogen illumination. Data were acquired at different measurement conditions and acquisition parameters. Preliminary results showed the potential of the evaluation methodology to show that measurement parameters such as the use of different lenses and different scanning backgrounds may not have a great influence on the quantitative results. Moreover, the evaluation methodology allowed for the selection of the best pre-treatment method to be applied to the data.
Sheldon, E M; Downar, J B
2000-08-15
Novel approaches to the development of analytical procedures for monitoring incoming starting material in support of chemical/pharmaceutical processes are described. High technology solutions were utilized for timely process development and preparation of high quality clinical supplies. A single robust HPLC method was developed and characterized for the analysis of the key starting material from three suppliers. Each supplier used a different process for the preparation of this material and, therefore, each suppliers' material exhibited a unique impurity profile. The HPLC method utilized standard techniques acceptable for release testing in a QC/manufacturing environment. An automated experimental design protocol was used to characterize the robustness of the HPLC method. The method was evaluated for linearity, limit of quantitation, solution stability, and precision of replicate injections. An LC-MS method that emulated the release HPLC method was developed and the identities of impurities were mapped between the two methods.
Analysis of gene network robustness based on saturated fixed point attractors
2014-01-01
The analysis of gene network robustness to noise and mutation is important for fundamental and practical reasons. Robustness refers to the stability of the equilibrium expression state of a gene network to variations of the initial expression state and network topology. Numerical simulation of these variations is commonly used for the assessment of robustness. Since there exists a great number of possible gene network topologies and initial states, even millions of simulations may be still too small to give reliable results. When the initial and equilibrium expression states are restricted to being saturated (i.e., their elements can only take values 1 or −1 corresponding to maximum activation and maximum repression of genes), an analytical gene network robustness assessment is possible. We present this analytical treatment based on determination of the saturated fixed point attractors for sigmoidal function models. The analysis can determine (a) for a given network, which and how many saturated equilibrium states exist and which and how many saturated initial states converge to each of these saturated equilibrium states and (b) for a given saturated equilibrium state or a given pair of saturated equilibrium and initial states, which and how many gene networks, referred to as viable, share this saturated equilibrium state or the pair of saturated equilibrium and initial states. We also show that the viable networks sharing a given saturated equilibrium state must follow certain patterns. These capabilities of the analytical treatment make it possible to properly define and accurately determine robustness to noise and mutation for gene networks. Previous network research conclusions drawn from performing millions of simulations follow directly from the results of our analytical treatment. Furthermore, the analytical results provide criteria for the identification of model validity and suggest modified models of gene network dynamics. The yeast cell-cycle network is used as an illustration of the practical application of this analytical treatment. PMID:24650364
NASA Astrophysics Data System (ADS)
Gupta, Lokesh Kumar
2012-11-01
Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.
Exploration of robust operating conditions in inductively coupled plasma mass spectrometry
NASA Astrophysics Data System (ADS)
Tromp, John W.; Pomares, Mario; Alvarez-Prieto, Manuel; Cole, Amanda; Ying, Hai; Salin, Eric D.
2003-11-01
'Robust' conditions, as defined by Mermet and co-workers for inductively coupled plasma (ICP)-atomic emission spectrometry, minimize matrix effects on analyte signals, and are obtained by increasing power and reducing nebulizer gas flow. In ICP-mass spectrometry (MS), it is known that reduced nebulizer gas flow usually leads to more robust conditions such that matrix effects are reduced. In this work, robust conditions for ICP-MS have been determined by optimizing for accuracy in the determination of analytes in a multi-element solution with various interferents (Al, Ba, Cs, K, Na), by varying power, nebulizer gas flow, sample introduction rate and ion lens voltage. The goal of the work was to determine which operating parameters were the most important in reducing matrix effects, and whether different interferents yielded the same robust conditions. Reduction in nebulizer gas flow and in sample input rate led to a significantly decreased interference, while an increase in power seemed to have a lesser effect. Once the other parameters had been adjusted to their robust values, there was no additional improvement in accuracy attainable by adjusting the ion lens voltage. The robust conditions were universal, since, for all the interferents and analytes studied, the optimum was found at the same operating conditions. One drawback to the use of robust conditions was the slightly reduced sensitivity; however, in the context of 'intelligent' instruments, the concept of 'robust conditions' is useful in many cases.
Built-in active sensing diagnostic system for civil infrastructure systems
NASA Astrophysics Data System (ADS)
Wu, Fan; Chang, Fu-Kuo
2001-07-01
A reliable, robust monitoring system can improve the maintenance of and provide safety protection for civil structures and therefore prolong their service lives. A built-in, active sensing diagnostic technique for civil structures has been under investigation. In this technique, piezoelectric materials are used as sensors/actuators to receive and generate signals. The transducers are embedded in reinforced concrete (RC) beams and are designed to detect damage, particularly debonding damage between the reinforcing bars and concrete. This paper presents preliminary results from a feasibility study of the technology. Laboratory experiments performed on RC beams, with piezo-electric sensors and actuators mounted on reinforced steel bars, have clearly demonstrated that the proposed technique could detect debonding damage. Analytical work, using a special purpose finite-element software, PZFlex, was also conducted to interpret the relationship between the measured data and actual debonding damage. Effectiveness of the proposed technique for detecting debonding damage in civil structures has been demonstrated.
Luo, Wei; Chen, Sheng; Chen, Lei; Li, Hualong; Miao, Pengcheng; Gao, Huiyi; Hu, Zelin; Li, Miao
2017-05-29
We describe a theoretical model to analyze temperature effects on the Kretschmann surface plasmon resonance (SPR) sensor, and describe a new double-incident angle technique to simultaneously measure changes in refractive index (RI) and temperature. The method uses the observation that output signals obtained from two different incident angles each have a linear dependence on RI and temperature, and are independent. A proof-of-concept experiment using different NaCl concentration solutions as analytes demonstrates the ability of the technique. The optical design is as simple and robust as conventional SPR detection, but provides a way to discriminate between RI-induced and temperature-induced SPR changes. This technique facilitates a way for traditional SPR sensors to detect RI in different temperature environments, and may lead to better design and fabrication of SPR sensors against temperature variation.
One Controller at a Time (1-CAT): A mimo design methodology
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Lucas, J. C.
1987-01-01
The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.
Numerical realization of the variational method for generating self-trapped beams.
Duque, Erick I; Lopez-Aguayo, Servando; Malomed, Boris A
2018-03-19
We introduce a numerical variational method based on the Rayleigh-Ritz optimization principle for predicting two-dimensional self-trapped beams in nonlinear media. This technique overcomes the limitation of the traditional variational approximation in performing analytical Lagrangian integration and differentiation. Approximate soliton solutions of a generalized nonlinear Schrödinger equation are obtained, demonstrating robustness of the beams of various types (fundamental, vortices, multipoles, azimuthons) in the course of their propagation. The algorithm offers possibilities to produce more sophisticated soliton profiles in general nonlinear models.
NASA Astrophysics Data System (ADS)
Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.
2017-12-01
Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).
NASA Astrophysics Data System (ADS)
Kaus, B.; Popov, A.
2015-12-01
The analytical expression for the Jacobian is a key component to achieve fast and robust convergence of the nonlinear Newton-Raphson iterative solver. Accomplishing this task in practice often requires a significant algebraic effort. Therefore it is quite common to use a cheap alternative instead, for example by approximating the Jacobian with a finite difference estimation. Despite its simplicity it is a relatively fragile and unreliable technique that is sensitive to the scaling of the residual and unknowns, as well as to the perturbation parameter selection. Unfortunately no universal rule can be applied to provide both a robust scaling and a perturbation. The approach we use here is to derive the analytical Jacobian for the coupled set of momentum, mass, and energy conservation equations together with the elasto-visco-plastic rheology and a marker in cell/staggered finite difference method. The software project LaMEM (Lithosphere and Mantle Evolution Model) is primarily developed for the thermo-mechanically coupled modeling of the 3D lithospheric deformation. The code is based on a staggered grid finite difference discretization in space, and uses customized scalable solvers form PETSc library to efficiently run on the massively parallel machines (such as IBM Blue Gene/Q). Currently LaMEM relies on the Jacobian-Free Newton-Krylov (JFNK) nonlinear solver, which approximates the Jacobian-vector product using a simple finite difference formula. This approach never requires an assembled Jacobian matrix and uses only the residual computation routine. We use an approximate Jacobian (Picard) matrix to precondition the Krylov solver with the Galerkin geometric multigrid. Because of the inherent problems of the finite difference Jacobian estimation, this approach doesn't always result in stable convergence. In this work we present and discuss a matrix-free technique in which the Jacobian-vector product is replaced by analytically-derived expressions and compare results with those obtained with a finite difference approximation of the Jacobian. This project is funded by ERC Starting Grant 258830 and computer facilities were provided by Jülich supercomputer center (Germany).
Analytical quality by design: a tool for regulatory flexibility and robust analytics.
Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).
Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics
Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723
A novel modification of the Turing test for artificial intelligence and robotics in healthcare.
Ashrafian, Hutan; Darzi, Ara; Athanasiou, Thanos
2015-03-01
The increasing demands of delivering higher quality global healthcare has resulted in a corresponding expansion in the development of computer-based and robotic healthcare tools that rely on artificially intelligent technologies. The Turing test was designed to assess artificial intelligence (AI) in computer technology. It remains an important qualitative tool for testing the next generation of medical diagnostics and medical robotics. Development of quantifiable diagnostic accuracy meta-analytical evaluative techniques for the Turing test paradigm. Modification of the Turing test to offer quantifiable diagnostic precision and statistical effect-size robustness in the assessment of AI for computer-based and robotic healthcare technologies. Modification of the Turing test to offer robust diagnostic scores for AI can contribute to enhancing and refining the next generation of digital diagnostic technologies and healthcare robotics. Copyright © 2014 John Wiley & Sons, Ltd.
Aeroservoelastic Uncertainty Model Identification from Flight Data
NASA Technical Reports Server (NTRS)
Brenner, Martin J.
2001-01-01
Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.
Use of pressure manifestations following the water plasma expansion for phytomass disintegration.
Maroušek, Josef; Kwan, Jason Tai Hong
2013-01-01
A prototype capable of generating underwater high-voltage discharges (3.5 kV) coupled with water plasma expansion was constructed. The level of phytomass disintegration caused by transmission of the pressure shockwaves (50-60 MPa) followed by this expansion was analyzed using gas adsorption techniques. The dynamics of the external surface area and the micropore volume on multiple pretreatment stages of maize silage and sunflower seeds was approximated with robust analytical techniques. The multiple increases on the reaction surface were manifest in up to a 15% increase in cumulative methane production, which was itself manifest in the overall acceleration of the anaerobic fermentation process. Disintegration of the sunflower seeds allowed up to 45% higher oil yields using the same operating pressure.
Nonlinear truncation error analysis of finite difference schemes for the Euler equations
NASA Technical Reports Server (NTRS)
Klopfer, G. H.; Mcrae, D. S.
1983-01-01
It is pointed out that, in general, dissipative finite difference integration schemes have been found to be quite robust when applied to the Euler equations of gas dynamics. The present investigation considers a modified equation analysis of both implicit and explicit finite difference techniques as applied to the Euler equations. The analysis is used to identify those error terms which contribute most to the observed solution errors. A technique for analytically removing the dominant error terms is demonstrated, resulting in a greatly improved solution for the explicit Lax-Wendroff schemes. It is shown that the nonlinear truncation errors are quite large and distributed quite differently for each of the three conservation equations as applied to a one-dimensional shock tube problem.
Principles and Applications of Liquid Chromatography-Mass Spectrometry in Clinical Biochemistry
Pitt, James J
2009-01-01
Liquid chromatography-mass spectrometry (LC-MS) is now a routine technique with the development of electrospray ionisation (ESI) providing a simple and robust interface. It can be applied to a wide range of biological molecules and the use of tandem MS and stable isotope internal standards allows highly sensitive and accurate assays to be developed although some method optimisation is required to minimise ion suppression effects. Fast scanning speeds allow a high degree of multiplexing and many compounds can be measured in a single analytical run. With the development of more affordable and reliable instruments, LC-MS is starting to play an important role in several areas of clinical biochemistry and compete with conventional liquid chromatography and other techniques such as immunoassay. PMID:19224008
Recent Advances in the Measurement of Arsenic, Cadmium, and Mercury in Rice and Other Foods
Punshon, Tracy
2015-01-01
Trace element analysis of foods is of increasing importance because of raised consumer awareness and the need to evaluate and establish regulatory guidelines for toxic trace metals and metalloids. This paper reviews recent advances in the analysis of trace elements in food, including challenges, state-of-the art methods, and use of spatially resolved techniques for localizing the distribution of As and Hg within rice grains. Total elemental analysis of foods is relatively well-established but the push for ever lower detection limits requires that methods be robust from potential matrix interferences which can be particularly severe for food. Inductively coupled plasma mass spectrometry (ICP-MS) is the method of choice, allowing for multi-element and highly sensitive analyses. For arsenic, speciation analysis is necessary because the inorganic forms are more likely to be subject to regulatory limits. Chromatographic techniques coupled to ICP-MS are most often used for arsenic speciation and a range of methods now exist for a variety of different arsenic species in different food matrices. Speciation and spatial analysis of foods, especially rice, can also be achieved with synchrotron techniques. Sensitive analytical techniques and methodological advances provide robust methods for the assessment of several metals in animal and plant-based foods, in particular for arsenic, cadmium and mercury in rice and arsenic speciation in foodstuffs. PMID:25938012
Dillon, C R; Borasi, G; Payne, A
2016-01-01
For thermal modeling to play a significant role in treatment planning, monitoring, and control of magnetic resonance-guided focused ultrasound (MRgFUS) thermal therapies, accurate knowledge of ultrasound and thermal properties is essential. This study develops a new analytical solution for the temperature change observed in MRgFUS which can be used with experimental MR temperature data to provide estimates of the ultrasound initial heating rate, Gaussian beam variance, tissue thermal diffusivity, and Pennes perfusion parameter. Simulations demonstrate that this technique provides accurate and robust property estimates that are independent of the beam size, thermal diffusivity, and perfusion levels in the presence of realistic MR noise. The technique is also demonstrated in vivo using MRgFUS heating data in rabbit back muscle. Errors in property estimates are kept less than 5% by applying a third order Taylor series approximation of the perfusion term and ensuring the ratio of the fitting time (the duration of experimental data utilized for optimization) to the perfusion time constant remains less than one. PMID:26741344
Analytical methods for determination of mycotoxins: An update (2009-2014).
Turner, Nicholas W; Bramhmbhatt, Heli; Szabo-Vezse, Monika; Poma, Alessandro; Coker, Raymond; Piletsky, Sergey A
2015-12-11
Mycotoxins are a problematic and toxic group of small organic molecules that are produced as secondary metabolites by several fungal species that colonise crops. They lead to contamination at both the field and postharvest stages of food production with a considerable range of foodstuffs affected, from coffee and cereals, to dried fruit and spices. With wide ranging structural diversity of mycotoxins, severe toxic effects caused by these molecules and their high chemical stability the requirement for robust and effective detection methods is clear. This paper builds on our previous review and summarises the most recent advances in this field, in the years 2009-2014 inclusive. This review summarises traditional methods such as chromatographic and immunochemical techniques, as well as newer approaches such as biosensors, and optical techniques which are becoming more prevalent. A section on sampling and sample treatment has been prepared to highlight the importance of this step in the analytical methods. We close with a look at emerging technologies that will bring effective and rapid analysis out of the laboratory and into the field. Copyright © 2015 Elsevier B.V. All rights reserved.
Zhang, Zhiyong; Zhao, Dishun; Xu, Baoyun
2013-01-01
A simple and rapid method is described for the analysis of glyoxal and related substances by high-performance liquid chromatography with a refractive index detector. The following chromatographic conditions were adopted: Aminex HPX-87H column, mobile phase consisting of 0.01N H2SO4, flow rate of 0.8 mL/min and temperature of 65°C. The application of the analytical technique developed in this study demonstrated that the aqueous reaction mixture produced by the oxidation of acetaldehyde with HNO3 was composed of glyoxal, acetaldehyde, acetic acid, formic acid, glyoxylic acid, oxalic acid, butanedione and glycolic acid. The method was validated by evaluating analytical parameters such as linearity, limits of detection and quantification, precision, recovery and robustness. The proposed methodology was successfully applied to the production of glyoxal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barran, Perdita; Baker, Erin
The great complexity of biological systems and their environment poses similarly vast challenges for accurate analytical evaluations of their identity, structure and quantity. Post genomic science, has predicted much regarding the static populations of biological systems, but a further challenge for analysis is to test the accuracy of these predictions, as well as provide a better representation of the transient nature of the molecules of life. Accurate measurements of biological systems have wide applications for biological, forensic, biotechnological and healthcare fields. Therefore, the holy grail is to find a technique which can identify and quantify biological molecules with high throughput,more » sensitivity and robustness, as well evaluate molecular structure(s) in order to understand how the specific molecules interact and function. While wrapping all of these characteristics into one platform may sound difficult, ion mobility spectrometry (IMS) is addressing all of these challenges. Over the last decade, the number of analytical studies utilizing IMS for the evaluation of complex biological and environmental samples has greatly increased. In most cases IMS is coupled with mass spectrometry (IM-MS), but even alone IMS provides the unique capability of rapidly assessing a molecule’s structure, which can be extremely difficult with other techniques. The robustness of the IMS measurement is bourne out by its widespread use in security, environmental and military applications. The multidimensional IM-MS measurements however have been proven to be ever more powerful, as applied to complex mixtures as they enable the evaluation of both the structure and mass of every molecular component in a sample during a single measurement, without the need for continual reference calibration.« less
Jaswal, Brij Bir S; Kumar, Vinay; Sharma, Jitendra; Rai, Pradeep K; Gondal, Mohammed A; Gondal, Bilal; Singh, Vivek K
2016-04-01
Laser-induced breakdown spectroscopy (LIBS) is an emerging analytical technique with numerous advantages such as rapidity, multi-elemental analysis, no specific sample preparation requirements, non-destructiveness, and versatility. It has been proven to be a robust elemental analysis tool attracting interest because of being applied to a wide range of materials including biomaterials. In this paper, we have performed spectroscopic studies on gallstones which are heterogeneous in nature using LIBS and wavelength dispersive X-ray fluorescence (WD-XRF) techniques. It has been observed that the presence and relative concentrations of trace elements in different kind of gallstones (cholesterol and pigment gallstones) can easily be determined using LIBS technique. From the experiments carried out on gallstones for trace elemental mapping and detection, it was found that LIBS is a robust tool for such biomedical applications. The stone samples studied in the present paper were classified using the Fourier transform infrared (FTIR) spectroscopy. WD-XRF spectroscopy has been applied for the qualitative and quantitative analysis of major and trace elements present in the gallstone which was compared with the LIBS data. The results obtained in the present paper show interesting prospects for LIBS and WD-XRF to study cholelithiasis better.
2D-Visualization of metabolic activity with planar optical chemical sensors (optodes)
NASA Astrophysics Data System (ADS)
Meier, R. J.; Liebsch, G.
2015-12-01
Microbia plays an outstandingly important role in many hydrologic compartments, such as e.g. the benthic community in sediments, or biologically active microorganisms in the capillary fringe, in ground water, or soil. Oxygen, pH, and CO2 are key factors and indicators for microbial activity. They can be measured using optical chemical sensors. These sensors record changing fluorescence properties of specific indicator dyes. The signals can be measured in a non-contact mode, even through transparent walls, which is important for many lab-experiments. They can measure in closed (transparent) systems, without sampling or intruding into the sample. They do not consume the analytes while measuring, are fully reversible and able to measure in non-stirred solutions. These sensors can be applied as high precision fiberoptic sensors (for profiling), robust sensor spots, or as planar sensors for 2D visualization (imaging). Imaging enables to detect thousands of measurement spots at the same time and generate 2D analyte maps over a region of interest. It allows for comparing different regions within one recorded image, visualizing spatial analyte gradients, or more important to identify hot spots of metabolic activity. We present ready-to-use portable imaging systems for the analytes oxygen, pH, and CO2. They consist of a detector unit, planar sensor foils and a software for easy data recording and evaluation. Sensors foils for various analytes and measurement ranges enable visualizing metabolic activity or analyte changes in the desired range. Dynamics of metabolic activity can be detected in one shot or over long time periods. We demonstrate the potential of this analytical technique by presenting experiments on benthic disturbance-recovery dynamics in sediments and microbial degradation of organic material in the capillary fringe. We think this technique is a new tool to further understand how microbial and geochemical processes are linked in (not solely) hydrologic systems.
Matisse: A Visual Analytics System for Exploring Emotion Trends in Social Media Text Streams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Drouhard, Margaret MEG G; Beaver, Justin M
Dynamically mining textual information streams to gain real-time situational awareness is especially challenging with social media systems where throughput and velocity properties push the limits of a static analytical approach. In this paper, we describe an interactive visual analytics system, called Matisse, that aids with the discovery and investigation of trends in streaming text. Matisse addresses the challenges inherent to text stream mining through the following technical contributions: (1) robust stream data management, (2) automated sentiment/emotion analytics, (3) interactive coordinated visualizations, and (4) a flexible drill-down interaction scheme that accesses multiple levels of detail. In addition to positive/negative sentiment prediction,more » Matisse provides fine-grained emotion classification based on Valence, Arousal, and Dominance dimensions and a novel machine learning process. Information from the sentiment/emotion analytics are fused with raw data and summary information to feed temporal, geospatial, term frequency, and scatterplot visualizations using a multi-scale, coordinated interaction model. After describing these techniques, we conclude with a practical case study focused on analyzing the Twitter sample stream during the week of the 2013 Boston Marathon bombings. The case study demonstrates the effectiveness of Matisse at providing guided situational awareness of significant trends in social media streams by orchestrating computational power and human cognition.« less
Considerations in detecting CDC select agents under field conditions
NASA Astrophysics Data System (ADS)
Spinelli, Charles; Soelberg, Scott; Swanson, Nathaneal; Furlong, Clement; Baker, Paul
2008-04-01
Surface Plasmon Resonance (SPR) has become a widely accepted technique for real-time detection of interactions between receptor molecules and ligands. Antibody may serve as receptor and can be attached to the gold surface of the SPR device, while candidate analyte fluids contact the detecting antibody. Minute, but detectable, changes in refractive indices (RI) indicate that analyte has bound to the antibody. A decade ago, an inexpensive, robust, miniature and fully integrated SPR chip, called SPREETA, was developed. University of Washington (UW) researchers subsequently developed a portable, temperature-regulated instrument, called SPIRIT, to simultaneously use eight of these three-channel SPREETA chips. A SPIRIT prototype instrument was tested in the field, coupled to a remote reporting system on a surrogate unmanned aerial vehicle (UAV). Two target protein analytes were released sequentially as aerosols with low analyte concentration during each of three flights and were successfully detected and verified. Laboratory experimentation with a more advanced SPIRIT instrument demonstrated detection of very low levels of several select biological agents that might be employed by bioterrorists. Agent detection under field-like conditions is more challenging, especially as analyte concentrations are reduced and complex matricies are introduced. Two different sample preconditioning protocols have been developed for select agents in complex matrices. Use of these preconditioning techniques has allowed laboratory detection in spiked heavy mud of Francisella tularensis at 10 3 CFU/ml, Bacillus anthracis spores at 10 3 CFU/ml, Staphylococcal enterotoxin B (SEB) at 1 ng/ml, and Vaccinia virus (a smallpox simulant) at 10 5 PFU/ml. Ongoing experiments are aimed at simultaneous detection of multiple agents in spiked heavy mud, using a multiplex preconditioning protocol.
NASA Astrophysics Data System (ADS)
Theis, L. S.; Motzoi, F.; Wilhelm, F. K.
2016-01-01
We present a few-parameter ansatz for pulses to implement a broad set of simultaneous single-qubit rotations in frequency-crowded multilevel systems. Specifically, we consider a system of two qutrits whose working and leakage transitions suffer from spectral crowding (detuned by δ ). In order to achieve precise controllability, we make use of two driving fields (each having two quadratures) at two different tones to simultaneously apply arbitrary combinations of rotations about axes in the X -Y plane to both qubits. Expanding the waveforms in terms of Hanning windows, we show how analytic pulses containing smooth and composite-pulse features can easily achieve gate errors less than 10-4 and considerably outperform known adiabatic techniques. Moreover, we find a generalization of the WAHWAH (Weak AnHarmonicity With Average Hamiltonian) method by Schutjens et al. [R. Schutjens, F. A. Dagga, D. J. Egger, and F. K. Wilhelm, Phys. Rev. A 88, 052330 (2013)], 10.1103/PhysRevA.88.052330 that allows precise separate single-qubit rotations for all gate times beyond a quantum speed limit. We find in all cases a quantum speed limit slightly below 2 π /δ for the gate time and show that our pulses are robust against variations in system parameters and filtering due to transfer functions, making them suitable for experimental implementations.
NASA Astrophysics Data System (ADS)
Hou, Liqiang; Cai, Yuanli; Liu, Jin; Hou, Chongyuan
2016-04-01
A variable fidelity robust optimization method for pulsed laser orbital debris removal (LODR) under uncertainty is proposed. Dempster-shafer theory of evidence (DST), which merges interval-based and probabilistic uncertainty modeling, is used in the robust optimization. The robust optimization method optimizes the performance while at the same time maximizing its belief value. A population based multi-objective optimization (MOO) algorithm based on a steepest descent like strategy with proper orthogonal decomposition (POD) is used to search robust Pareto solutions. Analytical and numerical lifetime predictors are used to evaluate the debris lifetime after the laser pulses. Trust region based fidelity management is designed to reduce the computational cost caused by the expensive model. When the solutions fall into the trust region, the analytical model is used to reduce the computational cost. The proposed robust optimization method is first tested on a set of standard problems and then applied to the removal of Iridium 33 with pulsed lasers. It will be shown that the proposed approach can identify the most robust solutions with minimum lifetime under uncertainty.
Dissolution of aerosol particles collected from nuclear facility plutonium production process
Xu, Ning; Martinez, Alexander; Schappert, Michael Francis; ...
2015-08-14
Here, a simple, robust analytical chemistry method has been developed to dissolve plutonium containing particles in a complex matrix. The aerosol particles collected on Marple cascade impactor substrates were shown to be dissolved completely with an acid mixture of 12 M HNO 3 and 0.1 M HF. A pressurized closed vessel acid digestion technique was utilized to heat the samples at 130 °C for 16 h to facilitate the digestion. The dissolution efficiency for plutonium particles was 99 %. The resulting particle digestate solution was suitable for trace elemental analysis and isotope composition determination, as well as radiochemistry measurements.
A direct method for nonlinear ill-posed problems
NASA Astrophysics Data System (ADS)
Lakhal, A.
2018-02-01
We propose a direct method for solving nonlinear ill-posed problems in Banach-spaces. The method is based on a stable inversion formula we explicitly compute by applying techniques for analytic functions. Furthermore, we investigate the convergence and stability of the method and prove that the derived noniterative algorithm is a regularization. The inversion formula provides a systematic sensitivity analysis. The approach is applicable to a wide range of nonlinear ill-posed problems. We test the algorithm on a nonlinear problem of travel-time inversion in seismic tomography. Numerical results illustrate the robustness and efficiency of the algorithm.
Simplex-stochastic collocation method with improved scalability
NASA Astrophysics Data System (ADS)
Edeling, W. N.; Dwight, R. P.; Cinnella, P.
2016-04-01
The Simplex-Stochastic Collocation (SSC) method is a robust tool used to propagate uncertain input distributions through a computer code. However, it becomes prohibitively expensive for problems with dimensions higher than 5. The main purpose of this paper is to identify bottlenecks, and to improve upon this bad scalability. In order to do so, we propose an alternative interpolation stencil technique based upon the Set-Covering problem, and we integrate the SSC method in the High-Dimensional Model-Reduction framework. In addition, we address the issue of ill-conditioned sample matrices, and we present an analytical map to facilitate uniformly-distributed simplex sampling.
NASA Technical Reports Server (NTRS)
Mukhopadhyay, V.
1988-01-01
A generic procedure for the parameter optimization of a digital control law for a large-order flexible flight vehicle or large space structure modeled as a sampled data system is presented. A linear quadratic Guassian type cost function was minimized, while satisfying a set of constraints on the steady-state rms values of selected design responses, using a constrained optimization technique to meet multiple design requirements. Analytical expressions for the gradients of the cost function and the design constraints on mean square responses with respect to the control law design variables are presented.
The Feasibility of Using a Galvanic Cell Array for Corrosion Detection and Solution Monitoring
NASA Technical Reports Server (NTRS)
Kolody, Mark; Calle, Luz-Marina; Zeitlin, Nancy P. (Technical Monitor)
2003-01-01
An initial investigation into the response of the individual galvanic couples was conducted using potentiodynamic polarization measurements of solutions under conditions of varying corrosivity. It is hypothesized that the differing electrodes may provide a means to further investigate the corrosive nature of the analyte through genetic algorithms and pattern recognition techniques. The robust design of the electrochemical sensor makes its utilization in space exploration particularly attractive. Since the electrodes are fired on a ceramic substrate at 900 C, they may be one of the most rugged sensors available for the anticipated usage.
Biologically inspired technologies using artificial muscles
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph
2005-01-01
One of the newest fields of biomimetics is the electroactive polymers (EAP) that are also known as artificial muscles. To take advantage of these materials, efforts are made worldwide to establish a strong infrastructure addressing the need for comprehensive analytical modeling of their response mechanism and develop effective processing and characterization techniques. The field is still in its emerging state and robust materials are still not readily available however in recent years significant progress has been made and commercial products have already started to appear. This paper covers the current state of- the-art and challenges to making artificial muscles and their potential biomimetic applications.
Impact of pesticide use by smallholder farmers on water quality in the Wakiso District, Uganda
NASA Astrophysics Data System (ADS)
Oltramare, Christelle; Weiss, Frederik T.; Atuhaire, Aggrey; Staudacher, Philipp; Niwagaba, Charles; Stamm, Christian
2017-04-01
As in many tropical countries, farmers of the Wakiso District rely on heavy use of pesticides to protect crops and animals. This may impair human and environmental health due to poor application techniques, misuse of pesticide bins or diffuse pesticide losses from the treated fields during intense tropical rainstorms. The extent of pollution in different environmental compartments however, are generally only poorly documented. The same holds true for quantitative data on the relevance of different transport pathways of pesticides into the environment. Part of the limited knowledge is caused by the demanding sampling and analytical techniques that are necessary to obtain robust data on the actual pollution status. Especially in surface waters, pesticide concentration may vary rapidly in time such that grab samples may yield a very incomplete picture. This incompleteness was often enhanced because of limited analytical windows that covered only a small fraction of the pesticides actually used. In this presentation, we describe an approach to overcome these limitations to a large extent by using three different passive sampling devices and two broad analytical techniques (GC-MS/MS, LC HR-MS) that allow the quantification of about 260 different pesticides. We will present how these approaches are implemented in the catchment area of the Wakiso District in Uganda. This area is intensively used by smallholder farmers who grow a large set of different crops. Diffuse losses are expected to occur mainly during the two rainy seasons (March to May and September to November). Accordingly, the study will focus on this situation.
Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.
Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R
2016-11-01
Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.
Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.
2016-11-01
Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.
High throughput-screening of animal urine samples: It is fast but is it also reliable?
Kaufmann, Anton
2016-05-01
Advanced analytical technologies like ultra-high-performance liquid chromatography coupled to high resolution mass spectrometry can be used for veterinary drug screening of animal urine. The technique is sufficiently robust and reliable to detect veterinary drugs in urine samples of animals where the maximum residue limit of these compounds in organs like muscle, kidney, or liver has been exceeded. The limitations and possibilities of the technique are discussed. The most critical point is the variability of the drug concentration ratio between the tissue and urine. Ways to manage the false positive and false negatives are discussed. The capability to confirm findings and the possibility of semi-targeted analysis are also addressed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Chang, Ching-Min; Lo, Yu-Lung; Tran, Nghia-Khanh; Chang, Yu-Jen
2018-03-20
A method is proposed for characterizing the optical properties of articular cartilage sliced from a pig's thighbone using a Stokes-Mueller polarimetry technique. The principal axis angle, phase retardance, optical rotation angle, circular diattenuation, diattenuation axis angle, linear diattenuation, and depolarization index properties of the cartilage sample are all decoupled in the proposed analytical model. Consequently, the accuracy and robustness of the extracted results are improved. The glucose concentration, collagen distribution, and scattering properties of samples from various depths of the articular cartilage are systematically explored via an inspection of the related parameters. The results show that the glucose concentration and scattering effect are both enhanced in the superficial region of the cartilage. By contrast, the collagen density increases with an increasing sample depth.
Measurement Protocols for In situ Analysis of Organic Compounds at Mars and Comets
NASA Technical Reports Server (NTRS)
Mahaffy, P. R.; Brinckerhuff, W. B.; Buch, A.; Cabane, M.; Coll, P.; Demick, J.; Glavin, D. P.; Navarro-Gonzalez, R.
2005-01-01
The determination of the abundance and chemical and isotopic composition of organic molecules in comets and those that might be found in protected environments at Mars is a first step toward understanding prebiotic chemistries on these solar system bodies. While future sample return missions from Mars and comets will enable detailed chemical and isotopic analysis with a wide range of analytical techniques, precursor insitu investigations can complement these missions and facilitate the identification of optimal sites for sample return. Robust automated experiments that make efficient use of limited spacecraft power, mass, and data volume resources are required for use by insitu missions. Within these constraints we continue to explore a range of instrument techniques and measurement protocols that can maximize the return from such insitu investigations.
Human motion planning based on recursive dynamics and optimal control techniques
NASA Technical Reports Server (NTRS)
Lo, Janzen; Huang, Gang; Metaxas, Dimitris
2002-01-01
This paper presents an efficient optimal control and recursive dynamics-based computer animation system for simulating and controlling the motion of articulated figures. A quasi-Newton nonlinear programming technique (super-linear convergence) is implemented to solve minimum torque-based human motion-planning problems. The explicit analytical gradients needed in the dynamics are derived using a matrix exponential formulation and Lie algebra. Cubic spline functions are used to make the search space for an optimal solution finite. Based on our formulations, our method is well conditioned and robust, in addition to being computationally efficient. To better illustrate the efficiency of our method, we present results of natural looking and physically correct human motions for a variety of human motion tasks involving open and closed loop kinematic chains.
Measurement Protocols for In Situ Analysis of Organic Compounds at Mars and Comets
NASA Technical Reports Server (NTRS)
Mahaffy, P. R.; Brinckerhoff, W. B.; Buch, A.; Cabane, M.; Coll, P.; Demick, J.; Glavin, D. P.; Navarro-Gonzalez, R.
2005-01-01
The determination of the abundance and chemical and isotopic composition of organic molecules in comets and those that might be found in protected environments at Mars is a first step toward understanding prebiotic chemistries on these solar system bodies. While future sample return missions from Mars and comets will enable detailed chemical and isotopic analysis with a wide range of analytical techniques, precursor insitu investigations can complement these missions and facilitate the identification of optimal sites for sample return. Robust automated experiments that make efficient use of limited spacecraft power, mass, and data volume resources are required for use by insitu missions. Within these constraints we continue to explore a range of instrument techniques and measurement protocols that can maximize the return from such insitu investigations.
Counterfeit drugs: analytical techniques for their identification.
Martino, R; Malet-Martino, M; Gilard, V; Balayssac, S
2010-09-01
In recent years, the number of counterfeit drugs has increased dramatically, including not only "lifestyle" products but also vital medicines. Besides the threat to public health, the financial and reputational damage to pharmaceutical companies is substantial. The lack of robust information on the prevalence of fake drugs is an obstacle in the fight against drug counterfeiting. It is generally accepted that approximately 10% of drugs worldwide could be counterfeit, but it is also well known that this number covers very different situations depending on the country, the places where the drugs are purchased, and the definition of what constitutes a counterfeit drug. The chemical analysis of drugs suspected to be fake is a crucial step as counterfeiters are becoming increasingly sophisticated, rendering visual inspection insufficient to distinguish the genuine products from the counterfeit ones. This article critically reviews the recent analytical methods employed to control the quality of drug formulations, using as an example artemisinin derivatives, medicines particularly targeted by counterfeiters. Indeed, a broad panel of techniques have been reported for their analysis, ranging from simple and cheap in-field ones (colorimetry and thin-layer chromatography) to more advanced laboratory methods (mass spectrometry, nuclear magnetic resonance, and vibrational spectroscopies) through chromatographic methods, which remain the most widely used. The conclusion section of the article highlights the questions to be posed before selecting the most appropriate analytical approach.
Dave, Vivek S; Shahin, Hend I; Youngren-Ortiz, Susanne R; Chougule, Mahavir B; Haware, Rahul V
2017-10-30
The density, porosity, breaking force, viscoelastic properties, and the presence or absence of any structural defects or irregularities are important physical-mechanical quality attributes of popular solid dosage forms like tablets. The irregularities associated with these attributes may influence the drug product functionality. Thus, an accurate and efficient characterization of these properties is critical for successful development and manufacturing of a robust tablets. These properties are mainly analyzed and monitored with traditional pharmacopeial and non-pharmacopeial methods. Such methods are associated with several challenges such as lack of spatial resolution, efficiency, or sample-sparing attributes. Recent advances in technology, design, instrumentation, and software have led to the emergence of newer techniques for non-invasive characterization of physical-mechanical properties of tablets. These techniques include near infrared spectroscopy, Raman spectroscopy, X-ray microtomography, nuclear magnetic resonance (NMR) imaging, terahertz pulsed imaging, laser-induced breakdown spectroscopy, and various acoustic- and thermal-based techniques. Such state-of-the-art techniques are currently applied at various stages of development and manufacturing of tablets at industrial scale. Each technique has specific advantages or challenges with respect to operational efficiency and cost, compared to traditional analytical methods. Currently, most of these techniques are used as secondary analytical tools to support the traditional methods in characterizing or monitoring tablet quality attributes. Therefore, further development in the instrumentation and software, and studies on the applications are necessary for their adoption in routine analysis and monitoring of tablet physical-mechanical properties. Copyright © 2017 Elsevier B.V. All rights reserved.
Moghimi, Fatemeh Hoda; Cheung, Michael; Wickramasinghe, Nilmini
2013-01-01
Healthcare is an information rich industry where successful outcomes require the processing of multi-spectral data and sound decision making. The exponential growth of data and big data issues coupled with a rapid increase of service demands in healthcare contexts today, requires a robust framework enabled by IT (information technology) solutions as well as real-time service handling in order to ensure superior decision making and successful healthcare outcomes. Such a context is appropriate for the application of real time intelligent risk detection decision support systems using predictive analytic techniques such as data mining. To illustrate the power and potential of data science technologies in healthcare decision making scenarios, the use of an intelligent risk detection (IRD) model is proffered for the context of Congenital Heart Disease (CHD) in children, an area which requires complex high risk decisions that need to be made expeditiously and accurately in order to ensure successful healthcare outcomes.
Advances in Molecular Rotational Spectroscopy for Applied Science
NASA Astrophysics Data System (ADS)
Harris, Brent; Fields, Shelby S.; Pulliam, Robin; Muckle, Matt; Neill, Justin L.
2017-06-01
Advances in chemical sensitivity and robust, solid-state designs for microwave/millimeter-wave instrumentation compel the expansion of molecular rotational spectroscopy as research tool into applied science. It is familiar to consider molecular rotational spectroscopy for air analysis. Those techniques for molecular rotational spectroscopy are included in our presentation of a more broad application space for materials analysis using Fourier Transform Molecular Rotational Resonance (FT-MRR) spectrometers. There are potentially transformative advantages for direct gas analysis of complex mixtures, determination of unknown evolved gases with parts per trillion detection limits in solid materials, and unambiguous chiral determination. The introduction of FT-MRR as an alternative detection principle for analytical chemistry has created a ripe research space for the development of new analytical methods and sampling equipment to fully enable FT-MRR. We present the current state of purpose-built FT-MRR instrumentation and the latest application measurements that make use of new sampling methods.
NASA Astrophysics Data System (ADS)
van der Bilt, Willem; Bakke, Jostein; Werner, Johannes; Paasche, Øyvind; Rosqvist, Gunhild
2016-04-01
The collapse of ice shelves, rapidly retreating glaciers and a dramatic recent temperature increase show that Southern Ocean climate is rapidly shifting. Also, instrumental and modelling data demonstrate transient interactions between oceanic and atmospheric forcings as well as climatic teleconnections with lower-latitude regions. Yet beyond the instrumental period, a lack of proxy climate timeseries impedes our understanding of Southern Ocean climate. Also, available records often lack the resolution and chronological control required to resolve rapid climate shifts like those observed at present. Alpine glaciers are found on most Southern Ocean islands and quickly respond to shifts in climate through changes in mass balance. Attendant changes in glacier size drive variations in the production of rock flour, the suspended product of glacial erosion. This climate response may be captured by downstream distal glacier-fed lakes, continuously recording glacier history. Sediment records from such lakes are considered prime sources for paleoclimate reconstructions. Here, we present the first reconstruction of Late Holocene glacier variability from the island of South Georgia. Using a toolbox of advanced physical, geochemical (XRF) and magnetic proxies, in combination with state-of-the-art numerical techniques, we fingerprinted a glacier signal from glacier-fed lake sediments. This lacustrine sediment signal was subsequently calibrated against mapped glacier extent with the help of geomorphological moraine evidence and remote sensing techniques. The outlined approach enabled us to robustly resolve variations of a complex glacier at sub-centennial timescales, while constraining the sedimentological imprint of other geomorphic catchment processes. From a paleoclimate perspective, our reconstruction reveals a dynamic Late Holocene climate, modulated by long-term shifts in regional circulation patterns. We also find evidence for rapid medieval glacier retreat as well as a synchronous bi-polar Little Ice Age (LIA). In conclusion, our work shows the potential of novel analytical and numerical tools to improve the robustness and resolution of lake sediment-based paleoclimate reconstructions beyond the current state-of-the-art.
Model-based ultrasound temperature visualization during and following HIFU exposure.
Ye, Guoliang; Smith, Penny Probert; Noble, J Alison
2010-02-01
This paper describes the application of signal processing techniques to improve the robustness of ultrasound feedback for displaying changes in temperature distribution in treatment using high-intensity focused ultrasound (HIFU), especially at the low signal-to-noise ratios that might be expected in in vivo abdominal treatment. Temperature estimation is based on the local displacements in ultrasound images taken during HIFU treatment, and a method to improve robustness to outliers is introduced. The main contribution of the paper is in the application of a Kalman filter, a statistical signal processing technique, which uses a simple analytical temperature model of heat dispersion to improve the temperature estimation from the ultrasound measurements during and after HIFU exposure. To reduce the sensitivity of the method to previous assumptions on the material homogeneity and signal-to-noise ratio, an adaptive form is introduced. The method is illustrated using data from HIFU exposure of ex vivo bovine liver. A particular advantage of the stability it introduces is that the temperature can be visualized not only in the intervals between HIFU exposure but also, for some configurations, during the exposure itself. 2010 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko; Ozawa, Takeaki
2018-05-31
Body fluid (BF) identification is a critical part of a criminal investigation because of its ability to suggest how the crime was committed and to provide reliable origins of DNA. In contrast to current methods using serological and biochemical techniques, vibrational spectroscopic approaches provide alternative advantages for forensic BF identification, such as non-destructivity and versatility for various BF types and analytical interests. However, unexplored issues remain for its practical application to forensics; for example, a specific BF needs to be discriminated from all other suspicious materials as well as other BFs, and the method should be applicable even to aged BF samples. Herein, we describe an innovative modeling method for discriminating the ATR FT-IR spectra of various BFs, including peripheral blood, saliva, semen, urine and sweat, to meet the practical demands described above. Spectra from unexpected non-BF samples were efficiently excluded as outliers by adopting the Q-statistics technique. The robustness of the models against aged BFs was significantly improved by using the discrimination scheme of a dichotomous classification tree with hierarchical clustering. The present study advances the use of vibrational spectroscopy and a chemometric strategy for forensic BF identification.
Measuring salivary analytes from free-ranging monkeys
Higham, James P.; Vitale, Alison; Rivera, Adaris Mas; Ayala, James E.; Maestripieri, Dario
2014-01-01
Studies of large free-ranging mammals have been revolutionized by non-invasive methods for assessing physiology, which usually involve the measurement of fecal or urinary biomarkers. However, such techniques are limited by numerous factors. To expand the range of physiological variables measurable non-invasively from free-ranging primates, we developed techniques for sampling monkey saliva by offering monkeys ropes with oral swabs sewn on the ends. We evaluated different attractants for encouraging individuals to offer samples, and proportions of individuals in different age/sex categories willing to give samples. We tested the saliva samples we obtained in three commercially available assays: cortisol, Salivary Alpha Amylase, and Secretory Immunoglobulin A. We show that habituated free-ranging rhesus macaques will give saliva samples voluntarily without training, with 100% of infants, and over 50% of adults willing to chew on collection devices. Our field methods are robust even for analytes that show poor recovery from cotton, and/or that have concentrations dependent on salivary flow rate. We validated the cortisol and SAA assays for use in rhesus macaques by showing aspects of analytical validation, such as that samples dilute linearly and in parallel to assay standards. We also found that values measured correlated with biologically meaningful characteristics of sampled individuals (age and dominance rank). The SIgA assay tested did not react to samples. Given the wide range of analytes measurable in saliva but not in feces or urine, our methods considerably improve our ability to study physiological aspects of the behavior and ecology of free-ranging primates, and are also potentially adaptable to other mammalian taxa. PMID:20837036
NASA Astrophysics Data System (ADS)
Noda, Isao
2014-07-01
Noteworthy experimental practices, which are advancing forward the frontiers of the field of two-dimensional (2D) correlation spectroscopy, are reviewed with the focus on various perturbation methods currently practiced to induce spectral changes, pertinent examples of applications in various fields, and types of analytical probes employed. Types of perturbation methods found in the published literature are very diverse, encompassing both dynamic and static effects. Although a sizable portion of publications report the use of dynamic perturbatuions, much greater number of studies employ static effect, especially that of temperature. Fields of applications covered by the literature are also very broad, ranging from fundamental research to practical applications in a number of physical, chemical and biological systems, such as synthetic polymers, composites and biomolecules. Aside from IR spectroscopy, which is the most commonly used tool, many other analytical probes are used in 2D correlation analysis. The ever expanding trend in depth, breadth and versatility of 2D correlation spectroscopy techniques and their broad applications all point to the robust and healthy state of the field.
NASA Astrophysics Data System (ADS)
Jabbari, Ali
2018-01-01
Surface inset permanent magnet DC machine can be used as an alternative in automation systems due to their high efficiency and robustness. Magnet segmentation is a common technique in order to mitigate pulsating torque components in permanent magnet machines. An accurate computation of air-gap magnetic field distribution is necessary in order to calculate machine performance. An exact analytical method for magnetic vector potential calculation in surface inset permanent magnet machines considering magnet segmentation has been proposed in this paper. The analytical method is based on the resolution of Laplace and Poisson equations as well as Maxwell equation in polar coordinate by using sub-domain method. One of the main contributions of the paper is to derive an expression for the magnetic vector potential in the segmented PM region by using hyperbolic functions. The developed method is applied on the performance computation of two prototype surface inset magnet segmented motors with open circuit and on load conditions. The results of these models are validated through FEM method.
Martinez-Lozano Sinues, Pablo; Landoni, Elena; Miceli, Rosalba; Dibari, Vincenza F; Dugo, Matteo; Agresti, Roberto; Tagliabue, Elda; Cristoni, Simone; Orlandi, Rosaria
2015-09-21
Breath analysis represents a new frontier in medical diagnosis and a powerful tool for cancer biomarker discovery due to the recent development of analytical platforms for the detection and identification of human exhaled volatile compounds. Statistical and bioinformatic tools may represent an effective complement to the technical and instrumental enhancements needed to fully exploit clinical applications of breath analysis. Our exploratory study in a cohort of 14 breast cancer patients and 11 healthy volunteers used secondary electrospray ionization-mass spectrometry (SESI-MS) to detect a cancer-related volatile profile. SESI-MS full-scan spectra were acquired in a range of 40-350 mass-to-charge ratio (m/z), converted to matrix data and analyzed using a procedure integrating data pre-processing for quality control, and a two-step class prediction based on machine-learning techniques, including a robust feature selection, and a classifier development with internal validation. MS spectra from exhaled breath showed an individual-specific breath profile and high reciprocal homogeneity among samples, with strong agreement among technical replicates, suggesting a robust responsiveness of SESI-MS. Supervised analysis of breath data identified a support vector machine (SVM) model including 8 features corresponding to m/z 106, 126, 147, 78, 148, 52, 128, 315 and able to discriminate exhaled breath from breast cancer patients from that of healthy individuals, with sensitivity and specificity above 0.9.Our data highlight the significance of SESI-MS as an analytical technique for clinical studies of breath analysis and provide evidence that our noninvasive strategy detects volatile signatures that may support existing technologies to diagnose breast cancer.
Developing Uncertainty Models for Robust Flutter Analysis Using Ground Vibration Test Data
NASA Technical Reports Server (NTRS)
Potter, Starr; Lind, Rick; Kehoe, Michael W. (Technical Monitor)
2001-01-01
A ground vibration test can be used to obtain information about structural dynamics that is important for flutter analysis. Traditionally, this information#such as natural frequencies of modes#is used to update analytical models used to predict flutter speeds. The ground vibration test can also be used to obtain uncertainty models, such as natural frequencies and their associated variations, that can update analytical models for the purpose of predicting robust flutter speeds. Analyzing test data using the -norm, rather than the traditional 2-norm, is shown to lead to a minimum-size uncertainty description and, consequently, a least-conservative robust flutter speed. This approach is demonstrated using ground vibration test data for the Aerostructures Test Wing. Different norms are used to formulate uncertainty models and their associated robust flutter speeds to evaluate which norm is least conservative.
van Elk, Michiel; Matzke, Dora; Gronau, Quentin F.; Guan, Maime; Vandekerckhove, Joachim; Wagenmakers, Eric-Jan
2015-01-01
According to a recent meta-analysis, religious priming has a positive effect on prosocial behavior (Shariff et al., 2015). We first argue that this meta-analysis suffers from a number of methodological shortcomings that limit the conclusions that can be drawn about the potential benefits of religious priming. Next we present a re-analysis of the religious priming data using two different meta-analytic techniques. A Precision-Effect Testing–Precision-Effect-Estimate with Standard Error (PET-PEESE) meta-analysis suggests that the effect of religious priming is driven solely by publication bias. In contrast, an analysis using Bayesian bias correction suggests the presence of a religious priming effect, even after controlling for publication bias. These contradictory statistical results demonstrate that meta-analytic techniques alone may not be sufficiently robust to firmly establish the presence or absence of an effect. We argue that a conclusive resolution of the debate about the effect of religious priming on prosocial behavior – and about theoretically disputed effects more generally – requires a large-scale, preregistered replication project, which we consider to be the sole remedy for the adverse effects of experimenter bias and publication bias. PMID:26441741
Sher, Mazhar; Zhuang, Rachel; Demirci, Utkan; Asghar, Waseem
2017-01-01
Introduction There is a significant interest in developing inexpensive portable biosensing platforms for various applications including disease diagnostics, environmental monitoring, food safety, and water testing at the point-of-care (POC) settings. Current diagnostic assays available in the developed world require sophisticated laboratory infrastructure and expensive reagents. Hence, they are not suitable for resource-constrained settings with limited financial resources, basic health infrastructure, and few trained technicians. Cellulose and flexible transparency paper-based analytical devices have demonstrated enormous potential for developing robust, inexpensive and portable devices for disease diagnostics. These devices offer promising solutions to disease management in resource-constrained settings where the vast majority of the population cannot afford expensive and highly sophisticated treatment options. Areas covered In this review, the authors describe currently developed cellulose and flexible transparency paper-based microfluidic devices, device fabrication techniques, and sensing technologies that are integrated with these devices. The authors also discuss the limitations and challenges associated with these devices and their potential in clinical settings. Expert commentary In recent years, cellulose and flexible transparency paper-based microfluidic devices have demonstrated the potential to become future healthcare options despite a few limitations such as low sensitivity and reproducibility. PMID:28103450
Sher, Mazhar; Zhuang, Rachel; Demirci, Utkan; Asghar, Waseem
2017-04-01
There is a significant interest in developing inexpensive portable biosensing platforms for various applications including disease diagnostics, environmental monitoring, food safety, and water testing at the point-of-care (POC) settings. Current diagnostic assays available in the developed world require sophisticated laboratory infrastructure and expensive reagents. Hence, they are not suitable for resource-constrained settings with limited financial resources, basic health infrastructure, and few trained technicians. Cellulose and flexible transparency paper-based analytical devices have demonstrated enormous potential for developing robust, inexpensive and portable devices for disease diagnostics. These devices offer promising solutions to disease management in resource-constrained settings where the vast majority of the population cannot afford expensive and highly sophisticated treatment options. Areas covered: In this review, the authors describe currently developed cellulose and flexible transparency paper-based microfluidic devices, device fabrication techniques, and sensing technologies that are integrated with these devices. The authors also discuss the limitations and challenges associated with these devices and their potential in clinical settings. Expert commentary: In recent years, cellulose and flexible transparency paper-based microfluidic devices have demonstrated the potential to become future healthcare options despite a few limitations such as low sensitivity and reproducibility.
NASA Astrophysics Data System (ADS)
Braun, Jean; van der Beek, Peter; Batt, Geoffrey
2006-05-01
Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching
A study of the temporal robustness of the growing global container-shipping network
Wang, Nuo; Wu, Nuan; Dong, Ling-ling; Yan, Hua-kun; Wu, Di
2016-01-01
Whether they thrive as they grow must be determined for all constantly expanding networks. However, few studies have focused on this important network feature or the development of quantitative analytical methods. Given the formation and growth of the global container-shipping network, we proposed the concept of network temporal robustness and quantitative method. As an example, we collected container liner companies’ data at two time points (2004 and 2014) and built a shipping network with ports as nodes and routes as links. We thus obtained a quantitative value of the temporal robustness. The temporal robustness is a significant network property because, for the first time, we can clearly recognize that the shipping network has become more vulnerable to damage over the last decade: When the node failure scale reached 50% of the entire network, the temporal robustness was approximately −0.51% for random errors and −12.63% for intentional attacks. The proposed concept and analytical method described in this paper are significant for other network studies. PMID:27713549
Chen, Wei; Li, Yanying; Chen, Chang-Er; Sweetman, Andrew J; Zhang, Hao; Jones, Kevin C
2017-11-21
Widespread use of organic chemicals in household and personal care products (HPCPs) and their discharge into aquatic systems means reliable, robust techniques to monitor environmental concentrations are needed. The passive sampling approach of diffusive gradients in thin-films (DGT) is developed here and demonstrated to provide in situ quantitative and time-weighted average (TWA) measurement of these chemicals in waters. The novel technique is developed for HPCPs, including preservatives, antioxidants and disinfectants, by evaluating the performance of different binding agents. Ultrasonic extraction of binding gels in acetonitrile gave good and consistent recoveries for all test chemicals. Uptake by DGT with HLB (hydrophilic-lipophilic-balanced) as the binding agent was relatively independent of pH (3.5-9.5), ionic strength (0.001-0.1 M) and dissolved organic matter (0-20 mg L -1 ), making it suitable for applications across a wide range of environments. Deployment time and diffusion layer thickness dependence experiments confirmed DGT accumulated chemicals masses are consistent with theoretical predictions. The technique was further tested and applied in the influent and effluent of a wastewater treatment plant. Results were compared with conventional grab-sampling and 24-h-composited samples from autosamplers. DGT provided TWA concentrations over up to 18 days deployment, with minimal effects from biofouling or the diffusive boundary layer. The field application demonstrated advantages of the DGT technique: it gives in situ analyte preconcentration in a simple matrix, with more quantitative measurement of the HPCP analytes.
A robust interrupted time series model for analyzing complex health care intervention data.
Cruz, Maricela; Bender, Miriam; Ombao, Hernando
2017-12-20
Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be "interrupted" by a change in a particular method of health care delivery. Interrupted time series (ITS) is a robust quasi-experimental design with the ability to infer the effectiveness of an intervention that accounts for data dependency. Current standardized methods for analyzing ITS data do not model changes in variation and correlation following the intervention. This is a key limitation since it is plausible for data variability and dependency to change because of the intervention. Moreover, present methodology either assumes a prespecified interruption time point with an instantaneous effect or removes data for which the effect of intervention is not fully realized. In this paper, we describe and develop a novel robust interrupted time series (robust-ITS) model that overcomes these omissions and limitations. The robust-ITS model formally performs inference on (1) identifying the change point; (2) differences in preintervention and postintervention correlation; (3) differences in the outcome variance preintervention and postintervention; and (4) differences in the mean preintervention and postintervention. We illustrate the proposed method by analyzing patient satisfaction data from a hospital that implemented and evaluated a new nursing care delivery model as the intervention of interest. The robust-ITS model is implemented in an R Shiny toolbox, which is freely available to the community. Copyright © 2017 John Wiley & Sons, Ltd.
A two-dimensional composite grid numerical model based on the reduced system for oceanography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Y.F.; Browning, G.L.; Chesshire, G.
The proper mathematical limit of a hyperbolic system with multiple time scales, the reduced system, is a system that contains no high-frequency motions and is well posed if suitable boundary conditions are chosen for the initial-boundary value problem. The composite grid method, a robust and efficient grid-generation technique that smoothly and accurately treats general irregular boundaries, is used to approximate the two-dimensional version of the reduced system for oceanography on irregular ocean basins. A change-of-variable technique that substantially increases the accuracy of the model and a method for efficiently solving the elliptic equation for the geopotential are discussed. Numerical resultsmore » are presented for circular and kidney-shaped basins by using a set of analytic solutions constructed in this paper.« less
Disentangling Random Motion and Flow in a Complex Medium
Koslover, Elena F.; Chan, Caleb K.; Theriot, Julie A.
2016-01-01
We describe a technique for deconvolving the stochastic motion of particles from large-scale fluid flow in a dynamic environment such as that found in living cells. The method leverages the separation of timescales to subtract out the persistent component of motion from single-particle trajectories. The mean-squared displacement of the resulting trajectories is rescaled so as to enable robust extraction of the diffusion coefficient and subdiffusive scaling exponent of the stochastic motion. We demonstrate the applicability of the method for characterizing both diffusive and fractional Brownian motion overlaid by flow and analytically calculate the accuracy of the method in different parameter regimes. This technique is employed to analyze the motion of lysosomes in motile neutrophil-like cells, showing that the cytoplasm of these cells behaves as a viscous fluid at the timescales examined. PMID:26840734
Svagera, Zdeněk; Hanzlíková, Dagmar; Simek, Petr; Hušek, Petr
2012-03-01
Four disulfide-reducing agents, dithiothreitol (DTT), 2,3-dimercaptopropanesulfonate (DMPS), and the newly tested 2-mercaptoethanesulfonate (MESNA) and Tris(hydroxypropyl)phosphine (THP), were investigated in detail for release of sulfur amino acids in human plasma. After protein precipitation with trichloroacetic acid (TCA), the plasma supernatant was treated with methyl, ethyl, or propyl chloroformate via the well-proven derivatization-extraction technique and the products were subjected to gas chromatographic-mass spectrometric (GC-MS) analysis. All the tested agents proved to be rapid and effective reducing agents for the assay of plasma thiols. When compared with DTT, the novel reducing agents DMPS, MESNA, and THP provided much cleaner extracts and improved analytical performance. Quantification of homocysteine, cysteine, and methionine was performed using their deuterated analogues, whereas other analytes were quantified by means of 4-chlorophenylalanine. Precise and reliable assay of all examined analytes was achieved, irrespective of the chloroformate reagent used. Average relative standard deviations at each analyte level were ≤6%, quantification limits were 0.1-0.2 μmol L(-1), recoveries were 94-121%, and linearity was over three orders of magnitude (r(2) equal to 0.997-0.998). Validation performed with the THP agent and propyl chloroformate derivatization demonstrated the robustness and reliability of this simple sample-preparation methodology.
Identification of novel peptides for horse meat speciation in highly processed foodstuffs.
Claydon, Amy J; Grundy, Helen H; Charlton, Adrian J; Romero, M Rosario
2015-01-01
There is a need for robust analytical methods to support enforcement of food labelling legislation. Proteomics is emerging as a complementary methodology to existing tools such as DNA and antibody-based techniques. Here we describe the development of a proteomics strategy for the determination of meat species in highly processed foods. A database of specific peptides for nine relevant animal species was used to enable semi-targeted species determination. This principle was tested for horse meat speciation, and a range of horse-specific peptides were identified as heat stable marker peptides for the detection of low levels of horse meat in mixtures with other species.
The Influence of Judgment Calls on Meta-Analytic Findings.
Tarrahi, Farid; Eisend, Martin
2016-01-01
Previous research has suggested that judgment calls (i.e., methodological choices made in the process of conducting a meta-analysis) have a strong influence on meta-analytic findings and question their robustness. However, prior research applies case study comparison or reanalysis of a few meta-analyses with a focus on a few selected judgment calls. These studies neglect the fact that different judgment calls are related to each other and simultaneously influence the outcomes of a meta-analysis, and that meta-analytic findings can vary due to non-judgment call differences between meta-analyses (e.g., variations of effects over time). The current study analyzes the influence of 13 judgment calls in 176 meta-analyses in marketing research by applying a multivariate, multilevel meta-meta-analysis. The analysis considers simultaneous influences from different judgment calls on meta-analytic effect sizes and controls for alternative explanations based on non-judgment call differences between meta-analyses. The findings suggest that judgment calls have only a minor influence on meta-analytic findings, whereas non-judgment call differences between meta-analyses are more likely to explain differences in meta-analytic findings. The findings support the robustness of meta-analytic results and conclusions.
The evolution of analytical chemistry methods in foodomics.
Gallo, Monica; Ferranti, Pasquale
2016-01-08
The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.
Pearl, D L; Louie, M; Chui, L; Doré, K; Grimsrud, K M; Martin, S W; Michel, P; Svenson, L W; McEwen, S A
2008-04-01
Using multivariable models, we compared whether there were significant differences between reported outbreak and sporadic cases in terms of their sex, age, and mode and site of disease transmission. We also determined the potential role of administrative, temporal, and spatial factors within these models. We compared a variety of approaches to account for clustering of cases in outbreaks including weighted logistic regression, random effects models, general estimating equations, robust variance estimates, and the random selection of one case from each outbreak. Age and mode of transmission were the only epidemiologically and statistically significant covariates in our final models using the above approaches. Weighing observations in a logistic regression model by the inverse of their outbreak size appeared to be a relatively robust and valid means for modelling these data. Some analytical techniques, designed to account for clustering, had difficulty converging or producing realistic measures of association.
Cracking the chocolate egg problem: polymeric films coated on curved substrates
NASA Astrophysics Data System (ADS)
Brun, Pierre-Thomas; Lee, Anna; Marthelot, Joel; Balestra, Gioele; Gallaire, François; Reis, Pedro
2015-11-01
Inspired by the traditional chocolate egg recipe, we show that pouring a polymeric solution onto spherical molds yields a simple and robust path of fabrication of thin elastic curved shells. The drainage dynamics naturally leads to uniform coatings frozen in time as the polymer cures, which are subsequently peeled off their mold. We show how the polymer curing affects the drainage dynamics and eventually selects the shell thickness and sets its uniformity. To this end, we perform coating experiments using silicon based elastomers, Vinylpolysiloxane (VPS) and Polydimethylsiloxane (PDMS). These results are rationalized combining numerical simulations of the lubrication flow field to a theoretical model of the dynamics yielding an analytical prediction of the formed shell characteristics. In particular, the robustness of the coating technique and its flexibility, two critical features for providing a generic framework for future studies, are shown to be an inherent consequence of the flow field (memory loss). The shell structure is both independent of initial conditions and tailorable by changing a single experimental parameter.
Capillarity Guided Patterning of Microliquids.
Kang, Myeongwoo; Park, Woohyun; Na, Sangcheol; Paik, Sang-Min; Lee, Hyunjae; Park, Jae Woo; Kim, Ho-Young; Jeon, Noo Li
2015-06-01
Soft lithography and other techniques have been developed to investigate biological and chemical phenomena as an alternative to photolithography-based patterning methods that have compatibility problems. Here, a simple approach for nonlithographic patterning of liquids and gels inside microchannels is described. Using a design that incorporates strategically placed microstructures inside the channel, microliquids or gels can be spontaneously trapped and patterned when the channel is drained. The ability to form microscale patterns inside microfluidic channels using simple fluid drain motion offers many advantages. This method is geometrically analyzed based on hydrodynamics and verified with simulation and experiments. Various materials (i.e., water, hydrogels, and other liquids) are successfully patterned with complex shapes that are isolated from each other. Multiple cell types are patterned within the gels. Capillarity guided patterning (CGP) is fast, simple, and robust. It is not limited by pattern shape, size, cell type, and material. In a simple three-step process, a 3D cancer model that mimics cell-cell and cell-extracellular matrix interactions is engineered. The simplicity and robustness of the CGP will be attractive for developing novel in vitro models of organ-on-a-chip and other biological experimental platforms amenable to long-term observation of dynamic events using advanced imaging and analytical techniques. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Russell, Shane R; Claridge, Shelley A
2016-04-01
Because noncovalent interface functionalization is frequently required in graphene-based devices, biomolecular self-assembly has begun to emerge as a route for controlling substrate electronic structure or binding specificity for soluble analytes. The remarkable diversity of structures that arise in biological self-assembly hints at the possibility of equally diverse and well-controlled surface chemistry at graphene interfaces. However, predicting and analyzing adsorbed monolayer structures at such interfaces raises substantial experimental and theoretical challenges. In contrast with the relatively well-developed monolayer chemistry and characterization methods applied at coinage metal surfaces, monolayers on graphene are both less robust and more structurally complex, levying more stringent requirements on characterization techniques. Theory presents opportunities to understand early binding events that lay the groundwork for full monolayer structure. However, predicting interactions between complex biomolecules, solvent, and substrate is necessitating a suite of new force fields and algorithms to assess likely binding configurations, solvent effects, and modulations to substrate electronic properties. This article briefly discusses emerging analytical and theoretical methods used to develop a rigorous chemical understanding of the self-assembly of peptide-graphene interfaces and prospects for future advances in the field.
NASA Astrophysics Data System (ADS)
Frew, Russell; Cannavan, Andrew; Zandric, Zora; Maestroni, Britt; Abrahim, Aiman
2013-04-01
Traceability systems play a key role in assuring a safe and reliable food supply. Analytical techniques harnessing the spatial patterns in distribution of stable isotope and trace element ratios can be used for the determination of the provenance of food. Such techniques offer the potential to enhance global trade by providing an independent means of verifying "paper" traceability systems and can also help to prove authenticity, to combat fraudulent practices, and to control adulteration, which are important issues for economic, religious or cultural reasons. To address some of the challenges that developing countries face in attempting to implement effective food traceability systems, the IAEA, through its Joint FAO/IAEA Division on Nuclear Techniques in Food and Agriculture, has initiated a 5-year coordinated research project involving institutes in 15 developing and developed countries (Austria, Botswana, Chile, China, France, India, Lebanon, Morocco, Portugal, Singapore, Sweden, Thailand, Uganda, UK, USA). The objective is to help in member state laboratories to establish robust analytical techniques and databases, validated to international standards, to determine the provenance of food. Nuclear techniques such as stable isotope and multi-element analysis, along with complementary methods, will be applied for the verification of food traceability systems and claims related to food origin, production, and authenticity. This integrated and multidisciplinary approach to strengthening capacity in food traceability will contribute to the effective implementation of holistic systems for food safety and control. The project focuses mainly on the development of techniques to confirm product authenticity, with several research partners also considering food safety issues. Research topics encompass determination of the geographical origin of a variety of commodities, including seed oils, rice, wine, olive oil, wheat, orange juice, fish, groundnuts, tea, pork, honey and coffee, the adulteration of milk with soy protein, chemical contamination of food products, and inhomogeneity in isotopic ratios in poultry and eggs as a means to determine production history. Analytical techniques include stable isotope ratio measurements (2H/1H, 13C/12C, 15N/14N, 18O/16O, 34S/32S, 87Sr/86Sr, 208Pb/207Pb/206Pb), elemental analysis, DNA fingerprinting, fatty acid and other biomolecule profiling, chromatography-mass spectrometry and near infra-red spectroscopy.
Analysis of short-chain fatty acids in human feces: A scoping review.
Primec, Maša; Mičetić-Turk, Dušanka; Langerholc, Tomaž
2017-06-01
Short-chain fatty acids (SCFAs) play a crucial role in maintaining homeostasis in humans, therefore the importance of a good and reliable SCFAs analytical detection has raised a lot in the past few years. The aim of this scoping review is to show the trends in the development of different methods of SCFAs analysis in feces, based on the literature published in the last eleven years in all major indexing databases. The search criteria included analytical quantification techniques of SCFAs in different human clinical and in vivo studies. SCFAs analysis is still predominantly performed using gas chromatography (GC), followed by high performance liquid chromatography (HPLC), nuclear magnetic resonance (NMR) and capillary electrophoresis (CE). Performances, drawbacks and advantages of these methods are discussed, especially in the light of choosing a proper pretreatment, as feces is a complex biological material. Further optimization to develop a simple, cost effective and robust method for routine use is needed. Copyright © 2017 Elsevier Inc. All rights reserved.
Zuin, Vânia G; Ramin, Luize Z
2018-01-17
New generations of biorefinery combine innovative biomass waste resources from different origins, chemical extraction and/or synthesis of biomaterials, biofuels, and bioenergy via green and sustainable processes. From the very beginning, identifying and evaluating all potentially high value-added chemicals that could be removed from available renewable feedstocks requires robust, efficient, selective, reproducible, and benign analytical approaches. With this in mind, green and sustainable separation of natural products from agro-industrial waste is clearly attractive considering both socio-environmental and economic aspects. In this paper, the concepts of green and sustainable separation of natural products will be discussed, highlighting the main studies conducted on this topic over the last 10 years. The principal analytical techniques (such as solvent, microwave, ultrasound, and supercritical treatments), by-products (e.g., citrus, coffee, corn, and sugarcane waste) and target compounds (polyphenols, proteins, essential oils, etc.) will be presented, including the emerging green and sustainable separation approaches towards bioeconomy and circular economy contexts.
Van Durme, Jim; Ingels, Isabel; De Winne, Ann
2016-08-15
Today, the cocoa industry is in great need of faster and robust analytical techniques to objectively assess incoming cocoa quality. In this work, inline roasting hyphenated with a cooled injection system coupled to a gas chromatograph-mass spectrometer (ILR-CIS-GC-MS) has been explored for the first time to assess fermentation quality and/or overall aroma formation potential of cocoa. This innovative approach resulted in the in-situ formation of relevant cocoa aroma compounds. After comparison with data obtained by headspace solid phase micro extraction (HS-SPME-GC-MS) on conventional roasted cocoa beans, ILR-CIS-GC-MS data on unroasted cocoa beans showed similar formation trends of important cocoa aroma markers as a function of fermentation quality. The latter approach only requires small aliquots of unroasted cocoa beans, can be automatated, requires no sample preparation, needs relatively short analytical times (<1h) and is highly reproducible. Copyright © 2016 Elsevier Ltd. All rights reserved.
Folding-paper-based preconcentrator for low dispersion of preconcentration plug
NASA Astrophysics Data System (ADS)
Lee, Kyungjae; Yoo, Yong Kyoung; Han, Sung Il; Lee, Junwoo; Lee, Dohwan; Kim, Cheonjung; Lee, Jeong Hoon
2017-12-01
Ion concentration polarization (ICP) has been widely studied for collecting target analytes as it is a powerful preconcentrator method employed for charged molecules. Although the method is quite robust, simple, cheap, and yields a high preconcentration factor, a major hurdle to be addressed is extracting the preconcentrated samples without dispersing the plug. This study investigates a 3D folding-paper-based ICP preconcentrator for preconcentrated plug extraction without the dispersion effect. The ICP preconcentrator is printed on a cellulose paper with pre-patterned hydrophobic wax. To extract and isolate the preconcentration plug with minimal dispersion, a 3D pop-up structure is fabricated via water drain, and a preconcentration factor of 300-fold for 10 min is achieved. By optimizing factors such as the electric field, water drain, and sample volume, the technique was enhanced by facilitating sample preconcentration and isolation, thereby providing the possibility for extensive applications in analytical devices such as lateral flow assays and FTAR cards.
Melvin, Elizabeth M; Moore, Brandon R; Gilchrist, Kristin H; Grego, Sonia; Velev, Orlin D
2011-09-01
The recent development of microfluidic "lab on a chip" devices requiring sample sizes <100 μL has given rise to the need to concentrate dilute samples and trap analytes, especially for surface-based detection techniques. We demonstrate a particle collection device capable of concentrating micron-sized particles in a predetermined area by combining AC electroosmosis (ACEO) and dielectrophoresis (DEP). The planar asymmetric electrode pattern uses ACEO pumping to induce equal, quadrilateral flow directed towards a stagnant region in the center of the device. A number of system parameters affecting particle collection efficiency were investigated including electrode and gap width, chamber height, applied potential and frequency, and number of repeating electrode pairs and electrode geometry. The robustness of the on-chip collection design was evaluated against varying electrolyte concentrations, particle types, and particle sizes. These devices are amenable to integration with a variety of detection techniques such as optical evanescent waveguide sensing.
Inorganic chemical analysis of environmental materials—A lecture series
Crock, J.G.; Lamothe, P.J.
2011-01-01
At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.
The use of analytical sedimentation velocity to extract thermodynamic linkage.
Cole, James L; Correia, John J; Stafford, Walter F
2011-11-01
For 25 years, the Gibbs Conference on Biothermodynamics has focused on the use of thermodynamics to extract information about the mechanism and regulation of biological processes. This includes the determination of equilibrium constants for macromolecular interactions by high precision physical measurements. These approaches further reveal thermodynamic linkages to ligand binding events. Analytical ultracentrifugation has been a fundamental technique in the determination of macromolecular reaction stoichiometry and energetics for 85 years. This approach is highly amenable to the extraction of thermodynamic couplings to small molecule binding in the overall reaction pathway. In the 1980s this approach was extended to the use of sedimentation velocity techniques, primarily by the analysis of tubulin-drug interactions by Na and Timasheff. This transport method necessarily incorporates the complexity of both hydrodynamic and thermodynamic nonideality. The advent of modern computational methods in the last 20 years has subsequently made the analysis of sedimentation velocity data for interacting systems more robust and rigorous. Here we review three examples where sedimentation velocity has been useful at extracting thermodynamic information about reaction stoichiometry and energetics. Approaches to extract linkage to small molecule binding and the influence of hydrodynamic nonideality are emphasized. These methods are shown to also apply to the collection of fluorescence data with the new Aviv FDS. Copyright © 2011 Elsevier B.V. All rights reserved.
The use of analytical sedimentation velocity to extract thermodynamic linkage
Cole, James L.; Correia, John J.; Stafford, Walter F.
2011-01-01
For 25 years, the Gibbs Conference on Biothermodynamics has focused on the use of thermodynamics to extract information about the mechanism and regulation of biological processes. This includes the determination of equilibrium constants for macromolecular interactions by high precision physical measurements. These approaches further reveal thermodynamic linkages to ligand binding events. Analytical ultracentrifugation has been a fundamental technique in the determination of macromolecular reaction stoichiometry and energetics for 85 years. This approach is highly amenable to the extraction of thermodynamic couplings to small molecule binding in the overall reaction pathway. In the 1980’s this approach was extended to the use of sedimentation velocity techniques, primarily by the analysis of tubulin-drug interactions by Na and Timasheff. This transport method necessarily incorporates the complexity of both hydrodynamic and thermodynamic nonideality. The advent of modern computational methods in the last 20 years has subsequently made the analysis of sedimentation velocity data for interacting systems more robust and rigorous. Here we review three examples where sedimentation velocity has been useful at extracting thermodynamic information about reaction stoichiometry and energetics. Approaches to extract linkage to small molecule binding and the influence of hydrodynamic nonideality are emphasized. These methods are shown to also apply to the collection of fluorescence data with the new Aviv FDS. PMID:21703752
Keiter, David A.; Davis, Amy J.; Rhodes, Olin E.; ...
2017-08-25
Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. For this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movementmore » had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, David A.; Davis, Amy J.; Rhodes, Olin E.
Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. For this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movementmore » had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.« less
2016-01-01
Rifaximin is an oral nonabsorbable antibiotic that acts locally in the gastrointestinal tract with minimal systemic adverse effects. It does not have spectrophotometric method ecofriendly in the ultraviolet region described in official compendiums and literature. The analytical techniques for determination of rifaximin reported in the literature require large amount of time to release results and are significantly onerous. Furthermore, they use toxic reagents both for the operator and environment and, therefore, cannot be considered environmentally friendly analytical techniques. The objective of this study was to develop and validate an ecofriendly spectrophotometric method in the ultraviolet region to quantify rifaximin in tablets. The method was validated, showing linearity, selectivity, precision, accuracy, and robustness. It was linear over the concentration range of 10–30 mg L−1 with correlation coefficients greater than 0.9999 and limits of detection and quantification of 1.39 and 4.22 mg L−1, respectively. The validated method is useful and applied for the routine quality control of rifaximin, since it is simple with inexpensive conditions and fast in the release of results, optimizes analysts and equipment, and uses environmentally friendly solvents, being considered a green method, which does not prejudice either the operator or the environment. PMID:27429835
Crossing Fibers Detection with an Analytical High Order Tensor Decomposition
Megherbi, T.; Kachouane, M.; Oulebsir-Boumghar, F.; Deriche, R.
2014-01-01
Diffusion magnetic resonance imaging (dMRI) is the only technique to probe in vivo and noninvasively the fiber structure of human brain white matter. Detecting the crossing of neuronal fibers remains an exciting challenge with an important impact in tractography. In this work, we tackle this challenging problem and propose an original and efficient technique to extract all crossing fibers from diffusion signals. To this end, we start by estimating, from the dMRI signal, the so-called Cartesian tensor fiber orientation distribution (CT-FOD) function, whose maxima correspond exactly to the orientations of the fibers. The fourth order symmetric positive definite tensor that represents the CT-FOD is then analytically decomposed via the application of a new theoretical approach and this decomposition is used to accurately extract all the fibers orientations. Our proposed high order tensor decomposition based approach is minimal and allows recovering the whole crossing fibers without any a priori information on the total number of fibers. Various experiments performed on noisy synthetic data, on phantom diffusion, data and on human brain data validate our approach and clearly demonstrate that it is efficient, robust to noise and performs favorably in terms of angular resolution and accuracy when compared to some classical and state-of-the-art approaches. PMID:25246940
NASA Astrophysics Data System (ADS)
Falugi, P.; Olaru, S.; Dumur, D.
2010-08-01
This article proposes an explicit robust predictive control solution based on linear matrix inequalities (LMIs). The considered predictive control strategy uses different local descriptions of the system dynamics and uncertainties and thus allows the handling of less conservative input constraints. The computed control law guarantees constraint satisfaction and asymptotic stability. The technique is effective for a class of nonlinear systems embedded into polytopic models. A detailed discussion of the procedures which adapt the partition of the state space is presented. For the practical implementation the construction of suitable (explicit) descriptions of the control law are described upon concrete algorithms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordström, Jan, E-mail: jan.nordstrom@liu.se; Wahlsten, Markus, E-mail: markus.wahlsten@liu.se
We consider a hyperbolic system with uncertainty in the boundary and initial data. Our aim is to show that different boundary conditions give different convergence rates of the variance of the solution. This means that we can with the same knowledge of data get a more or less accurate description of the uncertainty in the solution. A variety of boundary conditions are compared and both analytical and numerical estimates of the variance of the solution are presented. As an application, we study the effect of this technique on Maxwell's equations as well as on a subsonic outflow boundary for themore » Euler equations.« less
In-vivo analysis of ankle joint movement for patient-specific kinematic characterization.
Ferraresi, Carlo; De Benedictis, Carlo; Franco, Walter; Maffiodo, Daniela; Leardini, Alberto
2017-09-01
In this article, a method for the experimental in-vivo characterization of the ankle kinematics is proposed. The method is meant to improve personalization of various ankle joint treatments, such as surgical decision-making or design and application of an orthosis, possibly to increase their effectiveness. This characterization in fact would make the treatments more compatible with the specific patient's joint physiological conditions. This article describes the experimental procedure and the analytical method adopted, based on the instantaneous and mean helical axis theories. The results obtained in this experimental analysis reveal that more accurate techniques are necessary for a robust in-vivo assessment of the tibio-talar axis of rotation.
Atomic cobalt on nitrogen-doped graphene for hydrogen generation
Fei, Huilong; Dong, Juncai; Arellano-Jiménez, M. Josefina; Ye, Gonglan; Dong Kim, Nam; Samuel, Errol L.G.; Peng, Zhiwei; Zhu, Zhuan; Qin, Fan; Bao, Jiming; Yacaman, Miguel Jose; Ajayan, Pulickel M.; Chen, Dongliang; Tour, James M.
2015-01-01
Reduction of water to hydrogen through electrocatalysis holds great promise for clean energy, but its large-scale application relies on the development of inexpensive and efficient catalysts to replace precious platinum catalysts. Here we report an electrocatalyst for hydrogen generation based on very small amounts of cobalt dispersed as individual atoms on nitrogen-doped graphene. This catalyst is robust and highly active in aqueous media with very low overpotentials (30 mV). A variety of analytical techniques and electrochemical measurements suggest that the catalytically active sites are associated with the metal centres coordinated to nitrogen. This unusual atomic constitution of supported metals is suggestive of a new approach to preparing extremely efficient single-atom catalysts. PMID:26487368
Making the Dzyaloshinskii-Moriya interaction visible
NASA Astrophysics Data System (ADS)
Hrabec, A.; Belmeguenai, M.; Stashkevich, A.; Chérif, S. M.; Rohart, S.; Roussigné, Y.; Thiaville, A.
2017-06-01
Brillouin light spectroscopy is a powerful and robust technique for measuring the interfacial Dzyaloshinskii-Moriya interaction in thin films with broken inversion symmetry. Here, we show that the magnon visibility, i.e., the intensity of the inelastically scattered light, strongly depends on the thickness of the dielectric seed material—SiO2. By using both, analytical thin-film optics and numerical calculations, we reproduce the experimental data. We therefore provide a guideline for the maximization of the signal by adapting the substrate properties to the geometry of the measurement. Such a boost-up of the signal eases the magnon visualization in ultrathin magnetic films, speeds-up the measurement and increases the reliability of the data.
ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.
Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa
2016-05-01
The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
SPECTROSCOPIC ONLINE MONITORING FOR PROCESS CONTROL AND SAFEGUARDING OF RADIOCHEMICAL STREAMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Samuel A.; Levitskaia, Tatiana G.
2013-09-29
There is a renewed interest worldwide to promote the use of nuclear power and close the nuclear fuel cycle. The long term successful use of nuclear power is critically dependent upon adequate and safe processing and disposition of the used nuclear fuel. Liquid-liquid extraction is a separation technique commonly employed for the processing of the dissolved used nuclear fuel. The instrumentation used to monitor these processes must be robust, require little or no maintenance, and be able to withstand harsh environments such as high radiation fields and aggressive chemical matrices. This paper summarizes application of the absorption and vibrational spectroscopicmore » techniques supplemented by physicochemical measurements for radiochemical process monitoring. In this context, our team experimentally assessed the potential of Raman and spectrophotometric techniques for online real-time monitoring of the U(VI)/nitrate ion/nitric acid and Pu(IV)/Np(V)/Nd(III), respectively, in solutions relevant to spent fuel reprocessing. These techniques demonstrate robust performance in the repetitive batch measurements of each analyte in a wide concentration range using simulant and commercial dissolved spent fuel solutions. Spectroscopic measurements served as training sets for the multivariate data analysis to obtain partial least squares predictive models, which were validated using on-line centrifugal contactor extraction tests. Satisfactory prediction of the analytes concentrations in these preliminary experiments warrants further development of the spectroscopy-based methods for radiochemical process control and safeguarding. Additionally, the ability to identify material intentionally diverted from a liquid-liquid extraction contactor system was successfully tested using on-line process monitoring as a means to detect the amount of material diverted. A chemical diversion and detection from a liquid-liquid extraction scheme was demonstrated using a centrifugal contactor system operating with the simulant PUREX extraction system of Nd(NO3)3/nitric acid aqueous phase and TBP/n-dodecane organic phase. During a continuous extraction experiment, a portion of the feed from a counter-current extraction system was diverted while the spectroscopic on-line process monitoring system was simultaneously measuring the feed, raffinate and organic products streams. The amount observed to be diverted by on-line spectroscopic process monitoring was in excellent agreement with values based from the known mass of sample directly taken (diverted) from system feed solution.« less
NASA Astrophysics Data System (ADS)
Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd
2009-05-01
Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.
Szerkus, Oliwia; Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bartosińska, Ewa; Bujak, Renata; Borsuk, Agnieszka; Bienert, Agnieszka; Bartkowska-Śniatkowska, Alicja; Warzybok, Justyna; Wiczling, Paweł; Nasal, Antoni; Kaliszan, Roman; Markuszewski, Michał Jan; Siluk, Danuta
2017-02-01
The purpose of this work was to develop and validate a rapid and robust LC-MS/MS method for the determination of dexmedetomidine (DEX) in plasma, suitable for analysis of a large number of samples. Systematic approach, Design of Experiments, was applied to optimize ESI source parameters and to evaluate method robustness, therefore, a rapid, stable and cost-effective assay was developed. The method was validated according to US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (5-2500 pg/ml), Results: Experimental design approach was applied for optimization of ESI source parameters and evaluation of method robustness. The method was validated according to the US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (R 2 > 0.98). The accuracies, intra- and interday precisions were less than 15%. The stability data confirmed reliable behavior of DEX under tested conditions. Application of Design of Experiments approach allowed for fast and efficient analytical method development and validation as well as for reduced usage of chemicals necessary for regular method optimization. The proposed technique was applied to determination of DEX pharmacokinetics in pediatric patients undergoing long-term sedation in the intensive care unit.
Hybrid passive/active damping for robust multivariable acoustic control in composite plates
NASA Astrophysics Data System (ADS)
Veeramani, Sudha; Wereley, Norman M.
1996-05-01
Noise transmission through a flexible kevlar-epoxy composite trim panel into an acoustic cavity or box is studied with the intent of controlling the interior sound fields. A hybrid noise attenuation technique is proposed which uses viscoelastic damping layers in the composite plate for passive attenuation of high frequency noise transmission, and uses piezo-electric patch actuators for active control in the low frequency range. An adaptive feedforward noise control strategy is applied. The passive structural damping augmentation incorporated in the composite plates is also intended to increase stability robustness of the active noise control strategy. A condenser microphone in the interior of the enclosure functions as the error sensor. Three composite plates were experimentally evaluated: one with no damping layer, the second with a 10 mil damping layer, and the third with a 15 mil damping layer. The damping layer was cocured in the kevlar-epoxy trim panels. Damping in the plates was increased from 1.6% for the plate with no damping layer, to 5.9% for the plate with a 15 mil damping layer. In experimental studies, the improved stability robustness of the controller was demonstrated by improved adaptive feedforward control algorithm convergence. A preliminary analytical model is presented that describes the dynamic behavior of a composite panel actuated by piezoelectric actuators bonded to its surface.
Missing Value Monitoring Enhances the Robustness in Proteomics Quantitation.
Matafora, Vittoria; Corno, Andrea; Ciliberto, Andrea; Bachi, Angela
2017-04-07
In global proteomic analysis, it is estimated that proteins span from millions to less than 100 copies per cell. The challenge of protein quantitation by classic shotgun proteomic techniques relies on the presence of missing values in peptides belonging to low-abundance proteins that lowers intraruns reproducibility affecting postdata statistical analysis. Here, we present a new analytical workflow MvM (missing value monitoring) able to recover quantitation of missing values generated by shotgun analysis. In particular, we used confident data-dependent acquisition (DDA) quantitation only for proteins measured in all the runs, while we filled the missing values with data-independent acquisition analysis using the library previously generated in DDA. We analyzed cell cycle regulated proteins, as they are low abundance proteins with highly dynamic expression levels. Indeed, we found that cell cycle related proteins are the major components of the missing values-rich proteome. Using the MvM workflow, we doubled the number of robustly quantified cell cycle related proteins, and we reduced the number of missing values achieving robust quantitation for proteins over ∼50 molecules per cell. MvM allows lower quantification variance among replicates for low abundance proteins with respect to DDA analysis, which demonstrates the potential of this novel workflow to measure low abundance, dynamically regulated proteins.
Robust Feedback Control of Flow Induced Structural Radiation of Sound
NASA Technical Reports Server (NTRS)
Heatwole, Craig M.; Bernhard, Robert J.; Franchek, Matthew A.
1997-01-01
A significant component of the interior noise of aircraft and automobiles is a result of turbulent boundary layer excitation of the vehicular structure. In this work, active robust feedback control of the noise due to this non-predictable excitation is investigated. Both an analytical model and experimental investigations are used to determine the characteristics of the flow induced structural sound radiation problem. The problem is shown to be broadband in nature with large system uncertainties associated with the various operating conditions. Furthermore the delay associated with sound propagation is shown to restrict the use of microphone feedback. The state of the art control methodologies, IL synthesis and adaptive feedback control, are evaluated and shown to have limited success for solving this problem. A robust frequency domain controller design methodology is developed for the problem of sound radiated from turbulent flow driven plates. The control design methodology uses frequency domain sequential loop shaping techniques. System uncertainty, sound pressure level reduction performance, and actuator constraints are included in the design process. Using this design method, phase lag was added using non-minimum phase zeros such that the beneficial plant dynamics could be used. This general control approach has application to lightly damped vibration and sound radiation problems where there are high bandwidth control objectives requiring a low controller DC gain and controller order.
An adaptive robust controller for time delay maglev transportation systems
NASA Astrophysics Data System (ADS)
Milani, Reza Hamidi; Zarabadipour, Hassan; Shahnazi, Reza
2012-12-01
For engineering systems, uncertainties and time delays are two important issues that must be considered in control design. Uncertainties are often encountered in various dynamical systems due to modeling errors, measurement noises, linearization and approximations. Time delays have always been among the most difficult problems encountered in process control. In practical applications of feedback control, time delay arises frequently and can severely degrade closed-loop system performance and in some cases, drives the system to instability. Therefore, stability analysis and controller synthesis for uncertain nonlinear time-delay systems are important both in theory and in practice and many analytical techniques have been developed using delay-dependent Lyapunov function. In the past decade the magnetic and levitation (maglev) transportation system as a new system with high functionality has been the focus of numerous studies. However, maglev transportation systems are highly nonlinear and thus designing controller for those are challenging. The main topic of this paper is to design an adaptive robust controller for maglev transportation systems with time-delay, parametric uncertainties and external disturbances. In this paper, an adaptive robust control (ARC) is designed for this purpose. It should be noted that the adaptive gain is derived from Lyapunov-Krasovskii synthesis method, therefore asymptotic stability is guaranteed.
Neural network uncertainty assessment using Bayesian statistics: a remote sensing application
NASA Technical Reports Server (NTRS)
Aires, F.; Prigent, C.; Rossow, W. B.
2004-01-01
Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component analysis is proposed to suppress the multicollinearities in order to make these Jacobians robust and physically meaningful.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
NASA Astrophysics Data System (ADS)
Santos, Léonard; Thirel, Guillaume; Perrin, Charles
2018-04-01
In many conceptual rainfall-runoff models, the water balance differential equations are not explicitly formulated. These differential equations are solved sequentially by splitting the equations into terms that can be solved analytically with a technique called operator splitting
. As a result, only the solutions of the split equations are used to present the different models. This article provides a methodology to make the governing water balance equations of a bucket-type rainfall-runoff model explicit and to solve them continuously. This is done by setting up a comprehensive state-space representation of the model. By representing it in this way, the operator splitting, which makes the structural analysis of the model more complex, could be removed. In this state-space representation, the lag functions (unit hydrographs), which are frequent in rainfall-runoff models and make the resolution of the representation difficult, are first replaced by a so-called Nash cascade
and then solved with a robust numerical integration technique. To illustrate this methodology, the GR4J model is taken as an example. The substitution of the unit hydrographs with a Nash cascade, even if it modifies the model behaviour when solved using operator splitting, does not modify it when the state-space representation is solved using an implicit integration technique. Indeed, the flow time series simulated by the new representation of the model are very similar to those simulated by the classic model. The use of a robust numerical technique that approximates a continuous-time model also improves the lag parameter consistency across time steps and provides a more time-consistent model with time-independent parameters.
The Analytic Hierarchy Process and Participatory Decisionmaking
Daniel L. Schmoldt; Daniel L. Peterson; Robert L. Smith
1995-01-01
Managing natural resource lands requires social, as well as biophysical, considerations. Unfortunately, it is extremely difficult to accurately assess and quantify changing social preferences, and to aggregate conflicting opinions held by diverse social groups. The Analytic Hierarchy Process (AHP) provides a systematic, explicit, rigorous, and robust mechanism for...
Robust approximate optimal guidance strategies for aeroassisted orbital transfer missions
NASA Astrophysics Data System (ADS)
Ilgen, Marc R.
This thesis presents the application of game theoretic and regular perturbation methods to the problem of determining robust approximate optimal guidance laws for aeroassisted orbital transfer missions with atmospheric density and navigated state uncertainties. The optimal guidance problem is reformulated as a differential game problem with the guidance law designer and Nature as opposing players. The resulting equations comprise the necessary conditions for the optimal closed loop guidance strategy in the presence of worst case parameter variations. While these equations are nonlinear and cannot be solved analytically, the presence of a small parameter in the equations of motion allows the method of regular perturbations to be used to solve the equations approximately. This thesis is divided into five parts. The first part introduces the class of problems to be considered and presents results of previous research. The second part then presents explicit semianalytical guidance law techniques for the aerodynamically dominated region of flight. These guidance techniques are applied to unconstrained and control constrained aeroassisted plane change missions and Mars aerocapture missions, all subject to significant atmospheric density variations. The third part presents a guidance technique for aeroassisted orbital transfer problems in the gravitationally dominated region of flight. Regular perturbations are used to design an implicit guidance technique similar to the second variation technique but that removes the need for numerically computing an optimal trajectory prior to flight. This methodology is then applied to a set of aeroassisted inclination change missions. In the fourth part, the explicit regular perturbation solution technique is extended to include the class of guidance laws with partial state information. This methodology is then applied to an aeroassisted plane change mission using inertial measurements and subject to uncertainties in the initial value of the flight path angle. A summary of performance results for all these guidance laws is presented in the fifth part of this thesis along with recommendations for further research.
The study of nanomaterials in environmental systems requires robust and specific analytical methods. Analytical methods which discriminate based on particle size and molecular composition are not widely available. Asymmetric Flow Field-Flow Fractionation (AF4) is a separation...
Validation of the replica trick for simple models
NASA Astrophysics Data System (ADS)
Shinzato, Takashi
2018-04-01
We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.
Biomedical applications of laser-induced breakdown spectroscopy (LIBS)
NASA Astrophysics Data System (ADS)
Unnikrishnan, V. K.; Nayak, Rajesh; Bhat, Sujatha; Mathew, Stanley; Kartha, V. B.; Santhosh, C.
2015-03-01
LIBS has been proven to be a robust elemental analysis tool attracting interest because of the wide applications. LIBS can be used for analysis of any type of samples i.e. environmental/physiological, regardless of its state of matter. Conventional spectroscopy techniques are good in analytical performance, but their sample preparation method is mostly destructive and time consuming. Also, almost all these methods are incapable of analysing multi elements simaltaneously. On the other hand, LIBS has many potential advantages such as simplicity in the experimental setup, less sample preparation, less destructive analysis of sample etc. In this paper, we report some of the biomedical applications of LIBS. From the experiments carried out on clinical samples (calcified tissues or teeth and gall stones) for trace elemental mapping and detection, it was found that LIBS is a robust tool for such applications. It is seen that the presence and relative concentrations of major elements (calcium, phosphorus and magnesium) in human calcified tissue (tooth) can be easily determined using LIBS technique. The importance of this study comes in anthropology where tooth and bone are main samples from which reliable data can be easily retrieved. Similarly, elemental composition of bile juice and gall stone collected from the same subject using LIBS was found to be similar. The results show interesting prospects for LIBS to study cholelithiasis (the presence of stones in the gall bladder, is a common disease of the gastrointestinal tract) better.
NASA Astrophysics Data System (ADS)
Ma, Xu; Li, Yanqiu; Guo, Xuejia; Dong, Lisong
2012-03-01
Optical proximity correction (OPC) and phase shifting mask (PSM) are the most widely used resolution enhancement techniques (RET) in the semiconductor industry. Recently, a set of OPC and PSM optimization algorithms have been developed to solve for the inverse lithography problem, which are only designed for the nominal imaging parameters without giving sufficient attention to the process variations due to the aberrations, defocus and dose variation. However, the effects of process variations existing in the practical optical lithography systems become more pronounced as the critical dimension (CD) continuously shrinks. On the other hand, the lithography systems with larger NA (NA>0.6) are now extensively used, rendering the scalar imaging models inadequate to describe the vector nature of the electromagnetic field in the current optical lithography systems. In order to tackle the above problems, this paper focuses on developing robust gradient-based OPC and PSM optimization algorithms to the process variations under a vector imaging model. To achieve this goal, an integrative and analytic vector imaging model is applied to formulate the optimization problem, where the effects of process variations are explicitly incorporated in the optimization framework. The steepest descent algorithm is used to optimize the mask iteratively. In order to improve the efficiency of the proposed algorithms, a set of algorithm acceleration techniques (AAT) are exploited during the optimization procedure.
Cirujeda, Pol; Muller, Henning; Rubin, Daniel; Aguilera, Todd A; Loo, Billy W; Diehn, Maximilian; Binefa, Xavier; Depeursinge, Adrien
2015-01-01
In this paper we present a novel technique for characterizing and classifying 3D textured volumes belonging to different lung tissue types in 3D CT images. We build a volume-based 3D descriptor, robust to changes of size, rigid spatial transformations and texture variability, thanks to the integration of Riesz-wavelet features within a Covariance-based descriptor formulation. 3D Riesz features characterize the morphology of tissue density due to their response to changes in intensity in CT images. These features are encoded in a Covariance-based descriptor formulation: this provides a compact and flexible representation thanks to the use of feature variations rather than dense features themselves and adds robustness to spatial changes. Furthermore, the particular symmetric definite positive matrix form of these descriptors causes them to lay in a Riemannian manifold. Thus, descriptors can be compared with analytical measures, and accurate techniques from machine learning and clustering can be adapted to their spatial domain. Additionally we present a classification model following a "Bag of Covariance Descriptors" paradigm in order to distinguish three different nodule tissue types in CT: solid, ground-glass opacity, and healthy lung. The method is evaluated on top of an acquired dataset of 95 patients with manually delineated ground truth by radiation oncology specialists in 3D, and quantitative sensitivity and specificity values are presented.
Failure detection system design methodology. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chow, E. Y.
1980-01-01
The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.
Robustness and fragility in coupled oscillator networks under targeted attacks.
Yuan, Tianyu; Aihara, Kazuyuki; Tanaka, Gouhei
2017-01-01
The dynamical tolerance of coupled oscillator networks against local failures is studied. As the fraction of failed oscillator nodes gradually increases, the mean oscillation amplitude in the entire network decreases and then suddenly vanishes at a critical fraction as a phase transition. This critical fraction, widely used as a measure of the network robustness, was analytically derived for random failures but not for targeted attacks so far. Here we derive the general formula for the critical fraction, which can be applied to both random failures and targeted attacks. We consider the effects of targeting oscillator nodes based on their degrees. First we deal with coupled identical oscillators with homogeneous edge weights. Then our theory is applied to networks with heterogeneous edge weights and to those with nonidentical oscillators. The analytical results are validated by numerical experiments. Our results reveal the key factors governing the robustness and fragility of oscillator networks.
Analytical techniques for steroid estrogens in water samples - A review.
Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza
2016-12-01
In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments
NASA Astrophysics Data System (ADS)
Lane, Peter C. R.; Gobet, Fernand
2013-03-01
Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.
Aerocapture Guidance Algorithm Comparison Campaign
NASA Technical Reports Server (NTRS)
Rousseau, Stephane; Perot, Etienne; Graves, Claude; Masciarelli, James P.; Queen, Eric
2002-01-01
The aerocapture is a promising technique for the future human interplanetary missions. The Mars Sample Return was initially based on an insertion by aerocapture. A CNES orbiter Mars Premier was developed to demonstrate this concept. Mainly due to budget constraints, the aerocapture was cancelled for the French orbiter. A lot of studies were achieved during the three last years to develop and test different guidance algorithms (APC, EC, TPC, NPC). This work was shared between CNES and NASA, with a fruitful joint working group. To finish this study an evaluation campaign has been performed to test the different algorithms. The objective was to assess the robustness, accuracy, capability to limit the load, and the complexity of each algorithm. A simulation campaign has been specified and performed by CNES, with a similar activity on the NASA side to confirm the CNES results. This evaluation has demonstrated that the numerical guidance principal is not competitive compared to the analytical concepts. All the other algorithms are well adapted to guaranty the success of the aerocapture. The TPC appears to be the more robust, the APC the more accurate, and the EC appears to be a good compromise.
Digital robust active control law synthesis for large order systems using constrained optimization
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek
1987-01-01
This paper presents a direct digital control law synthesis procedure for a large order, sampled data, linear feedback system using constrained optimization techniques to meet multiple design requirements. A linear quadratic Gaussian type cost function is minimized while satisfying a set of constraints on the design loads and responses. General expressions for gradients of the cost function and constraints, with respect to the digital control law design variables are derived analytically and computed by solving a set of discrete Liapunov equations. The designer can choose the structure of the control law and the design variables, hence a stable classical control law as well as an estimator-based full or reduced order control law can be used as an initial starting point. Selected design responses can be treated as constraints instead of lumping them into the cost function. This feature can be used to modify a control law, to meet individual root mean square response limitations as well as minimum single value restrictions. Low order, robust digital control laws were synthesized for gust load alleviation of a flexible remotely piloted drone aircraft.
Digital robust control law synthesis using constrained optimization
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivekananda
1989-01-01
Development of digital robust control laws for active control of high performance flexible aircraft and large space structures is a research area of significant practical importance. The flexible system is typically modeled by a large order state space system of equations in order to accurately represent the dynamics. The active control law must satisy multiple conflicting design requirements and maintain certain stability margins, yet should be simple enough to be implementable on an onboard digital computer. Described here is an application of a generic digital control law synthesis procedure for such a system, using optimal control theory and constrained optimization technique. A linear quadratic Gaussian type cost function is minimized by updating the free parameters of the digital control law, while trying to satisfy a set of constraints on the design loads, responses and stability margins. Analytical expressions for the gradients of the cost function and the constraints with respect to the control law design variables are used to facilitate rapid numerical convergence. These gradients can be used for sensitivity study and may be integrated into a simultaneous structure and control optimization scheme.
An improved 3D MoF method based on analytical partial derivatives
NASA Astrophysics Data System (ADS)
Chen, Xiang; Zhang, Xiong
2016-12-01
MoF (Moment of Fluid) method is one of the most accurate approaches among various surface reconstruction algorithms. As other second order methods, MoF method needs to solve an implicit optimization problem to obtain the optimal approximate surface. Therefore, the partial derivatives of the objective function have to be involved during the iteration for efficiency and accuracy. However, to the best of our knowledge, the derivatives are currently estimated numerically by finite difference approximation because it is very difficult to obtain the analytical derivatives of the object function for an implicit optimization problem. Employing numerical derivatives in an iteration not only increase the computational cost, but also deteriorate the convergence rate and robustness of the iteration due to their numerical error. In this paper, the analytical first order partial derivatives of the objective function are deduced for 3D problems. The analytical derivatives can be calculated accurately, so they are incorporated into the MoF method to improve its accuracy, efficiency and robustness. Numerical studies show that by using the analytical derivatives the iterations are converged in all mixed cells with the efficiency improvement of 3 to 4 times.
Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M
2013-06-15
Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys. Published by Elsevier B.V.
Beam position monitor engineering
NASA Astrophysics Data System (ADS)
Smith, Stephen R.
1997-01-01
The design of beam position monitors often involves challenging system design choices. Position transducers must be robust, accurate, and generate adequate position signal without unduly disturbing the beam. Electronics must be reliable and affordable, usually while meeting tough requirements on precision, accuracy, and dynamic range. These requirements may be difficult to achieve simultaneously, leading the designer into interesting opportunities for optimization or compromise. Some useful techniques and tools are shown. Both finite element analysis and analytic techniques will be used to investigate quasi-static aspects of electromagnetic fields such as the impedance of and the coupling of beam to striplines or buttons. Finite-element tools will be used to understand dynamic aspects of the electromagnetic fields of beams, such as wake fields and transmission-line and cavity effects in vacuum-to-air feedthroughs. Mathematical modeling of electrical signals through a processing chain will be demonstrated, in particular to illuminate areas where neither a pure time-domain nor a pure frequency-domain analysis is obviously advantageous. Emphasis will be on calculational techniques, in particular on using both time domain and frequency domain approaches to the applicable parts of interesting problems.
Improved determination of vector lithospheric magnetic anomalies from MAGSAT data
NASA Technical Reports Server (NTRS)
Ravat, Dhananjay
1993-01-01
Scientific contributions made in developing new methods to isolate and map vector magnetic anomalies from measurements made by Magsat are described. In addition to the objective of the proposal, the isolation and mapping of equatorial vector lithospheric Magsat anomalies, isolation of polar ionospheric fields during the period were also studied. Significant progress was also made in isolation of polar delta(Z) component and scalar anomalies as well as integration and synthesis of various techniques of removing equatorial and polar ionospheric effects. The significant contributions of this research are: (1) development of empirical/analytical techniques in modeling ionospheric fields in Magsat data and their removal from uncorrected anomalies to obtain better estimates of lithospheric anomalies (this task was accomplished for equatorial delta(X), delta(Z), and delta(B) component and polar delta(Z) and delta(B) component measurements; (2) integration of important processing techniques developed during the last decade with the newly developed technologies of ionospheric field modeling into an optimum processing scheme; and (3) implementation of the above processing scheme to map the most robust magnetic anomalies of the lithosphere (components as well as scalar).
Robustness of high-fidelity Rydberg gates with single-site addressability
NASA Astrophysics Data System (ADS)
Goerz, Michael H.; Halperin, Eli J.; Aytac, Jon M.; Koch, Christiane P.; Whaley, K. Birgitta
2014-09-01
Controlled-phase (cphase) gates can be realized with trapped neutral atoms by making use of the Rydberg blockade. Achieving the ultrahigh fidelities required for quantum computation with such Rydberg gates, however, is compromised by experimental inaccuracies in pulse amplitudes and timings, as well as by stray fields that cause fluctuations of the Rydberg levels. We report here a comparative study of analytic and numerical pulse sequences for the Rydberg cphase gate that specifically examines the robustness of the gate fidelity with respect to such experimental perturbations. Analytical pulse sequences of both simultaneous and stimulated Raman adiabatic passage (STIRAP) are found to be at best moderately robust under these perturbations. In contrast, optimal control theory is seen to allow generation of numerical pulses that are inherently robust within a predefined tolerance window. The resulting numerical pulse shapes display simple modulation patterns and can be rationalized in terms of an interference between distinct two-photon Rydberg excitation pathways. Pulses of such low complexity should be experimentally feasible, allowing gate fidelities of order 99.90-99.99% to be achievable under realistic experimental conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munson, Matthew S.; Karp, Eric M.; Nimlos, Claire T.
Biomass conversion processes such as pretreatment, liquefaction, and pyrolysis often produce complex mixtures of intermediates that are a substantial challenge to analyze rapidly and reliably. To characterize these streams more comprehensively and efficiently, new techniques are needed to track species through biomass deconstruction and conversion processes. Here, we present the application of an emerging analytical method, gradient elution moving boundary electrophoresis (GEMBE), to quantify a suite of acids in a complex, biomass-derived streams from alkaline pretreatment of corn stover. GEMBE offers distinct advantages over common chromatography-spectrometry analytical approaches in terms of analysis time, sample preparation requirements, and cost of equipment.more » As demonstrated here, GEMBE is able to track 17 distinct compounds (oxalate, formate, succinate, malate, acetate, glycolate, protocatechuate, 3-hydroxypropanoate, lactate, glycerate, 2-hydroxybutanoate, 4-hydroxybenzoate, vanillate, p-coumarate, ferulate, sinapate, and acetovanillone). The lower limit of detection was compound dependent and ranged between 0.9 and 3.5 umol/L. Results from GEMBE were similar to recent results from an orthogonal method based on GCxGC-TOF/MS. Altogether, GEMBE offers a rapid, robust approach to analyze complex biomass-derived samples, and given the ease and convenience of deployment, may offer an analytical solution for online tracking of multiple types of biomass streams.« less
New robust bilinear least squares method for the analysis of spectral-pH matrix data.
Goicoechea, Héctor C; Olivieri, Alejandro C
2005-07-01
A new second-order multivariate method has been developed for the analysis of spectral-pH matrix data, based on a bilinear least-squares (BLLS) model achieving the second-order advantage and handling multiple calibration standards. A simulated Monte Carlo study of synthetic absorbance-pH data allowed comparison of the newly proposed BLLS methodology with constrained parallel factor analysis (PARAFAC) and with the combination multivariate curve resolution-alternating least-squares (MCR-ALS) technique under different conditions of sample-to-sample pH mismatch and analyte-background ratio. The results indicate an improved prediction ability for the new method. Experimental data generated by measuring absorption spectra of several calibration standards of ascorbic acid and samples of orange juice were subjected to second-order calibration analysis with PARAFAC, MCR-ALS, and the new BLLS method. The results indicate that the latter method provides the best analytical results in regard to analyte recovery in samples of complex composition requiring strict adherence to the second-order advantage. Linear dependencies appear when multivariate data are produced by using the pH or a reaction time as one of the data dimensions, posing a challenge to classical multivariate calibration models. The presently discussed algorithm is useful for these latter systems.
Munson, Matthew S.; Karp, Eric M.; Nimlos, Claire T.; ...
2016-09-27
Biomass conversion processes such as pretreatment, liquefaction, and pyrolysis often produce complex mixtures of intermediates that are a substantial challenge to analyze rapidly and reliably. To characterize these streams more comprehensively and efficiently, new techniques are needed to track species through biomass deconstruction and conversion processes. Here, we present the application of an emerging analytical method, gradient elution moving boundary electrophoresis (GEMBE), to quantify a suite of acids in a complex, biomass-derived streams from alkaline pretreatment of corn stover. GEMBE offers distinct advantages over common chromatography-spectrometry analytical approaches in terms of analysis time, sample preparation requirements, and cost of equipment.more » As demonstrated here, GEMBE is able to track 17 distinct compounds (oxalate, formate, succinate, malate, acetate, glycolate, protocatechuate, 3-hydroxypropanoate, lactate, glycerate, 2-hydroxybutanoate, 4-hydroxybenzoate, vanillate, p-coumarate, ferulate, sinapate, and acetovanillone). The lower limit of detection was compound dependent and ranged between 0.9 and 3.5 umol/L. Results from GEMBE were similar to recent results from an orthogonal method based on GCxGC-TOF/MS. Altogether, GEMBE offers a rapid, robust approach to analyze complex biomass-derived samples, and given the ease and convenience of deployment, may offer an analytical solution for online tracking of multiple types of biomass streams.« less
Higher-order differential phase shift keyed modulation
NASA Astrophysics Data System (ADS)
Vanalphen, Deborah K.; Lindsey, William C.
1994-02-01
Advanced modulation/demodulation techniques which are robust in the presence of phase and frequency uncertainties continue to be of interest to communication engineers. We are particularly interested in techniques which accommodate slow channel phase and frequency variations with minimal performance degradation and which alleviate the need for phase and frequency tracking loops in the receiver. We investigate the performance sensitivity to frequency offsets of a modulation technique known as binary Double Differential Phase Shift Keying (DDPSK) and compare it to that of classical binary Differential Phase Shift Keying (DPSK). We also generalize our analytical results to include n(sup -th) order, M-ary DPSK. The DDPSK (n = 2) technique was first introduced in the Russian literature circa 1972 and was studied more thoroughly in the late 1970's by Pent and Okunev. Here, we present an expression for the symbol error probability that is easy to derive and to evaluate numerically. We also present graphical results that establish when, as a function of signal energy-to-noise ratio and normalized frequency offset, binary DDPSK is preferable to binary DPSK with respect to performance in additive white Gaussian noise. Finally, we provide insight into the optimum receiver from a detection theory viewpoint.
NASA Astrophysics Data System (ADS)
Larsen, Erik H.
1998-02-01
Achievement of optimum selectivity, sensitivity and robustness in speciation analysis using high performance liquid chromatography (HPLC) with inductively coupled mass spectrometry (ICP-MS) detection requires that each instrumental component is selected and optimized with a view to the ideal operating characteristics of the entire hyphenated system. An isocratic HPLC system, which employs an aqueous mobile phase with organic buffer constituents, is well suited for introduction into the ICP-MS because of the stability of the detector response and high degree of analyte sensitivity attained. Anion and cation exchange HPLC systems, which meet these requirements, were used for the seperation of selenium and arsenic species in crude extracts of biological samples. Furthermore, the signal-to-noise ratios obtained for these incompletely ionized elements in the argon ICP were further enhanced by a factor of four by continously introducing carbon as methanol via the mobile phase into the ICP. Sources of error in the HPLC system (column overload), in the sample introduction system (memory by organic solvents) and in the ICP-MS (spectroscopic interferences) and their prevention are also discussed. The optimized anion and cation exchange HPLC-ICP-MS systems were used for arsenic speciation in contaminated ground water and in an in-house shrimp reference sample. For the purpose of verification, HPLC coupled with tandem mass spectrometry with electrospray ionization was additionally used for arsenic speciation in the shrimp sample. With this analytical technique the HPLC retention time in combination with mass analysis of the molecular ions and their collision-induced fragments provide almost conclusive evidence of the identity of the analyte species. The speciation methods are validated by establishing a mass balance of the analytes in each fraction of the extraction procedure, by recovery of spikes and by employing and comparing independent techniques. The urgent need for reference materials certified for elemental species is stressed.
Ślączka-Wilk, Magdalena M; Włodarczyk, Elżbieta; Kaleniecka, Aleksandra; Zarzycki, Paweł K
2017-07-01
There is increasing interest in the development of simple analytical systems enabling the fast screening of target components in complex samples. A number of newly invented protocols are based on quasi separation techniques involving microfluidic paper-based analytical devices and/or micro total analysis systems. Under such conditions, the quantification of target components can be performed mainly due to selective detection. The main goal of this paper is to demonstrate that miniaturized planar chromatography has the capability to work as an efficient separation and quantification tool for the analysis of multiple targets within complex environmental samples isolated and concentrated using an optimized SPE method. In particular, we analyzed various samples collected from surface water ecosystems (lakes, rivers, and the Baltic Sea of Middle Pomerania in the northern part of Poland) in different seasons, as well as samples collected during key wastewater technological processes (originating from the "Jamno" wastewater treatment plant in Koszalin, Poland). We documented that the multiple detection of chromatographic spots on RP-18W microplates-under visible light, fluorescence, and fluorescence quenching conditions, and using the visualization reagent phosphomolybdic acid-enables fast and robust sample classification. The presented data reveal that the proposed micro-TLC system is useful, inexpensive, and can be considered as a complementary method for the fast control of treated sewage water discharged by a municipal wastewater treatment plant, particularly for the detection of low-molecular mass micropollutants with polarity ranging from estetrol to progesterone, as well as chlorophyll-related dyes. Due to the low consumption of mobile phases composed of water-alcohol binary mixtures (less than 1 mL/run for the simultaneous separation of up to nine samples), this method can be considered an environmentally friendly and green chemistry analytical tool. The described analytical protocol can be complementary to those involving classical column chromatography (HPLC) or various planar microfluidic devices.
Advanced DPSM approach for modeling ultrasonic wave scattering in an arbitrary geometry
NASA Astrophysics Data System (ADS)
Yadav, Susheel K.; Banerjee, Sourav; Kundu, Tribikram
2011-04-01
Several techniques are used to diagnose structural damages. In the ultrasonic technique structures are tested by analyzing ultrasonic signals scattered by damages. The interpretation of these signals requires a good understanding of the interaction between ultrasonic waves and structures. Therefore, researchers need analytical or numerical techniques to have a clear understanding of the interaction between ultrasonic waves and structural damage. However, modeling of wave scattering phenomenon by conventional numerical techniques such as finite element method requires very fine mesh at high frequencies necessitating heavy computational power. Distributed point source method (DPSM) is a newly developed robust mesh free technique to simulate ultrasonic, electrostatic and electromagnetic fields. In most of the previous studies the DPSM technique has been applied to model two dimensional surface geometries and simple three dimensional scatterer geometries. It was difficult to perform the analysis for complex three dimensional geometries. This technique has been extended to model wave scattering in an arbitrary geometry. In this paper a channel section idealized as a thin solid plate with several rivet holes is formulated. The simulation has been carried out with and without cracks near the rivet holes. Further, a comparison study has been also carried out to characterize the crack. A computer code has been developed in C for modeling the ultrasonic field in a solid plate with and without cracks near the rivet holes.
Code of Federal Regulations, 2013 CFR
2013-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Code of Federal Regulations, 2012 CFR
2012-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Code of Federal Regulations, 2014 CFR
2014-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Mischak, Harald; Vlahou, Antonia; Ioannidis, John P A
2013-04-01
Mass spectrometry platforms have attracted a lot of interest in the last 2 decades as profiling tools for native peptides and proteins with clinical potential. However, limitations associated with reproducibility and analytical robustness, especially pronounced with the initial SELDI systems, hindered the application of such platforms in biomarker qualification and clinical implementation. The scope of this article is to give a short overview on data available on performance and on analytical robustness of the different platforms for peptide profiling. Using the CE-MS platform as a paradigm, data on analytical performance are described including reproducibility (short-term and intermediate repeatability), stability, interference, quantification capabilities (limits of detection), and inter-laboratory variability. We discuss these issues by using as an example our experience with the development of a 273-peptide marker for chronic kidney disease. Finally, we discuss pros and cons and means for improvement and emphasize the need to test in terms of comparative clinical performance and impact, different platforms that pass reasonably well analytical validation tests. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Qiu, Huazhang; Wu, Namei; Zheng, Yanjie; Chen, Min; Weng, Shaohuang; Chen, Yuanzhong; Lin, Xinhua
2015-01-01
A robust and versatile signal-on fluorescence sensing strategy was developed to provide label-free detection of various target analytes. The strategy used SYBR Green I dye and graphene oxide as signal reporter and signal-to-background ratio enhancer, respectively. Multidrug resistance protein 1 (MDR1) gene and mercury ion (Hg2+) were selected as target analytes to investigate the generality of the method. The linear relationship and specificity of the detections showed that the sensitive and selective analyses of target analytes could be achieved by the proposed strategy with low detection limits of 0.5 and 2.2 nM for MDR1 gene and Hg2+, respectively. Moreover, the strategy was used to detect real samples. Analytical results of MDR1 gene in the serum indicated that the developed method is a promising alternative approach for real applications in complex systems. Furthermore, the recovery of the proposed method for Hg2+ detection was acceptable. Thus, the developed label-free signal-on fluorescence sensing strategy exhibited excellent universality, sensitivity, and handling convenience. PMID:25565810
Net growth rate of continuum heterogeneous biofilms with inhibition kinetics.
Gonzo, Elio Emilio; Wuertz, Stefan; Rajal, Veronica B
2018-01-01
Biofilm systems can be modeled using a variety of analytical and numerical approaches, usually by making simplifying assumptions regarding biofilm heterogeneity and activity as well as effective diffusivity. Inhibition kinetics, albeit common in experimental systems, are rarely considered and analytical approaches are either lacking or consider effective diffusivity of the substrate and the biofilm density to remain constant. To address this obvious knowledge gap an analytical procedure to estimate the effectiveness factor (dimensionless substrate mass flux at the biofilm-fluid interface) was developed for a continuum heterogeneous biofilm with multiple limiting-substrate Monod kinetics to different types of inhibition kinetics. The simple perturbation technique, previously validated to quantify biofilm activity, was applied to systems where either the substrate or the inhibitor is the limiting component, and cases where the inhibitor is a reaction product or the substrate also acts as the inhibitor. Explicit analytical equations are presented for the effectiveness factor estimation and, therefore, the calculation of biomass growth rate or limiting substrate/inhibitor consumption rate, for a given biofilm thickness. The robustness of the new biofilm model was tested using kinetic parameters experimentally determined for the growth of Pseudomonas putida CCRC 14365 on phenol. Several additional cases have been analyzed, including examples where the effectiveness factor can reach values greater than unity, characteristic of systems with inhibition kinetics. Criteria to establish when the effectiveness factor can reach values greater than unity in each of the cases studied are also presented.
2013-01-01
Influenza virus-like particle vaccines are one of the most promising ways to respond to the threat of future influenza pandemics. VLPs are composed of viral antigens but lack nucleic acids making them non-infectious which limit the risk of recombination with wild-type strains. By taking advantage of the advancements in cell culture technologies, the process from strain identification to manufacturing has the potential to be completed rapidly and easily at large scales. After closely reviewing the current research done on influenza VLPs, it is evident that the development of quantification methods has been consistently overlooked. VLP quantification at all stages of the production process has been left to rely on current influenza quantification methods (i.e. Hemagglutination assay (HA), Single Radial Immunodiffusion assay (SRID), NA enzymatic activity assays, Western blot, Electron Microscopy). These are analytical methods developed decades ago for influenza virions and final bulk influenza vaccines. Although these methods are time-consuming and cumbersome they have been sufficient for the characterization of final purified material. Nevertheless, these analytical methods are impractical for in-line process monitoring because VLP concentration in crude samples generally falls out of the range of detection for these methods. This consequently impedes the development of robust influenza-VLP production and purification processes. Thus, development of functional process analytical techniques, applicable at every stage during production, that are compatible with different production platforms is in great need to assess, optimize and exploit the full potential of novel manufacturing platforms. PMID:23642219
DNA barcode-based delineation of putative species: efficient start for taxonomic workflows
Kekkonen, Mari; Hebert, Paul D N
2014-01-01
The analysis of DNA barcode sequences with varying techniques for cluster recognition provides an efficient approach for recognizing putative species (operational taxonomic units, OTUs). This approach accelerates and improves taxonomic workflows by exposing cryptic species and decreasing the risk of synonymy. This study tested the congruence of OTUs resulting from the application of three analytical methods (ABGD, BIN, GMYC) to sequence data for Australian hypertrophine moths. OTUs supported by all three approaches were viewed as robust, but 20% of the OTUs were only recognized by one or two of the methods. These OTUs were examined for three criteria to clarify their status. Monophyly and diagnostic nucleotides were both uninformative, but information on ranges was useful as sympatric sister OTUs were viewed as distinct, while allopatric OTUs were merged. This approach revealed 124 OTUs of Hypertrophinae, a more than twofold increase from the currently recognized 51 species. Because this analytical protocol is both fast and repeatable, it provides a valuable tool for establishing a basic understanding of species boundaries that can be validated with subsequent studies. PMID:24479435
Data informatics for the Detection, Characterization, and Attribution of Climate Extremes
NASA Astrophysics Data System (ADS)
Collins, W.; Wehner, M. F.; O'Brien, T. A.; Paciorek, C. J.; Krishnan, H.; Johnson, J. N.; Prabhat, M.
2015-12-01
The potential for increasing frequency and intensity of extremephenomena including downpours, heat waves, and tropical cyclonesconstitutes one of the primary risks of climate change for society andthe environment. The challenge of characterizing these risks is thatextremes represent the "tails" of distributions of atmosphericphenomena and are, by definition, highly localized and typicallyrelatively transient. Therefore very large volumes of observationaldata and projections of future climate are required to quantify theirproperties in a robust manner. Massive data analytics are required inorder to detect individual extremes, accumulate statistics on theirproperties, quantify how these statistics are changing with time, andattribute the effects of anthropogenic global warming on thesestatistics. We describe examples of the suite of techniques the climate communityis developing to address these analytical challenges. The techniquesinclude massively parallel methods for detecting and trackingatmospheric rivers and cyclones; data-intensive extensions togeneralized extreme value theory to summarize the properties ofextremes; and multi-model ensembles of hindcasts to quantify theattributable risk of anthropogenic influence on individual extremes.We conclude by highlighting examples of these methods developed by ourCASCADE (Calibrated and Systematic Characterization, Attribution, andDetection of Extremes) project.
2018-01-01
A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µm) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis. PMID:29686931
Valavala, Sriram; Seelam, Nareshvarma; Tondepu, Subbaiah; Jagarlapudi, V Shanmukha Kumar; Sundarmurthy, Vivekanandan
2018-01-01
A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µ m) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis.
Multidimensional assessment of awareness in early-stage dementia: a cluster analytic approach.
Clare, Linda; Whitaker, Christopher J; Nelis, Sharon M; Martyr, Anthony; Markova, Ivana S; Roth, Ilona; Woods, Robert T; Morris, Robin G
2011-01-01
Research on awareness in dementia has yielded variable and inconsistent associations between awareness and other factors. This study examined awareness using a multidimensional approach and applied cluster analytic techniques to identify associations between the level of awareness and other variables. Participants were 101 individuals with early-stage dementia (PwD) and their carers. Explicit awareness was assessed at 3 levels: performance monitoring in relation to memory, evaluative judgement in relation to memory, everyday activities and socio-emotional functioning, and metacognitive reflection in relation to the experience and impact of the condition. Implicit awareness was assessed with an emotional Stroop task. Different measures of explicit awareness scores were related only to a limited extent. Cluster analysis yielded 3 groups with differing degrees of explicit awareness. These groups showed no differences in implicit awareness. Lower explicit awareness was associated with greater age, lower MMSE scores, poorer recall and naming scores, lower anxiety and greater carer stress. Multidimensional assessment offers a more robust approach to classifying PwD according to level of awareness and hence to examining correlates and predictors of awareness. Copyright © 2011 S. Karger AG, Basel.
Santoro, Valentina; Aigotti, Riccardo; Gastaldi, Daniela; Romaniello, Francesco; Forte, Emanuele; Magni, Martina; Baiocchi, Claudio
2018-01-01
Interesterification is an industrial transformation process aiming to change the physico-chemical properties of vegetable oils by redistributing fatty acid position within the original constituent of the triglycerides. In the confectionery industry, controlling formation degree of positional isomers is important in order to obtain fats with the desired properties. Silver ion HPLC (High Performance Liquid Chromatography) is the analytical technique usually adopted to separate triglycerides (TAGs) having different unsaturation degrees. However, separation of TAG positional isomers is a challenge when the number of double bonds is the same and the only difference is in their position within the triglyceride molecule. The TAG positional isomers involved in the present work have a structural specificity that require a separation method tailored to the needs of confectionery industry. The aim of this work was to obtain a chromatographic resolution that might allow reliable qualitative and quantitative evaluation of TAG positional isomers within reasonably rapid retention times and robust in respect of repeatability and reproducibility. The resulting analytical procedure was applied both to confectionery raw materials and final products. PMID:29462917
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ratcliffe, W.A.; Corrie, J.E.; Dalziel, A.H.
1982-06-01
Researchers compared two direct radioimmunoassays for progesterone in 50 microL of unextracted serum or plasma with assays involving extraction of serum. The direct assays include the use of either danazol at pH 7.4 or 8-anilino-1-naphthalenesulfonic acid at pH 4.0 to displace progesterone from serum binding-proteins. Progesterone is then assayed by using an antiserum to a progesterone 11 alpha hemisuccinyl conjugate and the radioligand /sup 125/I-labeled progesterone 11 alpha-glucuronyl tyramine, with separation by double-antibody techniques. Direct assays with either displacing agent gave good analytical recovery of progesterone added to human serum, and progesterone values for patients' specimens correlated well (r greatermore » than 0.96) with results of assays involving extraction of serum. Precision was similar with each displacing agent over the working range 2.5-100 nmol/L and superior to that of extraction assays. Researchers conclude that these direct assays of progesterone are analytically valid and more robust, precise, and technically convenient than many conventional methods involving extraction of serum.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ratcliffe, W.A.; Corrie, J.E.T.; Dalziel, A.H.
1982-06-01
Two direct radioimmunoassays for progesterone in 50 ..mu..L of unextracted serum or plasma with assays involving extraction of serum were compared. The direct assays include the use of either danazol at pH 7.4 or 8-anilino-1-naphthalenesulfonic acid at pH 4.0 to displace progesterone from serum binding-proteins. Progesterone is then assayed by using an antiserum to a progesterone 11..cap alpha..-hemisuccinyl conjugate and the radioligand /sup 125/I-labeled progesterone 11..cap alpha..-glucuronyl tyramine, with separation by double-antibody techniques. Direct assays with either displacing agent gave good analytical recovery of progesterone added to human serum, and progesterone values for patients' specimens correlated well (r > 0.96)more » with results of assays involving extraction of serum. Precision was similar with each displacing agent over the working range 2.5-100 nmol/L and superior to that of extraction assays. We conclude that these direct assays of progesterone are analytically valid and more robust, precise, and technically convenient than many conventional methods involving extraction of serum.« less
Chen, Chaochao; Luo, Jiaxun; Li, Chenglong; Ma, Mingfang; Yu, Wenbo; Shen, Jianzhong; Wang, Zhanhui
2018-03-21
The chemical contaminants in food and the environment are quite harmful to food safety and human health. Rapid, accurate, and cheap detection can effectively control the potential risks derived from these chemical contaminants. Among all detection methods, the immunoassay based on the specific interaction of antibody-analyte is one of the most widely used techniques in the field. However, biological antibodies employed in the immunoassay usually cannot tolerate extreme conditions, resulting in an unstable state in both physical and chemical profiles. Molecularly imprinted polymers (MIPs) are a class of polymers with specific molecular recognition abilities, which are highly robust, showing excellent operational stability under a wide variety of conditions. Recently, MIPs have been used in biomimetic immunoassays for chemical contaminants as an antibody substitute in food and the environment. Here, we reviewed these applications of MIPs incorporated in different analytical platforms, such as enzyme-linked immunosorbent assay, fluorescent immunoassay, chemiluminescent immunoassay, electrochemical immunoassay, microfluidic paper-based immunoassay, and homogeneous immunoassay, and discussed current challenges and future trends in the use of MIPs in biomimetic immunoassays.
Piovesan, Davide; Pierobon, Alberto; DiZio, Paul; Lackner, James R
2012-01-01
This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases.
Novel selective TOCSY method enables NMR spectral elucidation of metabolomic mixtures
NASA Astrophysics Data System (ADS)
MacKinnon, Neil; While, Peter T.; Korvink, Jan G.
2016-11-01
Complex mixture analysis is routinely encountered in NMR-based investigations. With the aim of component identification, spectral complexity may be addressed chromatographically or spectroscopically, the latter being favored to reduce sample handling requirements. An attractive experiment is selective total correlation spectroscopy (sel-TOCSY), which is capable of providing tremendous spectral simplification and thereby enhancing assignment capability. Unfortunately, isolating a well resolved resonance is increasingly difficult as the complexity of the mixture increases and the assumption of single spin system excitation is no longer robust. We present TOCSY optimized mixture elucidation (TOOMIXED), a technique capable of performing spectral assignment particularly in the case where the assumption of single spin system excitation is relaxed. Key to the technique is the collection of a series of 1D sel-TOCSY experiments as a function of the isotropic mixing time (τm), resulting in a series of resonance intensities indicative of the underlying molecular structure. By comparing these τm -dependent intensity patterns with a library of pre-determined component spectra, one is able to regain assignment capability. After consideration of the technique's robustness, we tested TOOMIXED firstly on a model mixture. As a benchmark we were able to assign a molecule with high confidence in the case of selectively exciting an isolated resonance. Assignment confidence was not compromised when performing TOOMIXED on a resonance known to contain multiple overlapping signals, and in the worst case the method suggested a follow-up sel-TOCSY experiment to confirm an ambiguous assignment. TOOMIXED was then demonstrated on two realistic samples (whisky and urine), where under our conditions an approximate limit of detection of 0.6 mM was determined. Taking into account literature reports for the sel-TOCSY limit of detection, the technique should reach on the order of 10 μ M sensitivity. We anticipate this technique will be highly attractive to various analytical fields facing mixture analysis, including metabolomics, foodstuff analysis, pharmaceutical analysis, and forensics.
Improving entrepreneurial opportunity recognition through web content analytics
NASA Astrophysics Data System (ADS)
Bakar, Muhamad Shahbani Abu; Azmi, Azwiyati
2017-10-01
The ability to recognize and develop an opportunity into a venture defines an entrepreneur. Research in opportunity recognition has been robust and focuses more on explaining the processes involved in opportunity recognition. Factors such as prior knowledge, cognitive and creative capabilities are shown to affect opportunity recognition in entrepreneurs. Prior knowledge in areas such as customer problems, ways to serve the market, and technology has been shows in various studies to be a factor that facilitates entrepreneurs to identify and recognize opportunities. Findings from research also shows that experienced entrepreneurs search and scan for information to discover opportunities. Searching and scanning for information has also been shown to help novice entrepreneurs who lack prior knowledge to narrow this gap and enable them to better identify and recognize opportunities. There is less focus in research on finding empirically proven techniques and methods to develop and enhance opportunity recognition in student entrepreneurs. This is important as the country pushes for more graduate entrepreneurs that can drive the economy. This paper aims to discuss Opportunity Recognition Support System (ORSS), an information support system to help especially student entrepreneurs in identifying and recognizing business opportunities. The ORSS aims to provide the necessary knowledge to student entrepreneurs to be able to better identify and recognize opportunities. Applying design research, theories in opportunity recognition are applied to identify the requirements for the support system and the requirements in turn dictate the design of the support system. The paper proposes the use of web content mining and analytics as two core components and techniques for the support system. Web content mining can mine the vast knowledge repositories available on the internet and analytics can provide entrepreneurs with further insights into the information needed to recognize opportunities in a given market or industry.
Reschiglian, P; Roda, B; Zattoni, A; Tanase, M; Marassi, V; Serani, S
2014-02-01
The rapid development of protein-based pharmaceuticals highlights the need for robust analytical methods to ensure their quality and stability. Among proteins used in pharmaceutical applications, an important and ever increasing role is represented by monoclonal antibodies and large proteins, which are often modified to enhance their activity or stability when used as drugs. The bioactivity and the stability of those proteins are closely related to the maintenance of their complex structure, which however are influenced by many external factors that can cause degradation and/or aggregation. The presence of aggregates in these drugs could reduce their bioactivity and bioavailability, and induce immunogenicity. The choice of the proper analytical method for the analysis of aggregates is fundamental to understand their (size) dimensional range, their amount, and if they are present in the sample as generated by an aggregation or as an artifact due to the method itself. Size exclusion chromatography is one of the most important techniques for the quality control of pharmaceutical proteins; however, its application is limited to relatively low molar mass aggregates. Among the techniques for the size characterization of proteins, field-flow fractionation (FFF) represents a competitive choice because of its soft mechanism due to the absence of a stationary phase and application in a broader size range, from nanometer- to micrometer-sized analytes. In this paper, the microcolumn variant of FFF, the hollow-fiber flow FFF, was online coupled with multi-angle light scattering, and a method for the characterization of aggregates with high reproducibility and low limit of detection was demonstrated employing an avidin derivate as sample model.
NASA Astrophysics Data System (ADS)
Campbell, Ian S.; Ton, Alain T.; Mulligan, Christopher C.
2011-07-01
An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.
Generation of phase edge singularities by coplanar three-beam interference and their detection.
Patorski, Krzysztof; Sluzewski, Lukasz; Trusiak, Maciej; Pokorski, Krzysztof
2017-02-06
In recent years singular optics has gained considerable attention in science and technology. Up to now optical vortices (phase point dislocations) have been of main interest. This paper presents the first general analysis of formation of phase edge singularities by coplanar three-beam interference. They can be generated, for example, by three-slit interference or self-imaging in the Fresnel diffraction field of a sinusoidal grating. We derive a general condition for the ratio of amplitudes of interfering beams resulting in phase edge dislocations, lateral separation of dislocations depends on this ratio as well. Analytically derived properties are corroborated by numerical and experimental studies. We develop a simple, robust, common path optical self-imaging configuration aided by a coherent tilted reference wave and spatial filtering. Finally, we propose an automatic fringe pattern analysis technique for detecting phase edge dislocations, based on the continuous wavelet transform. Presented studies open new possibilities for developing grating based sensing techniques for precision metrology of very small phase differences.
Melvin, Elizabeth M.; Moore, Brandon R.; Gilchrist, Kristin H.; Grego, Sonia; Velev, Orlin D.
2011-01-01
The recent development of microfluidic “lab on a chip” devices requiring sample sizes <100 μL has given rise to the need to concentrate dilute samples and trap analytes, especially for surface-based detection techniques. We demonstrate a particle collection device capable of concentrating micron-sized particles in a predetermined area by combining AC electroosmosis (ACEO) and dielectrophoresis (DEP). The planar asymmetric electrode pattern uses ACEO pumping to induce equal, quadrilateral flow directed towards a stagnant region in the center of the device. A number of system parameters affecting particle collection efficiency were investigated including electrode and gap width, chamber height, applied potential and frequency, and number of repeating electrode pairs and electrode geometry. The robustness of the on-chip collection design was evaluated against varying electrolyte concentrations, particle types, and particle sizes. These devices are amenable to integration with a variety of detection techniques such as optical evanescent waveguide sensing. PMID:22662040
Campbell, Ian S; Ton, Alain T; Mulligan, Christopher C
2011-07-01
An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.
NASA Astrophysics Data System (ADS)
Vagh, Hardik A.; Baghai-Wadji, Alireza
2008-12-01
Current technological challenges in materials science and high-tech device industry require the solution of boundary value problems (BVPs) involving regions of various scales, e.g. multiple thin layers, fibre-reinforced composites, and nano/micro pores. In most cases straightforward application of standard variational techniques to BVPs of practical relevance necessarily leads to unsatisfactorily ill-conditioned analytical and/or numerical results. To remedy the computational challenges associated with sub-sectional heterogeneities various sophisticated homogenization techniques need to be employed. Homogenization refers to the systematic process of smoothing out the sub-structural heterogeneities, leading to the determination of effective constitutive coefficients. Ordinarily, homogenization involves a sophisticated averaging and asymptotic order analysis to obtain solutions. In the majority of the cases only zero-order terms are constructed due to the complexity of the processes involved. In this paper we propose a constructive scheme for obtaining homogenized solutions involving higher order terms, and thus, guaranteeing higher accuracy and greater robustness of the numerical results. We present
Biochemical methane potential (BMP) tests: Reducing test time by early parameter estimation.
Da Silva, C; Astals, S; Peces, M; Campos, J L; Guerrero, L
2018-01-01
Biochemical methane potential (BMP) test is a key analytical technique to assess the implementation and optimisation of anaerobic biotechnologies. However, this technique is characterised by long testing times (from 20 to >100days), which is not suitable for waste utilities, consulting companies or plants operators whose decision-making processes cannot be held for such a long time. This study develops a statistically robust mathematical strategy using sensitivity functions for early prediction of BMP first-order model parameters, i.e. methane yield (B 0 ) and kinetic constant rate (k). The minimum testing time for early parameter estimation showed a potential correlation with the k value, where (i) slowly biodegradable substrates (k≤0.1d -1 ) have a minimum testing times of ≥15days, (ii) moderately biodegradable substrates (0.1
The role of analytical chemistry in Niger Delta petroleum exploration: a review.
Akinlua, Akinsehinwa
2012-06-12
Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.
Analytical techniques: A compilation
NASA Technical Reports Server (NTRS)
1975-01-01
A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.
Chang, Yeong-Chan
2005-12-01
This paper addresses the problem of designing adaptive fuzzy-based (or neural network-based) robust controls for a large class of uncertain nonlinear time-varying systems. This class of systems can be perturbed by plant uncertainties, unmodeled perturbations, and external disturbances. Nonlinear H(infinity) control technique incorporated with adaptive control technique and VSC technique is employed to construct the intelligent robust stabilization controller such that an H(infinity) control is achieved. The problem of the robust tracking control design for uncertain robotic systems is employed to demonstrate the effectiveness of the developed robust stabilization control scheme. Therefore, an intelligent robust tracking controller for uncertain robotic systems in the presence of high-degree uncertainties can easily be implemented. Its solution requires only to solve a linear algebraic matrix inequality and a satisfactorily transient and asymptotical tracking performance is guaranteed. A simulation example is made to confirm the performance of the developed control algorithms.
Fontana, Ariel R; Lana, Nerina B; Martinez, Luis D; Altamirano, Jorgelina C
2010-06-30
Ultrasound-assisted leaching-dispersive solid-phase extraction followed by dispersive liquid-liquid microextraction (USAL-DSPE-DLLME) technique has been developed as a new analytical approach for extracting, cleaning up and preconcentrating polybrominated diphenyl ethers (PBDEs) from sediment samples prior gas chromatography-tandem mass spectrometry (GC-MS/MS) analysis. In the first place, PBDEs were leached from sediment samples by using acetone. This extract was cleaned-up by DSPE using activated silica gel as sorbent material. After clean-up, PBDEs were preconcentrated by using DLLME technique. Thus, 1 mL acetone extract (disperser solvent) and 60 microL carbon tetrachloride (extraction solvent) were added to 5 mL ultrapure water and a DLLME technique was applied. Several variables that govern the proposed technique were studied and optimized. Under optimum conditions, the method detection limits (MDLs) of PBDEs calculated as three times the signal-to-noise ratio (S/N) were within the range 0.02-0.06 ng g(-1). The relative standard deviations (RSDs) for five replicates were <9.8%. The calibration graphs were linear within the concentration range of 0.07-1000 ng g(-1) for BDE-47, 0.09-1000 ng g(-1) for BDE-100, 0.10-1000 ng g(-1) for BDE-99 and 0.19-1000 ng g(-1) for BDE-153 and the coefficients of estimation were > or =0.9991. Validation of the methodology was carried out by standard addition method at two concentration levels (0.25 and 1 ng g(-1)) and by comparing with a reference Soxhlet technique. Recovery values were > or =80%, which showed a satisfactory robustness of the analytical methodology for determination of low PBDEs concentration in sediment samples. Copyright 2010 Elsevier B.V. All rights reserved.
Suvarapu, Lakshmi Narayana; Baek, Sung-Ok
2015-01-01
This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539
Extending the Distributed Lag Model framework to handle chemical mixtures.
Bello, Ghalib A; Arora, Manish; Austin, Christine; Horton, Megan K; Wright, Robert O; Gennings, Chris
2017-07-01
Distributed Lag Models (DLMs) are used in environmental health studies to analyze the time-delayed effect of an exposure on an outcome of interest. Given the increasing need for analytical tools for evaluation of the effects of exposure to multi-pollutant mixtures, this study attempts to extend the classical DLM framework to accommodate and evaluate multiple longitudinally observed exposures. We introduce 2 techniques for quantifying the time-varying mixture effect of multiple exposures on an outcome of interest. Lagged WQS, the first technique, is based on Weighted Quantile Sum (WQS) regression, a penalized regression method that estimates mixture effects using a weighted index. We also introduce Tree-based DLMs, a nonparametric alternative for assessment of lagged mixture effects. This technique is based on the Random Forest (RF) algorithm, a nonparametric, tree-based estimation technique that has shown excellent performance in a wide variety of domains. In a simulation study, we tested the feasibility of these techniques and evaluated their performance in comparison to standard methodology. Both methods exhibited relatively robust performance, accurately capturing pre-defined non-linear functional relationships in different simulation settings. Further, we applied these techniques to data on perinatal exposure to environmental metal toxicants, with the goal of evaluating the effects of exposure on neurodevelopment. Our methods identified critical neurodevelopmental windows showing significant sensitivity to metal mixtures. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Hough, Susan L.; Hall, Bruce W.
The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…
Robust electroencephalogram phase estimation with applications in brain-computer interface systems.
Seraj, Esmaeil; Sameni, Reza
2017-03-01
In this study, a robust method is developed for frequency-specific electroencephalogram (EEG) phase extraction using the analytic representation of the EEG. Based on recent theoretical findings in this area, it is shown that some of the phase variations-previously associated to the brain response-are systematic side-effects of the methods used for EEG phase calculation, especially during low analytical amplitude segments of the EEG. With this insight, the proposed method generates randomized ensembles of the EEG phase using minor perturbations in the zero-pole loci of narrow-band filters, followed by phase estimation using the signal's analytical form and ensemble averaging over the randomized ensembles to obtain a robust EEG phase and frequency. This Monte Carlo estimation method is shown to be very robust to noise and minor changes of the filter parameters and reduces the effect of fake EEG phase jumps, which do not have a cerebral origin. As proof of concept, the proposed method is used for extracting EEG phase features for a brain computer interface (BCI) application. The results show significant improvement in classification rates using rather simple phase-related features and a standard K-nearest neighbors and random forest classifiers, over a standard BCI dataset. The average performance was improved between 4-7% (in absence of additive noise) and 8-12% (in presence of additive noise). The significance of these improvements was statistically confirmed by a paired sample t-test, with 0.01 and 0.03 p-values, respectively. The proposed method for EEG phase calculation is very generic and may be applied to other EEG phase-based studies.
Robust control synthesis for uncertain dynamical systems
NASA Technical Reports Server (NTRS)
Byun, Kuk-Whan; Wie, Bong; Sunkel, John
1989-01-01
This paper presents robust control synthesis techniques for uncertain dynamical systems subject to structured parameter perturbation. Both QFT (quantitative feedback theory) and H-infinity control synthesis techniques are investigated. Although most H-infinity-related control techniques are not concerned with the structured parameter perturbation, a new way of incorporating the parameter uncertainty in the robust H-infinity control design is presented. A generic model of uncertain dynamical systems is used to illustrate the design methodologies investigated in this paper. It is shown that, for a certain noncolocated structural control problem, use of both techniques results in nonminimum phase compensation.
Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra
2015-11-01
A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ho, Tien D; Yehl, Peter M; Chetwyn, Nik P; Wang, Jin; Anderson, Jared L; Zhong, Qiqing
2014-09-26
Ionic liquids (ILs) were used as a new class of diluents for the analysis of two classes of genotoxic impurities (GTIs), namely, alkyl/aryl halides and nitro-aromatics, in small molecule drug substances by headspace gas chromatography (HS-GC) coupled with electron capture detection (ECD). This novel approach using ILs as contemporary diluents greatly broadens the applicability of HS-GC for the determination of high boiling (≥ 130°C) analytes including GTIs with limits of detection (LOD) ranging from 5 to 500 parts-per-billion (ppb) of analytes in a drug substance. This represents up to tens of thousands-fold improvement compared to traditional HS-GC diluents such as dimethyl sulfoxide (DMSO) and dimethylacetamide (DMAC). Various ILs were screened to determine their suitability as diluents for the HS-GC/ECD analysis. Increasing the HS oven temperatures resulted in varying responses for alkyl/aryl halides and a significant increase in response for all nitroaromatic GTIs. Linear ranges of up to five orders of magnitude were found for a number of analytes. The technique was validated on two active pharmaceutical ingredients with excellent recovery. This simple and robust methodology offers a key advantage in the ease of method transfer from development laboratories to quality control environments since conventional validated chromatographic data systems and GC instruments can be used. For many analytes, it is a cost effective alternative to more complex trace analytical methodologies like LC/MS and GC/MS, and significantly reduces the training needed for operation. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.
On adaptive robustness approach to Anti-Jam signal processing
NASA Astrophysics Data System (ADS)
Poberezhskiy, Y. S.; Poberezhskiy, G. Y.
An effective approach to exploiting statistical differences between desired and jamming signals named adaptive robustness is proposed and analyzed in this paper. It combines conventional Bayesian, adaptive, and robust approaches that are complementary to each other. This combining strengthens the advantages and mitigates the drawbacks of the conventional approaches. Adaptive robustness is equally applicable to both jammers and their victim systems. The capabilities required for realization of adaptive robustness in jammers and victim systems are determined. The employment of a specific nonlinear robust algorithm for anti-jam (AJ) processing is described and analyzed. Its effectiveness in practical situations has been proven analytically and confirmed by simulation. Since adaptive robustness can be used by both sides in electronic warfare, it is more advantageous for the fastest and most intelligent side. Many results obtained and discussed in this paper are also applicable to commercial applications such as communications in unregulated or poorly regulated frequency ranges and systems with cognitive capabilities.
A fully Sinc-Galerkin method for Euler-Bernoulli beam models
NASA Technical Reports Server (NTRS)
Smith, R. C.; Bowers, K. L.; Lund, J.
1990-01-01
A fully Sinc-Galerkin method in both space and time is presented for fourth-order time-dependent partial differential equations with fixed and cantilever boundary conditions. The Sinc discretizations for the second-order temporal problem and the fourth-order spatial problems are presented. Alternate formulations for variable parameter fourth-order problems are given which prove to be especially useful when applying the forward techniques to parameter recovery problems. The discrete system which corresponds to the time-dependent partial differential equations of interest are then formulated. Computational issues are discussed and a robust and efficient algorithm for solving the resulting matrix system is outlined. Numerical results which highlight the method are given for problems with both analytic and singular solutions as well as fixed and cantilever boundary conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kristo, Michael J.; Gaffney, Amy M.; Marks, Naomi
Nuclear forensic science seeks to identify the origin of nuclear materials found outside regulatory control. It is increasingly recognized as an integral part of a robust nuclear security program. Our review highlights areas of active, evolving research in nuclear forensics, with a focus on analytical techniques commonly employed in Earth and planetary sciences. Applications of nuclear forensics to uranium ore concentrates (UOCs) are discussed first. UOCs have become an attractive target for nuclear forensic researchers because of the richness in impurities compared to materials produced later in the fuel cycle. Furthermore, the development of chronometric methods for age dating nuclearmore » materials is then discussed, with an emphasis on improvements in accuracy that have been gained from measurements of multiple radioisotopic systems. Finally, papers that report on casework are reviewed, to provide a window into current scientific practice.« less
Lodi-Smith, Jennifer; Roberts, Brent W
2007-02-01
Investing in normative, age-graded social roles has broad implications for both the individual and society. The current meta-analysis examines the way in which personality traits relate to four such investments -- work, family, religion, and volunteerism. The present study uses meta-analytic techniques (K = 94) to identify the cross-sectional patterns of relationships between social investment in these four roles and the personality trait domains of agreeableness, conscientiousness, and emotional stability. Results show that the extent of investment in social roles across these domains is positively related to agreeableness, conscientiousness, emotional stability, and low psychoticism. These findings are more robust when individuals are psychologically committed to rather than simply demographically associated with the investment role.
A gas-sensing array produced from screen-printed, zeolite-modified chromium titanate
NASA Astrophysics Data System (ADS)
Pugh, David C.; Hailes, Stephen M. V.; Parkin, Ivan P.
2015-08-01
Metal oxide semiconducting (MOS) gas sensors represent a cheap, robust and sensitive technology for detecting volatile organic compounds. MOS sensors have consistently been shown to lack sensitivity to a broad range on analytes, leading to false positive errors. In this study an array of five chromium titanate (CTO) thick-film sensors were produced. These were modified by incorporating a range of zeolites, namely β, Y, mordenite and ZSM5, into the bulk sensor material. Sensors were exposed to three common reducing gases, namely acetone, ethanol and toluene, and a machine learning technique was applied to differentiate between the different gases. All sensors produced strong resistive responses (increases in resistance) and a support vector machine (SVM) was able to classify the data to a high degree of selectivity.
Foldnes, Njål; Olsson, Ulf Henning
2016-01-01
We present and investigate a simple way to generate nonnormal data using linear combinations of independent generator (IG) variables. The simulated data have prespecified univariate skewness and kurtosis and a given covariance matrix. In contrast to the widely used Vale-Maurelli (VM) transform, the obtained data are shown to have a non-Gaussian copula. We analytically obtain asymptotic robustness conditions for the IG distribution. We show empirically that popular test statistics in covariance analysis tend to reject true models more often under the IG transform than under the VM transform. This implies that overly optimistic evaluations of estimators and fit statistics in covariance structure analysis may be tempered by including the IG transform for nonnormal data generation. We provide an implementation of the IG transform in the R environment.
Zheng, Xianlin; Lu, Yiqing; Zhao, Jiangbo; Zhang, Yuhai; Ren, Wei; Liu, Deming; Lu, Jie; Piper, James A; Leif, Robert C; Liu, Xiaogang; Jin, Dayong
2016-01-19
Compared with routine microscopy imaging of a few analytes at a time, rapid scanning through the whole sample area of a microscope slide to locate every single target object offers many advantages in terms of simplicity, speed, throughput, and potential for robust quantitative analysis. Existing techniques that accommodate solid-phase samples incorporating individual micrometer-sized targets generally rely on digital microscopy and image analysis, with intrinsically low throughput and reliability. Here, we report an advanced on-the-fly stage scanning method to achieve high-precision target location across the whole slide. By integrating X- and Y-axis linear encoders to a motorized stage as the virtual "grids" that provide real-time positional references, we demonstrate an orthogonal scanning automated microscopy (OSAM) technique which can search a coverslip area of 50 × 24 mm(2) in just 5.3 min and locate individual 15 μm lanthanide luminescent microspheres with standard deviations of 1.38 and 1.75 μm in X and Y directions. Alongside implementation of an autofocus unit that compensates the tilt of a slide in the Z-axis in real time, we increase the luminescence detection efficiency by 35% with an improved coefficient of variation. We demonstrate the capability of advanced OSAM for robust quantification of luminescence intensities and lifetimes for a variety of micrometer-scale luminescent targets, specifically single down-shifting and upconversion microspheres, crystalline microplates, and color-barcoded microrods, as well as quantitative suspension array assays of biotinylated-DNA functionalized upconversion nanoparticles.
A robust and efficient stepwise regression method for building sparse polynomial chaos expansions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abraham, Simon, E-mail: Simon.Abraham@ulb.ac.be; Raisee, Mehrdad; Ghorbaniasl, Ghader
2017-03-01
Polynomial Chaos (PC) expansions are widely used in various engineering fields for quantifying uncertainties arising from uncertain parameters. The computational cost of classical PC solution schemes is unaffordable as the number of deterministic simulations to be calculated grows dramatically with the number of stochastic dimension. This considerably restricts the practical use of PC at the industrial level. A common approach to address such problems is to make use of sparse PC expansions. This paper presents a non-intrusive regression-based method for building sparse PC expansions. The most important PC contributions are detected sequentially through an automatic search procedure. The variable selectionmore » criterion is based on efficient tools relevant to probabilistic method. Two benchmark analytical functions are used to validate the proposed algorithm. The computational efficiency of the method is then illustrated by a more realistic CFD application, consisting of the non-deterministic flow around a transonic airfoil subject to geometrical uncertainties. To assess the performance of the developed methodology, a detailed comparison is made with the well established LAR-based selection technique. The results show that the developed sparse regression technique is able to identify the most significant PC contributions describing the problem. Moreover, the most important stochastic features are captured at a reduced computational cost compared to the LAR method. The results also demonstrate the superior robustness of the method by repeating the analyses using random experimental designs.« less
Robust stability of fractional order polynomials with complicated uncertainty structure
Şenol, Bilal; Pekař, Libor
2017-01-01
The main aim of this article is to present a graphical approach to robust stability analysis for families of fractional order (quasi-)polynomials with complicated uncertainty structure. More specifically, the work emphasizes the multilinear, polynomial and general structures of uncertainty and, moreover, the retarded quasi-polynomials with parametric uncertainty are studied. Since the families with these complex uncertainty structures suffer from the lack of analytical tools, their robust stability is investigated by numerical calculation and depiction of the value sets and subsequent application of the zero exclusion condition. PMID:28662173
Fu, Li; Merabia, Samy; Joly, Laurent
2018-04-19
Following our recent theoretical prediction of the giant thermo-osmotic response of the water-graphene interface, we explore the practical implementation of waste heat harvesting with carbon-based membranes, focusing on model membranes of carbon nanotubes (CNT). To that aim, we combine molecular dynamics simulations and an analytical model considering the details of hydrodynamics in the membrane and at the tube entrances. The analytical model and the simulation results match quantitatively, highlighting the need to take into account both thermodynamics and hydrodynamics to predict thermo-osmotic flows through membranes. We show that, despite viscous entrance effects and a thermal short-circuit mechanism, CNT membranes can generate very fast thermo-osmotic flows, which can overcome the osmotic pressure of seawater. We then show that in small tubes confinement has a complex effect on the flow and can even reverse the flow direction. Beyond CNT membranes, our analytical model can guide the search for other membranes to generate fast and robust thermo-osmotic flows.
The Analog Revolution and Its On-Going Role in Modern Analytical Measurements.
Enke, Christie G
2015-12-15
The electronic revolution in analytical instrumentation began when we first exceeded the two-digit resolution of panel meters and chart recorders and then took the first steps into automated control. It started with the first uses of operational amplifiers (op amps) in the analog domain 20 years before the digital computer entered the analytical lab. Their application greatly increased both accuracy and precision in chemical measurement and they provided an elegant means for the electronic control of experimental quantities. Later, laboratory and personal computers provided an unlimited readout resolution and enabled programmable control of instrument parameters as well as storage and computation of acquired data. However, digital computers did not replace the op amp's critical role of converting the analog sensor's output to a robust and accurate voltage. Rather it added a new role: converting that voltage into a number. These analog operations are generally the limiting portions of our computerized instrumentation systems. Operational amplifier performance in gain, input current and resistance, offset voltage, and rise time have improved by a remarkable 3-4 orders of magnitude since their first implementations. Each 10-fold improvement has opened the doors for the development of new techniques in all areas of chemical analysis. Along with some interesting history, the multiple roles op amps play in modern instrumentation are described along with a number of examples of new areas of analysis that have been enabled by their improvements.
A rapid and sensitive analytical method for the determination of 14 pyrethroids in water samples.
Feo, M L; Eljarrat, E; Barceló, D
2010-04-09
A simple, efficient and environmentally friendly analytical methodology is proposed for extracting and preconcentrating pyrethroids from water samples prior to gas chromatography-negative ion chemical ionization mass spectrometry (GC-NCI-MS) analysis. Fourteen pyrethroids were selected for this work: bifenthrin, cyfluthrin, lambda-cyhalothrin, cypermethrin, deltamethrin, esfenvalerate, fenvalerate, fenpropathrin, tau-fluvalinate, permethrin, phenothrin, resmethrin, tetramethrin and tralomethrin. The method is based on ultrasound-assisted emulsification-extraction (UAEE) of a water-immiscible solvent in an aqueous medium. Chloroform was used as extraction solvent in the UAEE technique. Target analytes were quantitatively extracted achieving an enrichment factor of 200 when 20 mL aliquot of pure water spiked with pyrethroid standards was extracted. The method was also evaluated with tap water and river water samples. Method detection limits (MDLs) ranged from 0.03 to 35.8 ng L(-1) with RSDs values < or =3-25% (n=5). The coefficients of estimation of the calibration curves obtained following the proposed methodology were > or =0.998. Recovery values were in the range of 45-106%, showing satisfactory robustness of the method for analyzing pyrethroids in water samples. The proposed methodology was applied for the analysis of river water samples. Cypermethrin was detected at concentration levels ranging from 4.94 to 30.5 ng L(-1). Copyright 2010 Elsevier B.V. All rights reserved.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Green analytical chemistry--theory and practice.
Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek
2010-08-01
This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.
Combined sensing platform for advanced diagnostics in exhaled mouse breath
NASA Astrophysics Data System (ADS)
Fortes, Paula R.; Wilk, Andreas; Seichter, Felicia; Cajlakovic, Merima; Koestler, Stefan; Ribitsch, Volker; Wachter, Ulrich; Vogt, Josef; Radermacher, Peter; Carter, Chance; Raimundo, Ivo M.; Mizaikoff, Boris
2013-03-01
Breath analysis is an attractive non-invasive strategy for early disease recognition or diagnosis, and for therapeutic progression monitoring, as quantitative compositional analysis of breath can be related to biomarker panels provided by a specific physiological condition invoked by e.g., pulmonary diseases, lung cancer, breast cancer, and others. As exhaled breath contains comprehensive information on e.g., the metabolic state, and since in particular volatile organic constituents (VOCs) in exhaled breath may be indicative of certain disease states, analytical techniques for advanced breath diagnostics should be capable of sufficient molecular discrimination and quantification of constituents at ppm-ppb - or even lower - concentration levels. While individual analytical techniques such as e.g., mid-infrared spectroscopy may provide access to a range of relevant molecules, some IR-inactive constituents require the combination of IR sensing schemes with orthogonal analytical tools for extended molecular coverage. Combining mid-infrared hollow waveguides (HWGs) with luminescence sensors (LS) appears particularly attractive, as these complementary analytical techniques allow to simultaneously analyze total CO2 (via luminescence), the 12CO2/13CO2 tracer-to-tracee (TTR) ratio (via IR), selected VOCs (via IR) and O2 (via luminescence) in exhaled breath, yet, establishing a single diagnostic platform as both sensors simultaneously interact with the same breath sample volume. In the present study, we take advantage of a particularly compact (shoebox-size) FTIR spectrometer combined with novel substrate-integrated hollow waveguide (iHWG) recently developed by our research team, and miniaturized fiberoptic luminescence sensors for establishing a multi-constituent breath analysis tool that is ideally compatible with mouse intensive care stations (MICU). Given the low tidal volume and flow of exhaled mouse breath, the TTR is usually determined after sample collection via gas chromatography coupled to mass spectrometric detection. Here, we aim at potentially continuously analyzing the TTR via iHWGs and LS flow-through sensors requiring only minute (< 1 mL) sample volumes. Furthermore, this study explores non-linearities observed for the calibration functions of 12CO2 and 13CO2 potentially resulting from effects related to optical collision diameters e.g., in presence of molecular oxygen. It is anticipated that the simultaneous continuous analysis of oxygen via LS will facilitate the correction of these effects after inclusion within appropriate multivariate calibration models, thus providing more reliable and robust calibration schemes for continuously monitoring relevant breath constituents.
The application of absolute quantitative (1)H NMR spectroscopy in drug discovery and development.
Singh, Suruchi; Roy, Raja
2016-07-01
The identification of a drug candidate and its structural determination is the most important step in the process of the drug discovery and for this, nuclear magnetic resonance (NMR) is one of the most selective analytical techniques. The present review illustrates the various perspectives of absolute quantitative (1)H NMR spectroscopy in drug discovery and development. It deals with the fundamentals of quantitative NMR (qNMR), the physiochemical properties affecting qNMR, and the latest referencing techniques used for quantification. The precise application of qNMR during various stages of drug discovery and development, namely natural product research, drug quantitation in dosage forms, drug metabolism studies, impurity profiling and solubility measurements is elaborated. To achieve this, the authors explore the literature of NMR in drug discovery and development between 1963 and 2015. It also takes into account several other reviews on the subject. qNMR experiments are used for drug discovery and development processes as it is a non-destructive, versatile and robust technique with high intra and interpersonal variability. However, there are several limitations also. qNMR of complex biological samples is incorporated with peak overlap and a low limit of quantification and this can be overcome by using hyphenated chromatographic techniques in addition to NMR.
Li, Guo-Sheng; Wei, Xian-Yong
2017-01-01
Elucidation of chemical composition of biooil is essentially important to evaluate the process of lignocellulosic biomass (LCBM) conversion and its upgrading and suggest proper value-added utilization like producing fuel and feedstock for fine chemicals. Although the main components of LCBM are cellulose, hemicelluloses, and lignin, the chemicals derived from LCBM differ significantly due to the various feedstock and methods used for the decomposition. Biooil, produced from pyrolysis of LCBM, contains hundreds of organic chemicals with various classes. This review covers the methodologies used for the componential analysis of biooil, including pretreatments and instrumental analysis techniques. The use of chromatographic and spectrometric methods was highlighted, covering the conventional techniques such as gas chromatography, high performance liquid chromatography, Fourier transform infrared spectroscopy, nuclear magnetic resonance, and mass spectrometry. The combination of preseparation methods and instrumental technologies is a robust pathway for the detailed componential characterization of biooil. The organic species in biooils can be classified into alkanes, alkenes, alkynes, benzene-ring containing hydrocarbons, ethers, alcohols, phenols, aldehydes, ketones, esters, carboxylic acids, and other heteroatomic organic compounds. The recent development of high resolution mass spectrometry and multidimensional hyphenated chromatographic and spectrometric techniques has considerably elucidated the composition of biooils. PMID:29387086
Grabowski, Krzysztof; Gawronski, Mateusz; Baran, Ireneusz; Spychalski, Wojciech; Staszewski, Wieslaw J; Uhl, Tadeusz; Kundu, Tribikram; Packo, Pawel
2016-05-01
Acoustic Emission used in Non-Destructive Testing is focused on analysis of elastic waves propagating in mechanical structures. Then any information carried by generated acoustic waves, further recorded by a set of transducers, allow to determine integrity of these structures. It is clear that material properties and geometry strongly impacts the result. In this paper a method for Acoustic Emission source localization in thin plates is presented. The approach is based on the Time-Distance Domain Transform, that is a wavenumber-frequency mapping technique for precise event localization. The major advantage of the technique is dispersion compensation through a phase-shifting of investigated waveforms in order to acquire the most accurate output, allowing for source-sensor distance estimation using a single transducer. The accuracy and robustness of the above process are also investigated. This includes the study of Young's modulus value and numerical parameters influence on damage detection. By merging the Time-Distance Domain Transform with an optimal distance selection technique, an identification-localization algorithm is achieved. The method is investigated analytically, numerically and experimentally. The latter involves both laboratory and large scale industrial tests. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bergholt, Mads Sylvest; Zheng, Wei; Lin, Kan; Ho, Khek Yu; Yeoh, Khay Guan; Teh, Ming; So, Jimmy Bok Yan; Huang, Zhiwei
2012-01-01
Raman spectroscopy is a vibrational analytic technique sensitive to the changes in biomolecular composition and conformations occurring in tissue. With our most recent development of near-infrared (NIR) Raman endoscopy integrated with diagnostic algorithms, in vivo real-time Raman diagnostics has been realized under multimodal wide-field imaging (i.e., white- light reflectance (WLR), narrow-band imaging (NBI), autofluorescence imaging (AFI)) modalities. A selection of 177 patients who previously underwent Raman endoscopy (n=2510 spectra) was used to render two robust models based on partial least squares - discriminant analysis (PLS-DA) for esophageal and gastric cancer diagnosis. The Raman endoscopy technique was validated prospectively on 4 new gastric and esophageal patients for in vivo tissue diagnosis. The Raman endoscopic technique could identify esophageal cancer in vivo with a sensitivity of 88.9% (8/9) and specificity of 100.0% (11/11) and gastric cancers with a sensitivity of 77.8% (14/18) and specificity of 100.0% (13/13). This study realizes for the first time the image-guided Raman endoscopy for real-time in vivo diagnosis of malignancies in the esophagus and gastric at the biomolecular level.
NASA Astrophysics Data System (ADS)
Taylor, Stephen R.; Simon, Joseph; Sampson, Laura
2017-01-01
The final parsec of supermassive black-hole binary evolution is subject to the complex interplay of stellar loss-cone scattering, circumbinary disk accretion, and gravitational-wave emission, with binary eccentricity affected by all of these. The strain spectrum of gravitational-waves in the pulsar-timing band thus encodes rich information about the binary population's response to these various environmental mechanisms. Current spectral models have heretofore followed basic analytic prescriptions, and attempt to investigate these final-parsec mechanisms in an indirect fashion. Here we describe a new technique to directly probe the environmental properties of supermassive black-hole binaries through "Bayesian model-emulation". We perform black-hole binary population synthesis simulations at a restricted set of environmental parameter combinations, compute the strain spectra from these, then train a Gaussian process to learn the shape of the spectrum at any point in parameter space. We describe this technique, demonstrate its efficacy with a program of simulated datasets, then illustrate its power by directly constraining final-parsec physics in a Bayesian analysis of the NANOGrav 5-year dataset. The technique is fast, flexible, and robust.
NASA Astrophysics Data System (ADS)
Taylor, Stephen; Simon, Joseph; Sampson, Laura
2017-01-01
The final parsec of supermassive black-hole binary evolution is subject to the complex interplay of stellar loss-cone scattering, circumbinary disk accretion, and gravitational-wave emission, with binary eccentricity affected by all of these. The strain spectrum of gravitational-waves in the pulsar-timing band thus encodes rich information about the binary population's response to these various environmental mechanisms. Current spectral models have heretofore followed basic analytic prescriptions, and attempt to investigate these final-parsec mechanisms in an indirect fashion. Here we describe a new technique to directly probe the environmental properties of supermassive black-hole binaries through ``Bayesian model-emulation''. We perform black-hole binary population synthesis simulations at a restricted set of environmental parameter combinations, compute the strain spectra from these, then train a Gaussian process to learn the shape of spectrum at any point in parameter space. We describe this technique, demonstrate its efficacy with a program of simulated datasets, then illustrate its power by directly constraining final-parsec physics in a Bayesian analysis of the NANOGrav 5-year dataset. The technique is fast, flexible, and robust.
Jones, Damon E; Feinberg, Mark E; Cleveland, Michael J; Cooper, Brittany Rhoades
2012-11-01
We examined the independent and combined influence of major risk and protective factors on youths' alcohol use. Five large data sets provided similar measures of alcohol use and risk or protective factors. We carried out analyses within each data set, separately for boys and girls in 8th and 10th grades. We included interaction and curvilinear predictive terms in final models if results were robust across data sets. We combined results using meta-analytic techniques. Individual, family, and peer risk factors and a community protective factor moderately predicted youths' alcohol use. Family and school protective factors did not predict alcohol use when combined with other factors. Youths' antisocial attitudes were more strongly associated with alcohol use for those also reporting higher levels of peer or community risk. For certain risk factors, the association with alcohol use varied across different risk levels. Efforts toward reducing youths' alcohol use should be based on robust estimates of the relative influence of risk and protective factors across adolescent environment domains. Public health advocates should focus on context (e.g., community factors) as a strategy for curbing underage alcohol use.
Vectorial mask optimization methods for robust optical lithography
NASA Astrophysics Data System (ADS)
Ma, Xu; Li, Yanqiu; Guo, Xuejia; Dong, Lisong; Arce, Gonzalo R.
2012-10-01
Continuous shrinkage of critical dimension in an integrated circuit impels the development of resolution enhancement techniques for low k1 lithography. Recently, several pixelated optical proximity correction (OPC) and phase-shifting mask (PSM) approaches were developed under scalar imaging models to account for the process variations. However, the lithography systems with larger-NA (NA>0.6) are predominant for current technology nodes, rendering the scalar models inadequate to describe the vector nature of the electromagnetic field that propagates through the optical lithography system. In addition, OPC and PSM algorithms based on scalar models can compensate for wavefront aberrations, but are incapable of mitigating polarization aberrations in practical lithography systems, which can only be dealt with under the vector model. To this end, we focus on developing robust pixelated gradient-based OPC and PSM optimization algorithms aimed at canceling defocus, dose variation, wavefront and polarization aberrations under a vector model. First, an integrative and analytic vector imaging model is applied to formulate the optimization problem, where the effects of process variations are explicitly incorporated in the optimization framework. A steepest descent algorithm is then used to iteratively optimize the mask patterns. Simulations show that the proposed algorithms can effectively improve the process windows of the optical lithography systems.
Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.
ERIC Educational Resources Information Center
Heineman, William R.; Kissinger, Peter T.
1980-01-01
Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)
NASA Astrophysics Data System (ADS)
Mølgaard, Lasse L.; Buus, Ole T.; Larsen, Jan; Babamoradi, Hamid; Thygesen, Ida L.; Laustsen, Milan; Munk, Jens Kristian; Dossi, Eleftheria; O'Keeffe, Caroline; Lässig, Lina; Tatlow, Sol; Sandström, Lars; Jakobsen, Mogens H.
2017-05-01
We present a data-driven machine learning approach to detect drug- and explosives-precursors using colorimetric sensor technology for air-sampling. The sensing technology has been developed in the context of the CRIM-TRACK project. At present a fully- integrated portable prototype for air sampling with disposable sensing chips and automated data acquisition has been developed. The prototype allows for fast, user-friendly sampling, which has made it possible to produce large datasets of colorimetric data for different target analytes in laboratory and simulated real-world application scenarios. To make use of the highly multi-variate data produced from the colorimetric chip a number of machine learning techniques are employed to provide reliable classification of target analytes from confounders found in the air streams. We demonstrate that a data-driven machine learning method using dimensionality reduction in combination with a probabilistic classifier makes it possible to produce informative features and a high detection rate of analytes. Furthermore, the probabilistic machine learning approach provides a means of automatically identifying unreliable measurements that could produce false predictions. The robustness of the colorimetric sensor has been evaluated in a series of experiments focusing on the amphetamine pre-cursor phenylacetone as well as the improvised explosives pre-cursor hydrogen peroxide. The analysis demonstrates that the system is able to detect analytes in clean air and mixed with substances that occur naturally in real-world sampling scenarios. The technology under development in CRIM-TRACK has the potential as an effective tool to control trafficking of illegal drugs, explosive detection, or in other law enforcement applications.
TU-AB-BRB-01: Coverage Evaluation and Probabilistic Treatment Planning as a Margin Alternative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siebers, J.
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, H.
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
TU-AB-BRB-02: Stochastic Programming Methods for Handling Uncertainty and Motion in IMRT Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Unkelbach, J.
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
TU-AB-BRB-00: New Methods to Ensure Target Coverage
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
Raterink, Robert-Jan; Witkam, Yoeri; Vreeken, Rob J; Ramautar, Rawi; Hankemeier, Thomas
2014-10-21
In the field of bioanalysis, there is an increasing demand for miniaturized, automated, robust sample pretreatment procedures that can be easily connected to direct-infusion mass spectrometry (DI-MS) in order to allow the high-throughput screening of drugs and/or their metabolites in complex body fluids like plasma. Liquid-Liquid extraction (LLE) is a common sample pretreatment technique often used for complex aqueous samples in bioanalysis. Despite significant developments that have been made in automated and miniaturized LLE procedures, fully automated LLE techniques allowing high-throughput bioanalytical studies on small-volume samples using direct infusion mass spectrometry, have not been matured yet. Here, we introduce a new fully automated micro-LLE technique based on gas-pressure assisted mixing followed by passive phase separation, coupled online to nanoelectrospray-DI-MS. Our method was characterized by varying the gas flow and its duration through the solvent mixture. For evaluation of the analytical performance, four drugs were spiked to human plasma, resulting in highly acceptable precision (RSD down to 9%) and linearity (R(2) ranging from 0.990 to 0.998). We demonstrate that our new method does not only allow the reliable extraction of analytes from small sample volumes of a few microliters in an automated and high-throughput manner, but also performs comparable or better than conventional offline LLE, in which the handling of small volumes remains challenging. Finally, we demonstrate the applicability of our method for drug screening on dried blood spots showing excellent linearity (R(2) of 0.998) and precision (RSD of 9%). In conclusion, we present the proof of principe of a new high-throughput screening platform for bioanalysis based on a new automated microLLE method, coupled online to a commercially available nano-ESI-DI-MS.
Variability of measurements of sweat sodium using the regional absorbent-patch method.
Dziedzic, Christine E; Ross, Megan L; Slater, Gary J; Burke, Louise M
2014-09-01
There is interest in including recommendations for the replacement of the sodium lost in sweat in individualized hydration plans for athletes. Although the regional absorbent-patch method provides a practical approach to measuring sweat sodium losses in field conditions, there is a need to understand the variability of estimates associated with this technique. Sweat samples were collected from the forearms, chest, scapula, and thigh of 12 cyclists during 2 standardized cycling time trials in the heat and 2 in temperate conditions. Single measure analysis of sodium concentration was conducted immediately by ion-selective electrodes (ISE). A subset of 30 samples was frozen for reanalysis of sodium concentration using ISE, flame photometry (FP), and conductivity (SC). Sweat samples collected in hot conditions produced higher sweat sodium concentrations than those from the temperate environment (P = .0032). A significant difference (P = .0048) in estimates of sweat sodium concentration was evident when calculated from the forearm average (mean ± 95% CL; 64 ± 12 mmol/L) compared with using a 4-site equation (70 ± 12 mmol/L). There was a high correlation between the values produced using different analytical techniques (r2 = .95), but mean values were different between treatments (frozen FP, frozen SC > immediate ISE > frozen ISE; P < .0001). Whole-body sweat sodium concentration estimates differed depending on the number of sites included in the calculation. Environmental testing conditions should be considered in the interpretation of results. The impact of sample freezing and subsequent analytical technique was small but statistically significant. Nevertheless, when undertaken using a standardized protocol, the regional absorbent-patch method appears to be a relatively robust field test.
Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael
2018-05-01
Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.
Linden, Ariel; Yarnold, Paul R
2016-12-01
Program evaluations often utilize various matching approaches to emulate the randomization process for group assignment in experimental studies. Typically, the matching strategy is implemented, and then covariate balance is assessed before estimating treatment effects. This paper introduces a novel analytic framework utilizing a machine learning algorithm called optimal discriminant analysis (ODA) for assessing covariate balance and estimating treatment effects, once the matching strategy has been implemented. This framework holds several key advantages over the conventional approach: application to any variable metric and number of groups; insensitivity to skewed data or outliers; and use of accuracy measures applicable to all prognostic analyses. Moreover, ODA accepts analytic weights, thereby extending the methodology to any study design where weights are used for covariate adjustment or more precise (differential) outcome measurement. One-to-one matching on the propensity score was used as the matching strategy. Covariate balance was assessed using standardized difference in means (conventional approach) and measures of classification accuracy (ODA). Treatment effects were estimated using ordinary least squares regression and ODA. Using empirical data, ODA produced results highly consistent with those obtained via the conventional methodology for assessing covariate balance and estimating treatment effects. When ODA is combined with matching techniques within a treatment effects framework, the results are consistent with conventional approaches. However, given that it provides additional dimensions and robustness to the analysis versus what can currently be achieved using conventional approaches, ODA offers an appealing alternative. © 2016 John Wiley & Sons, Ltd.
Piovesan, Davide; Pierobon, Alberto; DiZio, Paul; Lackner, James R.
2012-01-01
This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases. PMID:22448233
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
A Robust Zero-Watermarking Algorithm for Audio
NASA Astrophysics Data System (ADS)
Chen, Ning; Zhu, Jie
2007-12-01
In traditional watermarking algorithms, the insertion of watermark into the host signal inevitably introduces some perceptible quality degradation. Another problem is the inherent conflict between imperceptibility and robustness. Zero-watermarking technique can solve these problems successfully. Instead of embedding watermark, the zero-watermarking technique extracts some essential characteristics from the host signal and uses them for watermark detection. However, most of the available zero-watermarking schemes are designed for still image and their robustness is not satisfactory. In this paper, an efficient and robust zero-watermarking technique for audio signal is presented. The multiresolution characteristic of discrete wavelet transform (DWT), the energy compression characteristic of discrete cosine transform (DCT), and the Gaussian noise suppression property of higher-order cumulant are combined to extract essential features from the host audio signal and they are then used for watermark recovery. Simulation results demonstrate the effectiveness of our scheme in terms of inaudibility, detection reliability, and robustness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Partridge Jr, William P.; Choi, Jae-Soon
By directly resolving spatial and temporal species distributions within operating honeycomb monolith catalysts, spatially resolved capillary inlet mass spectrometry (SpaciMS) provides a uniquely enabling perspective for advancing automotive catalysis. Specifically, the ability to follow the spatiotemporal evolution of reactions throughout the catalyst is a significant advantage over inlet-and-effluent-limited analysis. Intracatalyst resolution elucidates numerous catalyst details including the network and sequence of reactions, clarifying reaction pathways; the relative rates of different reactions and impacts of operating conditions and catalyst state; and reaction dynamics and intermediate species that exist only within the catalyst. These details provide a better understanding of how themore » catalyst functions and have basic and practical benefits; e.g., catalyst system design; strategies for on-road catalyst state assessment, control, and on-board diagnostics; and creating robust and accurate predictive catalyst models. Moreover, such spatiotemporally distributed data provide for critical model assessment, and identification of improvement opportunities that might not be apparent from effluent assessment; i.e., while an incorrectly formulated model may provide correct effluent predictions, one that can accurately predict the spatiotemporal evolution of reactions along the catalyst channels will be more robust, accurate, and reliable. In such ways, intracatalyst diagnostics comprehensively enable improved design and development tools, and faster and lower-cost development of more efficient and durable automotive catalyst systems. Beyond these direct contributions, SpaciMS has spawned and been applied to enable other analytical techniques for resolving transient distributed intracatalyst performance. This chapter focuses on SpaciMS applications and associated catalyst insights and improvements, with specific sections related to lean NOx traps, selective catalytic reduction catalysts, oxidation catalysts, and particulate filters. The objective is to promote broader use and development of intracatalyst analytical methods, and thereby expand the insights resulting from this detailed perspective for advancing automotive catalyst technologies.« less
NASA Technical Reports Server (NTRS)
Rogers, Stuart E.
1990-01-01
The current work is initiated in an effort to obtain an efficient, accurate, and robust algorithm for the numerical solution of the incompressible Navier-Stokes equations in two- and three-dimensional generalized curvilinear coordinates for both steady-state and time-dependent flow problems. This is accomplished with the use of the method of artificial compressibility and a high-order flux-difference splitting technique for the differencing of the convective terms. Time accuracy is obtained in the numerical solutions by subiterating the equations in psuedo-time for each physical time step. The system of equations is solved with a line-relaxation scheme which allows the use of very large pseudo-time steps leading to fast convergence for steady-state problems as well as for the subiterations of time-dependent problems. Numerous laminar test flow problems are computed and presented with a comparison against analytically known solutions or experimental results. These include the flow in a driven cavity, the flow over a backward-facing step, the steady and unsteady flow over a circular cylinder, flow over an oscillating plate, flow through a one-dimensional inviscid channel with oscillating back pressure, the steady-state flow through a square duct with a 90 degree bend, and the flow through an artificial heart configuration with moving boundaries. An adequate comparison with the analytical or experimental results is obtained in all cases. Numerical comparisons of the upwind differencing with central differencing plus artificial dissipation indicates that the upwind differencing provides a much more robust algorithm, which requires significantly less computing time. The time-dependent problems require on the order of 10 to 20 subiterations, indicating that the elliptical nature of the problem does require a substantial amount of computing effort.
Twarog, Nathaniel R.; Low, Jonathan A.; Currier, Duane G.; Miller, Greg; Chen, Taosheng; Shelat, Anang A.
2016-01-01
Phenotypic screening through high-content automated microscopy is a powerful tool for evaluating the mechanism of action of candidate therapeutics. Despite more than a decade of development, however, high content assays have yielded mixed results, identifying robust phenotypes in only a small subset of compound classes. This has led to a combinatorial explosion of assay techniques, analyzing cellular phenotypes across dozens of assays with hundreds of measurements. Here, using a minimalist three-stain assay and only 23 basic cellular measurements, we developed an analytical approach that leverages informative dimensions extracted by linear discriminant analysis to evaluate similarity between the phenotypic trajectories of different compounds in response to a range of doses. This method enabled us to visualize biologically-interpretable phenotypic tracks populated by compounds of similar mechanism of action, cluster compounds according to phenotypic similarity, and classify novel compounds by comparing them to phenotypically active exemplars. Hierarchical clustering applied to 154 compounds from over a dozen different mechanistic classes demonstrated tight agreement with published compound mechanism classification. Using 11 phenotypically active mechanism classes, classification was performed on all 154 compounds: 78% were correctly identified as belonging to one of the 11 exemplar classes or to a different unspecified class, with accuracy increasing to 89% when less phenotypically active compounds were excluded. Importantly, several apparent clustering and classification failures, including rigosertib and 5-fluoro-2’-deoxycytidine, instead revealed more complex mechanisms or off-target effects verified by more recent publications. These results show that a simple, easily replicated, minimalist high-content assay can reveal subtle variations in the cellular phenotype induced by compounds and can correctly predict mechanism of action, as long as the appropriate analytical tools are used. PMID:26886014
Twarog, Nathaniel R; Low, Jonathan A; Currier, Duane G; Miller, Greg; Chen, Taosheng; Shelat, Anang A
2016-01-01
Phenotypic screening through high-content automated microscopy is a powerful tool for evaluating the mechanism of action of candidate therapeutics. Despite more than a decade of development, however, high content assays have yielded mixed results, identifying robust phenotypes in only a small subset of compound classes. This has led to a combinatorial explosion of assay techniques, analyzing cellular phenotypes across dozens of assays with hundreds of measurements. Here, using a minimalist three-stain assay and only 23 basic cellular measurements, we developed an analytical approach that leverages informative dimensions extracted by linear discriminant analysis to evaluate similarity between the phenotypic trajectories of different compounds in response to a range of doses. This method enabled us to visualize biologically-interpretable phenotypic tracks populated by compounds of similar mechanism of action, cluster compounds according to phenotypic similarity, and classify novel compounds by comparing them to phenotypically active exemplars. Hierarchical clustering applied to 154 compounds from over a dozen different mechanistic classes demonstrated tight agreement with published compound mechanism classification. Using 11 phenotypically active mechanism classes, classification was performed on all 154 compounds: 78% were correctly identified as belonging to one of the 11 exemplar classes or to a different unspecified class, with accuracy increasing to 89% when less phenotypically active compounds were excluded. Importantly, several apparent clustering and classification failures, including rigosertib and 5-fluoro-2'-deoxycytidine, instead revealed more complex mechanisms or off-target effects verified by more recent publications. These results show that a simple, easily replicated, minimalist high-content assay can reveal subtle variations in the cellular phenotype induced by compounds and can correctly predict mechanism of action, as long as the appropriate analytical tools are used.
Robust optimization of front members in a full frontal car impact
NASA Astrophysics Data System (ADS)
Aspenberg (né Lönn), David; Jergeus, Johan; Nilsson, Larsgunnar
2013-03-01
In the search for lightweight automobile designs, it is necessary to assure that robust crashworthiness performance is achieved. Structures that are optimized to handle a finite number of load cases may perform poorly when subjected to various dispersions. Thus, uncertainties must be accounted for in the optimization process. This article presents an approach to optimization where all design evaluations include an evaluation of the robustness. Metamodel approximations are applied both to the design space and the robustness evaluations, using artifical neural networks and polynomials, respectively. The features of the robust optimization approach are displayed in an analytical example, and further demonstrated in a large-scale design example of front side members of a car. Different optimization formulations are applied and it is shown that the proposed approach works well. It is also concluded that a robust optimization puts higher demands on the finite element model performance than normally.
How Robust is Your System Resilience?
NASA Astrophysics Data System (ADS)
Homayounfar, M.; Muneepeerakul, R.
2017-12-01
Robustness and resilience are concepts in system thinking that have grown in importance and popularity. For many complex social-ecological systems, however, robustness and resilience are difficult to quantify and the connections and trade-offs between them difficult to study. Most studies have either focused on qualitative approaches to discuss their connections or considered only one of them under particular classes of disturbances. In this study, we present an analytical framework to address the linkage between robustness and resilience more systematically. Our analysis is based on a stylized dynamical model that operationalizes a widely used concept framework for social-ecological systems. The model enables us to rigorously define robustness and resilience and consequently investigate their connections. The results reveal the tradeoffs among performance, robustness, and resilience. They also show how the nature of the such tradeoffs varies with the choices of certain policies (e.g., taxation and investment in public infrastructure), internal stresses and external disturbances.
Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z
2015-12-01
Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.
Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette
2014-08-15
The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Naha, Pratap C.; Lau, Kristen C.; Hsu, Jessica C.; Hajfathalian, Maryam; Mian, Shaameen; Chhour, Peter; Uppuluri, Lahari; McDonald, Elizabeth S.; Maidment, Andrew D. A.; Cormode, David P.
2016-07-01
Earlier detection of breast cancer reduces mortality from this disease. As a result, the development of better screening techniques is a topic of intense interest. Contrast-enhanced dual-energy mammography (DEM) is a novel technique that has improved sensitivity for cancer detection. However, the development of contrast agents for this technique is in its infancy. We herein report gold-silver alloy nanoparticles (GSAN) that have potent DEM contrast properties and improved biocompatibility. GSAN formulations containing a range of gold : silver ratios and capped with m-PEG were synthesized and characterized using various analytical methods. DEM and computed tomography (CT) phantom imaging showed that GSAN produced robust contrast that was comparable to silver alone. Cell viability, reactive oxygen species generation and DNA damage results revealed that the formulations with 30% or higher gold content are cytocompatible to Hep G2 and J774A.1 cells. In vivo imaging was performed in mice with and without breast tumors. The results showed that GSAN produce strong DEM and CT contrast and accumulated in tumors. Furthermore, both in vivo imaging and ex vivo analysis indicated the excretion of GSAN via both urine and feces. In summary, GSAN produce strong DEM and CT contrast, and has potential for both blood pool imaging and for breast cancer screening.Earlier detection of breast cancer reduces mortality from this disease. As a result, the development of better screening techniques is a topic of intense interest. Contrast-enhanced dual-energy mammography (DEM) is a novel technique that has improved sensitivity for cancer detection. However, the development of contrast agents for this technique is in its infancy. We herein report gold-silver alloy nanoparticles (GSAN) that have potent DEM contrast properties and improved biocompatibility. GSAN formulations containing a range of gold : silver ratios and capped with m-PEG were synthesized and characterized using various analytical methods. DEM and computed tomography (CT) phantom imaging showed that GSAN produced robust contrast that was comparable to silver alone. Cell viability, reactive oxygen species generation and DNA damage results revealed that the formulations with 30% or higher gold content are cytocompatible to Hep G2 and J774A.1 cells. In vivo imaging was performed in mice with and without breast tumors. The results showed that GSAN produce strong DEM and CT contrast and accumulated in tumors. Furthermore, both in vivo imaging and ex vivo analysis indicated the excretion of GSAN via both urine and feces. In summary, GSAN produce strong DEM and CT contrast, and has potential for both blood pool imaging and for breast cancer screening. Electronic supplementary information (ESI) available: Reactive oxygen species generation and DNA damage methods, stability of GSAN in PBS, step phantom images and a DEM image of a gold nanoparticle phantom, GSAN CT phantom results. See DOI: 10.1039/c6nr02618d
Li, Mingjie; Zhou, Ping; Zhao, Zhicheng; Zhang, Jinggang
2016-03-01
Recently, fractional order (FO) processes with dead-time have attracted more and more attention of many researchers in control field, but FO-PID controllers design techniques available for the FO processes with dead-time suffer from lack of direct systematic approaches. In this paper, a simple design and parameters tuning approach of two-degree-of-freedom (2-DOF) FO-PID controller based on internal model control (IMC) is proposed for FO processes with dead-time, conventional one-degree-of-freedom control exhibited the shortcoming of coupling of robustness and dynamic response performance. 2-DOF control can overcome the above weakness which means it realizes decoupling of robustness and dynamic performance from each other. The adjustable parameter η2 of FO-PID controller is directly related to the robustness of closed-loop system, and the analytical expression is given between the maximum sensitivity specification Ms and parameters η2. In addition, according to the dynamic performance requirement of the practical system, the parameters η1 can also be selected easily. By approximating the dead-time term of the process model with the first-order Padé or Taylor series, the expressions for 2-DOF FO-PID controller parameters are derived for three classes of FO processes with dead-time. Moreover, compared with other methods, the proposed method is simple and easy to implement. Finally, the simulation results are given to illustrate the effectiveness of this method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Talluri, Murali V N Kumar; Kalariya, Pradipbhai D; Dharavath, Shireesha; Shaikh, Naeem; Garg, Prabha; Ramisetti, Nageswara Rao; Ragampeta, Srinivas
2016-09-01
A novel ultra high performance liquid chromatography method development strategy was ameliorated by applying quality by design approach. The developed systematic approach was divided into five steps (i) Analytical Target Profile, (ii) Critical Quality Attributes, (iii) Risk Assessments of Critical parameters using design of experiments (screening and optimization phases), (iv) Generation of design space, and (v) Process Capability Analysis (Cp) for robustness study using Monte Carlo simulation. The complete quality-by-design-based method development was made automated and expedited by employing sub-2 μm particles column with an ultra high performance liquid chromatography system. Successful chromatographic separation of the Coenzyme Q10 from its biotechnological process related impurities was achieved on a Waters Acquity phenyl hexyl (100 mm × 2.1 mm, 1.7 μm) column with gradient elution of 10 mM ammonium acetate buffer (pH 4.0) and a mixture of acetonitrile/2-propanol (1:1) as the mobile phase. Through this study, fast and organized method development workflow was developed and robustness of the method was also demonstrated. The method was validated for specificity, linearity, accuracy, precision, and robustness in compliance to the International Conference on Harmonization, Q2 (R1) guidelines. The impurities were identified by atmospheric pressure chemical ionization-mass spectrometry technique. Further, the in silico toxicity of impurities was analyzed using TOPKAT and DEREK software. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Parvathi, S. P.; Ramanan, R. V.
2018-06-01
An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.
Zhou, Xu; Yang, Long; Tan, Xiaoping; Zhao, Genfu; Xie, Xiaoguang; Du, Guanben
2018-07-30
Prostate specific antigen (PSA) is the most significant biomarker for the screening of prostate cancer in human serum. However, most methods for the detection of PSA often require major laboratories, precisely analytical instruments and complicated operations. Currently, the design and development of satisfying electrochemical biosensors based on biomimetic materials (e.g. synthetic receptors) and nanotechnology is highly desired. Thus, we focused on the combination of molecular recognition and versatile nanomaterials in electrochemical devices for advancing their analytical performance and robustness. Herein, by using the present prepared multifunctional hydroxyl pillar[5]arene@gold nanoparticles@graphitic carbon nitride (HP5@AuNPs@g-C 3 N 4 ) hybrid nanomaterial as robust biomimetic element, a high-performance electrochemical immunosensor for detection of PSA was constructed. The as-prepared immunosensor, with typically competitive advantages of low cost, simple preparation and fast detection, exhibited remarkable robustness, ultra-sensitivity, excellent selectivity and reproducibility. The limit of detection (LOD) and linear range were 0.12 pg mL -1 (S/N = 3) and 0.0005-10.00 ng mL -1 , respectively. The satisfying results provide a promising approach for clinical detection of PSA in human serum. Copyright © 2018 Elsevier B.V. All rights reserved.
Advances in high-resolution mass spectrometry based on metabolomics studies for food--a review.
Rubert, Josep; Zachariasova, Milena; Hajslova, Jana
2015-01-01
Food authenticity becomes a necessity for global food policies, since food placed in the market without fail has to be authentic. It has always been a challenge, since in the past minor components, called also markers, have been mainly monitored by chromatographic methods in order to authenticate the food. Nevertheless, nowadays, advanced analytical methods have allowed food fingerprints to be achieved. At the same time they have been also combined with chemometrics, which uses statistical methods in order to verify food and to provide maximum information by analysing chemical data. These sophisticated methods based on different separation techniques or stand alone have been recently coupled to high-resolution mass spectrometry (HRMS) in order to verify the authenticity of food. The new generation of HRMS detectors have experienced significant advances in resolving power, sensitivity, robustness, extended dynamic range, easier mass calibration and tandem mass capabilities, making HRMS more attractive and useful to the food metabolomics community, therefore becoming a reliable tool for food authenticity. The purpose of this review is to summarise and describe the most recent metabolomics approaches in the area of food metabolomics, and to discuss the strengths and drawbacks of the HRMS analytical platforms combined with chemometrics.
Determining optimal parameters in magnetic spacecraft stabilization via attitude feedback
NASA Astrophysics Data System (ADS)
Bruni, Renato; Celani, Fabio
2016-10-01
The attitude control of a spacecraft using magnetorquers can be achieved by a feedback control law which has four design parameters. However, the practical determination of appropriate values for these parameters is a critical open issue. We propose here an innovative systematic approach for finding these values: they should be those that minimize the convergence time to the desired attitude. This a particularly diffcult optimization problem, for several reasons: 1) such time cannot be expressed in analytical form as a function of parameters and initial conditions; 2) design parameters may range over very wide intervals; 3) convergence time depends also on the initial conditions of the spacecraft, which are not known in advance. To overcome these diffculties, we present a solution approach based on derivative-free optimization. These algorithms do not need to write analytically the objective function: they only need to compute it in a number of points. We also propose a fast probing technique to identify which regions of the search space have to be explored densely. Finally, we formulate a min-max model to find robust parameters, namely design parameters that minimize convergence time under the worst initial conditions. Results are very promising.
Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya
2015-01-07
To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.
Wang, Lu; Qu, Haibin
2016-03-01
A method combining solid phase extraction, high performance liquid chromatography, and ultraviolet/evaporative light scattering detection (SPE-HPLC-UV/ELSD) was developed according to Quality by Design (QbD) principles and used to assay nine bioactive compounds within a botanical drug, Shenqi Fuzheng Injection. Risk assessment and a Plackett-Burman design were utilized to evaluate the impact of 11 factors on the resolutions and signal-to-noise of chromatographic peaks. Multiple regression and Pareto ranking analysis indicated that the sorbent mass, sample volume, flow rate, column temperature, evaporator temperature, and gas flow rate were statistically significant (p < 0.05) in this procedure. Furthermore, a Box-Behnken design combined with response surface analysis was employed to study the relationships between the quality of SPE-HPLC-UV/ELSD analysis and four significant factors, i.e., flow rate, column temperature, evaporator temperature, and gas flow rate. An analytical design space of SPE-HPLC-UV/ELSD was then constructed by calculated Monte Carlo probability. In the presented approach, the operating parameters of sample preparation, chromatographic separation, and compound detection were investigated simultaneously. Eight terms of method validation, i.e., system-suitability tests, method robustness/ruggedness, sensitivity, precision, repeatability, linearity, accuracy, and stability, were accomplished at a selected working point. These results revealed that the QbD principles were suitable in the development of analytical procedures for samples in complex matrices. Meanwhile, the analytical quality and method robustness were validated by the analytical design space. The presented strategy provides a tutorial on the development of a robust QbD-compliant quantitative method for samples in complex matrices.
Depth-resolved monitoring of analytes diffusion in ocular tissues
NASA Astrophysics Data System (ADS)
Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.
2007-02-01
Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.
Effect of an angular trajectory kick in a high-gain free-electron laser
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baxevanis, Panagiotis; Huang, Zhirong; Stupakov, Gennady
In a free-electron laser, a transverse momentum offset (or “kick”) results in an oscillation of the centroid of the electron beam about the undulator axis. Studying the influence of this effect on the free-electron laser (FEL) interaction is important both from a tolerance point of view and for its potential diagnostic applications. In this paper, we present a self-consistent theoretical analysis of a high-gain FEL driven by such a “kicked” beam. In particular, we derive a solution to the three-dimensional, linearized initial value problem of the FEL through an orthogonal expansion technique and also describe a variational method for calculatingmore » the average FEL growth rate. Lastly, our results are benchmarked with genesis simulations and provide a robust theoretical background for a comparison with previous analytical results.« less
Effect of an angular trajectory kick in a high-gain free-electron laser
Baxevanis, Panagiotis; Huang, Zhirong; Stupakov, Gennady
2017-04-18
In a free-electron laser, a transverse momentum offset (or “kick”) results in an oscillation of the centroid of the electron beam about the undulator axis. Studying the influence of this effect on the free-electron laser (FEL) interaction is important both from a tolerance point of view and for its potential diagnostic applications. In this paper, we present a self-consistent theoretical analysis of a high-gain FEL driven by such a “kicked” beam. In particular, we derive a solution to the three-dimensional, linearized initial value problem of the FEL through an orthogonal expansion technique and also describe a variational method for calculatingmore » the average FEL growth rate. Lastly, our results are benchmarked with genesis simulations and provide a robust theoretical background for a comparison with previous analytical results.« less
Nuclear Forensic Science: Analysis of Nuclear Material Out of Regulatory Control
NASA Astrophysics Data System (ADS)
Kristo, Michael J.; Gaffney, Amy M.; Marks, Naomi; Knight, Kim; Cassata, William S.; Hutcheon, Ian D.
2016-06-01
Nuclear forensic science seeks to identify the origin of nuclear materials found outside regulatory control. It is increasingly recognized as an integral part of a robust nuclear security program. This review highlights areas of active, evolving research in nuclear forensics, with a focus on analytical techniques commonly employed in Earth and planetary sciences. Applications of nuclear forensics to uranium ore concentrates (UOCs) are discussed first. UOCs have become an attractive target for nuclear forensic researchers because of the richness in impurities compared to materials produced later in the fuel cycle. The development of chronometric methods for age dating nuclear materials is then discussed, with an emphasis on improvements in accuracy that have been gained from measurements of multiple radioisotopic systems. Finally, papers that report on casework are reviewed, to provide a window into current scientific practice.
NASA Astrophysics Data System (ADS)
Hart, Brian K.; Griffiths, Peter R.
1998-06-01
Partial least squares (PLS) regression has been evaluated as a robust calibration technique for over 100 hazardous air pollutants (HAPs) measured by open path Fourier transform infrared (OP/FT-IR) spectrometry. PLS has the advantage over the current recommended calibration method of classical least squares (CLS), in that it can look at the whole useable spectrum (700-1300 cm-1, 2000-2150 cm-1, and 2400-3000 cm-1), and detect several analytes simultaneously. Up to one hundred HAPs synthetically added to OP/FT-IR backgrounds have been simultaneously calibrated and detected using PLS. PLS also has the advantage in requiring less preprocessing of spectra than that which is required in CLS calibration schemes, allowing PLS to provide user independent real-time analysis of OP/FT-IR spectra.
Reduced-Order Modeling: Cooperative Research and Development at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Beran, Philip S.; Cesnik, Carlos E. S.; Guendel, Randal E.; Kurdila, Andrew; Prazenica, Richard J.; Librescu, Liviu; Marzocca, Piergiovanni; Raveh, Daniella E.
2001-01-01
Cooperative research and development activities at the NASA Langley Research Center (LaRC) involving reduced-order modeling (ROM) techniques are presented. Emphasis is given to reduced-order methods and analyses based on Volterra series representations, although some recent results using Proper Orthogonal Deco in position (POD) are discussed as well. Results are reported for a variety of computational and experimental nonlinear systems to provide clear examples of the use of reduced-order models, particularly within the field of computational aeroelasticity. The need for and the relative performance (speed, accuracy, and robustness) of reduced-order modeling strategies is documented. The development of unsteady aerodynamic state-space models directly from computational fluid dynamics analyses is presented in addition to analytical and experimental identifications of Volterra kernels. Finally, future directions for this research activity are summarized.
Second International Conference on Accelerating Biopharmaceutical Development
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme “Delivering cost-effective, robust processes and methods quickly and efficiently.” The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development. PMID:20065637
Photogrammetry Methodology Development for Gossamer Spacecraft Structures
NASA Technical Reports Server (NTRS)
Pappa, Richard S.; Jones, Thomas W.; Walford, Alan; Black, Jonathan T.; Robson, Stuart; Shortis, Mark R.
2002-01-01
Photogrammetry--the science of calculating 3D object coordinates from images-is a flexible and robust approach for measuring the static and dynamic characteristics of future ultralightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.
Loh, Leslie J; Bandara, Gayan C; Weber, Genevieve L; Remcho, Vincent T
2015-08-21
Due to the rapid expansion in hydraulic fracturing (fracking), there is a need for robust, portable and specific water analysis techniques. Early detection of contamination is crucial for the prevention of lasting environmental damage. Bromide can potentially function as an early indicator of water contamination by fracking waste, because there is a high concentration of bromide ions in fracking wastewaters. To facilitate this, a microfluidic paper-based analytical device (μPAD) has been developed and optimized for the quantitative colorimetric detection of bromide in water using a smartphone. A paper microfluidic platform offers the advantages of inexpensive fabrication, elimination of unstable wet reagents, portability and high adaptability for widespread distribution. These features make this assay an attractive option for a new field test for on-site determination of bromide.
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kharkov, B. B.; Chizhik, V. I.; Dvinskikh, S. V., E-mail: sergeid@kth.se
2016-01-21
Dipolar recoupling is an essential part of current solid-state NMR methodology for probing atomic-resolution structure and dynamics in solids and soft matter. Recently described magic-echo amplitude- and phase-modulated cross-polarization heteronuclear recoupling strategy aims at efficient and robust recoupling in the entire range of coupling constants both in rigid and highly dynamic molecules. In the present study, the properties of this recoupling technique are investigated by theoretical analysis, spin-dynamics simulation, and experimentally. The resonance conditions and the efficiency of suppressing the rf field errors are examined and compared to those for other recoupling sequences based on similar principles. The experimental datamore » obtained in a variety of rigid and soft solids illustrate the scope of the method and corroborate the results of analytical and numerical calculations. The technique benefits from the dipolar resolution over a wider range of coupling constants compared to that in other state-of-the-art methods and thus is advantageous in studies of complex solids with a broad range of dynamic processes and molecular mobility degrees.« less
Ultra-small dye-doped silica nanoparticles via modified sol-gel technique
NASA Astrophysics Data System (ADS)
Riccò, R.; Nizzero, S.; Penna, E.; Meneghello, A.; Cretaio, E.; Enrichi, F.
2018-05-01
In modern biosensing and imaging, fluorescence-based methods constitute the most diffused approach to achieve optimal detection of analytes, both in solution and on the single-particle level. Despite the huge progresses made in recent decades in the development of plasmonic biosensors and label-free sensing techniques, fluorescent molecules remain the most commonly used contrast agents to date for commercial imaging and detection methods. However, they exhibit low stability, can be difficult to functionalise, and often result in a low signal-to-noise ratio. Thus, embedding fluorescent probes into robust and bio-compatible materials, such as silica nanoparticles, can substantially enhance the detection limit and dramatically increase the sensitivity. In this work, ultra-small fluorescent silica nanoparticles (NPs) for optical biosensing applications were doped with a fluorescent dye, using simple water-based sol-gel approaches based on the classical Stöber procedure. By systematically modulating reaction parameters, controllable size tuning of particle diameters as low as 10 nm was achieved. Particles morphology and optical response were evaluated showing a possible single-molecule behaviour, without employing microemulsion methods to achieve similar results. [Figure not available: see fulltext.
Semidefinite Relaxation-Based Optimization of Multiple-Input Wireless Power Transfer Systems
NASA Astrophysics Data System (ADS)
Lang, Hans-Dieter; Sarris, Costas D.
2017-11-01
An optimization procedure for multi-transmitter (MISO) wireless power transfer (WPT) systems based on tight semidefinite relaxation (SDR) is presented. This method ensures physical realizability of MISO WPT systems designed via convex optimization -- a robust, semi-analytical and intuitive route to optimizing such systems. To that end, the nonconvex constraints requiring that power is fed into rather than drawn from the system via all transmitter ports are incorporated in a convex semidefinite relaxation, which is efficiently and reliably solvable by dedicated algorithms. A test of the solution then confirms that this modified problem is equivalent (tight relaxation) to the original (nonconvex) one and that the true global optimum has been found. This is a clear advantage over global optimization methods (e.g. genetic algorithms), where convergence to the true global optimum cannot be ensured or tested. Discussions of numerical results yielded by both the closed-form expressions and the refined technique illustrate the importance and practicability of the new method. It, is shown that this technique offers a rigorous optimization framework for a broad range of current and emerging WPT applications.
Robson, Philip M; Grant, Aaron K; Madhuranthakam, Ananth J; Lattanzi, Riccardo; Sodickson, Daniel K; McKenzie, Charles A
2008-10-01
Parallel imaging reconstructions result in spatially varying noise amplification characterized by the g-factor, precluding conventional measurements of noise from the final image. A simple Monte Carlo based method is proposed for all linear image reconstruction algorithms, which allows measurement of signal-to-noise ratio and g-factor and is demonstrated for SENSE and GRAPPA reconstructions for accelerated acquisitions that have not previously been amenable to such assessment. Only a simple "prescan" measurement of noise amplitude and correlation in the phased-array receiver, and a single accelerated image acquisition are required, allowing robust assessment of signal-to-noise ratio and g-factor. The "pseudo multiple replica" method has been rigorously validated in phantoms and in vivo, showing excellent agreement with true multiple replica and analytical methods. This method is universally applicable to the parallel imaging reconstruction techniques used in clinical applications and will allow pixel-by-pixel image noise measurements for all parallel imaging strategies, allowing quantitative comparison between arbitrary k-space trajectories, image reconstruction, or noise conditioning techniques. (c) 2008 Wiley-Liss, Inc.
The determination of trace elements in crude oil and its heavy fractions by atomic spectrometry
NASA Astrophysics Data System (ADS)
Duyck, Christiane; Miekeley, Norbert; Porto da Silveira, Carmem L.; Aucélio, Ricardo Q.; Campos, Reinaldo C.; Grinberg, Patrícia; Brandão, Geisamanda P.
2007-09-01
A literature review on the determination of trace elements in crude oil and heavy molecular mass fractions (saturates, aromatics, resins and asphaltenes) by ICP-MS, ICP OES and AAS is presented. Metal occurrences, forms and distributions are examined as well as their implications in terms of reservoir geochemistry, oil refining and environment. The particular analytical challenges for the determination of metals in these complex matrices by spectrochemical techniques are discussed. Sample preparation based on ashing, microwave-assisted digestion and combustion decomposition procedures is noted as robust and long used. However, the introduction of non-aqueous solvents and micro-emulsions into inductively coupled plasmas is cited as a new trend for achieving rapid and accurate analysis. Separation procedures for operationally defined fractions in crude oil are more systematically applied for the observation of metal distributions and their implications. Chemical speciation is of growing interest, achieved by the coupling of high efficiency separation techniques (e.g., HPLC and GC) to ICP-MS instrumentation, which allows the simultaneous determination of multiple organometallic species of geochemical and environmental importance.
Grimalt, Susana; Thompson, Dean G; Coppens, Melanie; Chartrand, Derek T; Shorney, Thomas; Meating, Joe; Scarr, Taylor
2011-08-10
A rapid and sensitive LC-ESI-MS method has been developed and validated for the quantitation of azadirachtin and 3-tigloylazadirachtol in deciduous tree matrices. The method involves automated extraction and simultaneous cleanup using an accelerated solvent technique with the matrix dispersed in solid phase over a layer of primary-secondary amine silica. The limits of quantification were 0.02 mg/kg for all matrices with the exception of Norway maple foliage (0.05 mg/kg). Validation at three levels (0.02, 0.1, and 1 mg/kg), demonstrated satisfactory recoveries (71-103%) with relative standard deviation <20%. Two in-source fragment ions were used for confirmation at levels above 0.1 mg/kg. Over a period of several months, quality control analyses showed the technique to be robust and effective in tracking the fate of these natural botanical insecticides following systemic injection into various tree species for control of invasive insect pest species such as the emerald ash borer and Asian longhorned beetle.
Recent Developments in Antibody-Based Assays for the Detection of Bacterial Toxins
Zhu, Kui; Dietrich, Richard; Didier, Andrea; Doyscher, Dominik; Märtlbauer, Erwin
2014-01-01
Considering the urgent demand for rapid and accurate determination of bacterial toxins and the recent promising developments in nanotechnology and microfluidics, this review summarizes new achievements of the past five years. Firstly, bacterial toxins will be categorized according to their antibody binding properties into low and high molecular weight compounds. Secondly, the types of antibodies and new techniques for producing antibodies are discussed, including poly- and mono-clonal antibodies, single-chain variable fragments (scFv), as well as heavy-chain and recombinant antibodies. Thirdly, the use of different nanomaterials, such as gold nanoparticles (AuNPs), magnetic nanoparticles (MNPs), quantum dots (QDs) and carbon nanomaterials (graphene and carbon nanotube), for labeling antibodies and toxins or for readout techniques will be summarized. Fourthly, microscale analysis or minimized devices, for example microfluidics or lab-on-a-chip (LOC), which have attracted increasing attention in combination with immunoassays for the robust detection or point-of-care testing (POCT), will be reviewed. Finally, some new materials and analytical strategies, which might be promising for analyzing toxins in the near future, will be shortly introduced. PMID:24732203
Percolation of localized attack on complex networks
NASA Astrophysics Data System (ADS)
Shao, Shuai; Huang, Xuqing; Stanley, H. Eugene; Havlin, Shlomo
2015-02-01
The robustness of complex networks against node failure and malicious attack has been of interest for decades, while most of the research has focused on random attack or hub-targeted attack. In many real-world scenarios, however, attacks are neither random nor hub-targeted, but localized, where a group of neighboring nodes in a network are attacked and fail. In this paper we develop a percolation framework to analytically and numerically study the robustness of complex networks against such localized attack. In particular, we investigate this robustness in Erdős-Rényi networks, random-regular networks, and scale-free networks. Our results provide insight into how to better protect networks, enhance cybersecurity, and facilitate the design of more robust infrastructures.
Robust Sensitivity Analysis for Multi-Attribute Deterministic Hierarchical Value Models
2002-03-01
such as weighted sum method, weighted 5 product method, and the Analytic Hierarchy Process ( AHP ). This research focuses on only weighted sum...different groups. They can be termed as deterministic, stochastic, or fuzzy multi-objective decision methods if they are classified according to the...weighted product model (WPM), and analytic hierarchy process ( AHP ). His method attempts to identify the most important criteria weight and the most
Teachable, high-content analytics for live-cell, phase contrast movies.
Alworth, Samuel V; Watanabe, Hirotada; Lee, James S J
2010-09-01
CL-Quant is a new solution platform for broad, high-content, live-cell image analysis. Powered by novel machine learning technologies and teach-by-example interfaces, CL-Quant provides a platform for the rapid development and application of scalable, high-performance, and fully automated analytics for a broad range of live-cell microscopy imaging applications, including label-free phase contrast imaging. The authors used CL-Quant to teach off-the-shelf universal analytics, called standard recipes, for cell proliferation, wound healing, cell counting, and cell motility assays using phase contrast movies collected on the BioStation CT and BioStation IM platforms. Similar to application modules, standard recipes are intended to work robustly across a wide range of imaging conditions without requiring customization by the end user. The authors validated the performance of the standard recipes by comparing their performance with truth created manually, or by custom analytics optimized for each individual movie (and therefore yielding the best possible result for the image), and validated by independent review. The validation data show that the standard recipes' performance is comparable with the validated truth with low variation. The data validate that the CL-Quant standard recipes can provide robust results without customization for live-cell assays in broad cell types and laboratory settings.
Simulation and statistics: Like rhythm and song
NASA Astrophysics Data System (ADS)
Othman, Abdul Rahman
2013-04-01
Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.
Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.
Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun
2017-07-08
Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.
NASA Technical Reports Server (NTRS)
Wieseman, Carol D.; Christhilf, David; Perry, Boyd, III
2012-01-01
An important objective of the Semi-Span Super-Sonic Transport (S4T) wind tunnel model program was the demonstration of Flutter Suppression (FS), Gust Load Alleviation (GLA), and Ride Quality Enhancement (RQE). It was critical to evaluate the stability and robustness of these control laws analytically before testing them and experimentally while testing them to ensure safety of the model and the wind tunnel. MATLAB based software was applied to evaluate the performance of closed-loop systems in terms of stability and robustness. Existing software tools were extended to use analytical representations of the S4T and the control laws to analyze and evaluate the control laws prior to testing. Lessons were learned about the complex windtunnel model and experimental testing. The open-loop flutter boundary was determined from the closed-loop systems. A MATLAB/Simulink Simulation developed under the program is available for future work to improve the CPE process. This paper is one of a series of that comprise a special session, which summarizes the S4T wind-tunnel program.
da Rosa, Hemerson S; Koetz, Mariana; Santos, Marí Castro; Jandrey, Elisa Helena Farias; Folmer, Vanderlei; Henriques, Amélia Teresinha; Mendez, Andreas Sebastian Loureiro
2018-04-01
Sida tuberculata (ST) is a Malvaceae species widely distributed in Southern Brazil. In traditional medicine, ST has been employed as hypoglycemic, hypocholesterolemic, anti-inflammatory and antimicrobial. Additionally, this species is chemically characterized by flavonoids, alkaloids and phytoecdysteroids mainly. The present work aimed to optimize the extractive technique and to validate an UHPLC method for the determination of 20-hydroxyecdsone (20HE) in the ST leaves. Box-Behnken Design (BBD) was used in method optimization. The extractive methods tested were: static and dynamic maceration, ultrasound, ultra-turrax and reflux. In the Box-Behnken three parameters were evaluated in three levels (-1, 0, +1), particle size, time and plant:solvent ratio. In validation method, the parameters of selectivity, specificity, linearity, limits of detection and quantification (LOD, LOQ), precision, accuracy and robustness were evaluated. The results indicate static maceration as better technique to obtain 20HE peak area in ST extract. The optimal extraction from surface response methodology was achieved with the parameters granulometry of 710 nm, 9 days of maceration and plant:solvent ratio 1:54 (w/v). The UHPLC-PDA analytical developed method showed full viability of performance, proving to be selective, linear, precise, accurate and robust for 20HE detection in ST leaves. The average content of 20HE was 0.56% per dry extract. Thus, the optimization of extractive method in ST leaves increased the concentration of 20HE in crude extract, and a reliable method was successfully developed according to validation requirements and in agreement with current legislation. Copyright © 2018 Elsevier Inc. All rights reserved.
Shaper-Based Filters for the compensation of the load cell response in dynamic mass measurement
NASA Astrophysics Data System (ADS)
Richiedei, Dario; Trevisani, Alberto
2018-01-01
This paper proposes a novel model-based signal filtering technique for dynamic mass measurement through load cells. Load cells are sensors with an underdamped oscillatory response which usually imposes a long settling time. Real-time filtering is therefore necessary to compensate for such a dynamics and to quickly retrieve the mass of the measurand (which is the steady state value of the load cell response) before the measured signal actually settles. This problem has a big impact on the throughput of industrial weighing machines. In this paper a novel solution to this problem is developed: a model-based filtering technique is proposed to ensure accurate, robust and rapid estimation of the mass of the measurand. The digital filters proposed are referred to as Shaper-Based Filters (SBFs) and are based on the convolution of the load cell output signal with a sequence of few impulses (typically, between 2 and 5). The amplitudes and the instants of application of such impulses are computed through the analytical development of the load cell step response, by imposing the admissible residual oscillation in the steady-state filtered signal and by requiring the desired sensitivity of the filter. The inclusion of robustness specifications tackles effectively the unavoidable uncertainty and variability in the load cell frequency and damping. The effectiveness of the proposed filters is proved experimentally through an industrial set up: the load-cell-instrumented weigh bucket of a multihead weighing machine for packaging. A performance comparison with other benchmark filters is provided and discussed too.
NASA Astrophysics Data System (ADS)
Ngom, Ndèye Fatou; Monga, Olivier; Ould Mohamed, Mohamed Mahmoud; Garnier, Patricia
2012-02-01
This paper focuses on the modeling of soil microstructures using generalized cylinders, with a specific application to pore space. The geometric modeling of these microstructures is a recent area of study, made possible by the improved performance of computed tomography techniques. X-scanners provide very-high-resolution 3D volume images ( 3-5μm) of soil samples in which pore spaces can be extracted by thresholding. However, in most cases, the pore space defines a complex volume shape that cannot be approximated using simple analytical functions. We propose representing this shape using a compact, stable, and robust piecewise approximation by means of generalized cylinders. This intrinsic shape representation conserves its topological and geometric properties. Our algorithm includes three main processing stages. The first stage consists in describing the volume shape using a minimum number of balls included within the shape, such that their union recovers the shape skeleton. The second stage involves the optimum extraction of simply connected chains of balls. The final stage copes with the approximation of each simply optimal chain using generalized cylinders: circular generalized cylinders, tori, cylinders, and truncated cones. This technique was applied to several data sets formed by real volume computed tomography soil samples. It was possible to demonstrate that our geometric representation supplied a good approximation of the pore space. We also stress the compactness and robustness of this method with respect to any changes affecting the initial data, as well as its coherence with the intuitive notion of pores. During future studies, this geometric pore space representation will be used to simulate biological dynamics.
Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment
ERIC Educational Resources Information Center
Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James
2010-01-01
The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…
Robust Feature Selection Technique using Rank Aggregation.
Sarkar, Chandrima; Cooley, Sarah; Srivastava, Jaideep
2014-01-01
Although feature selection is a well-developed research area, there is an ongoing need to develop methods to make classifiers more efficient. One important challenge is the lack of a universal feature selection technique which produces similar outcomes with all types of classifiers. This is because all feature selection techniques have individual statistical biases while classifiers exploit different statistical properties of data for evaluation. In numerous situations this can put researchers into dilemma as to which feature selection method and a classifiers to choose from a vast range of choices. In this paper, we propose a technique that aggregates the consensus properties of various feature selection methods to develop a more optimal solution. The ensemble nature of our technique makes it more robust across various classifiers. In other words, it is stable towards achieving similar and ideally higher classification accuracy across a wide variety of classifiers. We quantify this concept of robustness with a measure known as the Robustness Index (RI). We perform an extensive empirical evaluation of our technique on eight data sets with different dimensions including Arrythmia, Lung Cancer, Madelon, mfeat-fourier, internet-ads, Leukemia-3c and Embryonal Tumor and a real world data set namely Acute Myeloid Leukemia (AML). We demonstrate not only that our algorithm is more robust, but also that compared to other techniques our algorithm improves the classification accuracy by approximately 3-4% (in data set with less than 500 features) and by more than 5% (in data set with more than 500 features), across a wide range of classifiers.
Nonlinear estimation for arrays of chemical sensors
NASA Astrophysics Data System (ADS)
Yosinski, Jason; Paffenroth, Randy
2010-04-01
Reliable detection of hazardous materials is a fundamental requirement of any national security program. Such materials can take a wide range of forms including metals, radioisotopes, volatile organic compounds, and biological contaminants. In particular, detection of hazardous materials in highly challenging conditions - such as in cluttered ambient environments, where complex collections of analytes are present, and with sensors lacking specificity for the analytes of interest - is an important part of a robust security infrastructure. Sophisticated single sensor systems provide good specificity for a limited set of analytes but often have cumbersome hardware and environmental requirements. On the other hand, simple, broadly responsive sensors are easily fabricated and efficiently deployed, but such sensors individually have neither the specificity nor the selectivity to address analyte differentiation in challenging environments. However, arrays of broadly responsive sensors can provide much of the sensitivity and selectivity of sophisticated sensors but without the substantial hardware overhead. Unfortunately, arrays of simple sensors are not without their challenges - the selectivity of such arrays can only be realized if the data is first distilled using highly advanced signal processing algorithms. In this paper we will demonstrate how the use of powerful estimation algorithms, based on those commonly used within the target tracking community, can be extended to the chemical detection arena. Herein our focus is on algorithms that not only provide accurate estimates of the mixture of analytes in a sample, but also provide robust measures of ambiguity, such as covariances.
Petruzziello, Filomena; Grand-Guillaume Perrenoud, Alexandre; Thorimbert, Anita; Fogwill, Michael; Rezzi, Serge
2017-07-18
Analytical solutions enabling the quantification of circulating levels of liposoluble micronutrients such as vitamins and carotenoids are currently limited to either single or a reduced panel of analytes. The requirement to use multiple approaches hampers the investigation of the biological variability on a large number of samples in a time and cost efficient manner. With the goal to develop high-throughput and robust quantitative methods for the profiling of micronutrients in human plasma, we introduce a novel, validated workflow for the determination of 14 fat-soluble vitamins and carotenoids in a single run. Automated supported liquid extraction was optimized and implemented to simultaneously parallelize 48 samples in 1 h, and the analytes were measured using ultrahigh-performance supercritical fluid chromatography coupled to tandem mass spectrometry in less than 8 min. An improved mass spectrometry interface hardware was built up to minimize the post-decompression volume and to allow better control of the chromatographic effluent density on its route toward and into the ion source. In addition, a specific make-up solvent condition was developed to ensure both analytes and matrix constituents solubility after mobile phase decompression. The optimized interface resulted in improved spray plume stability and conserved matrix compounds solubility leading to enhanced hyphenation robustness while ensuring both suitable analytical repeatability and improved the detection sensitivity. The overall developed methodology gives recoveries within 85-115%, as well as within and between-day coefficient of variation of 2 and 14%, respectively.
Automated UHPLC separation of 10 pharmaceutical compounds using software-modeling.
Zöldhegyi, A; Rieger, H-J; Molnár, I; Fekhretdinova, L
2018-03-20
Human mistakes are still one of the main reasons of underlying regulatory affairs that in a compliance with FDA's Data Integrity and Analytical Quality by Design (AQbD) must be eliminated. To develop smooth, fast and robust methods that are free of human failures, a state-of-the-art automation was presented. For the scope of this study, a commercial software (DryLab) and a model mixture of 10 drugs were subjected to testing. Following AQbD-principles, the best available working point was selected and conformational experimental runs, i.e. the six worst cases of the conducted robustness calculation, were performed. Simulated results were found to be in excellent agreement with the experimental ones, proving the usefulness and effectiveness of an automated, software-assisted analytical method development. Copyright © 2018. Published by Elsevier B.V.
Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy
NASA Astrophysics Data System (ADS)
Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.
2011-08-01
The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.
Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Borchers, Christoph H
2013-07-01
An emerging approach for multiplexed targeted proteomics involves bottom-up LC-MRM-MS, with stable isotope-labeled internal standard peptides, to accurately quantitate panels of putative disease biomarkers in biofluids. In this paper, we used this approach to quantitate 27 candidate cancer-biomarker proteins in human plasma that had not been treated by immunoaffinity depletion or enrichment techniques. These proteins have been reported as biomarkers for a variety of human cancers, from laryngeal to ovarian, with breast cancer having the highest correlation. We implemented measures to minimize the analytical variability, improve the quantitative accuracy, and increase the feasibility and applicability of this MRM-based method. We have demonstrated excellent retention time reproducibility (median interday CV: 0.08%) and signal stability (median interday CV: 4.5% for the analytical platform and 6.1% for the bottom-up workflow) for the 27 biomarker proteins (represented by 57 interference-free peptides). The linear dynamic range for the MRM assays spanned four orders-of-magnitude, with 25 assays covering a 10(3) -10(4) range in protein concentration. The lowest abundance quantifiable protein in our biomarker panel was insulin-like growth factor 1 (calculated concentration: 127 ng/mL). Overall, the analytical performance of this assay demonstrates high robustness and sensitivity, and provides the necessary throughput and multiplexing capabilities required to verify and validate cancer-associated protein biomarker panels in human plasma, prior to clinical use. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Granzotto, Clara; Sutherland, Ken
2017-03-07
This paper reports an improved method for the identification of Acacia gum in cultural heritage samples using matrix assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF MS) after enzymatic digestion of the polysaccharide component. The analytical strategy was optimized using a reference Acacia gum (gum arabic, sp. A. senegal) and provided an unambiguous MS profile of the gum, characterized by specific and recognized oligosaccharides, from as little as 0.1 μg of material. The enhanced experimental approach with reduced detection limit was successfully applied to the analysis of naturally aged (∼80 year) gum arabic samples, pure and mixed with lead white pigment, and allowed the detection of gum arabic in samples from a late painting (1949/1954) by Georges Braque in the collection of the Art Institute of Chicago. This first application of the technique to characterize microsamples from a painting, in conjunction with analyses by gas chromatography/mass spectrometry (GC/MS), provided important insights into Braque's unusual mixed paint media that are also helpful to inform appropriate conservation treatments for his works. The robustness of the analytical strategy due to the reproducibility of the gum MS profile, even in the presence of other organic and inorganic components, together with the minimal sample size required, demonstrate the value of this new MALDI-TOF MS method as an analytical tool for the identification of gum arabic in microsamples from museum artifacts.
Matsche, Mark A; Arnold, Jill; Jenkins, Erin; Townsend, Howard; Rosemary, Kevin
2014-09-01
The imperiled status of Atlantic sturgeon (Acipenser oxyrinchus oxyrinchus), a large, long-lived, anadromous fish found along the Atlantic coast of North America, has prompted efforts at captive propagation for research and stock enhancement. The purpose of this study was to establish hematology and plasma chemistry reference intervals of captive Atlantic sturgeon maintained under different culture conditions. Blood specimens were collected from a total of 119 fish at 3 hatcheries: Lamar, PA (n = 36, ages 10-14 years); Chalk Point, MD (n = 40, siblings of Lamar); and Horn Point, Cambridge, MD (n = 43, mixed population from Chesapeake Bay). Reference intervals (using robust techniques), median, mean, and standard deviations were determined for WBC, RBC, thrombocytes, PCV, HGB, MCV, MCH, MCHC, and absolute counts for lymphocytes (L), neutrophils (N), monocytes, and eosinophils. Chemistry analytes included concentrations of total proteins, albumin, glucose, urea, calcium, phosphate, sodium, potassium, chloride, and globulins, AST, CK, and LDH activities, and osmolality. Mean concentrations of total proteins, albumin, and glucose were at or below the analytic range. Statistical comparisons showed significant differences among hatcheries for each remaining plasma chemistry analyte and for PCV, RBC, MCHC, MCH, eosinophil and monocyte counts, and N:L ratio throughout all 3 groups. Therefore, reference intervals were calculated separately for each population. Reference intervals for fish maintained under differing conditions should be established per population. © 2014 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.
Enhancement of integrated photonic biosensing by magnetic controlled nano-particles
NASA Astrophysics Data System (ADS)
Peserico, N.; Sharma, P. Pratim; Belloni, A.; Damin, F.; Chiari, M.; Bertacco, R.; Melloni, A.
2018-02-01
Integrated Mach-Zehnder interferometers, ring resonators, Bragg reflectors or simple waveguides are commonly used as photonic biosensing elements. They can be used for label-free detection relating the changes in the optical signal in realtime, as optical power or spectral response, to the presence and even the quantity of a target analyte on the surface of the photonic waveguide. The label-free method has advantages in term of sample preparation but it is more sensitive to spurious effects such as temperature and refractive index sample variation, biological noise, etc. Label methods can be more robust, more sensitive and able to manipulate the biological targets. In this work, we present an innovative labeled biosensing technique exploiting magnetic nano-beads for enhancement of sensitivity over integrated optic microrings. A sandwich binding is exploited to bring the magnetic labels close to the surface of the optical waveguide and interact with the optical evanescent field. The proximity and the quantity of the magnetic nano-beads are seen as a shift in the resonance of the microring. Detection of antibodies permits to reach a high level of sensitivity, down to 8 pM with a high confidence level. The sizes of the nano-beads are 50 to 250 nm. Furthermore, time-varying magnetic fields permit to manipulate the beads and even induce specific signals on the detected light to easy the processing and provide a reliable identification of the presence of the desired analyte. Multiple analytes detection is also possible.
Rodriguez, Estrella Sanz; Poynter, Sam; Curran, Mark; Haddad, Paul R; Shellie, Robert A; Nesterenko, Pavel N; Paull, Brett
2015-08-28
Preservation of ionic species within Antarctic ice yields a unique proxy record of the Earth's climate history. Studies have been focused until now on two proxies: the ionic components of sea salt aerosol and methanesulfonic acid. Measurement of the all of the major ionic species in ice core samples is typically carried out by ion chromatography. Former methods, whilst providing suitable detection limits, have been based upon off-column preconcentration techniques, requiring larger sample volumes, with potential for sample contamination and/or carryover. Here, a new capillary ion chromatography based analytical method has been developed for quantitative analysis of limited volume Antarctic ice core samples. The developed analytical protocol applies capillary ion chromatography (with suppressed conductivity detection) and direct on-column sample injection and focusing, thus eliminating the requirement for off-column sample preconcentration. This limits the total sample volume needed to 300μL per analysis, allowing for triplicate sample analysis with <1mL of sample. This new approach provides a reliable and robust analytical method for the simultaneous determination of organic and inorganic anions, including fluoride, methanesulfonate, chloride, sulfate and nitrate anions. Application to composite ice-core samples is demonstrated, with coupling of the capillary ion chromatograph to high resolution mass spectrometry used to confirm the presence and purity of the observed methanesulfonate peak. Copyright © 2015 Elsevier B.V. All rights reserved.
Microemulsification: an approach for analytical determinations.
Lima, Renato S; Shiroma, Leandro Y; Teixeira, Alvaro V N C; de Toledo, José R; do Couto, Bruno C; de Carvalho, Rogério M; Carrilho, Emanuel; Kubota, Lauro T; Gobbi, Angelo L
2014-09-16
We address a novel method for analytical determinations that combines simplicity, rapidity, low consumption of chemicals, and portability with high analytical performance taking into account parameters such as precision, linearity, robustness, and accuracy. This approach relies on the effect of the analyte content over the Gibbs free energy of dispersions, affecting the thermodynamic stabilization of emulsions or Winsor systems to form microemulsions (MEs). Such phenomenon was expressed by the minimum volume fraction of amphiphile required to form microemulsion (Φ(ME)), which was the analytical signal of the method. Thus, the measurements can be taken by visually monitoring the transition of the dispersions from cloudy to transparent during the microemulsification, like a titration. It bypasses the employment of electric energy. The performed studies were: phase behavior, droplet dimension by dynamic light scattering, analytical curve, and robustness tests. The reliability of the method was evaluated by determining water in ethanol fuels and monoethylene glycol in complex samples of liquefied natural gas. The dispersions were composed of water-chlorobenzene (water analysis) and water-oleic acid (monoethylene glycol analysis) with ethanol as the hydrotrope phase. The mean hydrodynamic diameter values for the nanostructures in the droplet-based water-chlorobenzene MEs were in the range of 1 to 11 nm. The procedures of microemulsification were conducted by adding ethanol to water-oleic acid (W-O) mixtures with the aid of micropipette and shaking. The Φ(ME) measurements were performed in a thermostatic water bath at 23 °C by direct observation that is based on the visual analyses of the media. The experiments to determine water demonstrated that the analytical performance depends on the composition of ME. It shows flexibility in the developed method. The linear range was fairly broad with limits of linearity up to 70.00% water in ethanol. For monoethylene glycol in water, in turn, the linear range was observed throughout the volume fraction of analyte. The best limits of detection were 0.32% v/v water to ethanol and 0.30% v/v monoethylene glycol to water. Furthermore, the accuracy was highly satisfactory. The natural gas samples provided by the Petrobras exhibited color, particulate material, high ionic strength, and diverse compounds as metals, carboxylic acids, and anions. These samples had a conductivity of up to 2630 μS cm(-1); the conductivity of pure monoethylene glycol was only 0.30 μS cm(-1). Despite such downsides, the method allowed accurate measures bypassing steps such as extraction, preconcentration, and dilution of the sample. In addition, the levels of robustness were promising. This parameter was evaluated by investigating the effect of (i) deviations in volumetric preparation of the dispersions and (ii) changes in temperature over the analyte contents recorded by the method.
The influence of parametric and external noise in act-and-wait control with delayed feedback.
Wang, Jiaxing; Kuske, Rachel
2017-11-01
We apply several novel semi-analytic approaches for characterizing and calculating the effects of noise in a system with act-and-wait control. For concrete illustration, we apply these to a canonical balance model for an inverted pendulum to study the combined effect of delay and noise within the act-and-wait setting. While the act-and-wait control facilitates strong stabilization through deadbeat control, a comparison of different models with continuous vs. discrete updating of the control strategy in the active period illustrates how delays combined with the imprecise application of the control can seriously degrade the performance. We give several novel analyses of a generalized act-and-wait control strategy, allowing flexibility in the updating of the control strategy, in order to understand the sensitivities to delays and random fluctuations. In both the deterministic and stochastic settings, we give analytical and semi-analytical results that characterize and quantify the dynamics of the system. These results include the size and shape of stability regions, densities for the critical eigenvalues that capture the rate of reaching the desired stable equilibrium, and amplification factors for sustained fluctuations in the context of external noise. They also provide the dependence of these quantities on the length of the delay and the active period. In particular, we see that the combined influence of delay, parametric error, or external noise and on-off control can qualitatively change the dynamics, thus reducing the robustness of the control strategy. We also capture the dependence on how frequently the control is updated, allowing an interpolation between continuous and frequent updating. In addition to providing insights for these specific models, the methods we propose are generalizable to other settings with noise, delay, and on-off control, where analytical techniques are otherwise severely scarce.
2012-09-01
Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain techniques Peter N. Crabtree, Collin Seanor...00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain...demonstrate performance of a hybrid algorithm . These results are from analysis of a set of images of an ISO 12233 [12] resolution chart captured in the
Image Alignment for Multiple Camera High Dynamic Range Microscopy.
Eastwood, Brian S; Childs, Elisabeth C
2012-01-09
This paper investigates the problem of image alignment for multiple camera high dynamic range (HDR) imaging. HDR imaging combines information from images taken with different exposure settings. Combining information from multiple cameras requires an alignment process that is robust to the intensity differences in the images. HDR applications that use a limited number of component images require an alignment technique that is robust to large exposure differences. We evaluate the suitability for HDR alignment of three exposure-robust techniques. We conclude that image alignment based on matching feature descriptors extracted from radiant power images from calibrated cameras yields the most accurate and robust solution. We demonstrate the use of this alignment technique in a high dynamic range video microscope that enables live specimen imaging with a greater level of detail than can be captured with a single camera.
Image Alignment for Multiple Camera High Dynamic Range Microscopy
Eastwood, Brian S.; Childs, Elisabeth C.
2012-01-01
This paper investigates the problem of image alignment for multiple camera high dynamic range (HDR) imaging. HDR imaging combines information from images taken with different exposure settings. Combining information from multiple cameras requires an alignment process that is robust to the intensity differences in the images. HDR applications that use a limited number of component images require an alignment technique that is robust to large exposure differences. We evaluate the suitability for HDR alignment of three exposure-robust techniques. We conclude that image alignment based on matching feature descriptors extracted from radiant power images from calibrated cameras yields the most accurate and robust solution. We demonstrate the use of this alignment technique in a high dynamic range video microscope that enables live specimen imaging with a greater level of detail than can be captured with a single camera. PMID:22545028
Yang, Clayton S C; Jin, Feng; Swaminathan, Siva R; Patel, Sita; Ramer, Evan D; Trivedi, Sudhir B; Brown, Ei E; Hommerich, Uwe; Samuels, Alan C
2017-10-30
This is the first report of a simultaneous ultraviolet/visible/NIR and longwave infrared laser-induced breakdown spectroscopy (UVN + LWIR LIBS) measurement. In our attempt to study the feasibility of combining the newly developed rapid LWIR LIBS linear array detection system to existing rapid analytical techniques for a wide range of chemical analysis applications, two different solid pharmaceutical tablets, Tylenol arthritis pain and Bufferin, were studied using both a recently designed simultaneous UVN + LWIR LIBS detection system and a fast AOTF NIR (1200 to 2200 nm) spectrometer. Every simultaneous UVN + LWIR LIBS emission spectrum in this work was initiated by one single laser pulse-induced micro-plasma in the ambient air atmosphere. Distinct atomic and molecular LIBS emission signatures of the target compounds measured simultaneously in UVN (200 to 1100 nm) and LWIR (5.6 to 10 µm) spectral regions are readily detected and identified without the need to employ complex data processing. In depth profiling studies of these two pharmaceutical tablets without any sample preparation, one can easily monitor the transition of the dominant LWIR emission signatures from coating ingredients gradually to the pharmaceutical ingredients underneath the coating. The observed LWIR LIBS emission signatures provide complementary molecular information to the UVN LIBS signatures, thus adding robustness to identification procedures. LIBS techniques are more surface specific while NIR spectroscopy has the capability to probe more bulk materials with its greater penetration depth. Both UVN + LWIR LIBS and NIR absorption spectroscopy have shown the capabilities of acquiring useful target analyte spectral signatures in comparable short time scales. The addition of a rapid LWIR spectroscopic probe to these widely used optical analytical methods, such as NIR spectroscopy and UVN LIBS, may greatly enhance the capability and accuracy of the combined system for a comprehensive analysis.
Roy, Sharmili; Wei, Sim Xiao; Ying, Jean Liew Zhi; Safavieh, Mohammadali; Ahmed, Minhaz Uddin
2016-12-15
Electrochemiluminescence (ECL) has been widely rendered for nucleic acid testing. Here, we integrate loop-mediated isothermal amplification (LAMP) with ECL technique for DNA detection and quantification. The target LAMP DNA bound electrostatically with [Ru(bpy)3](+2) on the carbon electrode surface, and an ECL reaction was triggered by tripropylamine (TPrA) to yield luminescence. We illustrated this method as a new and highly sensitive strategy for the detection of sequence-specific DNA from different meat species at picogram levels. The proposed strategy renders the signal amplification capacities of TPrA and combines LAMP with inherently high sensitivity of the ECL technique, to facilitate the detection of low quantities of DNA. By leveraging this technique, target DNA of Sus scrofa (pork) meat was detected as low as 1pg/µL (3.43×10(-1)copies/µL). In addition, the proposed technique was applied for detection of Bacillus subtilis DNA samples and detection limit of 10pg/µL (2.2×10(3)copies/µL) was achieved. The advantages of being isothermal, sensitive and robust with ability for multiplex detection of bio-analytes makes this method a facile and appealing sensing modality in hand-held devices to be used at the point-of-care (POC). Copyright © 2016 Elsevier B.V. All rights reserved.
Robust nonlinear control of vectored thrust aircraft
NASA Technical Reports Server (NTRS)
Doyle, John C.; Murray, Richard; Morris, John
1993-01-01
An interdisciplinary program in robust control for nonlinear systems with applications to a variety of engineering problems is outlined. Major emphasis will be placed on flight control, with both experimental and analytical studies. This program builds on recent new results in control theory for stability, stabilization, robust stability, robust performance, synthesis, and model reduction in a unified framework using Linear Fractional Transformations (LFT's), Linear Matrix Inequalities (LMI's), and the structured singular value micron. Most of these new advances have been accomplished by the Caltech controls group independently or in collaboration with researchers in other institutions. These recent results offer a new and remarkably unified framework for all aspects of robust control, but what is particularly important for this program is that they also have important implications for system identification and control of nonlinear systems. This combines well with Caltech's expertise in nonlinear control theory, both in geometric methods and methods for systems with constraints and saturations.
Adaptive Control for Autonomous Navigation of Mobile Robots Considering Time Delay and Uncertainty
NASA Astrophysics Data System (ADS)
Armah, Stephen Kofi
Autonomous control of mobile robots has attracted considerable attention of researchers in the areas of robotics and autonomous systems during the past decades. One of the goals in the field of mobile robotics is development of platforms that robustly operate in given, partially unknown, or unpredictable environments and offer desired services to humans. Autonomous mobile robots need to be equipped with effective, robust and/or adaptive, navigation control systems. In spite of enormous reported work on autonomous navigation control systems for mobile robots, achieving the goal above is still an open problem. Robustness and reliability of the controlled system can always be improved. The fundamental issues affecting the stability of the control systems include the undesired nonlinear effects introduced by actuator saturation, time delay in the controlled system, and uncertainty in the model. This research work develops robustly stabilizing control systems by investigating and addressing such nonlinear effects through analytical, simulations, and experiments. The control systems are designed to meet specified transient and steady-state specifications. The systems used for this research are ground (Dr Robot X80SV) and aerial (Parrot AR.Drone 2.0) mobile robots. Firstly, an effective autonomous navigation control system is developed for X80SV using logic control by combining 'go-to-goal', 'avoid-obstacle', and 'follow-wall' controllers. A MATLAB robot simulator is developed to implement this control algorithm and experiments are conducted in a typical office environment. The next stage of the research develops an autonomous position (x, y, and z) and attitude (roll, pitch, and yaw) controllers for a quadrotor, and PD-feedback control is used to achieve stabilization. The quadrotor's nonlinear dynamics and kinematics are implemented using MATLAB S-function to generate the state output. Secondly, the white-box and black-box approaches are used to obtain a linearized second-order altitude models for the quadrotor, AR.Drone 2.0. Proportional (P), pole placement or proportional plus velocity (PV), linear quadratic regulator (LQR), and model reference adaptive control (MRAC) controllers are designed and validated through simulations using MATLAB/Simulink. Control input saturation and time delay in the controlled systems are also studied. MATLAB graphical user interface (GUI) and Simulink programs are developed to implement the controllers on the drone. Thirdly, the time delay in the drone's control system is estimated using analytical and experimental methods. In the experimental approach, the transient properties of the experimental altitude responses are compared to those of simulated responses. The analytical approach makes use of the Lambert W function to obtain analytical solutions of scalar first-order delay differential equations (DDEs). A time-delayed P-feedback control system (retarded type) is used in estimating the time delay. Then an improved system performance is obtained by incorporating the estimated time delay in the design of the PV control system (neutral type) and PV-MRAC control system. Furthermore, the stability of a parametric perturbed linear time-invariant (LTI) retarded-type system is studied. This is done by analytically calculating the stability radius of the system. Simulation of the control system is conducted to confirm the stability. This robust control design and uncertainty analysis are conducted for first-order and second-order quadrotor models. Lastly, the robustly designed PV and PV-MRAC control systems are used to autonomously track multiple waypoints. Also, the robustness of the PV-MRAC controller is tested against a baseline PV controller using the payload capability of the drone. It is shown that the PV-MRAC offers several benefits over the fixed-gain approach of the PV controller. The adaptive control is found to offer enhanced robustness to the payload fluctuations.
Doerrer, Nancy; Ladics, Gregory; McClain, Scott; Herouet-Guicheney, Corinne; Poulsen, Lars K; Privalle, Laura; Stagg, Nicola
2010-12-01
The International Life Sciences Institute Health and Environmental Sciences Institute Protein Allergenicity Technical Committee hosted an international workshop November 16-17, 2009, in Paris, France, with over 60 participants from academia, government, and industry to review and discuss the potential utility of "-omics" technologies for assessing the variability in plant gene, protein, and metabolite expression. The goal of the workshop was to illustrate how a plant's constituent makeup and phenotypic processes can be surveyed analytically. Presentations on the "-omics" techniques (i.e., genomics, proteomics, and metabolomics) highlighted the workshop, and summaries of these presentations are published separately in this supplemental issue. This paper summarizes key messages, as well as the consensus points reached, in a roundtable discussion on eight specific questions posed during the final session of the workshop. The workshop established some common, though not unique, challenges for all "-omics" techniques, and include (a) standardization of separation/extraction and analytical techniques; (b) difficulty in associating environmental impacts (e.g., planting, soil texture, location, climate, stress) with potential alterations in plants at genomic, proteomic, and metabolomic levels; (c) many independent analytical measurements, but few replicates/subjects--poorly defined accuracy and precision; and (d) bias--a lack of hypothesis-driven science. Information on natural plant variation is critical in establishing the utility of new technologies due to the variability in specific analytes that may result from genetic differences (crop genotype), different crop management practices (conventional high input, low input, organic), interaction between genotype and environment, and the use of different breeding methods. For example, variations of several classes of proteins were reported among different soybean, rice, or wheat varieties or varieties grown at different locations. Data on the variability of allergenic proteins are important in defining the risk of potential allergenicity. Once established as a standardized assay, survey approaches such as the "-omics" techniques can be considered in a hypothesis-driven analysis of plants, such as determining unintended effects in genetically modified (GM) crops. However, the analysis should include both the GM and control varieties that have the same breeding history and exposure to the same environmental conditions. Importantly, the biological relevance and safety significance of changes in "-omic" data are still unknown. Furthermore, the current compositional assessment for evaluating the substantial equivalence of GM crops is robust, comprehensive, and a good tool for food safety assessments. The overall consensus of the workshop participants was that many "-omics" techniques are extremely useful in the discovery and research phases of biotechnology, and are valuable for hypothesis generation. However, there are many methodological shortcomings identified with "-omics" approaches, a paucity of reference materials, and a lack of focused strategy for their use that currently make them not conducive for the safety assessment of GM crops. Copyright © 2010 Elsevier Inc. All rights reserved.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2013-12-20
Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.
Geochemical Constraints for Mercury's PCA-Derived Geochemical Terranes
NASA Astrophysics Data System (ADS)
Stockstill-Cahill, K. R.; Peplowski, P. N.
2018-05-01
PCA-derived geochemical terranes provide a robust, analytical means of defining these terranes using strictly geochemical inputs. Using the end members derived in this way, we are able to assess the geochemical implications for Mercury.
Reddy, B P; Kelly, M P; Thokala, P; Walters, S J; Duenas, A
2014-10-01
The Centre for Public Health (CPH), at the United Kingdom's National Institute for Health and Care Excellence (NICE) is responsible for producing national guidance relating to the promotion of good health and the prevention and treatment of disease. Given the challenges of developing guidance in this area, choosing the most appropriate topics for further study is of fundamental importance. This paper explores the current prioritisation process and describes how the Analytic Hierarchy Process (AHP), a multi criteria decision analysis (MCDA) technique, might be used to do so. A proposed approach is outlined, which was tested in a proof of concept pilot. This consisted of eight participants with experience of related NICE committees building scores for each topic together in a 'decision conference' setting. Criteria were identified and subsequently weighted to indicate the relative importance of each. Participants then collaboratively estimated the performance of each topic on each criterion. Total scores for each topic were calculated, which could be ranked and used as the basis for better informed discussion for prioritising topics to recommend to the Minister for future guidance. Sensitivity analyses of the dataset found it to be robust. Choosing the right topics for guidance at the earliest possible time is of fundamental importance to public health guidance, and judgement is likely to play an important part in doing so. MCDA techniques offer a potentially useful approach to structuring the problem in a rational and transparent way. NICE should consider carefully whether such an approach might be worth pursuing in the future.
One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.
Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz
2009-07-15
The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.
Simulation of laser generated ultrasound with application to defect detection
NASA Astrophysics Data System (ADS)
Pantano, A.; Cerniglia, D.
2008-06-01
Laser generated ultrasound holds substantial promise for use as a tool for defect detection in remote inspection thanks to its ability to produce frequencies in the MHz range, enabling fine spatial resolution of defects. Despite the potential impact of laser generated ultrasound in many areas of science and industry, robust tools for studying the phenomenon are lacking and thus limit the design and optimization of non-destructive testing and evaluation techniques. The laser generated ultrasound propagation in complex structures is an intricate phenomenon and is extremely hard to analyze. Only simple geometries can be studied analytically. Numerical techniques found in the literature have proved to be limited in their applicability, by the frequencies in the MHz range and very short wavelengths. The objective of this research is to prove that by using an explicit integration rule together with diagonal element mass matrices, instead of the almost universally adopted implicit integration rule to integrate the equations of motion in a dynamic analysis, it is possible to efficiently and accurately solve ultrasound wave propagation problems with frequencies in the MHz range travelling in relatively large bodies. Presented results on NDE testing of rails demonstrate that the proposed FE technique can provide a valuable tool for studying the laser generated ultrasound propagation.
Zou, Wei; She, Jianwen; Tolstikov, Vladimir V.
2013-01-01
Current available biomarkers lack sensitivity and/or specificity for early detection of cancer. To address this challenge, a robust and complete workflow for metabolic profiling and data mining is described in details. Three independent and complementary analytical techniques for metabolic profiling are applied: hydrophilic interaction liquid chromatography (HILIC–LC), reversed-phase liquid chromatography (RP–LC), and gas chromatography (GC). All three techniques are coupled to a mass spectrometer (MS) in the full scan acquisition mode, and both unsupervised and supervised methods are used for data mining. The univariate and multivariate feature selection are used to determine subsets of potentially discriminative predictors. These predictors are further identified by obtaining accurate masses and isotopic ratios using selected ion monitoring (SIM) and data-dependent MS/MS and/or accurate mass MSn ion tree scans utilizing high resolution MS. A list combining all of the identified potential biomarkers generated from different platforms and algorithms is used for pathway analysis. Such a workflow combining comprehensive metabolic profiling and advanced data mining techniques may provide a powerful approach for metabolic pathway analysis and biomarker discovery in cancer research. Two case studies with previous published data are adapted and included in the context to elucidate the application of the workflow. PMID:24958150
Schmidt, Paul; Schmid, Volker J; Gaser, Christian; Buck, Dorothea; Bührlen, Susanne; Förschler, Annette; Mühlau, Mark
2013-01-01
Aiming at iron-related T2-hypointensity, which is related to normal aging and neurodegenerative processes, we here present two practicable approaches, based on Bayesian inference, for preprocessing and statistical analysis of a complex set of structural MRI data. In particular, Markov Chain Monte Carlo methods were used to simulate posterior distributions. First, we rendered a segmentation algorithm that uses outlier detection based on model checking techniques within a Bayesian mixture model. Second, we rendered an analytical tool comprising a Bayesian regression model with smoothness priors (in the form of Gaussian Markov random fields) mitigating the necessity to smooth data prior to statistical analysis. For validation, we used simulated data and MRI data of 27 healthy controls (age: [Formula: see text]; range, [Formula: see text]). We first observed robust segmentation of both simulated T2-hypointensities and gray-matter regions known to be T2-hypointense. Second, simulated data and images of segmented T2-hypointensity were analyzed. We found not only robust identification of simulated effects but also a biologically plausible age-related increase of T2-hypointensity primarily within the dentate nucleus but also within the globus pallidus, substantia nigra, and red nucleus. Our results indicate that fully Bayesian inference can successfully be applied for preprocessing and statistical analysis of structural MRI data.
Fine, Daniel; Grattoni, Alessandro; Hosali, Sharath; Ziemys, Arturas; De Rosa, Enrica; Gill, Jaskaran; Medema, Ryan; Hudson, Lee; Kojic, Milos; Milosevic, Miljan; Brousseau Iii, Louis; Goodall, Randy; Ferrari, Mauro; Liu, Xuewu
2010-11-21
This manuscript demonstrates a mechanically robust implantable nanofluidic membrane capable of tunable long-term zero-order release of therapeutic agents in ranges relevant for clinical applications. The membrane, with nanochannels as small as 5 nm, allows for the independent control of both dosage and mechanical strength through the integration of high-density short nanochannels parallel to the membrane surface with perpendicular micro- and macrochannels for interfacing with the ambient solutions. These nanofluidic membranes are created using precision silicon fabrication techniques on silicon-on-insulator substrates enabling exquisite control over the monodispersed nanochannel dimensions and surface roughness. Zero-order release of analytes is achieved by exploiting molecule to surface interactions which dominate diffusive transport when fluids are confined to the nanoscale. In this study we investigate the nanofluidic membrane performance using custom diffusion and gas testing apparatuses to quantify molecular release rate and process uniformity as well as mechanical strength using a gas based burst test. The kinetics of the constrained zero-order release is probed with molecules presenting a range of sizes, charge states, and structural conformations. Finally, an optimal ratio of the molecular hydrodynamic diameter to the nanochannel dimension is determined to assure zero-order release for each tested molecule.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Chiron, L.; Oger, G.; de Leffe, M.; Le Touzé, D.
2018-02-01
While smoothed-particle hydrodynamics (SPH) simulations are usually performed using uniform particle distributions, local particle refinement techniques have been developed to concentrate fine spatial resolutions in identified areas of interest. Although the formalism of this method is relatively easy to implement, its robustness at coarse/fine interfaces can be problematic. Analysis performed in [16] shows that the radius of refined particles should be greater than half the radius of unrefined particles to ensure robustness. In this article, the basics of an Adaptive Particle Refinement (APR) technique, inspired by AMR in mesh-based methods, are presented. This approach ensures robustness with alleviated constraints. Simulations applying the new formalism proposed achieve accuracy comparable to fully refined spatial resolutions, together with robustness, low CPU times and maintained parallel efficiency.
NASA Astrophysics Data System (ADS)
Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.
2015-10-01
We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.
Comparing Four Instructional Techniques for Promoting Robust Knowledge
ERIC Educational Resources Information Center
Richey, J. Elizabeth; Nokes-Malach, Timothy J.
2015-01-01
Robust knowledge serves as a common instructional target in academic settings. Past research identifying characteristics of experts' knowledge across many domains can help clarify the features of robust knowledge as well as ways of assessing it. We review the expertise literature and identify three key features of robust knowledge (deep,…
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...
2016-07-05
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
Experimental and analytical determination of stability parameters for a balloon tethered in a wind
NASA Technical Reports Server (NTRS)
Redd, L. T.; Bennett, R. M.; Bland, S. R.
1973-01-01
Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.
Electrochemical sensors and devices for heavy metals assay in water: the French groups' contribution
Pujol, Luca; Evrard, David; Groenen-Serrano, Karine; Freyssinier, Mathilde; Ruffien-Cizsak, Audrey; Gros, Pierre
2014-01-01
A great challenge in the area of heavy metal trace detection is the development of electrochemical techniques and devices which are user-friendly, robust, selective, with low detection limits and allowing fast analyses. This review presents the major contribution of the French scientific academic community in the field of electrochemical sensors and electroanalytical methods within the last 20 years. From the well-known polarography to the up-to-date generation of functionalized interfaces, the different strategies dedicated to analytical performances improvement are exposed: stripping voltammetry, solid mercury-free electrode, ion selective sensor, carbon based materials, chemically modified electrodes, nano-structured surfaces. The paper particularly emphasizes their advantages and limits face to the last Water Frame Directive devoted to the Environmental Quality Standards for heavy metals. Recent trends on trace metal speciation as well as on automatic “on line” monitoring devices are also evoked. PMID:24818124
PECAN: library-free peptide detection for data-independent acquisition tandem mass spectrometry data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ting, Ying S.; Egertson, Jarrett D.; Bollinger, James G.
Data-independent acquisition (DIA) is an emerging mass spectrometry (MS)-based technique for unbiased and reproducible measurement of protein mixtures. DIA tandem mass spectrometry spectra are often highly multiplexed, containing product ions from multiple cofragmenting precursors. Detecting peptides directly from DIA data is therefore challenging; most DIA data analyses require spectral libraries. Here we present PECECAN (http://pecan.maccosslab.org), a library-free, peptide-centric tool that robustly and accurately detects peptides directly from DIA data. PECECAN reports evidence of detection based on product ion scoring, which enables detection of low-abundance analytes with poor precursor ion signal. We demonstrate the chromatographic peak picking accuracy and peptide detectionmore » capability of PECECAN, and we further validate its detection with data-dependent acquisition and targeted analyses. Lastly, we used PECECAN to build a plasma proteome library from DIA data and to query known sequence variants.« less
Tiscione, Nicholas B; Yeatman, Dustin Tate; Shan, Xiaoqin; Kahl, Joseph H
2013-10-01
Volatiles are frequently abused as inhalants. The methods used for identification are generally nonspecific if analyzed concurrently with ethanol or require an additional analytical procedure that employs mass spectrometry. A previously published technique utilizing a capillary flow technology splitter to simultaneously quantitate and confirm ethyl alcohol by flame ionization and mass spectrometric detection after headspace sampling and gas chromatographic separation was evaluated for the detection of inhalants. Methanol, isopropanol, acetone, acetaldehyde, toluene, methyl ethyl ketone, isoamyl alcohol, isobutyl alcohol, n-butyl alcohol, 1,1-difluoroethane, 1,1,1-trifluoroethane, 1,1,1,2-tetrafluoroethane (Norflurane, HFC-134a), chloroethane, trichlorofluoromethane (Freon®-11), dichlorodifluoromethane (Freon®-12), dichlorofluoromethane (Freon®-21), chlorodifluoromethane (Freon®-22) and 1,2-dichlorotetrafluoroethane (Freon®-114) were validated for qualitative identification by this method. The validation for qualitative identification included evaluation of matrix effects, sensitivity, carryover, specificity, repeatability and ruggedness/robustness.
Drevinskas, Tomas; Mickienė, Rūta; Maruška, Audrius; Stankevičius, Mantas; Tiso, Nicola; Mikašauskaitė, Jurgita; Ragažinskienė, Ona; Levišauskas, Donatas; Bartkuvienė, Violeta; Snieškienė, Vilija; Stankevičienė, Antanina; Polcaro, Chiara; Galli, Emanuela; Donati, Enrica; Tekorius, Tomas; Kornyšova, Olga; Kaškonienė, Vilma
2016-02-01
The miniaturization and optimization of a white rot fungal bioremediation experiment is described in this paper. The optimized procedure allows determination of the degradation kinetics of anthracene. The miniaturized procedure requires only 2.5 ml of culture medium. The experiment is more precise, robust, and better controlled comparing it to classical tests in flasks. Using this technique, different parts, i.e., the culture medium, the fungi, and the cotton seal, can be analyzed. A simple sample preparation speeds up the analytical process. Experiments performed show degradation of anthracene up to approximately 60% by Irpex lacteus and up to approximately 40% by Pleurotus ostreatus in 25 days. Bioremediation of anthracene by the consortium of I. lacteus and P. ostreatus shows the biodegradation of anthracene up to approximately 56% in 23 days. At the end of the experiment, the surface tension of culture medium decreased comparing it to the blank, indicating generation of surfactant compounds.
Nuclear Forensic Science: Analysis of Nuclear Material Out of Regulatory Control
Kristo, Michael J.; Gaffney, Amy M.; Marks, Naomi; ...
2016-05-11
Nuclear forensic science seeks to identify the origin of nuclear materials found outside regulatory control. It is increasingly recognized as an integral part of a robust nuclear security program. Our review highlights areas of active, evolving research in nuclear forensics, with a focus on analytical techniques commonly employed in Earth and planetary sciences. Applications of nuclear forensics to uranium ore concentrates (UOCs) are discussed first. UOCs have become an attractive target for nuclear forensic researchers because of the richness in impurities compared to materials produced later in the fuel cycle. Furthermore, the development of chronometric methods for age dating nuclearmore » materials is then discussed, with an emphasis on improvements in accuracy that have been gained from measurements of multiple radioisotopic systems. Finally, papers that report on casework are reviewed, to provide a window into current scientific practice.« less
Dutta, Sibasish; Saikia, Gunjan Prasad; Sarma, Dhruva Jyoti; Gupta, Kuldeep; Das, Priyanka; Nath, Pabitra
2017-05-01
In this paper the utilization of smartphone as a detection platform for colorimetric quantification of biological macromolecules has been demonstrated. Using V-channel of HSV color space, the quantification of BSA protein, catalase enzyme and carbohydrate (using D-glucose) have been successfully investigated. A custom designed android application has been developed for estimating the total concentration of biological macromolecules. The results have been compared with that of a standard spectrophotometer which is generally used for colorimetric quantification in laboratory settings by measuring its absorbance at a specific wavelength. The results obtained with the designed sensor is found to be similar when compared with the spectrophotometer data. The designed sensor is low cost, robust and we envision that it could promote diverse fields of bio-analytical investigations. Schematic illustration of the smartphone sensing mechanism for colorimetric analysis of biomolecular samples. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Reichert, Janice M; Jacob, Nitya; Amanullah, Ashraf
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme "Delivering cost-effective, robust processes and methods quickly and efficiently." The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development.
Reichert, Janice M; Jacob, Nitya M; Amanullah, Ashraf
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme "Delivering cost-effective, robust processes and methods quickly and efficiently." The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development.
Uncertainty Quantification in Aeroelasticity
NASA Astrophysics Data System (ADS)
Beran, Philip; Stanford, Bret; Schrock, Christopher
2017-01-01
Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.
SWCNTs-based nanocomposites as sensitive coatings for advanced fiber optic chemical nanosensors
NASA Astrophysics Data System (ADS)
Consales, M.; Crescitelli, A.; Penza, M.; Aversa, P.; Giordano, M.; Cutolo, A.; Cusano, A.
2008-04-01
In this work, the feasibility of exploiting novel Cadmium Arachidate (CdA)/single-walled carbon nanotubes (SWCNTs) based composites as sensitive coatings for the development of robust and high performances optoelectronic chemosensors able to work in liquid environments has been investigated and proved. Here, nano-composite sensing layers have been transferred upon the distal end of standard optical fibers by the Langmuir-Blodgett (LB) technique. Reflectance measurements have been carried out to monitor ppm concentration of chemicals in water through the changes in the optical and geometrical features of the sensing overlay induced by the interaction with the analyte molecules. Preliminary experimental results evidence that such nanoscale coatings integrated with the optical fiber technology offers great potentialities for the room temperature detection of chemical traces in water and lead to significant improvements of the traditional fiber optic sensors based on SWCNTs layers.
Hao, Li-Ying; Park, Ju H; Ye, Dan
2017-09-01
In this paper, a new robust fault-tolerant compensation control method for uncertain linear systems over networks is proposed, where only quantized signals are assumed to be available. This approach is based on the integral sliding mode (ISM) method where two kinds of integral sliding surfaces are constructed. One is the continuous-state-dependent surface with the aim of sliding mode stability analysis and the other is the quantization-state-dependent surface, which is used for ISM controller design. A scheme that combines the adaptive ISM controller and quantization parameter adjustment strategy is then proposed. Through utilizing H ∞ control analytical technique, once the system is in the sliding mode, the nature of performing disturbance attenuation and fault tolerance from the initial time can be found without requiring any fault information. Finally, the effectiveness of our proposed ISM control fault-tolerant schemes against quantization errors is demonstrated in the simulation.
Dynamic one-dimensional modeling of secondary settling tanks and system robustness evaluation.
Li, Ben; Stenstrom, M K
2014-01-01
One-dimensional secondary settling tank models are widely used in current engineering practice for design and optimization, and usually can be expressed as a nonlinear hyperbolic or nonlinear strongly degenerate parabolic partial differential equation (PDE). Reliable numerical methods are needed to produce approximate solutions that converge to the exact analytical solutions. In this study, we introduced a reliable numerical technique, the Yee-Roe-Davis (YRD) method as the governing PDE solver, and compared its reliability with the prevalent Stenstrom-Vitasovic-Takács (SVT) method by assessing their simulation results at various operating conditions. The YRD method also produced a similar solution to the previously developed Method G and Enquist-Osher method. The YRD and SVT methods were also used for a time-to-failure evaluation, and the results show that the choice of numerical method can greatly impact the solution. Reliable numerical methods, such as the YRD method, are strongly recommended.
NASA Astrophysics Data System (ADS)
Banegas, Frederic; Michelucci, Dominique; Roelens, Marc; Jaeger, Marc
1999-05-01
We present a robust method for automatically constructing an ellipsoidal skeleton (e-skeleton) from a set of 3D points taken from NMR or TDM images. To ensure steadiness and accuracy, all points of the objects are taken into account, including the inner ones, which is different from the existing techniques. This skeleton will be essentially useful for object characterization, for comparisons between various measurements and as a basis for deformable models. It also provides good initial guess for surface reconstruction algorithms. On output of the entire process, we obtain an analytical description of the chosen entity, semantically zoomable (local features only or reconstructed surfaces), with any level of detail (LOD) by discretization step control in voxel or polygon format. This capability allows us to handle objects at interactive frame rates once the e-skeleton is computed. Each e-skeleton is stored as a multiscale CSG implicit tree.
Social Context of First Birth Timing in a Rapidly Changing Rural Setting
Ghimire, Dirgha J.
2016-01-01
This article examines the influence of social context on the rate of first birth. Drawing on socialization models, I develop a theoretical framework to explain how different aspects of social context (i.e., neighbors), may affect the rate of first birth. Neighbors, who in the study setting comprise individuals’ immediate social context, have an important influence on the rate of first birth. To test my hypotheses, I leverage a setting, measures and analytical techniques designed to study the impact of macro-level social contexts on micro-level individual behavior. The results show that neighbors’ age at first birth, travel to the capital city and media exposure tend to reduce the first birth rate, while neighbors’ non-family work experience increases first birth rate. These effects are independent of neighborhood characteristics and are robust against several key variations in model specifications. PMID:27886737
Photogrammetry Methodology Development for Gossamer Spacecraft Structures
NASA Technical Reports Server (NTRS)
Pappa, Richard S.; Jones, Thomas W.; Black, Jonathan T.; Walford, Alan; Robson, Stuart; Shortis, Mark R.
2002-01-01
Photogrammetry--the science of calculating 3D object coordinates from images--is a flexible and robust approach for measuring the static and dynamic characteristics of future ultra-lightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.
Troy, Declan J; Ojha, Kumari Shikha; Kerry, Joseph P; Tiwari, Brijesh K
2016-10-01
New and emerging robust technologies can play an important role in ensuring a more resilient meat value chain and satisfying consumer demands and needs. This paper outlines various novel thermal and non-thermal technologies which have shown potential for meat processing applications. A number of process analytical techniques which have shown potential for rapid, real-time assessment of meat quality are also discussed. The commercial uptake and consumer acceptance of novel technologies in meat processing have been subjects of great interest over the past decade. Consumer focus group studies have shown that consumer expectations and liking for novel technologies, applicable to meat processing applications, vary significantly. This overview also highlights the necessity for meat processors to address consumer risk-benefit perceptions, knowledge and trust in order to be commercially successful in the application of novel technologies within the meat sector. Copyright © 2016. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Lin, Yi; Bunker, Christopher E.; Fernandos, K. A. Shiral; Connell, John W.
2012-01-01
The impurity-free aqueous dispersions of boron nitride nanosheets (BNNS) allowed the facile preparation of silver (Ag) nanoparticle-decorated BNNS by chemical reduction of an Ag salt with hydrazine in the presence of BNNS. The resultant Ag-BNNS nanohybrids remained dispersed in water, allowing convenient subsequent solution processing. By using substrate transfer techniques, Ag-BNNS nanohybrid thin film coatings on quartz substrates were prepared and evaluated as reusable surface enhanced Raman spectroscopy (SERS) sensors that were robust against repeated solvent washing. In addition, because of the unique thermal oxidation-resistant properties of the BNNS, the sensor devices may be readily recycled by short-duration high temperature air oxidation to remove residual analyte molecules in repeated runs. The limiting factor associated with the thermal oxidation recycling process was the Ostwald ripening effect of Ag nanostructures.
Analytical Chemistry: A Literary Approach.
ERIC Educational Resources Information Center
Lucy, Charles A.
2000-01-01
Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)
Axially and radially viewed inductively coupled plasmas — a critical review
NASA Astrophysics Data System (ADS)
Brenner, I. B.; Zander, A. T.
2000-08-01
The present status of axially viewed inductively coupled plasmas (ICP) is reviewed with special emphasis placed on the analytical performance of currently available systems. Descriptions are given of the various designs of the plasma-spectrometer configuration. Conventional figures of merit such as limits of detection, background behavior, interferences due to easily ionized elements (EIE), Ca and acids, and the Mg II 280.270 nm/Mg I 285.213 nm intensity ratio, are used to compare the performance of axially viewed and radially viewed ICPs. Various modes of sample introduction, including conventional pneumatic and ultrasonic nebulization (USN), thermospray and a direct injection probe will be described. For axially viewed ICPs, limits of detection (LOD) are improved by factors varying from approximately 2 to 30. Additional improvements by factors of 2-20 can be obtained using USN. The improvement factors generally depend on energy potentials of the spectral lines and the element. Although limits of detection in the presence of Ca and Na are degraded relative to an aqueous solution 10-30-fold, USN LODs using an axially viewed ICP are improved relative to those obtained using a pneumatic nebulizer for solutions containing Ca and Na. With normal aerosol load and under robust plasma conditions (as evidenced by Mg II/Mg I intensity ratios >8), EIE, Ca and mineral acid induced interferences are relatively small and are similar in axial and conventional radial configurations. However, interferences due to Ca are larger than those caused by Na due to the larger amount of energy required to dissociate the matrix. Matrix effects increase considerably when an USN is employed. For robust plasmas, ICP operating conditions and performance for multi-element quantitative analysis do not differ significantly from those of conventional radial configurations. In cases where robustness decreases, matrix interferences should be taken into account when establishing optimum conditions for operation. In robust axially viewed ICPs, a single internal standard can compensate for ionic line intensity suppression due to Na. However, owing to the variable influence of Ca on spectral response, more than one internal standard is required to compensate for these matrix effects. In this situation, linear energy potential-interference functions can be used to improve accuracy using spectral lines varying over wide ranges of energy potentials. In axially viewed ICPs, Mg II/ Mg I ratios vary widely as a function of applied RF power, aerosol flow rates and load, diameter of the central torch injector, and composition of the aspirated solution. The highest values of 9-13 have been observed for a pure aqueous solution using conventional nebulization and argon carrier flow rates (0.5-0.7 ml min -1) and forward powers of 1.2-1.5 kW. Mg II/Mg I ratios decrease when the RF power decreases, when Na and Ca are added to the plasma, and when the aerosol load is increased. A low value of 2 was obtained when the carrier gas flow rate was high and when the aerosol load was high using an USN. The use of a copper metal skimmer below the analytical observation zone to isolate the axial channel of the ICP and to deflect the outer cool fringe results in 5-20 times improvement of the LODs compared to those obtained using a conventional configuration (a normal radially viewed ICP). A direct He purged plasma-spectrometer interface for end-on detection of the vacuum UV (VUV) emission from the axial region of an ICP allows the determination of Cl, Br and other analytes in the μg l -1 range. The characteristics of a secondary discharge at the orifice of a Cu cone when the axial channel of the ICP is extracted into a vacuum chamber will be discussed. The characteristics of the emission in the Mach disk region extracted from the axial column will be surveyed. Several applications and techniques are described: determination of major, minor and trace elements in geological, environmental and biological materials, analysis of brines, nuclear materials and organic solvents and solutions. Several unique techniques are described: elemental speciation, determination of the halides and other analytes with VUV spectral lines using a He purged direct plasma-spectrometer interface. Direct solids analysis using slurries, laser and spark ablation and direct solids insertion further extends the scope of axially viewed ICPs.
2014-09-30
resulted in the identification of metabolite patterns indicative of flight line exposure when compared to non -flight line control subjects...virtually non -invasive sample collection, minimal sample processing, robust and stable analytical platform, with excellent analytical and biological...identification of metabolite patterns indicative of flight line exposure when compared to non -flight line control subjects. Regardless of fuel (JP-4 or
Optomechanical frequency combs
NASA Astrophysics Data System (ADS)
Miri, Mohammad-Ali; D’Aguanno, Giuseppe; Alù, Andrea
2018-04-01
We study the formation of frequency combs in a single-mode optomechanical cavity. The comb is composed of equidistant spectral lines centered at the pump laser frequency and located at different harmonics of the mechanical resonator. We investigate the classical nonlinear dynamics of such system and find analytically the onset of parametric instability resulting in the breakdown of a stationary continuous wave intracavity field into a periodic train of pulses, which in the Fourier domain gives rise to a broadband frequency comb. Different dynamical regimes, including a stationary state, frequency comb generation and chaos, and their dependence on the system parameters, are studied both analytically and numerically. Interestingly, the comb generation is found to be more robust in the poor cavity limit, where optical loss is equal or larger than the mechanical resonance frequency. Our results show that optomechanical resonators open exciting opportunities for microwave photonics as compact and robust sources of frequency combs with megahertz line spacing.
Sokoliess, Torsten; Köller, Gerhard
2005-06-01
A chiral capillary electrophoresis system allowing the determination of the enantiomeric purity of an investigational new drug was developed using a generic method development approach for basic analytes. The method was optimized in terms of type and concentration of both cyclodextrin (CD) and electrolyte, buffer pH, temperature, voltage, and rinsing procedure. Optimal chiral separation of the analyte was obtained using an electrolyte with 2.5% carboxymethyl-beta-CD in 25 mM NaH2PO4 (pH 4.0). Interchanging the inlet and outlet vials after each run improved the method's precision. To assure the method's suitability for the control of enantiomeric impurities in pharmaceutical quality control, its specificity, linearity, precision, accuracy, and robustness were validated according to the requirements of the International Conference on Harmonization. The usefulness of our generic method development approach for the validation of robustness was demonstrated.
Neural robust stabilization via event-triggering mechanism and adaptive learning technique.
Wang, Ding; Liu, Derong
2018-06-01
The robust control synthesis of continuous-time nonlinear systems with uncertain term is investigated via event-triggering mechanism and adaptive critic learning technique. We mainly focus on combining the event-triggering mechanism with adaptive critic designs, so as to solve the nonlinear robust control problem. This can not only make better use of computation and communication resources, but also conduct controller design from the view of intelligent optimization. Through theoretical analysis, the nonlinear robust stabilization can be achieved by obtaining an event-triggered optimal control law of the nominal system with a newly defined cost function and a certain triggering condition. The adaptive critic technique is employed to facilitate the event-triggered control design, where a neural network is introduced as an approximator of the learning phase. The performance of the event-triggered robust control scheme is validated via simulation studies and comparisons. The present method extends the application domain of both event-triggered control and adaptive critic control to nonlinear systems possessing dynamical uncertainties. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhyaya, Belle R.; Hines, J. Wesley; Lu, Baofu
2005-06-03
The overall purpose of this Nuclear Engineering Education Research (NEER) project was to integrate new, innovative, and existing technologies to develop a fault diagnostics and characterization system for nuclear plant steam generators (SG) and heat exchangers (HX). Issues related to system level degradation of SG and HX tubing, including tube fouling, performance under reduced heat transfer area, and the damage caused by stress corrosion cracking, are the important factors that influence overall plant operation, maintenance, and economic viability of nuclear power systems. The research at The University of Tennessee focused on the development of techniques for monitoring process and structuralmore » integrity of steam generators and heat exchangers. The objectives of the project were accomplished by the completion of the following tasks. All the objectives were accomplished during the project period. This report summarizes the research and development activities, results, and accomplishments during June 2001 September 2004. Development and testing of a high-fidelity nodal model of a U-tube steam generator (UTSG) to simulate the effects of fouling and to generate a database representing normal and degraded process conditions. Application of the group method of data handling (GMDH) method for process variable prediction. Development of a laboratory test module to simulate particulate fouling of HX tubes and its effect on overall thermal resistance. Application of the GMDH technique to predict HX fluid temperatures, and to compare with the calculated thermal resistance.Development of a hybrid modeling technique for process diagnosis and its evaluation using laboratory heat exchanger test data. Development and testing of a sensor suite using piezo-electric devices for monitoring structural integrity of both flat plates (beams) and tubing. Experiments were performed in air, and in water with and without bubbly flow. Development of advanced signal processing methods using wavelet transforms and image processing techniques for isolating flaw types. Development and implementation of a new nonlinear and non-stationary signal processing method, called the Hilbert-Huang transform (HHT), for flaw detection and location. This is a more robust and adaptive approach compared to the wavelet transform.Implementation of a moving-window technique in the time domain for detecting and quantifying flaw types in tubular structures. A window zooming technique was also developed for flaw location in tubes. Theoretical study of elastic wave propagation (longitudinal and shear waves) in metallic flat plates and tubing with and without flaws. Simulation of the Lamb wave propagation using the finite-element code ABAQUS. This enabled the verification of the experimental results. The research tasks included both analytical research and experimental studies. The experimental results helped to enhance the robustness of fault monitoring methods and to provide a systematic verification of the analytical results. The results of this research were disseminated in scientific meetings. The journal manuscript titled, "Structural Integrity Monitoring of Steam generator Tubing Using Transient Acoustic Signal Analysis," was published in IEEE Trasactions on Nuclear Science, Vol. 52, No. 1, February 2005. The new findings of this research have potential applications in aerospace and civil structures. The report contains a complete bibliography that was developed during the course of the project.« less
Enhancing robustness and immunization in geographical networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang Liang; Department of Physics, Lanzhou University, Lanzhou 730000; Yang Kongqing
2007-03-15
We find that different geographical structures of networks lead to varied percolation thresholds, although these networks may have similar abstract topological structures. Thus, strategies for enhancing robustness and immunization of a geographical network are proposed. Using the generating function formalism, we obtain an explicit form of the percolation threshold q{sub c} for networks containing arbitrary order cycles. For three-cycles, the dependence of q{sub c} on the clustering coefficients is ascertained. The analysis substantiates the validity of the strategies with analytical evidence.
Kim, SungHwan; Lin, Chien-Wei; Tseng, George C
2016-07-01
Supervised machine learning is widely applied to transcriptomic data to predict disease diagnosis, prognosis or survival. Robust and interpretable classifiers with high accuracy are usually favored for their clinical and translational potential. The top scoring pair (TSP) algorithm is an example that applies a simple rank-based algorithm to identify rank-altered gene pairs for classifier construction. Although many classification methods perform well in cross-validation of single expression profile, the performance usually greatly reduces in cross-study validation (i.e. the prediction model is established in the training study and applied to an independent test study) for all machine learning methods, including TSP. The failure of cross-study validation has largely diminished the potential translational and clinical values of the models. The purpose of this article is to develop a meta-analytic top scoring pair (MetaKTSP) framework that combines multiple transcriptomic studies and generates a robust prediction model applicable to independent test studies. We proposed two frameworks, by averaging TSP scores or by combining P-values from individual studies, to select the top gene pairs for model construction. We applied the proposed methods in simulated data sets and three large-scale real applications in breast cancer, idiopathic pulmonary fibrosis and pan-cancer methylation. The result showed superior performance of cross-study validation accuracy and biomarker selection for the new meta-analytic framework. In conclusion, combining multiple omics data sets in the public domain increases robustness and accuracy of the classification model that will ultimately improve disease understanding and clinical treatment decisions to benefit patients. An R package MetaKTSP is available online. (http://tsenglab.biostat.pitt.edu/software.htm). ctseng@pitt.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Migneault, Gerard E.
1987-01-01
Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.
Bao, Yuanwu; Chen, Ceng; Newburg, David S.
2012-01-01
Defining the biologic roles of human milk oligosaccharides (HMOS) requires an efficient, simple, reliable, and robust analytical method for simultaneous quantification of oligosaccharide profiles from multiple samples. The HMOS fraction of milk is a complex mixture of polar, highly branched, isomeric structures that contain no intrinsic facile chromophore, making their resolution and quantification challenging. A liquid chromatography-mass spectrometry (LC-MS) method was devised to resolve and quantify 11 major neutral oligosaccharides of human milk simultaneously. Crude HMOS fractions are reduced, resolved by porous graphitic carbon HPLC with a water/acetonitrile gradient, detected by mass spectrometric specific ion monitoring, and quantified. The HPLC separates isomers of identical molecular weights allowing 11 peaks to be fully resolved and quantified by monitoring mass to charge (m/z) ratios of the deprotonated negative ions. The standard curves for each of the 11 oligosaccharides is linear from 0.078 or 0.156 to 20 μg/mL (R2 > 0.998). Precision (CV) ranges from 1% to 9%. Accuracy is from 86% to 104%. This analytical technique provides sensitive, precise, accurate quantification for each of the 11 milk oligosaccharides and allows measurement of differences in milk oligosaccharide patterns between individuals and at different stages of lactation. PMID:23068043
Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I
2016-01-01
Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers.
NASA Astrophysics Data System (ADS)
Márquez, Andrés; Francés, Jorge; Martínez, Francisco J.; Gallego, Sergi; Álvarez, Mariela L.; Calzado, Eva M.; Pascual, Inmaculada; Beléndez, Augusto
2018-03-01
Simplified analytical models with predictive capability enable simpler and faster optimization of the performance in applications of complex photonic devices. We recently demonstrated the most simplified analytical model still showing predictive capability for parallel-aligned liquid crystal on silicon (PA-LCoS) devices, which provides the voltage-dependent retardance for a very wide range of incidence angles and any wavelength in the visible. We further show that the proposed model is not only phenomenological but also physically meaningful, since two of its parameters provide the correct values for important internal properties of these devices related to the birefringence, cell gap, and director profile. Therefore, the proposed model can be used as a means to inspect internal physical properties of the cell. As an innovation, we also show the applicability of the split-field finite-difference time-domain (SF-FDTD) technique for phase-shift and retardance evaluation of PA-LCoS devices under oblique incidence. As a simplified model for PA-LCoS devices, we also consider the exact description of homogeneous birefringent slabs. However, we show that, despite its higher degree of simplification, the proposed model is more robust, providing unambiguous and physically meaningful solutions when fitting its parameters.
McElearney, Kyle; Ali, Amr; Gilbert, Alan; Kshirsagar, Rashmi; Zang, Li
2016-01-01
Chemically defined media have been widely used in the biopharmaceutical industry to enhance cell culture productivities and ensure process robustness. These media, which are quite complex, often contain a mixture of many components such as vitamins, amino acids, metals and other chemicals. Some of these components are known to be sensitive to various stress factors including photodegradation. Previous work has shown that small changes in impurity concentrations induced by these potential stresses can have a large impact on the cell culture process including growth and product quality attributes. Furthermore, it has been shown to be difficult to detect these modifications analytically due to the complexity of the cell culture media and the trace level of the degradant products. Here, we describe work performed to identify the specific chemical(s) in photodegraded medium that affect cell culture performance. First, we developed a model system capable of detecting changes in cell culture performance. Second, we used these data and applied an LC-MS analytical technique to characterize the cell culture media and identify degradant products which affect cell culture performance. Riboflavin limitation and N-formylkynurenine (NFK), a tryptophan oxidation catabolite, were identified as chemicals which results in a reduction in cell culture performance. © 2015 American Institute of Chemical Engineers.
Big genomics and clinical data analytics strategies for precision cancer prognosis.
Ow, Ghim Siong; Kuznetsov, Vladimir A
2016-11-07
The field of personalized and precise medicine in the era of big data analytics is growing rapidly. Previously, we proposed our model of patient classification termed Prognostic Signature Vector Matching (PSVM) and identified a 37 variable signature comprising 36 let-7b associated prognostic significant mRNAs and the age risk factor that stratified large high-grade serous ovarian cancer patient cohorts into three survival-significant risk groups. Here, we investigated the predictive performance of PSVM via optimization of the prognostic variable weights, which represent the relative importance of one prognostic variable over the others. In addition, we compared several multivariate prognostic models based on PSVM with classical machine learning techniques such as K-nearest-neighbor, support vector machine, random forest, neural networks and logistic regression. Our results revealed that negative log-rank p-values provides more robust weight values as opposed to the use of other quantities such as hazard ratios, fold change, or a combination of those factors. PSVM, together with the classical machine learning classifiers were combined in an ensemble (multi-test) voting system, which collectively provides a more precise and reproducible patient stratification. The use of the multi-test system approach, rather than the search for the ideal classification/prediction method, might help to address limitations of the individual classification algorithm in specific situation.
Guale, Fessessework; Shahreza, Shahriar; Walterscheid, Jeffrey P.; Chen, Hsin-Hung; Arndt, Crystal; Kelly, Anna T.; Mozayani, Ashraf
2013-01-01
Liquid chromatography time-of-flight mass spectrometry (LC–TOF-MS) analysis provides an expansive technique for identifying many known and unknown analytes. This study developed a screening method that utilizes automated solid-phase extraction to purify a wide array of analytes involving stimulants, benzodiazepines, opiates, muscle relaxants, hypnotics, antihistamines, antidepressants and newer synthetic “Spice/K2” cannabinoids and cathinone “bath salt” designer drugs. The extract was applied to LC–TOF-MS analysis, implementing a 13 min chromatography gradient with mobile phases of ammonium formate and methanol using positive mode electrospray. Several common drugs and metabolites can share the same mass and chemical formula among unrelated compounds, but they are structurally different. In this method, the LC–TOF-MS was able to resolve many isobaric compounds by accurate mass correlation within 15 ppm mass units and a narrow retention time interval of less than 10 s of separation. Drug recovery yields varied among spiked compounds, but resulted in overall robust area counts to deliver an average match score of 86 when compared to the retention time and mass of authentic standards. In summary, this method represents a rapid, enhanced screen for blood and urine specimens in postmortem, driving under the influence, and drug facilitated sexual assault forensic toxicology casework. PMID:23118149
Guale, Fessessework; Shahreza, Shahriar; Walterscheid, Jeffrey P; Chen, Hsin-Hung; Arndt, Crystal; Kelly, Anna T; Mozayani, Ashraf
2013-01-01
Liquid chromatography time-of-flight mass spectrometry (LC-TOF-MS) analysis provides an expansive technique for identifying many known and unknown analytes. This study developed a screening method that utilizes automated solid-phase extraction to purify a wide array of analytes involving stimulants, benzodiazepines, opiates, muscle relaxants, hypnotics, antihistamines, antidepressants and newer synthetic "Spice/K2" cannabinoids and cathinone "bath salt" designer drugs. The extract was applied to LC-TOF-MS analysis, implementing a 13 min chromatography gradient with mobile phases of ammonium formate and methanol using positive mode electrospray. Several common drugs and metabolites can share the same mass and chemical formula among unrelated compounds, but they are structurally different. In this method, the LC-TOF-MS was able to resolve many isobaric compounds by accurate mass correlation within 15 ppm mass units and a narrow retention time interval of less than 10 s of separation. Drug recovery yields varied among spiked compounds, but resulted in overall robust area counts to deliver an average match score of 86 when compared to the retention time and mass of authentic standards. In summary, this method represents a rapid, enhanced screen for blood and urine specimens in postmortem, driving under the influence, and drug facilitated sexual assault forensic toxicology casework.
Chow, Jason C; Ekholm, Erik; Coleman, Heather
2018-05-24
The purpose of this article is to estimate the overall weighted mean effect of the relation between early language skills and later behavior problems in school-aged children. A systematic literature search yielded 19,790 unduplicated reports, and a structured search strategy and identification procedure yielded 25 unique data sets, with 114 effect sizes for analysis. Eligible reports were then coded, and effect sizes were extracted and synthesized via robust variance estimation and random-effects meta-analytic techniques. The overall correlation between early language and later behavior problems was negative and small (r = -.14, 95% confidence interval [CI] [-.16, -.11]), and controlling for demographic variables did not reduce the magnitude of the inverse relationship between language skill and problem behavior (r = -.16). Moderator analyses identified receptive language, parent-reported behavior measures, gender, and age as significant predictors of the association between language and behavior. This article corroborates the consistent findings of previous meta-analytic and longitudinal studies and further identifies areas, particularly around measurement, for future research. Furthermore, prospective longitudinal evaluations of the relations between language deficits and behavior problems with different types of measures (teacher-/parent-report, direct assessment, classroom observation) is warranted. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Smart, Kathleen F; Aggio, Raphael B M; Van Houtte, Jeremy R; Villas-Bôas, Silas G
2010-09-01
This protocol describes an analytical platform for the analysis of intra- and extracellular metabolites of microbial cells (yeast, filamentous fungi and bacteria) using gas chromatography-mass spectrometry (GC-MS). The protocol is subdivided into sampling, sample preparation, chemical derivatization of metabolites, GC-MS analysis and data processing and analysis. This protocol uses two robust quenching methods for microbial cultures, the first of which, cold glycerol-saline quenching, causes reduced leakage of intracellular metabolites, thus allowing a more reliable separation of intra- and extracellular metabolites with simultaneous stopping of cell metabolism. The second, fast filtration, is specifically designed for quenching filamentous micro-organisms. These sampling techniques are combined with an easy sample-preparation procedure and a fast chemical derivatization reaction using methyl chloroformate. This reaction takes place at room temperature, in aqueous medium, and is less prone to matrix effect compared with other derivatizations. This protocol takes an average of 10 d to complete and enables the simultaneous analysis of hundreds of metabolites from the central carbon metabolism (amino and nonamino organic acids, phosphorylated organic acids and fatty acid intermediates) using an in-house MS library and a data analysis pipeline consisting of two free software programs (Automated Mass Deconvolution and Identification System (AMDIS) and R).
Gika, Helen G; Theodoridis, Georgios A; Earll, Mark; Wilson, Ian D
2012-09-01
An approach to the determination of day-to-day analytical robustness of LC-MS-based methods for global metabolic profiling using a pooled QC sample is presented for the evaluation of metabonomic/metabolomic data. A set of 60 urine samples were repeatedly analyzed on five different days and the day-to-day reproducibility of the data obtained was determined. Multivariate statistical analysis was performed with the aim of evaluating variability and selected peaks were assessed and validated in terms of retention time stability, mass accuracy and intensity. The methodology enables the repeatability/reproducibility of extended analytical runs in large-scale studies to be determined, allowing the elimination of analytical (as opposed to biological) variability, in order to discover true patterns and correlations within the data. The day-to-day variability of the data revealed by this process suggested that, for this particular system, 3 days continuous operation was possible without the need for maintenance and cleaning. Variation was generally based on signal intensity changes over the 7-day period of the study, and was mainly a result of source contamination.
Heterodimer Autorepression Loop: A Robust and Flexible Pulse-Generating Genetic Module
NASA Astrophysics Data System (ADS)
Lannoo, B.; Carlon, E.; Lefranc, M.
2016-07-01
We investigate the dynamics of the heterodimer autorepression loop (HAL), a small genetic module in which a protein A acts as an autorepressor and binds to a second protein B to form an A B dimer. For suitable values of the rate constants, the HAL produces pulses of A alternating with pulses of B . By means of analytical and numerical calculations, we show that the duration of A pulses is extremely robust against variation of the rate constants while the duration of the B pulses can be flexibly adjusted. The HAL is thus a minimal genetic module generating robust pulses with a tunable duration, an interesting property for cellular signaling.
Analytical Applications of Monte Carlo Techniques.
ERIC Educational Resources Information Center
Guell, Oscar A.; Holcombe, James A.
1990-01-01
Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)
Thermoelectrically cooled water trap
Micheels, Ronald H [Concord, MA
2006-02-21
A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.
Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.
Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs
2018-01-01
While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.
Designing Phononic Crystals with Wide and Robust Band Gaps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jia, Zian; Chen, Yanyu; Yang, Haoxiang
Here, phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with widemore » and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.« less
Designing Phononic Crystals with Wide and Robust Band Gaps
Jia, Zian; Chen, Yanyu; Yang, Haoxiang; ...
2018-04-16
Here, phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with widemore » and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.« less
Designing Phononic Crystals with Wide and Robust Band Gaps
NASA Astrophysics Data System (ADS)
Jia, Zian; Chen, Yanyu; Yang, Haoxiang; Wang, Lifeng
2018-04-01
Phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with wide and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.
Ide, A H; Nogueira, J M F
2018-05-10
The present contribution aims to design new-generation bar adsorptive microextraction (BAμE) devices that promote an innovative and much better user-friendly analytical approach. The novel BAμE devices were lab-made prepared having smaller dimensions by using flexible nylon-based supports (7.5 × 1.0 mm) coated with convenient sorbents (≈ 0.5 mg). This novel advance allows effective microextraction and back-extraction ('only single liquid desorption step') stages as well as interfacing enhancement with the instrumental systems dedicated for routine analysis. To evaluate the achievements of these improvements, four antidepressant agents (bupropion, citalopram, amitriptyline and trazodone) were used as model compounds in aqueous media combined with liquid chromatography (LC) systems. By using an N-vinylpyrrolidone based-polymer phase good selectivity and efficiency were obtained. Assays performed on 25 mL spiked aqueous samples, yielded average recoveries in between 67.8 ± 12.4% (bupropion) and 88.3 ± 12.1% (citalopram), under optimized experimental conditions. The analytical performance also showed convenient precision (RSD < 12%) and detection limits (50 ng L -1 ), as well as linear dynamic ranges (160-2000 ng L -1 ) with suitable determination coefficients (r 2 > 0.9820). The application of the proposed analytical approach on biological fluids showed negligible matrix effects by using the standard addition methodology. From the data obtained, the new-generation BAμE devices presented herein provide an innovative and robust analytical cycle, are simple to prepare, cost-effective, user-friendly and compatible with the current LC autosampler systems. Furthermore, the novel devices were designed to be disposable and used together with negligible amounts of organic solvents (100 μL) during back-extraction, in compliance with the green analytical chemistry principles. In short, the new-generation BAμE devices showed to be an eco-user-friendly approach for trace analysis of priority compounds in biological fluids and a versatile alternative over other well-stablished sorption-based microextraction techniques. Copyright © 2018 Elsevier B.V. All rights reserved.
Surface enhanced Raman spectroscopy based nanoparticle assays for rapid, point-of-care diagnostics
NASA Astrophysics Data System (ADS)
Driscoll, Ashley J.
Nucleotide and immunoassays are important tools for disease diagnostics. Many of the current laboratory-based analytical diagnostic techniques require multiple assay steps and long incubation times before results are acquired. In the development of bioassays designed for detecting the emergence and spread of diseases in point-of-care (POC) and remote settings, more rapid and portable analytical methods are necessary. Nanoparticles provide simple and reproducible synthetic methods for the preparation of substrates that can be applied in colloidal assays, providing gains in kinetics due to miniaturization and plasmonic substrates for surface enhanced spectroscopies. Specifically, surface enhanced Raman spectroscopy (SERS) is finding broad application as a signal transduction method in immunological and nucleotide assays due to the production of narrow spectral peaks from the scattering molecules and the potential for simultaneous multiple analyte detection. The application of SERS to a no-wash, magnetic capture assay for the detection of West Nile Virus Envelope and Rift Valley Fever Virus N antigens is described. The platform utilizes colloid based capture of the target antigen in solution, magnetic collection of the immunocomplexes and acquisition of SERS spectra by a handheld Raman spectrometer. The reagents for a core-shell nanoparticle, SERS based assay designed for the capture of target microRNA implicated in acute myocardial infarction are also characterized. Several new, small molecule Raman scatterers are introduced and used to analyze the enhancing properties of the synthesized gold coated-magnetic nanoparticles. Nucleotide and immunoassay platforms have shown improvements in speed and analyte capture through the miniaturization of the capture surface and particle-based capture systems can provide a route to further surface miniaturization. A reaction-diffusion model of the colloidal assay platform is presented to understand the interplay of system parameters such as particle diameter, initial analyte concentration and dissociation constants. The projected sensitivities over a broad range of assay conditions are examined and the governing regime of particle systems reported. The results provide metrics in the design of more robust analytics that are of particular interest for POC diagnostics.
Determination of Perfluorinated Compounds in the Upper Mississippi River Basin
Despite ongoing efforts to develop robust analytical methods for the determination of perfluorinated compounds (PFCs) such as perfluorooctanesulfonate (PFOS) and perfluorooctanoic acid (PFOA) in surface water, comparatively little has been published on method performance, and the...
Accuracy of selected techniques for estimating ice-affected streamflow
Walker, John F.
1991-01-01
This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.
GenInfoGuard--a robust and distortion-free watermarking technique for genetic data.
Iftikhar, Saman; Khan, Sharifullah; Anwar, Zahid; Kamran, Muhammad
2015-01-01
Genetic data, in digital format, is used in different biological phenomena such as DNA translation, mRNA transcription and protein synthesis. The accuracy of these biological phenomena depend on genetic codes and all subsequent processes. To computerize the biological procedures, different domain experts are provided with the authorized access of the genetic codes; as a consequence, the ownership protection of such data is inevitable. For this purpose, watermarks serve as the proof of ownership of data. While protecting data, embedded hidden messages (watermarks) influence the genetic data; therefore, the accurate execution of the relevant processes and the overall result becomes questionable. Most of the DNA based watermarking techniques modify the genetic data and are therefore vulnerable to information loss. Distortion-free techniques make sure that no modifications occur during watermarking; however, they are fragile to malicious attacks and therefore cannot be used for ownership protection (particularly, in presence of a threat model). Therefore, there is a need for a technique that must be robust and should also prevent unwanted modifications. In this spirit, a watermarking technique with aforementioned characteristics has been proposed in this paper. The proposed technique makes sure that: (i) the ownership rights are protected by means of a robust watermark; and (ii) the integrity of genetic data is preserved. The proposed technique-GenInfoGuard-ensures its robustness through the "watermark encoding" in permuted values, and exhibits high decoding accuracy against various malicious attacks.
Uludağ, Yildiz; Piletsky, Sergey A; Turner, Anthony P F; Cooper, Matthew A
2007-11-01
Biomimetic recognition elements employed for the detection of analytes are commonly based on proteinaceous affibodies, immunoglobulins, single-chain and single-domain antibody fragments or aptamers. The alternative supra-molecular approach using a molecularly imprinted polymer now has proven utility in numerous applications ranging from liquid chromatography to bioassays. Despite inherent advantages compared with biochemical/biological recognition (which include robustness, storage endurance and lower costs) there are few contributions that describe quantitative analytical applications of molecularly imprinted polymers for relevant small molecular mass compounds in real-world samples. There is, however, significant literature describing the use of low-power, portable piezoelectric transducers to detect analytes in environmental monitoring and other application areas. Here we review the combination of molecularly imprinted polymers as recognition elements with piezoelectric biosensors for quantitative detection of small molecules. Analytes are classified by type and sample matrix presentation and various molecularly imprinted polymer synthetic fabrication strategies are also reviewed.
Khani, Rouhollah; Ghasemi, Jahan B; Shemirani, Farzaneh
2014-03-25
A powerful and efficient signal-preprocessing technique that combines local and multiscale properties of the wavelet prism with the global filtering capability of orthogonal signal correction (OSC) is applied for pretreatment of spectroscopic data of parabens as model compounds after their preconcentration by robust ionic liquid-based dispersive liquid-liquid microextraction method (IL-DLLME). In the proposed technique, a mixture of a water-immiscible ionic liquid (as extraction solvent) [Hmim][PF6] and disperser solvent is injected into an aqueous sample solution containing one of the IL's ions, NaPF6, as extraction solvent and common ion source. After preconcentration, the absorbance of the extracted compounds was measured in the wavelength range of 200-700 nm. The wavelet orthogonal signal correction with partial least squares (WOSC-PLS) method was then applied for simultaneous determination of each individual compound. Effective parameters, such as amount of IL, volume of the disperser solvent and amount of NaPF6, were inspected by central composite design to identify the most important parameters and their interactions. The effect of pH on the sensitivity and selectivity was studied according to the net analyte signal (NAS) for each component. Under optimum conditions, enrichment factors of the studied compounds were 75 for methyl paraben (MP) and 71 for propyl paraben (PP). Limits of detection for MP and PP were 4.2 and 4.8 ng mL(-)(1), respectively. The root mean square errors of prediction for MP and PP were 0.1046 and 0.1275 μg mL(-)(1), respectively. The practical applicability of the developed method was examined using hygienic, cosmetic, pharmaceutical and natural water samples. Copyright © 2013 Elsevier B.V. All rights reserved.
Balabin, Roman M; Lomakina, Ekaterina I
2011-04-21
In this study, we make a general comparison of the accuracy and robustness of five multivariate calibration models: partial least squares (PLS) regression or projection to latent structures, polynomial partial least squares (Poly-PLS) regression, artificial neural networks (ANNs), and two novel techniques based on support vector machines (SVMs) for multivariate data analysis: support vector regression (SVR) and least-squares support vector machines (LS-SVMs). The comparison is based on fourteen (14) different datasets: seven sets of gasoline data (density, benzene content, and fractional composition/boiling points), two sets of ethanol gasoline fuel data (density and ethanol content), one set of diesel fuel data (total sulfur content), three sets of petroleum (crude oil) macromolecules data (weight percentages of asphaltenes, resins, and paraffins), and one set of petroleum resins data (resins content). Vibrational (near-infrared, NIR) spectroscopic data are used to predict the properties and quality coefficients of gasoline, biofuel/biodiesel, diesel fuel, and other samples of interest. The four systems presented here range greatly in composition, properties, strength of intermolecular interactions (e.g., van der Waals forces, H-bonds), colloid structure, and phase behavior. Due to the high diversity of chemical systems studied, general conclusions about SVM regression methods can be made. We try to answer the following question: to what extent can SVM-based techniques replace ANN-based approaches in real-world (industrial/scientific) applications? The results show that both SVR and LS-SVM methods are comparable to ANNs in accuracy. Due to the much higher robustness of the former, the SVM-based approaches are recommended for practical (industrial) application. This has been shown to be especially true for complicated, highly nonlinear objects.
ERIC Educational Resources Information Center
Wilcox, Rand R.; Serang, Sarfaraz
2017-01-01
The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…
Analytical methods in multivariate highway safety exposure data estimation
DOT National Transportation Integrated Search
1984-01-01
Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...
Robust numerical electromagnetic eigenfunction expansion algorithms
NASA Astrophysics Data System (ADS)
Sainath, Kamalesh
This thesis summarizes developments in rigorous, full-wave, numerical spectral-domain (integral plane wave eigenfunction expansion [PWE]) evaluation algorithms concerning time-harmonic electromagnetic (EM) fields radiated by generally-oriented and positioned sources within planar and tilted-planar layered media exhibiting general anisotropy, thickness, layer number, and loss characteristics. The work is motivated by the need to accurately and rapidly model EM fields radiated by subsurface geophysical exploration sensors probing layered, conductive media, where complex geophysical and man-made processes can lead to micro-laminate and micro-fractured geophysical formations exhibiting, at the lower (sub-2MHz) frequencies typically employed for deep EM wave penetration through conductive geophysical media, bulk-scale anisotropic (i.e., directional) electrical conductivity characteristics. When the planar-layered approximation (layers of piecewise-constant material variation and transversely-infinite spatial extent) is locally, near the sensor region, considered valid, numerical spectral-domain algorithms are suitable due to their strong low-frequency stability characteristic, and ability to numerically predict time-harmonic EM field propagation in media with response characterized by arbitrarily lossy and (diagonalizable) dense, anisotropic tensors. If certain practical limitations are addressed, PWE can robustly model sensors with general position and orientation that probe generally numerous, anisotropic, lossy, and thick layers. The main thesis contributions, leading to a sensor and geophysical environment-robust numerical modeling algorithm, are as follows: (1) Simple, rapid estimator of the region (within the complex plane) containing poles, branch points, and branch cuts (critical points) (Chapter 2), (2) Sensor and material-adaptive azimuthal coordinate rotation, integration contour deformation, integration domain sub-region partition and sub-region-dependent integration order (Chapter 3), (3) Integration partition-extrapolation-based (Chapter 3) and Gauss-Laguerre Quadrature (GLQ)-based (Chapter 4) evaluations of the deformed, semi-infinite-length integration contour tails, (4) Robust in-situ-based (i.e., at the spectral-domain integrand level) direct/homogeneous-medium field contribution subtraction and analytical curbing of the source current spatial spectrum function's ill behavior (Chapter 5), and (5) Analytical re-casting of the direct-field expressions when the source is embedded within a NBAM, short for non-birefringent anisotropic medium (Chapter 6). The benefits of these contributions are, respectively, (1) Avoiding computationally intensive critical-point location and tracking (computation time savings), (2) Sensor and material-robust curbing of the integrand's oscillatory and slow decay behavior, as well as preventing undesirable critical-point migration within the complex plane (computation speed, precision, and instability-avoidance benefits), (3) sensor and material-robust reduction (or, for GLQ, elimination) of integral truncation error, (4) robustly stable modeling of scattered fields and/or fields radiated from current sources modeled as spatially distributed (10 to 1000-fold compute-speed acceleration also realized for distributed-source computations), and (5) numerically stable modeling of fields radiated from sources within NBAM layers. Having addressed these limitations, are PWE algorithms applicable to modeling EM waves in tilted planar-layered geometries too? This question is explored in Chapter 7 using a Transformation Optics-based approach, allowing one to model wave propagation through layered media that (in the sensor's vicinity) possess tilted planar interfaces. The technique leads to spurious wave scattering however, whose induced computation accuracy degradation requires analysis. Mathematical exhibition, and exhaustive simulation-based study and analysis of the limitations of, this novel tilted-layer modeling formulation is Chapter 7's main contribution.
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
Asymptotic Linearity of Optimal Control Modification Adaptive Law with Analytical Stability Margins
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T.
2010-01-01
Optimal control modification has been developed to improve robustness to model-reference adaptive control. For systems with linear matched uncertainty, optimal control modification adaptive law can be shown by a singular perturbation argument to possess an outer solution that exhibits a linear asymptotic property. Analytical expressions of phase and time delay margins for the outer solution can be obtained. Using the gradient projection operator, a free design parameter of the adaptive law can be selected to satisfy stability margins.
Card, Alan J; Simsekler, Mecit Can Emre; Clark, Michael; Ward, James R; Clarkson, P John
2014-01-01
Risk assessment is widely used to improve patient safety, but healthcare workers are not trained to design robust solutions to the risks they uncover. This leads to an overreliance on the weakest category of risk control recommendations: administrative controls. Increasing the proportion of non-administrative risk control options (NARCOs) generated would enable (though not ensure) the adoption of more robust solutions. Experimentally assess a method for generating stronger risk controls: The Generating Options for Active Risk Control (GO-ARC) Technique. Participants generated risk control options in response to two patient safety scenarios. Scenario 1 (baseline): All participants used current practice (unstructured brainstorming). Scenario 2: Control group used current practice; intervention group used the GO-ARC Technique. To control for individual differences between participants, analysis focused on the change in the proportion of NARCOs for each group. Proportion of NARCOs decreased from 0.18 at baseline to 0.12. Intervention group: Proportion increased from 0.10 at baseline to 0.29 using the GO-ARC Technique. Results were statistically significant. There was no decrease in the number of administrative controls generated by the intervention group. The Generating Options for Active Risk Control (GO-ARC) Technique appears to lead to more robust risk control options.
A reference web architecture and patterns for real-time visual analytics on large streaming data
NASA Astrophysics Data System (ADS)
Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer
2013-12-01
Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.
Robust Vision-Based Pose Estimation Algorithm for AN Uav with Known Gravity Vector
NASA Astrophysics Data System (ADS)
Kniaz, V. V.
2016-06-01
Accurate estimation of camera external orientation with respect to a known object is one of the central problems in photogrammetry and computer vision. In recent years this problem is gaining an increasing attention in the field of UAV autonomous flight. Such application requires a real-time performance and robustness of the external orientation estimation algorithm. The accuracy of the solution is strongly dependent on the number of reference points visible on the given image. The problem only has an analytical solution if 3 or more reference points are visible. However, in limited visibility conditions it is often needed to perform external orientation with only 2 visible reference points. In such case the solution could be found if the gravity vector direction in the camera coordinate system is known. A number of algorithms for external orientation estimation for the case of 2 known reference points and a gravity vector were developed to date. Most of these algorithms provide analytical solution in the form of polynomial equation that is subject to large errors in the case of complex reference points configurations. This paper is focused on the development of a new computationally effective and robust algorithm for external orientation based on positions of 2 known reference points and a gravity vector. The algorithm implementation for guidance of a Parrot AR.Drone 2.0 micro-UAV is discussed. The experimental evaluation of the algorithm proved its computational efficiency and robustness against errors in reference points positions and complex configurations.
Das, Saptarshi; Pan, Indranil; Das, Shantanu; Gupta, Amitava
2012-03-01
Genetic algorithm (GA) has been used in this study for a new approach of suboptimal model reduction in the Nyquist plane and optimal time domain tuning of proportional-integral-derivative (PID) and fractional-order (FO) PI(λ)D(μ) controllers. Simulation studies show that the new Nyquist-based model reduction technique outperforms the conventional H(2)-norm-based reduced parameter modeling technique. With the tuned controller parameters and reduced-order model parameter dataset, optimum tuning rules have been developed with a test-bench of higher-order processes via genetic programming (GP). The GP performs a symbolic regression on the reduced process parameters to evolve a tuning rule which provides the best analytical expression to map the data. The tuning rules are developed for a minimum time domain integral performance index described by a weighted sum of error index and controller effort. From the reported Pareto optimal front of the GP-based optimal rule extraction technique, a trade-off can be made between the complexity of the tuning formulae and the control performance. The efficacy of the single-gene and multi-gene GP-based tuning rules has been compared with the original GA-based control performance for the PID and PI(λ)D(μ) controllers, handling four different classes of representative higher-order processes. These rules are very useful for process control engineers, as they inherit the power of the GA-based tuning methodology, but can be easily calculated without the requirement for running the computationally intensive GA every time. Three-dimensional plots of the required variation in PID/fractional-order PID (FOPID) controller parameters with reduced process parameters have been shown as a guideline for the operator. Parametric robustness of the reported GP-based tuning rules has also been shown with credible simulation examples. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
Emergence of robustness in networks of networks
NASA Astrophysics Data System (ADS)
Roth, Kevin; Morone, Flaviano; Min, Byungjoon; Makse, Hernán A.
2017-06-01
A model of interdependent networks of networks (NONs) was introduced recently [Proc. Natl. Acad. Sci. (USA) 114, 3849 (2017), 10.1073/pnas.1620808114] in the context of brain activation to identify the neural collective influencers in the brain NON. Here we investigate the emergence of robustness in such a model, and we develop an approach to derive an exact expression for the random percolation transition in Erdös-Rényi NONs of this kind. Analytical calculations are in agreement with numerical simulations, and highlight the robustness of the NON against random node failures, which thus presents a new robust universality class of NONs. The key aspect of this robust NON model is that a node can be activated even if it does not belong to the giant mutually connected component, thus allowing the NON to be built from below the percolation threshold, which is not possible in previous models of interdependent networks. Interestingly, the phase diagram of the model unveils particular patterns of interconnectivity for which the NON is most vulnerable, thereby marking the boundary above which the robustness of the system improves with increasing dependency connections.
Biswas, A K; Tandon, S; Beura, C K
2016-06-01
The aim of this study was to develop a simple, specific and rapid analytical method for accurate identification of calpain and calpastatin from chicken blood and muscle samples. The method is based on liquid-liquid extraction technique followed by casein Zymography detection. The target compounds were extracted from blood and meat samples by tris buffer, and purified and separated on anion exchange chromatography. It has been observed that buffer (pH 6.7) containing 50 mM tris-base appears to be excellent extractant as activity of analytes was maximum for all samples. The concentrations of μ-, m-calpain and calpastatin detected in the extracts of blood, breast and thigh samples were 0.28-0.55, 1.91-2.05 and 1.38-1.52 Unit/g, respectively. For robustness, the analytical method was applied to determine the activity of calpains (μ and m) in eighty postmortem muscle samples. It has been observed that μ-calpain activity in breast and thigh muscles declined very rapidly at 48 h and 24 h, respectively while activity of m-calpain remained stable. Shear force values were also declined with the increase of post-mortem aging showing the presence of ample tenderness of breast and thigh muscles. Finally, it is concluded that the method standardized for the detection of calpain and calpastatin has the potential to be applied to identify post-mortem aging of chicken meat samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Peng, R; Sonner, Z; Hauke, A; Wilder, E; Kasting, J; Gaillard, T; Swaille, D; Sherman, F; Mao, X; Hagen, J; Murdock, R; Heikenfeld, J
2016-11-01
Wearable sweat biosensensing technology has dominantly relied on techniques which place planar-sensors or fluid-capture materials directly onto the skin surface. This 'on-skin' approach can result in sample volumes in the μL regime, due to the roughness of skin and/or due to the presence of hair. Not only does this increase the required sampling time to 10's of minutes or more, but it also increases the time that sweat spends on skin and therefore increases the amount of analyte contamination coming from the skin surface. Reported here is a first demonstration of a new paradigm in sweat sampling and sensing, where sample volumes are reduced from the μL's to nL's regime, and where analyte contamination from skin is reduced or even eliminated. A micro-porous membrane is constructed such that it is porous to sweat only. To complete a working device, first placed onto skin is a cosmetic-grade oil, secondly this membrane, and thirdly the sensors. As a result, spreading of sweat is isolated to only regions above the sweat glands before it reaches the sensors. Best case sampling intervals are on the order of several minutes, and the majority of hydrophilic (low oil solubility) contaminants from the skin surface are blocked. In vitro validation of this new approach is performed with an improved artificial skin including human hair. In vivo tests show strikingly consistent results, and reveal that the oil/membrane is robust enough to even allow horizontal sliding of a sensor.
Robust control algorithms for Mars aerobraking
NASA Technical Reports Server (NTRS)
Shipley, Buford W., Jr.; Ward, Donald T.
1992-01-01
Four atmospheric guidance concepts have been adapted to control an interplanetary vehicle aerobraking in the Martian atmosphere. The first two offer improvements to the Analytic Predictor Corrector (APC) to increase its robustness to density variations. The second two are variations of a new Liapunov tracking exit phase algorithm, developed to guide the vehicle along a reference trajectory. These four new controllers are tested using a six degree of freedom computer simulation to evaluate their robustness. MARSGRAM is used to develop realistic atmospheres for the study. When square wave density pulses perturb the atmosphere all four controllers are successful. The algorithms are tested against atmospheres where the inbound and outbound density functions are different. Square wave density pulses are again used, but only for the outbound leg of the trajectory. Additionally, sine waves are used to perturb the density function. The new algorithms are found to be more robust than any previously tested and a Liapunov controller is selected as the most robust control algorithm overall examined.
Improving Marine Ecosystem Models with Biochemical Tracers
NASA Astrophysics Data System (ADS)
Pethybridge, Heidi R.; Choy, C. Anela; Polovina, Jeffrey J.; Fulton, Elizabeth A.
2018-01-01
Empirical data on food web dynamics and predator-prey interactions underpin ecosystem models, which are increasingly used to support strategic management of marine resources. These data have traditionally derived from stomach content analysis, but new and complementary forms of ecological data are increasingly available from biochemical tracer techniques. Extensive opportunities exist to improve the empirical robustness of ecosystem models through the incorporation of biochemical tracer data and derived indices, an area that is rapidly expanding because of advances in analytical developments and sophisticated statistical techniques. Here, we explore the trophic information required by ecosystem model frameworks (species, individual, and size based) and match them to the most commonly used biochemical tracers (bulk tissue and compound-specific stable isotopes, fatty acids, and trace elements). Key quantitative parameters derived from biochemical tracers include estimates of diet composition, niche width, and trophic position. Biochemical tracers also provide powerful insight into the spatial and temporal variability of food web structure and the characterization of dominant basal and microbial food web groups. A major challenge in incorporating biochemical tracer data into ecosystem models is scale and data type mismatches, which can be overcome with greater knowledge exchange and numerical approaches that transform, integrate, and visualize data.
Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.
Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn
2017-01-01
The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.
Use of the Hugoniot elastic limit in laser shockwave experiments to relate velocity measurements
NASA Astrophysics Data System (ADS)
Smith, James A.; Lacy, Jeffrey M.; Lévesque, Daniel; Monchalin, Jean-Pierre; Lord, Martin
2016-02-01
The US National Nuclear Security Agency has a Global Threat Reduction Initiative (GTRI) with the goal of reducing the worldwide use of high-enriched uranium (HEU). A salient component of that initiative is the conversion of research reactors from HEU to low enriched uranium (LEU) fuels. An innovative fuel is being developed to replace HEU in high-power research reactors. The new LEU fuel is a monolithic fuel made from a U-Mo alloy foil encapsulated in Al-6061 cladding. In order to support the fuel qualification process, the Laser Shockwave Technique (LST) is being developed to characterize the clad-clad and fuel-clad interface strengths in fresh and irradiated fuel plates. This fuel-cladding interface qualification will ensure the survivability of the fuel plates in the harsh reactor environment even under abnormal operating conditions. One of the concerns of the project is the difficulty of calibrating and standardizing the laser shock technique. An analytical study under development and experimental testing supports the hypothesis that the Hugoniot Elastic Limit (HEL) in materials can be a robust and simple benchmark to compare stresses generated by different laser shock systems.
Fontana, Ariel R; Patil, Sangram H; Banerjee, Kaushik; Altamirano, Jorgelina C
2010-04-28
A fast and effective microextraction technique is proposed for preconcentration of 2,4,6-trichloroanisole (2,4,6-TCA) from wine samples prior gas chromatography tandem mass spectrometric (GC-MS/MS) analysis. The proposed technique is based on ultrasonication (US) for favoring the emulsification phenomenon during the extraction stage. Several variables influencing the relative response of the target analyte were studied and optimized. Under optimal experimental conditions, 2,4,6-TCA was quantitatively extracted achieving enhancement factors (EF) > or = 400 and limits of detection (LODs) 0.6-0.7 ng L(-1) with relative standard deviations (RSDs) < or = 11.3%, when 10 ng L(-1) 2,4,6-TCA standard-wine sample blend was analyzed. The calibration graphs for white and red wine were linear within the range of 5-1000 ng L(-1), and estimation coefficients (r(2)) were > or = 0.9995. Validation of the methodology was carried out by standard addition method at two concentrations (10 and 50 ng L(-1)) achieving recoveries >80% indicating satisfactory robustness of the method. The methodology was successfully applied for determination of 2,4,6-TCA in different wine samples.
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Sheverev, Valery A.; Holland, Paul M.; Takeuchi, Norishige
2006-01-01
In situ exploration of the solar system to identify its early chemistry as preserved in icy bodies and to look for compelling evidence of astrobiology will require new technology for chemical analysis. Chemical measurements in space flight environments highlight the need for a high level of positive identification of chemical compounds, since re-measurement by alternative techniques for confirmation will not be feasible. It also may not be possible to anticipate all chemical species that are observed, and important species may be present only at trace levels where they can be masked by complex chemical backgrounds. Up to now, the only techniques providing independent sample identification of GC separated components across a wide range of chemical species have been Mass Spectrometry (MS) and Ion Mobility Spectrometry (IMS). We describe here the development of a versatile and robust miniature GC detector based on Penning Ionization Electron Spectroscopy (PIES), for use with miniature GC systems being developed for planetary missions. PIES identifies the sample molecule through spectra related to its ionization potential. The combination of miniature GC technology with the primary identification capabilities of PIES provides an analytical approach ideal for planetary analyses.
Robust model predictive control for multi-step short range spacecraft rendezvous
NASA Astrophysics Data System (ADS)
Zhu, Shuyi; Sun, Ran; Wang, Jiaolong; Wang, Jihe; Shao, Xiaowei
2018-07-01
This work presents a robust model predictive control (MPC) approach for the multi-step short range spacecraft rendezvous problem. During the specific short range phase concerned, the chaser is supposed to be initially outside the line-of-sight (LOS) cone. Therefore, the rendezvous process naturally includes two steps: the first step is to transfer the chaser into the LOS cone and the second step is to transfer the chaser into the aimed region with its motion confined within the LOS cone. A novel MPC framework named after Mixed MPC (M-MPC) is proposed, which is the combination of the Variable-Horizon MPC (VH-MPC) framework and the Fixed-Instant MPC (FI-MPC) framework. The M-MPC framework enables the optimization for the two steps to be implemented jointly rather than to be separated factitiously, and its computation workload is acceptable for the usually low-power processors onboard spacecraft. Then considering that disturbances including modeling error, sensor noise and thrust uncertainty may induce undesired constraint violations, a robust technique is developed and it is attached to the above M-MPC framework to form a robust M-MPC approach. The robust technique is based on the chance-constrained idea, which ensures that constraints can be satisfied with a prescribed probability. It improves the robust technique proposed by Gavilan et al., because it eliminates the unnecessary conservativeness by explicitly incorporating known statistical properties of the navigation uncertainty. The efficacy of the robust M-MPC approach is shown in a simulation study.
Passman, Dina B.
2013-01-01
Objective The objective of this demonstration is to show conference attendees how they can integrate, analyze, and visualize diverse data type data from across a variety of systems by leveraging an off-the-shelf enterprise business intelligence (EBI) solution to support decision-making in disasters. Introduction Fusion Analytics is the data integration system developed by the Fusion Cell at the U.S. Department of Health and Human Services (HHS), Office of the Assistant Secretary for Preparedness and Response (ASPR). Fusion Analytics meaningfully augments traditional public and population health surveillance reporting by providing web-based data analysis and visualization tools. Methods Fusion Analytics serves as a one-stop-shop for the web-based data visualizations of multiple real-time data sources within ASPR. The 24-7 web availability makes it an ideal analytic tool for situational awareness and response allowing stakeholders to access the portal from any internet-enabled device without installing any software. The Fusion Analytics data integration system was built using off-the-shelf EBI software. Fusion Analytics leverages the full power of statistical analysis software and delivers reports to users in a secure web-based environment. Fusion Analytics provides an example of how public health staff can develop and deploy a robust public health informatics solution using an off-the shelf product and with limited development funding. It also provides the unique example of a public health information system that combines patient data for traditional disease surveillance with manpower and resource data to provide overall decision support for federal public health and medical disaster response operations. Conclusions We are currently in a unique position within public health. One the one hand, we have been gaining greater and greater access to electronic data of all kinds over the last few years. On the other, we are working in a time of reduced government spending to support leveraging this data for decision support with robust analytics and visualizations. Fusion Analytics provides an opportunity for attendees to see how various types of data are integrated into a single application for population health decision support. It also can provide them with ideas of how they can use their own staff to create analyses and reports that support their public health activities.
Oliva, Alexis; Fariña, José B; Llabrés, Matías
2013-10-15
A simple and reproducible UPLC method was developed and validated for the quantitative analysis of finasteride in low-dose drug products. Method validation demonstrated the reliability and consistency of analytical results. Due to the regulatory requirements of pharmaceutical analysis in particular, evaluation of robustness is vital to predict how small variations in operating conditions affect the responses. Response surface methodology as an optimization technique was used to evaluate the robustness. For this, a central composite design was implemented around the nominal conditions. Statistical treatment of the responses (retention factor and drug concentrations expressed as percentage of label claim) showed that methanol content in mobile-phase and flow rate were the most influential factors. In the optimization process, the compromise decision support problem (cDSP) strategy was used. Construction of the robust domain from response-surfaces provided tolerance windows for the factors affecting the effectiveness of the method. The specified limits for the USP uniformity of dosage units assay (98.5-101.5%) and the purely experimental variations based on the repeatability test for center points (nominal conditions repetitions) were used as criteria to establish the tolerance windows, which allowed definition design space (DS) of analytical method. Thus, the acceptance criteria values (AV) proposed by the USP-uniformity of assay only depend on the sampling error. If the variation in the responses corresponded to approximately twice the repeatability standard deviation, individual values for percentage label claim (%LC) response may lie outside the specified limits; this implies the data are not centered between the specified limits, and that this term plus the sampling error affects the AV value. To avoid this fact, the limits specified by the Uniformity of Dosage Form assay (i.e., 98.5-101.5%) must be taken into consideration to fix the tolerance windows for each factor. All these results were verified by the Monte Carlo simulation. In conclusion, the level of variability for different factors must be calculated for each case, and not arbitrary way, provided a variation is found higher than the repeatability for center points and secondly, the %LC response must lie inside the specified limits i.e., 98.5-101.5%. If not the UPLC method must be re-developed. © 2013 Elsevier B.V. All rights reserved.
Using dynamic mode decomposition for real-time background/foreground separation in video
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kutz, Jose Nathan; Grosek, Jacob; Brunton, Steven
The technique of dynamic mode decomposition (DMD) is disclosed herein for the purpose of robustly separating video frames into background (low-rank) and foreground (sparse) components in real-time. Foreground/background separation is achieved at the computational cost of just one singular value decomposition (SVD) and one linear equation solve, thus producing results orders of magnitude faster than robust principal component analysis (RPCA). Additional techniques, including techniques for analyzing the video for multi-resolution time-scale components, and techniques for reusing computations to allow processing of streaming video in real time, are also described herein.
New software solutions for analytical spectroscopists
NASA Astrophysics Data System (ADS)
Davies, Antony N.
1999-05-01
Analytical spectroscopists must be computer literate to effectively carry out the tasks assigned to them. This has often been resisted within organizations with insufficient funds to equip their staff properly, a lack of desire to deliver the essential training and a basic resistance amongst staff to learn the new techniques required for computer assisted analysis. In the past these problems were compounded by seriously flawed software which was being sold for spectroscopic applications. Owing to the limited market for such complex products the analytical spectroscopist often was faced with buying incomplete and unstable tools if the price was to remain reasonable. Long product lead times meant spectrometer manufacturers often ended up offering systems running under outdated and sometimes obscure operating systems. Not only did this mean special staff training for each instrument where the knowledge gained on one system could not be transferred to the neighbouring system but these spectrometers were often only capable of running in a stand-alone mode, cut-off from the rest of the laboratory environment. Fortunately a number of developments in recent years have substantially changed this depressing picture. A true multi-tasking operating system with a simple graphical user interface, Microsoft Windows NT4, has now been widely introduced into the spectroscopic computing environment which has provided a desktop operating system which has proved to be more stable and robust as well as requiring better programming techniques of software vendors. The opening up of the Internet has provided an easy way to access new tools for data handling and has forced a substantial re-think about results delivery (for example Chemical MIME types, IUPAC spectroscopic data exchange standards). Improved computing power and cheaper hardware now allows large spectroscopic data sets to be handled without too many problems. This includes the ability to carry out chemometric operations in minutes rather than hours. Fast networks now enable data analysis of even multi-dimensional spectroscopic data sets remote from the measuring instrument. A strong tendency to opt for a more unified graphical user interface which is substantially more user friendly allows even inexperienced users to rapidly get acquainted with even the complex mathematical analyses. Some examples of new spectroscopic software products will be given to demonstrate the aforesaid points and highlight the ease of integration into a modern analytical spectroscopy workplace.
An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy
ERIC Educational Resources Information Center
Collis, Peter
2012-01-01
Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…
NASA Astrophysics Data System (ADS)
Tinianov, Brandon D.; Nakagawa, Masami; Muñoz, David R.
2006-02-01
This article describes a novel technique for the measurement of the thermal conductivity of low-density (12-18kg/m3) fiberglass insulation and other related fibrous insulation materials using a noninvasive acoustic apparatus. The experimental method is an extension of earlier acoustic methods based upon the evaluation of the propagation constant from the acoustic pressure transfer function across the test material. To accomplish this, an analytical model is employed that describes the behavior of sound waves at the outlet of a baffled waveguide. The model accounts for the behavior of the mixed impedance interface introduced by the test material. Current results show that the technique is stable for a broad range of absorber thicknesses and densities. Experimental results obtained in the laboratory show excellent correlation between the thermal conductivity and both the real and imaginary components of the propagation constant. Correlation of calculated propagation constant magnitude versus measured thermal conductivity gave an R2 of 0.94 for the bulk density range (12-18kg/m3) typical for manufactured fiberglass batt materials. As an improvement to earlier acoustic techniques, measurement is now possible in noisy manufacturing environments with a moving test material. Given the promise of such highly correlated measurements in a robust method, the acoustic technique is well suited to continuously measure the thermal conductivity of the material during its production, replacing current expensive off-line methods. Test cycle time is reduced from hours to seconds.
NASA Astrophysics Data System (ADS)
Azzawi, Wessam Al; Epaarachchi, J. A.; Islam, Mainul; Leng, Jinsong
2017-12-01
Shape memory polymers (SMPs) offer a unique ability to undergo a substantial shape deformation and subsequently recover the original shape when exposed to a particular external stimulus. Comparatively low mechanical properties being the major drawback for extended use of SMPs in engineering applications. However the inclusion of reinforcing fibres in to SMPs improves mechanical properties significantly while retaining intrinsic shape memory effects. The implementation of shape memory polymer composites (SMPCs) in any engineering application is a unique task which requires profound materials and design optimization. However currently available analytical tools have critical limitations to undertake accurate analysis/simulations of SMPC structures and slower derestrict transformation of breakthrough research outcomes to real-life applications. Many finite element (FE) models have been presented. But majority of them require a complicated user-subroutines to integrate with standard FE software packages. Furthermore, those subroutines are problem specific and difficult to use for a wider range of SMPC materials and related structures. This paper presents a FE simulation technique to model the thermomechanical behaviour of the SMPCs using commercial FE software ABAQUS. Proposed technique incorporates material time-dependent viscoelastic behaviour. The ability of the proposed technique to predict the shape fixity and shape recovery was evaluated by experimental data acquired by a bending of a SMPC cantilever beam. The excellent correlation between the experimental and FE simulation results has confirmed the robustness of the proposed technique.
NASA Technical Reports Server (NTRS)
Bozeman, Robert E.
1987-01-01
An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.
Ren, Yao; Walczyk, Thomas
2014-09-01
Ferritin is a hollow sphere protein composed of 24 subunits that can store up to 4500 iron atoms in its inner cavity. It is mainly found in the liver and spleen but also in serum at trace levels. Serum ferritin is considered as the best single indicator in assessing body iron stores except liver or bone marrow biopsy. However, it is confounded by other disease conditions. Ferritin bound iron (FBI) and ferritin saturation have been suggested as more robust biomarkers. The current techniques for FBI determination are limited by low antibody specificity, low instrument sensitivity and possible analyte losses during sample preparation. The need for a highly sensitive and reliable method is widely recognized. Here we describe a novel technique to detect serum FBI using species-specific isotope dilution mass spectrometry (SS-IDMS). [(57)Fe]-ferritin was produced by biosynthesis and in vitro labeling with the (57)Fe spike in the form of [(57)Fe]-citrate after cell lysis and heat treatment. [(57)Fe]-ferritin for sample spiking was further purified by fast liquid protein chromatography. Serum ferritin and added [(57)Fe]-ferritin were separated from other iron species by ultrafiltration followed by isotopic analysis of FBI using negative thermal ionization mass spectrometry. Repeatability of our assay is 8% with an absolute detection limit of 18 ng FBI in the sample. As compared to other speciation techniques, SS-IDMS offers maximum control over sample losses and species conversion during analysis. The described technique may therefore serve as a reference technique for clinical applications of FBI as a new biomarker for assessing body iron status.
Gauglitz, Günter; Wimmer, Benedikt; Melzer, Tanja; Huhn, Carolin
2018-01-01
Since its introduction in 1974, the herbicide glyphosate has experienced a tremendous increase in use, with about one million tons used annually today. This review focuses on sensors and electromigration separation techniques as alternatives to chromatographic methods for the analysis of glyphosate and its metabolite aminomethyl phosphonic acid. Even with the large number of studies published, glyphosate analysis remains challenging. With its polar and depending on pH even ionic functional groups lacking a chromophore, it is difficult to analyze with chromatographic techniques. Its analysis is mostly achieved after derivatization. Its purification from food and environmental samples inevitably results incoextraction of ionic matrix components, with a further impact on analysis derivatization. Its purification from food and environmental samples inevitably results in coextraction of ionic matrix components, with a further impact on analysis and also derivatization reactions. Its ability to form chelates with metal cations is another obstacle for precise quantification. Lastly, the low limits of detection required by legislation have to be met. These challenges preclude glyphosate from being analyzed together with many other pesticides in common multiresidue (chromatographic) methods. For better monitoring of glyphosate in environmental and food samples, further fast and robust methods are required. In this review, analytical methods are summarized and discussed from the perspective of biosensors and various formats of electromigration separation techniques, including modes such as capillary electrophoresis and micellar electrokinetic chromatography, combined with various detection techniques. These methods are critically discussed with regard to matrix tolerance, limits of detection reached, and selectivity.
Connecting Core Percolation and Controllability of Complex Networks
Jia, Tao; Pósfai, Márton
2014-01-01
Core percolation is a fundamental structural transition in complex networks related to a wide range of important problems. Recent advances have provided us an analytical framework of core percolation in uncorrelated random networks with arbitrary degree distributions. Here we apply the tools in analysis of network controllability. We confirm analytically that the emergence of the bifurcation in control coincides with the formation of the core and the structure of the core determines the control mode of the network. We also derive the analytical expression related to the controllability robustness by extending the deduction in core percolation. These findings help us better understand the interesting interplay between the structural and dynamical properties of complex networks. PMID:24946797
Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján
2016-02-11
A critical overview on automation of modern liquid phase microextraction (LPME) approaches based on the liquid impregnation of porous sorbents and membranes is presented. It is the continuation of part 1, in which non-dispersive LPME techniques based on the use of the extraction phase (EP) in the form of drop, plug, film, or microflow have been surveyed. Compared to the approaches described in part 1, porous materials provide an improved support for the EP. Simultaneously they allow to enlarge its contact surface and to reduce the risk of loss by incident flow or by components of surrounding matrix. Solvent-impregnated membranes or hollow fibres are further ideally suited for analyte extraction with simultaneous or subsequent back-extraction. Their use can therefore improve the procedure robustness and reproducibility as well as it "opens the door" to the new operation modes and fields of application. However, additional work and time are required for membrane replacement and renewed impregnation. Automation of porous support-based and membrane-based approaches plays an important role in the achievement of better reliability, rapidness, and reproducibility compared to manual assays. Automated renewal of the extraction solvent and coupling of sample pretreatment with the detection instrumentation can be named as examples. The different LPME methodologies using impregnated membranes and porous supports for the extraction phase and the different strategies of their automation, and their analytical applications are comprehensively described and discussed in this part. Finally, an outlook on future demands and perspectives of LPME techniques from both parts as a promising area in the field of sample pretreatment is given. Copyright © 2015 Elsevier B.V. All rights reserved.
Robust quantum control using smooth pulses and topological winding
NASA Astrophysics Data System (ADS)
Barnes, Edwin; Wang, Xin
2015-03-01
Perhaps the greatest challenge in achieving control of microscopic quantum systems is the decoherence induced by the environment, a problem which pervades experimental quantum physics and is particularly severe in the context of solid state quantum computing and nanoscale quantum devices because of the inherently strong coupling to the surrounding material. We present an analytical approach to constructing intrinsically robust driving fields which automatically cancel the leading-order noise-induced errors in a qubit's evolution exactly. We address two of the most common types of non-Markovian noise that arise in qubits: slow fluctuations of the qubit energy splitting and fluctuations in the driving field itself. We demonstrate our method by constructing robust quantum gates for several types of spin qubits, including phosphorous donors in silicon and nitrogen-vacancy centers in diamond. Our results constitute an important step toward achieving robust generic control of quantum systems, bringing their novel applications closer to realization. Work supported by LPS-CMTC.
NASA Astrophysics Data System (ADS)
Cabello, Violeta
2017-04-01
This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.
Explicit robust schemes for implementation of general principal value-based constitutive models
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Saleeb, A. F.; Tan, H. Q.; Zhang, Y.
1993-01-01
The issue of developing effective and robust schemes to implement general hyperelastic constitutive models is addressed. To this end, special purpose functions are used to symbolically derive, evaluate, and automatically generate the associated FORTRAN code for the explicit forms of the corresponding stress function and material tangent stiffness tensors. These explicit forms are valid for the entire deformation range. The analytical form of these explicit expressions is given here for the case in which the strain-energy potential is taken as a nonseparable polynomial function of the principle stretches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Audrey Noreen
Homeland security relies heavily on analytical chemistry to identify suspicious materials and persons. Traditionally this role has focused on attribution, determining the type and origin of an explosive, for example. But as technology advances, analytical chemistry can and will play an important role in the prevention and preemption of terrorist attacks. More sensitive and selective detection techniques can allow suspicious materials and persons to be identified even before a final destructive product is made. The work presented herein focuses on the use of commercial and novel detection techniques for application to the prevention of terrorist activities. Although drugs are notmore » commonly thought of when discussing terrorism, narcoterrorism has become a significant threat in the 21st century. The role of the drug trade in the funding of terrorist groups is prevalent; thus, reducing the trafficking of illegal drugs can play a role in the prevention of terrorism by cutting off much needed funding. To do so, sensitive, specific, and robust analytical equipment is needed to quickly identify a suspected drug sample no matter what matrix it is in. Single Particle Aerosol Mass Spectrometry (SPAMS) is a novel technique that has previously been applied to biological and chemical detection. The current work applies SPAMS to drug analysis, identifying the active ingredients in single component, multi-component, and multi-tablet drug samples in a relatively non-destructive manner. In order to do so, a sampling apparatus was created to allow particle generation from drug tablets with on-line introduction to the SPAMS instrument. Rules trees were developed to automate the identification of drug samples on a single particle basis. A novel analytical scheme was also developed to identify suspect individuals based on chemical signatures in human breath. Human breath was sampled using an RTube{trademark} and the trace volatile organic compounds (VOCs) were preconcentrated using solid phase microextraction (SPME) and identified using gas chromatography - mass spectrometry (GC-MS). Modifications to the sampling apparatus allowed for increased VOC collection efficiency, and reduced the time of sampling and analysis by over 25%. The VOCs are present in breath due to either endogenous production, or exposure to an external source through absorption, inhalation, or ingestion. Detection of these exogenous chemicals can provide information on the prior location and activities of the subject. Breath samples collected before and after exposure in a hardware store and nail salon were analyzed to investigate the prior location of a subject; breath samples collected before and after oral exposure to terpenes and terpenoid compounds, pseudoephedrine, and inhalation exposure to hexamine and other explosive related compounds were analyzed to investigate the prior activity of a subject. The elimination of such compounds from the body was also monitored. In application, this technique may provide an early warning system to identify persons of interest in the prevention and preemption stages of homeland security.« less
Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert
2009-01-01
Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...
Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A
2007-01-01
The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.
Insights into the Streaming Instability in Protoplanetary Disks
NASA Astrophysics Data System (ADS)
Youdin, Andrew N.; Lin, Min-Kai; Li, Rixin
2017-10-01
The streaming instability is a leading mechanism to concentrate particles in protoplanetary disks, thereby triggering planetesimal formation. I will present recent analytical and numerical work on the origin of the streaming instability and its robustness. Our recent analytic work examines the origin of, and relationship between, a variety of drag-induced instabilities, including the streaming instability as well as secular gravitational instabilities, a drag instability driven by self-gravity. We show that drag instabilities are powered by a specific phase relationship between gas pressure and particle concentrations, which power the instability via pressure work. This mechanism is analogous to pulsating instabilities in stars. This mechanism differs qualitatively from other leading particle concentration mechanisms in pressure bumps and vortices. Our recent numerical work investigates the numerical robustness of non-linear particle clumping by the streaming instability, especially with regard to the location and boundary condition of vertical boundaries. We find that particle clumping is robust to these choices in boxes that are not too short. However, hydrodynamic activity away from the particle-dominated midplane is significantly affected by vertical boundary conditions. This activity affects the observationally significant lofting of small dust grains. We thus emphasize the need for larger scale simulations which connect disk surface layers, including outflowing winds, to the planet-forming midplane.
Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.
ERIC Educational Resources Information Center
Hercules, David M.; Hercules, Shirley H.
1984-01-01
Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, H; Liang, X; Kalbasi, A
2014-06-01
Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: protonmore » PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.« less
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-01-01
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-12-15
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.
Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"
NASA Astrophysics Data System (ADS)
Pal, Sangita; Singha, Mousumi; Meena, Sher Singh
2018-04-01
Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.
Ghani, Milad; Saraji, Mohammad; Maya, Fernando; Cerdà, Víctor
2016-05-06
Herein we present a simple, rapid and low cost strategy for the preparation of robust stir bar coatings based on the combination of montmorillonite with epoxy resin. The composite stir bar was implemented in a novel automated multisyringe stir bar sorptive extraction system (MS-SBSE), and applied to the extraction of four chlorophenols (4-chlorophenol, 2,4-dichlorophenol, 2,4,6-trichlorophenol and pentachlorophenol) as model compounds, followed by high performance liquid chromatography-diode array detection. The different experimental parameters of the MS-SBSE, such as sample volume, selection of the desorption solvent, desorption volume, desorption time, sample solution pH, salt effect and extraction time were studied. Under the optimum conditions, the detection limits were between 0.02 and 0.34μgL(-1). Relative standard deviations (RSD) of the method for the analytes at 10μgL(-1) concentration level ranged from 3.5% to 4.1% (as intra-day RSD) and from 3.9% to 4.3% (as inter-day RSD at 50μgL(-1) concentration level). Batch-to-batch reproducibility for three different stir bars was 4.6-5.1%. The enrichment factors were between 30 and 49. In order to investigate the capability of the developed technique for real sample analysis, well water, wastewater and leachates from a solid waste treatment plant were satisfactorily analyzed. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Michalik-Onichimowska, Aleksandra; Kern, Simon; Riedel, Jens; Panne, Ulrich; King, Rudibert; Maiwald, Michael
2017-04-01
Driven mostly by the search for chemical syntheses under biocompatible conditions, so called "click" chemistry rapidly became a growing field of research. The resulting simple one-pot reactions are so far only scarcely accompanied by an adequate optimization via comparably straightforward and robust analysis techniques possessing short set-up times. Here, we report on a fast and reliable calibration-free online NMR monitoring approach for technical mixtures. It combines a versatile fluidic system, continuous-flow measurement of 1H spectra with a time interval of 20 s per spectrum, and a robust, fully automated algorithm to interpret the obtained data. As a proof-of-concept, the thiol-ene coupling between N-boc cysteine methyl ester and allyl alcohol was conducted in a variety of non-deuterated solvents while its time-resolved behaviour was characterized with step tracer experiments. Overlapping signals in online spectra during thiol-ene coupling could be deconvoluted with a spectral model using indirect hard modeling and were subsequently converted to either molar ratios (using a calibration-free approach) or absolute concentrations (using 1-point calibration). For various solvents the kinetic constant k for pseudo-first order reaction was estimated to be 3.9 h-1 at 25 °C. The obtained results were compared with direct integration of non-overlapping signals and showed good agreement with the implemented mass balance.
NASA Astrophysics Data System (ADS)
Singh, Jaskaran; Darpe, A. K.; Singh, S. P.
2018-02-01
Local damage in rolling element bearings usually generates periodic impulses in vibration signals. The severity, repetition frequency and the fault excited resonance zone by these impulses are the key indicators for diagnosing bearing faults. In this paper, a methodology based on over complete rational dilation wavelet transform (ORDWT) is proposed, as it enjoys a good shift invariance. ORDWT offers flexibility in partitioning the frequency spectrum to generate a number of subbands (filters) with diverse bandwidths. The selection of the optimal filter that perfectly overlaps with the bearing fault excited resonance zone is based on the maximization of a proposed impulse detection measure "Temporal energy operated auto correlated kurtosis". The proposed indicator is robust and consistent in evaluating the impulsiveness of fault signals in presence of interfering vibration such as heavy background noise or sporadic shocks unrelated to the fault or normal operation. The structure of the proposed indicator enables it to be sensitive to fault severity. For enhanced fault classification, an autocorrelation of the energy time series of the signal filtered through the optimal subband is proposed. The application of the proposed methodology is validated on simulated and experimental data. The study shows that the performance of the proposed technique is more robust and consistent in comparison to the original fast kurtogram and wavelet kurtogram.
Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration
NASA Technical Reports Server (NTRS)
Merritt, D. A.; Brand, W. A.; Hayes, J. M.
1994-01-01
In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).
Analytical technique characterizes all trace contaminants in water
NASA Technical Reports Server (NTRS)
Foster, J. N.; Lysyj, I.; Nelson, K. H.
1967-01-01
Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.
Foster, Katherine T; Beltz, Adriene M
2018-08-01
Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.
Synthesis Methods for Robust Passification and Control
NASA Technical Reports Server (NTRS)
Kelkar, Atul G.; Joshi, Suresh M. (Technical Monitor)
2000-01-01
The research effort under this cooperative agreement has been essentially the continuation of the work from previous grants. The ongoing work has primarily focused on developing passivity-based control techniques for Linear Time-Invariant (LTI) systems. During this period, there has been a significant progress made in the area of passivity-based control of LTI systems and some preliminary results have also been obtained for nonlinear systems, as well. The prior work has addressed optimal control design for inherently passive as well as non- passive linear systems. For exploiting the robustness characteristics of passivity-based controllers the passification methodology was developed for LTI systems that are not inherently passive. Various methods of passification were first proposed in and further developed. The robustness of passification was addressed for multi-input multi-output (MIMO) systems for certain classes of uncertainties using frequency-domain methods. For MIMO systems, a state-space approach using Linear Matrix Inequality (LMI)-based formulation was presented, for passification of non-passive LTI systems. An LMI-based robust passification technique was presented for systems with redundant actuators and sensors. The redundancy in actuators and sensors was used effectively for robust passification using the LMI formulation. The passification was designed to be robust to an interval-type uncertainties in system parameters. The passification techniques were used to design a robust controller for Benchmark Active Control Technology wing under parametric uncertainties. The results on passive nonlinear systems, however, are very limited to date. Our recent work in this area was presented, wherein some stability results were obtained for passive nonlinear systems that are affine in control.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.
Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek
2017-07-04
One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.
Per- and Polyfluoroalkyl Substances (PFAS): Sampling ...
Per- and polyfluoroalkyl substances (PFAS) are a large group of manufactured compounds used in a variety of industries, such as aerospace, automotive, textiles, and electronics, and are used in some food packaging and firefighting materials. For example, they may be used to make products more resistant to stains, grease and water. In the environment, some PFAS break down very slowly, if at all, allowing bioaccumulation (concentration) to occur in humans and wildlife. Some have been found to be toxic to laboratory animals, producing reproductive, developmental, and systemic effects in laboratory tests. EPA's methods for analyzing PFAS in environmental media are in various stages of development. This technical brief summarizes the work being done to develop robust analytical methods for groundwater, surface water, wastewater, and solids, including soils, sediments, and biosolids. The U.S. Environmental Protection Agency’s (EPA) methods for analyzing PFAS in environmental media are in various stages of development. EPA is working to develop robust analytical methods for groundwater, surface water, wastewater, and solids, including soils, sediments, and biosolids.
Togola, Anne; Coureau, Charlotte; Guezennec, Anne-Gwenaëlle; Touzé, Solène
2015-05-01
The presence of acrylamide in natural systems is of concern from both environmental and health points of view. We developed an accurate and robust analytical procedure (offline solid phase extraction combined with UPLC/MS/MS) with a limit of quantification (20 ng L(-1)) compatible with toxicity threshold values. The optimized (considering the nature of extraction phases, sampling volumes, and solvent of elution) solid phase extraction (SPE) was validated according to ISO Standard ISO/IEC 17025 on groundwater, surface water, and industrial process water samples. Acrylamide is highly polar, which induces a high variability during the SPE step, therefore requiring the use of C(13)-labeled acrylamide as an internal standard to guarantee the accuracy and robustness of the method (uncertainty about 25 % (k = 2) at limit of quantification level). The specificity of the method and the stability of acrylamide were studied for these environmental media, and it was shown that the method is suitable for measuring acrylamide in environmental studies.
A robust molecular probe for Ångstrom-scale analytics in liquids
Nirmalraj, Peter; Thompson, Damien; Dimitrakopoulos, Christos; Gotsmann, Bernd; Dumcenco, Dumitru; Kis, Andras; Riel, Heike
2016-01-01
Traditionally, nanomaterial profiling using a single-molecule-terminated scanning probe is performed at the vacuum–solid interface often at a few Kelvin, but is not a notion immediately associated with liquid–solid interface at room temperature. Here, using a scanning tunnelling probe functionalized with a single C60 molecule stabilized in a high-density liquid, we resolve low-dimensional surface defects, atomic interfaces and capture Ångstrom-level bond-length variations in single-layer graphene and MoS2. Atom-by-atom controllable imaging contrast is demonstrated at room temperature and the electronic structure of the C60–metal probe complex within the encompassing liquid molecules is clarified using density functional theory. Our findings demonstrates that operating a robust single-molecular probe is not restricted to ultra-high vacuum and cryogenic settings. Hence the scope of high-precision analytics can be extended towards resolving sub-molecular features of organic elements and gauging ambient compatibility of emerging layered materials with atomic-scale sensitivity under experimentally less stringent conditions. PMID:27516157
Orion Orbit Control Design and Analysis
NASA Technical Reports Server (NTRS)
Jackson, Mark; Gonzalez, Rodolfo; Sims, Christopher
2007-01-01
The analysis of candidate thruster configurations for the Crew Exploration Vehicle (CEV) is presented. Six candidate configurations were considered for the prime contractor baseline design. The analysis included analytical assessments of control authority, control precision, efficiency and robustness, as well as simulation assessments of control performance. The principles used in the analytic assessments of controllability, robustness and fuel performance are covered and results provided for the configurations assessed. Simulation analysis was conducted using a pulse width modulated, 6 DOF reaction system control law with a simplex-based thruster selection algorithm. Control laws were automatically derived from hardware configuration parameters including thruster locations, directions, magnitude and specific impulse, as well as vehicle mass properties. This parameterized controller allowed rapid assessment of multiple candidate layouts. Simulation results are presented for final phase rendezvous and docking, as well as low lunar orbit attitude hold. Finally, on-going analysis to consider alternate Service Module designs and to assess the pilot-ability of the baseline design are discussed to provide a status of orbit control design work to date.
Werling, Donna M; Brand, Harrison; An, Joon-Yong; Stone, Matthew R; Zhu, Lingxue; Glessner, Joseph T; Collins, Ryan L; Dong, Shan; Layer, Ryan M; Markenscoff-Papadimitriou, Eirene; Farrell, Andrew; Schwartz, Grace B; Wang, Harold Z; Currall, Benjamin B; Zhao, Xuefang; Dea, Jeanselle; Duhn, Clif; Erdman, Carolyn A; Gilson, Michael C; Yadav, Rachita; Handsaker, Robert E; Kashin, Seva; Klei, Lambertus; Mandell, Jeffrey D; Nowakowski, Tomasz J; Liu, Yuwen; Pochareddy, Sirisha; Smith, Louw; Walker, Michael F; Waterman, Matthew J; He, Xin; Kriegstein, Arnold R; Rubenstein, John L; Sestan, Nenad; McCarroll, Steven A; Neale, Benjamin M; Coon, Hilary; Willsey, A Jeremy; Buxbaum, Joseph D; Daly, Mark J; State, Matthew W; Quinlan, Aaron R; Marth, Gabor T; Roeder, Kathryn; Devlin, Bernie; Talkowski, Michael E; Sanders, Stephan J
2018-05-01
Genomic association studies of common or rare protein-coding variation have established robust statistical approaches to account for multiple testing. Here we present a comparable framework to evaluate rare and de novo noncoding single-nucleotide variants, insertion/deletions, and all classes of structural variation from whole-genome sequencing (WGS). Integrating genomic annotations at the level of nucleotides, genes, and regulatory regions, we define 51,801 annotation categories. Analyses of 519 autism spectrum disorder families did not identify association with any categories after correction for 4,123 effective tests. Without appropriate correction, biologically plausible associations are observed in both cases and controls. Despite excluding previously identified gene-disrupting mutations, coding regions still exhibited the strongest associations. Thus, in autism, the contribution of de novo noncoding variation is probably modest in comparison to that of de novo coding variants. Robust results from future WGS studies will require large cohorts and comprehensive analytical strategies that consider the substantial multiple-testing burden.
Characterization of Phase Separation Propensity for Amorphous Spray Dried Dispersions.
McNamara, Daniel; Yin, Shawn; Pan, Duohai; Crull, George; Timmins, Peter; Vig, Balvinder
2017-02-06
A generalized screening approach, applying isothermal calorimetry at 37 °C 100% RH, to formulations of spray dried dispersions (SDDs) for two active pharmaceutical ingredients (APIs) (BMS-903452 and BMS-986034) is demonstrated. APIs 452 and 034, with similar chemotypes, were synthesized and promoted during development for oral dosing. Both APIs were formulated as SDDs for animal exposure studies using the polymer hydroxypropylmethlycellulose acetyl succinate M grade (HPMCAS-M). 452 formulated at 30% (wt/wt %) was an extremely robust SDD that was able to withstand 40 °C 75% RH open storage conditions for 6 months with no physical evidence of crystallization or loss of dissolution performance. Though 034 was a chemical analogue with similar physical chemical properties to 452, a physically stable SDD of 034 could not be formulated in HPMCAS-M at any of the drug loads attempted. This study was used to develop experience with specific physical characterization laboratory techniques to evaluate the physical stability of SDDs and to characterize the propensity of SDDs to phase separate and possibly crystallize. The screening strategy adopted was to stress the formulated SDDs with a temperature humidity screen, within the calorimeter, and to apply orthogonal analytical techniques to gain a more informed understanding of why these SDDs formulated with HPMCAS-M demonstrated such different physical stability. Isothermal calorimetry (thermal activity monitor, TAM) was employed as a primary stress screen wherein the SDD formulations were monitored for 3 days at 37 °C 100% RH for signs of phase separation and possible crystallization of API. Powder X-ray diffraction (pXRD), modulated differential scanning calorimetry (mDSC), Fourier transform infrared spectroscopy (FTIR), and solid state nuclear magnetic resonance (ssNMR) were all used to examine formulated SDDs and neat amorphous drug. 452 SDDs formulated at 30% (wt/wt %) or less did not show phase separation behavior upon exposure to 37 °C 100% RH for 3 days. 034 SDD formulations from 10 through 50% (wt/wt %) all demonstrated thermal traces consistent with exothermic phase separation events over 3 days at 37 °C 100% RH in the TAM. However, only the 15, 30, and 50% containing 034 samples showed pXRD patterns consistent with crystalline material in post-TAM samples. Isothermal calorimetry is a useful screening tool to probe robust SDD physical performance and help investigate the level of drug polymer miscibility under a humid stress. Orthogonal analytical techniques such as pXRD, ssNMR, and FTIR were key in this SDD formulation screening to gain physical understanding and confirm or refute whether physical changes occur during the observed thermal events characterized by the calorimetric screening experiments.
Common aspects influencing the translocation of SERS to Biomedicine.
Gil, Pilar Rivera; Tsouts, Dionysia; Sanles-Sobrido, Marcos; Cabo, Andreu
2018-01-04
In this review, we introduce the reader the analytical technique, surface-enhanced Raman scattering motivated by the great potential we believe this technique have in biomedicine. We present the advantages and limitations of this technique relevant for bioanalysis in vitro and in vivo and how this technique goes beyond the state of the art of traditional analytical, labelling and healthcare diagnosis technologies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Robust dynamical decoupling for quantum computing and quantum memory.
Souza, Alexandre M; Alvarez, Gonzalo A; Suter, Dieter
2011-06-17
Dynamical decoupling (DD) is a popular technique for protecting qubits from the environment. However, unless special care is taken, experimental errors in the control pulses used in this technique can destroy the quantum information instead of preserving it. Here, we investigate techniques for making DD sequences robust against different types of experimental errors while retaining good decoupling efficiency in a fluctuating environment. We present experimental data from solid-state nuclear spin qubits and introduce a new DD sequence that is suitable for quantum computing and quantum memory.
Parametric robust control and system identification: Unified approach
NASA Technical Reports Server (NTRS)
Keel, Leehyun
1994-01-01
Despite significant advancement in the area of robust parametric control, the problem of synthesizing such a controller is still a wide open problem. Thus, we attempt to give a solution to this important problem. Our approach captures the parametric uncertainty as an H(sub infinity) unstructured uncertainty so that H(sub infinity) synthesis techniques are applicable. Although the techniques cannot cope with the exact parametric uncertainty, they give a reasonable guideline to model the unstructured uncertainty that contains the parametric uncertainty. An additional loop shaping technique is also introduced to relax its conservatism.
Liu, Wei; Li, Yupeng; Li, Xiaoqiang; Cao, Wenhua; Zhang, Xiaodong
2012-01-01
Purpose: The distal edge tracking (DET) technique in intensity-modulated proton therapy (IMPT) allows for high energy efficiency, fast and simple delivery, and simple inverse treatment planning; however, it is highly sensitive to uncertainties. In this study, the authors explored the application of DET in IMPT (IMPT-DET) and conducted robust optimization of IMPT-DET to see if the planning technique’s sensitivity to uncertainties was reduced. They also compared conventional and robust optimization of IMPT-DET with three-dimensional IMPT (IMPT-3D) to gain understanding about how plan robustness is achieved. Methods: They compared the robustness of IMPT-DET and IMPT-3D plans to uncertainties by analyzing plans created for a typical prostate cancer case and a base of skull (BOS) cancer case (using data for patients who had undergone proton therapy at our institution). Spots with the highest and second highest energy layers were chosen so that the Bragg peak would be at the distal edge of the targets in IMPT-DET using 36 equally spaced angle beams; in IMPT-3D, 3 beams with angles chosen by a beam angle optimization algorithm were planned. Dose contributions for a number of range and setup uncertainties were calculated, and a worst-case robust optimization was performed. A robust quantification technique was used to evaluate the plans’ sensitivity to uncertainties. Results: With no uncertainties considered, the DET is less robust to uncertainties than is the 3D method but offers better normal tissue protection. With robust optimization to account for range and setup uncertainties, robust optimization can improve the robustness of IMPT plans to uncertainties; however, our findings show the extent of improvement varies. Conclusions: IMPT’s sensitivity to uncertainties can be improved by using robust optimization. They found two possible mechanisms that made improvements possible: (1) a localized single-field uniform dose distribution (LSFUD) mechanism, in which the optimization algorithm attempts to produce a single-field uniform dose distribution while minimizing the patching field as much as possible; and (2) perturbed dose distribution, which follows the change in anatomical geometry. Multiple-instance optimization has more knowledge of the influence matrices; this greater knowledge improves IMPT plans’ ability to retain robustness despite the presence of uncertainties. PMID:22755694
Abdelkarim, Noha; Mohamed, Amr E; El-Garhy, Ahmed M; Dorrah, Hassen T
2016-01-01
The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller.
Mohamed, Amr E.; Dorrah, Hassen T.
2016-01-01
The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller. PMID:27807444
A secure and robust information hiding technique for covert communication
NASA Astrophysics Data System (ADS)
Parah, S. A.; Sheikh, J. A.; Hafiz, A. M.; Bhat, G. M.
2015-08-01
The unprecedented advancement of multimedia and growth of the internet has made it possible to reproduce and distribute digital media easier and faster. This has given birth to information security issues, especially when the information pertains to national security, e-banking transactions, etc. The disguised form of encrypted data makes an adversary suspicious and increases the chance of attack. Information hiding overcomes this inherent problem of cryptographic systems and is emerging as an effective means of securing sensitive data being transmitted over insecure channels. In this paper, a secure and robust information hiding technique referred to as Intermediate Significant Bit Plane Embedding (ISBPE) is presented. The data to be embedded is scrambled and embedding is carried out using the concept of Pseudorandom Address Vector (PAV) and Complementary Address Vector (CAV) to enhance the security of the embedded data. The proposed ISBPE technique is fully immune to Least Significant Bit (LSB) removal/replacement attack. Experimental investigations reveal that the proposed technique is more robust to various image processing attacks like JPEG compression, Additive White Gaussian Noise (AWGN), low pass filtering, etc. compared to conventional LSB techniques. The various advantages offered by ISBPE technique make it a good candidate for covert communication.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Surface-Enhanced Raman Spectroscopy.
ERIC Educational Resources Information Center
Garrell, Robin L.
1989-01-01
Reviews the basis for the technique and its experimental requirements. Describes a few examples of the analytical problems to which surface-enhanced Raman spectroscopy (SERS) has been and can be applied. Provides a perspective on the current limitations and frontiers in developing SERS as an analytical technique. (MVL)
Jabłońska-Czapla, Magdalena
2015-01-01
Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962
Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali
2016-03-01
The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.
A Robust Absorbing Boundary Condition for Compressible Flows
NASA Technical Reports Server (NTRS)
Loh, Ching Y.; orgenson, Philip C. E.
2005-01-01
An absorbing non-reflecting boundary condition (NRBC) for practical computations in fluid dynamics and aeroacoustics is presented with theoretical proof. This paper is a continuation and improvement of a previous paper by the author. The absorbing NRBC technique is based on a first principle of non reflecting, which contains the essential physics that a plane wave solution of the Euler equations remains intact across the boundary. The technique is theoretically shown to work for a large class of finite volume approaches. When combined with the hyperbolic conservation laws, the NRBC is simple, robust and truly multi-dimensional; no additional implementation is needed except the prescribed physical boundary conditions. Several numerical examples in multi-dimensional spaces using two different finite volume schemes are illustrated to demonstrate its robustness in practical computations. Limitations and remedies of the technique are also discussed.
Hybrid approach for robust diagnostics of cutting tools
NASA Astrophysics Data System (ADS)
Ramamurthi, K.; Hough, C. L., Jr.
1994-03-01
A new multisensor based hybrid technique has been developed for robust diagnosis of cutting tools. The technique combines the concepts of pattern classification and real-time knowledge based systems (RTKBS) and draws upon their strengths; learning facility in the case of pattern classification and a higher level of reasoning in the case of RTKBS. It eliminates some of their major drawbacks: false alarms or delayed/lack of diagnosis in case of pattern classification and tedious knowledge base generation in case of RTKBS. It utilizes a dynamic distance classifier, developed upon a new separability criterion and a new definition of robust diagnosis for achieving these benefits. The promise of this technique has been proven concretely through an on-line diagnosis of drill wear. Its suitability for practical implementation is substantiated by the use of practical, inexpensive, machine-mounted sensors and low-cost delivery systems.
NASA Astrophysics Data System (ADS)
Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.
2002-04-01
Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.
Van Dam, Debby; Vermeiren, Yannick; Aerts, Tony; De Deyn, Peter Paul
2014-08-01
A fast and simple RP-HPLC method with electrochemical detection (ECD) and ion pair chromatography was developed, optimized and validated in order to simultaneously determine eight different biogenic amines and metabolites in post-mortem human brain tissue in a single-run analytical approach. The compounds of interest are the indolamine serotonin (5-hydroxytryptamine, 5-HT), the catecholamines dopamine (DA) and (nor)epinephrine ((N)E), as well as their respective metabolites, i.e. 3,4-dihydroxyphenylacetic acid (DOPAC) and homovanillic acid (HVA), 5-hydroxy-3-indoleacetic acid (5-HIAA) and 3-methoxy-4-hydroxyphenylglycol (MHPG). A two-level fractional factorial experimental design was applied to study the effect of five experimental factors (i.e. the ion-pair counter concentration, the level of organic modifier, the pH of the mobile phase, the temperature of the column, and the voltage setting of the detector) on the chromatographic behaviour. The cross effect between the five quantitative factors and the capacity and separation factors of the analytes were then analysed using a Standard Least Squares model. The optimized method was fully validated according to the requirements of SFSTP (Société Française des Sciences et Techniques Pharmaceutiques). Our human brain tissue sample preparation procedure is straightforward and relatively short, which allows samples to be loaded onto the HPLC system within approximately 4h. Additionally, a high sample throughput was achieved after optimization due to a total runtime of maximally 40min per sample. The conditions and settings of the HPLC system were found to be accurate with high intra and inter-assay repeatability, recovery and accuracy rates. The robust analytical method results in very low detection limits and good separation for all of the eight biogenic amines and metabolites in this complex mixture of biological analytes. Copyright © 2014 Elsevier B.V. All rights reserved.
WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL
The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Study of Cyclodextrin-Based Polymers to Extract Patulin from Apple Juice
USDA-ARS?s Scientific Manuscript database
Synthetic sorbents offer a means to develop more robust materials to detect analytes in complex matrices, including methods to detect naturally occurring contaminants in agricultural commodities. Patulin is a mold metabolite associated with rotting apples and poses health risks to humans and animal...
Resonance Ionization, Mass Spectrometry.
ERIC Educational Resources Information Center
Young, J. P.; And Others
1989-01-01
Discussed is an analytical technique that uses photons from lasers to resonantly excite an electron from some initial state of a gaseous atom through various excited states of the atom or molecule. Described are the apparatus, some analytical applications, and the precision and accuracy of the technique. Lists 26 references. (CW)
Meta-Analytic Structural Equation Modeling (MASEM): Comparison of the Multivariate Methods
ERIC Educational Resources Information Center
Zhang, Ying
2011-01-01
Meta-analytic Structural Equation Modeling (MASEM) has drawn interest from many researchers recently. In doing MASEM, researchers usually first synthesize correlation matrices across studies using meta-analysis techniques and then analyze the pooled correlation matrix using structural equation modeling techniques. Several multivariate methods of…
Turbine blade tip durability analysis
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.
1981-01-01
An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.
Analytic proof of the existence of the Lorenz attractor in the extended Lorenz model
NASA Astrophysics Data System (ADS)
Ovsyannikov, I. I.; Turaev, D. V.
2017-01-01
We give an analytic (free of computer assistance) proof of the existence of a classical Lorenz attractor for an open set of parameter values of the Lorenz model in the form of Yudovich-Morioka-Shimizu. The proof is based on detection of a homoclinic butterfly with a zero saddle value and rigorous verification of one of the Shilnikov criteria for the birth of the Lorenz attractor; we also supply a proof for this criterion. The results are applied in order to give an analytic proof for the existence of a robust, pseudohyperbolic strange attractor (the so-called discrete Lorenz attractor) for an open set of parameter values in a 4-parameter family of 3D Henon-like diffeomorphisms.
CMB weak-lensing beyond the Born approximation: a numerical approach
NASA Astrophysics Data System (ADS)
Fabbian, Giulio; Calabrese, Matteo; Carbone, Carmelita
2018-02-01
We perform a complete study of the gravitational lensing effect beyond the Born approximation on the Cosmic Microwave Background (CMB) anisotropies using a multiple-lens raytracing technique through cosmological N-body simulations of the DEMNUni suite. The impact of second-order effects accounting for the non-linear evolution of large-scale structures is evaluated propagating for the first time the full CMB lensing jacobian together with the light rays trajectories. We carefully investigate the robustness of our approach against several numerical effects in the raytracing procedure and in the N-body simulation itself, and find no evidence of large contaminations. We discuss the impact of beyond-Born corrections on lensed CMB observables, and compare our results with recent analytical predictions that appeared in the literature, finding a good agreement, and extend these results to smaller angular scales. We measure the gravitationally-induced CMB polarization rotation that appears in the geodesic equation at second order, and compare this result with the latest analytical predictions. We then present the detection prospect of beyond-Born effects with the future CMB-S4 experiment. We show that corrections to the temperature power spectrum can be measured only if a good control of the extragalactic foregrounds is achieved. Conversely, the beyond-Born corrections on E and B-modes power spectra will be much more difficult to detect.
NASA Astrophysics Data System (ADS)
Smith, Katharine A.; Schlag, Zachary; North, Elizabeth W.
2018-07-01
Coupled three-dimensional circulation and biogeochemical models predict changes in water properties that can be used to define fish habitat, including physiologically important parameters such as temperature, salinity, and dissolved oxygen. However, methods for calculating the volume of habitat defined by the intersection of multiple water properties are not well established for coupled three-dimensional models. The objectives of this research were to examine multiple methods for calculating habitat volume from three-dimensional model predictions, select the most robust approach, and provide an example application of the technique. Three methods were assessed: the "Step," "Ruled Surface", and "Pentahedron" methods, the latter of which was developed as part of this research. Results indicate that the analytical Pentahedron method is exact, computationally efficient, and preserves continuity in water properties between adjacent grid cells. As an example application, the Pentahedron method was implemented within the Habitat Volume Model (HabVol) using output from a circulation model with an Arakawa C-grid and physiological tolerances of juvenile striped bass (Morone saxatilis). This application demonstrates that the analytical Pentahedron method can be successfully applied to calculate habitat volume using output from coupled three-dimensional circulation and biogeochemical models, and it indicates that the Pentahedron method has wide application to aquatic and marine systems for which these models exist and physiological tolerances of organisms are known.
Lima, Marcelo B; Barreto, Inakã S; Andrade, Stéfani Iury E; Almeida, Luciano F; Araújo, Mário C U
2012-10-15
In this study, a micro-flow-batch analyzer (μFBA) with solenoid micro-pumps for the photometric determination of iodate in table salt is described. The method is based on the reaction of iodate with iodide to form molecular iodine followed by the reaction with N,N-diethyl-p-phenylenediamine (DPD). The analytical signal was measured at 520 nm using a green LED integrated into the μFBA built in the urethane-acrylate resin. The analytical curve for iodate was linear in the range of 0.01-10.0 mg L(-1) with a correlation coefficient of 0.997. The limit of detection and relative standard deviation were estimated at 0.004 mg L(-1) and<1.5% (n=3), respectively. The accuracy was assessed through recovery test (97.6-103.5%) and independent analysis by a conventional titrimetric method. Comparing this technique with the conventional method, no statistically significant differences were observed when applying the paired t-test at a 95% confidence level. The proposed microsystem using solenoid micro-pumps presented satisfactory robustness and high sampling rate (170 h(-1)), with a low reagents consumption and a low cost to build the device. The proposed microsystem is a new alternative for automatic determination of iodate in table salt, comparing satisfactory to the recently flow system. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Spencer, R. G.; Moura, J. M. S.; Mitsuya, M.; Peucker-Ehrenbrink, B.; Holmes, R. M.; Galy, V.; Drake, T.
2017-12-01
Rivers integrate over a fixed and definable area (the watershed), with their discharge and chemistry at any given point a function of upstream processes. As a consequence, examination of riverine discharge and chemistry can provide powerful indictors of change within a watershed. To assess the validity of this approach long-term datasets are required from fluvial environments around the globe. The Amazon River delivers one-fifth of the total freshwater discharged to the ocean and so represents a fundamentally important site for examination of long-term major ion, trace element, nutrient, and organic matter (OM) export. Here we describe data from a multi-year, monthly sampling campaign of the Amazon River at Obidos (Para, Brazil). Clear seasonality in all analyte fluxes is apparent and is linked to hydrology, however dissolved OM composition appears dominated by allochthonous sources throughout the year as evidenced by optical parameters indicative of high molecular weight and high relative aromatic content. Annual loads of some analytes for 2011-2013 inclusive varied by up to 50%, highlighting significant variability in flux from year to year that was linked to inter-annual hydrologic shifts (i.e. higher fluxes in wetter years). Finally, encompassing both intra- and inter-annual variability, a robust correlation was observed between chromophoric dissolved OM (CDOM) absorbance and dissolved organic carbon (DOC) concentration highlighting the potential to improve DOC flux estimates at this globally significant site via CDOM measurements from in situ technologies or remote sensing techniques.
NASA Astrophysics Data System (ADS)
Trainoff, Steven
2009-03-01
Many modern pharmaceuticals and naturally occurring biomolecules consist of complexes of proteins and polyethylene glycol or carbohydrates. In the case of vaccine development, these complexes are often used to induce or amplify immune responses. For protein therapeutics they are used to modify solubility and function, or to control the rate of degradation and elimination of a drug from the body. Characterizing the stoichiometry of these complexes is an important industrial problem that presents a formidable challenge to analytical instrument designers. Traditional analytical methods, such as using florescent tagging, chemical assays, and mass spectrometry perturb the system so dramatically that the complexes are often destroyed or uncontrollably modified by the measurement. A solution to this problem consists of fractionating the samples and then measuring the fractions using sequential non-invasive detectors that are sensitive to different components of the complex. We present results using UV absorption, which is primarily sensitive to the protein fraction, Light Scattering, which measures the total weight average molar mass, and Refractive Index detection, which measures the net concentration. We also present a solution of the problem inter-detector band-broadening problem that has heretofore made this approach impractical. Presented will be instrumentation and an analysis method that overcome these obstacles and make this technique a reliable and robust way of non-invasively characterizing these industrially important compounds.
Analytical Challenges in Biotechnology.
ERIC Educational Resources Information Center
Glajch, Joseph L.
1986-01-01
Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)
An analytical and experimental evaluation of a Fresnel lens solar concentrator
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Allums, S. A.; Cosby, R. M.
1976-01-01
An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.
Deriving Earth Science Data Analytics Requirements
NASA Technical Reports Server (NTRS)
Kempler, Steven J.
2015-01-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.
Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples
NASA Astrophysics Data System (ADS)
Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi
2016-10-01
The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (
A Case Study: Analyzing City Vitality with Four Pillars of Activity-Live, Work, Shop, and Play.
Griffin, Matt; Nordstrom, Blake W; Scholes, Jon; Joncas, Kate; Gordon, Patrick; Krivenko, Elliott; Haynes, Winston; Higdon, Roger; Stewart, Elizabeth; Kolker, Natali; Montague, Elizabeth; Kolker, Eugene
2016-03-01
This case study evaluates and tracks vitality of a city (Seattle), based on a data-driven approach, using strategic, robust, and sustainable metrics. This case study was collaboratively conducted by the Downtown Seattle Association (DSA) and CDO Analytics teams. The DSA is a nonprofit organization focused on making the city of Seattle and its Downtown a healthy and vibrant place to Live, Work, Shop, and Play. DSA primarily operates through public policy advocacy, community and business development, and marketing. In 2010, the organization turned to CDO Analytics ( cdoanalytics.org ) to develop a process that can guide and strategically focus DSA efforts and resources for maximal benefit to the city of Seattle and its Downtown. CDO Analytics was asked to develop clear, easily understood, and robust metrics for a baseline evaluation of the health of the city, as well as for ongoing monitoring and comparisons of the vitality, sustainability, and growth. The DSA and CDO Analytics teams strategized on how to effectively assess and track the vitality of Seattle and its Downtown. The two teams filtered a variety of data sources, and evaluated the veracity of multiple diverse metrics. This iterative process resulted in the development of a small number of strategic, simple, reliable, and sustainable metrics across four pillars of activity: Live, Work, Shop, and Play. Data during the 5 years before 2010 were used for the development of the metrics and model and its training, and data during the 5 years from 2010 and on were used for testing and validation. This work enabled DSA to routinely track these strategic metrics, use them to monitor the vitality of Downtown Seattle, prioritize improvements, and identify new value-added programs. As a result, the four-pillar approach became an integral part of the data-driven decision-making and execution of the Seattle community's improvement activities. The approach described in this case study is actionable, robust, inexpensive, and easy to adopt and sustain. It can be applied to cities, districts, counties, regions, states, or countries, enabling cross-comparisons and improvements of vitality, sustainability, and growth.
Lewis, Nathan S
2004-09-01
Arrays of broadly cross-reactive vapor sensors provide a man-made implementation of an olfactory system, in which an analyte elicits a response from many receptors and each receptor responds to a variety of analytes. Pattern recognition methods are then used to detect analytes based on the collective response of the sensor array. With the use of this architecture, arrays of chemically sensitive resistors made from composites of conductors and insulating organic polymers have been shown to robustly classify, identify, and quantify a diverse collection of organic vapors, even though no individual sensor responds selectively to a particular analyte. The properties and functioning of these arrays are inspired by advances in the understanding of biological olfaction, and in turn, evaluation of the performance of the man-made array provides suggestions regarding some of the fundamental odor detection principles of the mammalian olfactory system.
Creating analytically divergence-free velocity fields from grid-based data
NASA Astrophysics Data System (ADS)
Ravu, Bharath; Rudman, Murray; Metcalfe, Guy; Lester, Daniel R.; Khakhar, Devang V.
2016-10-01
We present a method, based on B-splines, to calculate a C2 continuous analytic vector potential from discrete 3D velocity data on a regular grid. A continuous analytically divergence-free velocity field can then be obtained from the curl of the potential. This field can be used to robustly and accurately integrate particle trajectories in incompressible flow fields. Based on the method of Finn and Chacon (2005) [10] this new method ensures that the analytic velocity field matches the grid values almost everywhere, with errors that are two to four orders of magnitude lower than those of existing methods. We demonstrate its application to three different problems (each in a different coordinate system) and provide details of the specifics required in each case. We show how the additional accuracy of the method results in qualitatively and quantitatively superior trajectories that results in more accurate identification of Lagrangian coherent structures.
Bär, David; Debus, Heiko; Brzenczek, Sina; Fischer, Wolfgang; Imming, Peter
2018-03-20
Near-infrared spectroscopy is frequently used by the pharmaceutical industry to monitor and optimize several production processes. In combination with chemometrics, a mathematical-statistical technique, the following advantages of near-infrared spectroscopy can be applied: It is a fast, non-destructive, non-invasive, and economical analytical method. One of the most advanced and popular chemometric technique is the partial least square algorithm with its best applicability in routine and its results. The required reference analytic enables the analysis of various parameters of interest, for example, moisture content, particle size, and many others. Parameters like the correlation coefficient, root mean square error of prediction, root mean square error of calibration, and root mean square error of validation have been used for evaluating the applicability and robustness of these analytical methods developed. This study deals with investigating a Naproxen Sodium granulation process using near-infrared spectroscopy and the development of water content and particle-size methods. For the water content method, one should consider a maximum water content of about 21% in the granulation process, which must be confirmed by the loss on drying. Further influences to be considered are the constantly changing product temperature, rising to about 54 °C, the creation of hydrated states of Naproxen Sodium when using a maximum of about 21% water content, and the large quantity of about 87% Naproxen Sodium in the formulation. It was considered to use a combination of these influences in developing the near-infrared spectroscopy method for the water content of Naproxen Sodium granules. The "Root Mean Square Error" was 0.25% for calibration dataset and 0.30% for the validation dataset, which was obtained after different stages of optimization by multiplicative scatter correction and the first derivative. Using laser diffraction, the granules have been analyzed for particle sizes and obtaining the summary sieve sizes of >63 μm and >100 μm. The following influences should be considered for application in routine production: constant changes in water content up to 21% and a product temperature up to 54 °C. The different stages of optimization result in a "Root Mean Square Error" of 2.54% for the calibration data set and 3.53% for the validation set by using the Kubelka-Munk conversion and first derivative for the near-infrared spectroscopy method for a particle size >63 μm. For the near-infrared spectroscopy method using a particle size >100 μm, the "Root Mean Square Error" was 3.47% for the calibration data set and 4.51% for the validation set, while using the same pre-treatments. - The robustness and suitability of this methodology has already been demonstrated by its recent successful implementation in a routine granulate production process. Copyright © 2018 Elsevier B.V. All rights reserved.
Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community
2016-01-01
Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement
40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests
Code of Federal Regulations, 2012 CFR
2012-07-01
... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...
40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests
Code of Federal Regulations, 2011 CFR
2011-07-01
... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...
Analytical aids in land management planning
David R. Betters
1978-01-01
Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...
Patel, Bhumit A; Pinto, Nuno D S; Gospodarek, Adrian; Kilgore, Bruce; Goswami, Kudrat; Napoli, William N; Desai, Jayesh; Heo, Jun H; Panzera, Dominick; Pollard, David; Richardson, Daisy; Brower, Mark; Richardson, Douglas D
2017-11-07
Combining process analytical technology (PAT) with continuous production provides a powerful tool to observe and control monoclonal antibody (mAb) fermentation and purification processes. This work demonstrates on-line liquid chromatography (on-line LC) as a PAT tool for monitoring a continuous biologics process and forced degradation studies. Specifically, this work focused on ion exchange chromatography (IEX), which is a critical separation technique to detect charge variants. Product-related impurities, including charge variants, that impact function are classified as critical quality attributes (CQAs). First, we confirmed no significant differences were observed in the charge heterogeneity profile of a mAb through both at-line and on-line sampling and that the on-line method has the ability to rapidly detect changes in protein quality over time. The robustness and versatility of the PAT methods were tested by sampling from two purification locations in a continuous mAb process. The PAT IEX methods used with on-line LC were a weak cation exchange (WCX) separation and a newly developed shorter strong cation exchange (SCX) assay. Both methods provided similar results with the distribution of percent acidic, main, and basic species remaining unchanged over a 2 week period. Second, a forced degradation study showed an increase in acidic species and a decrease in basic species when sampled on-line over 7 days. These applications further strengthen the use of on-line LC to monitor CQAs of a mAb continuously with various PAT IEX analytical methods. Implementation of on-line IEX will enable faster decision making during process development and could potentially be applied to control in biomanufacturing.
Determination of cocaine and metabolites in hair by column-switching LC-MS-MS analysis.
Alves, Marcela Nogueira Rabelo; Zanchetti, Gabriele; Piccinotti, Alberto; Tameni, Silvia; De Martinis, Bruno Spinosa; Polettini, Aldo
2013-07-01
A method for rapid, selective, and robust determination of cocaine (CO) and metabolites in 5-mg hair samples was developed and fully validated using a column-switching liquid chromatography-tandem mass spectrometry system (LC-MS-MS). Hair samples were decontaminated, segmented, incubated overnight in diluted HCl, and centrifuged, and the diluted (1:10 with distilled water) extracts were analyzed in positive ionization mode monitoring two reactions per analyte. Quantifier transitions were: m/z 304.2→182.2 for CO, m/z 290.1→168.1 for benzoylecgonine (BE), and m/z 318.2→196.2 for cocaethylene (CE). The lower limit of quantification (LLOQ) was set at 0.05 ng/mg for CO and CE, and 0.012 ng/mg for BE. Imprecision and inaccuracy at LLOQ were lower than 20 % for all analytes. Linearity ranged between 0.05 and 50.0 ng/mg for CO and CE and 0.012 and 12.50 ng/mg for BE. Selectivity, matrix effect, process efficiency, recovery, carryover, cross talk, and autosampler stability were also evaluated during validation. Eighteen real hair samples and five samples from a commercial proficiency testing program were comparatively examined with the proposed multidimensional chromatography coupled with tandem mass spectrometry procedure and our reference gas chromatography coupled to mass spectrometry (GC-MS) method. Compared with our reference GC-MS method, column-switching technique and the high sensitivity of the tandem mass spectrometry detection system allowed to significantly reduce sample amount (×10) with increased sensitivity (×2) and sample throughput (×4), to simplify sample preparation, and to avoid that interfering compounds and ions impaired the ionization and detection of the analytes and deteriorate the performance of the ion source.
Northrop, Paul W. C.; Pathak, Manan; Rife, Derek; ...
2015-03-09
Lithium-ion batteries are an important technology to facilitate efficient energy storage and enable a shift from petroleum based energy to more environmentally benign sources. Such systems can be utilized most efficiently if good understanding of performance can be achieved for a range of operating conditions. Mathematical models can be useful to predict battery behavior to allow for optimization of design and control. An analytical solution is ideally preferred to solve the equations of a mathematical model, as it eliminates the error that arises when using numerical techniques and is usually computationally cheap. An analytical solution provides insight into the behaviormore » of the system and also explicitly shows the effects of different parameters on the behavior. However, most engineering models, including the majority of battery models, cannot be solved analytically due to non-linearities in the equations and state dependent transport and kinetic parameters. The numerical method used to solve the system of equations describing a battery operation can have a significant impact on the computational cost of the simulation. In this paper, a model reformulation of the porous electrode pseudo three dimensional (P3D) which significantly reduces the computational cost of lithium ion battery simulation, while maintaining high accuracy, is discussed. This reformulation enables the use of the P3D model into applications that would otherwise be too computationally expensive to justify its use, such as online control, optimization, and parameter estimation. Furthermore, the P3D model has proven to be robust enough to allow for the inclusion of additional physical phenomena as understanding improves. In this study, the reformulated model is used to allow for more complicated physical phenomena to be considered for study, including thermal effects.« less
Scadding, Cameron J; Watling, R John; Thomas, Allen G
2005-08-15
The majority of crimes result in the generation of some form of physical evidence, which is available for collection by crime scene investigators or police. However, this debris is often limited in amount as modern criminals become more aware of its potential value to forensic scientists. The requirement to obtain robust evidence from increasingly smaller sized samples has required refinement and modification of old analytical techniques and the development of new ones. This paper describes a new method for the analysis of oxy-acetylene debris, left behind at a crime scene, and the establishment of its co-provenance with single particles of equivalent debris found on the clothing of persons of interest (POI). The ability to rapidly determine and match the elemental distribution patterns of debris collected from crime scenes to those recovered from persons of interest is essential in ensuring successful prosecution. Traditionally, relatively large amounts of sample (up to several milligrams) have been required to obtain a reliable elemental fingerprint of this type of material [R.J. Walting , B.F. Lynch, D. Herring, J. Anal. At. Spectrom. 12 (1997) 195]. However, this quantity of material is unlikely to be recovered from a POI. This paper describes the development and application of laser ablation inductively coupled plasma time of flight mass spectrometry (LA-ICP-TOF-MS), as an analytical protocol, which can be applied more appropriately to the analysis of micro-debris than conventional quadrupole based mass spectrometry. The resulting data, for debris as small as 70mum in diameter, was unambiguously matched between a single spherule recovered from a POI and a spherule recovered from the scene of crime, in an analytical procedure taking less than 5min.
Quantification of 4 antidepressants and a metabolite by LC-MS for therapeutic drug monitoring.
Choong, Eva; Rudaz, Serge; Kottelat, Astrid; Haldemann, Sophie; Guillarme, Davy; Veuthey, Jean-Luc; Eap, Chin B
2011-06-01
A liquid chromatography method coupled to mass spectrometry was developed for the quantification of bupropion, its metabolite hydroxy-bupropion, moclobemide, reboxetine and trazodone in human plasma. The validation of the analytical procedure was assessed according to Société Française des Sciences et Techniques Pharmaceutiques and the latest Food and Drug Administration guidelines. The sample preparation was performed with 0.5 mL of plasma extracted on a cation-exchange solid phase 96-well plate. The separation was achieved in 14 min on a C18 XBridge column (2.1 mm×100 mm, 3.5 μm) using a 50 mM ammonium acetate pH 9/acetonitrile mobile phase in gradient mode. The compounds of interest were analysed in the single ion monitoring mode on a single quadrupole mass spectrometer working in positive electrospray ionisation mode. Two ions were selected per molecule to increase the number of identification points and to avoid as much as possible any false positives. Since selectivity is always a critical point for routine therapeutic drug monitoring, more than sixty common comedications for the psychiatric population were tested. For each analyte, the analytical procedure was validated to cover the common range of concentrations measured in plasma samples: 1-400 ng/mL for reboxetine and bupropion, 2-2000 ng/mL for hydroxy-bupropion, moclobemide, and trazodone. For all investigated compounds, reliable performance in terms of accuracy, precision, trueness, recovery, selectivity and stability was obtained. One year after its implementation in a routine process, this method demonstrated a high robustness with accurate values over the wide concentration range commonly observed among a psychiatric population. Copyright © 2011 Elsevier B.V. All rights reserved.
Non-negative Tensor Factorization for Robust Exploratory Big-Data Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexandrov, Boian; Vesselinov, Velimir Valentinov; Djidjev, Hristo Nikolov
Currently, large multidimensional datasets are being accumulated in almost every field. Data are: (1) collected by distributed sensor networks in real-time all over the globe, (2) produced by large-scale experimental measurements or engineering activities, (3) generated by high-performance simulations, and (4) gathered by electronic communications and socialnetwork activities, etc. Simultaneous analysis of these ultra-large heterogeneous multidimensional datasets is often critical for scientific discoveries, decision-making, emergency response, and national and global security. The importance of such analyses mandates the development of the next-generation of robust machine learning (ML) methods and tools for bigdata exploratory analysis.
Lucena, Rafael; Cárdenas, Soledad; Gallego, Mercedes; Valcárcel, Miguel
2006-03-01
Monitoring the exhaustion of alkaline degreasing baths is one of the main aspects in metal mechanizing industrial process control. The global level of surfactant, and mainly grease, can be used as ageing indicators. In this paper, an attenuated total reflection-Fourier transform infrared (ATR-FTIR) membrane-based sensor is presented for the determination of these parameters. The system is based on a micro-liquid-liquid extraction of the analytes through a polymeric membrane from the aqueous to the organic solvent layer which is in close contact with the internal reflection element and continuously monitored. Samples are automatically processed using a simple, robust sequential injection analysis (SIA) configuration, on-line coupled to the instrument. The global signal obtained for both families of compounds are processed via a multivariate calibration technique (partial least squares, PLS). Excellent correlation was obtained for the values given by the proposed method compared to those of the gravimetric reference one with very low error values for both calibration and validation.
Celetti, Giorgia; Natale, Concetta Di; Causa, Filippo; Battista, Edmondo; Netti, Paolo A
2016-09-01
Polymeric microparticles represent a robustly platform for the detection of clinically relevant analytes in biological samples; they can be functionalized encapsulating a multiple types of biologics entities, enhancing their applications as a new class of colloid materials. Microfluidic offers a versatile platform for the synthesis of monodisperse and engineered microparticles. In this work, we report microfluidic synthesis of novel polymeric microparticles endowed with specific peptide due to its superior specificity for target binding in complex media. A peptide sequence was efficiently encapsulated into the polymeric network and protein binding occurred with high affinity (KD 0.1-0.4μM). Fluidic dynamics simulation was performed to optimize the production conditions for monodisperse and stable functionalized microgels. The results demonstrate the easy and fast realization, in a single step, of functionalized monodisperse microgels using droplet-microfluidic technique, and how the inclusion of the peptide within polymeric network improve both the affinity and the specificity of protein capture. Copyright © 2016 Elsevier B.V. All rights reserved.
Assessment of active methods for removal of LEO debris
NASA Astrophysics Data System (ADS)
Hakima, Houman; Emami, M. Reza
2018-03-01
This paper investigates the applicability of five active methods for removal of large low Earth orbit debris. The removal methods, namely net, laser, electrodynamic tether, ion beam shepherd, and robotic arm, are selected based on a set of high-level space mission constraints. Mission level criteria are then utilized to assess the performance of each redirection method in light of the results obtained from a Monte Carlo simulation. The simulation provides an insight into the removal time, performance robustness, and propellant mass criteria for the targeted debris range. The remaining attributes are quantified based on the models provided in the literature, which take into account several important parameters pertaining to each removal method. The means of assigning attributes to each assessment criterion is discussed in detail. A systematic comparison is performed using two different assessment schemes: Analytical Hierarchy Process and utility-based approach. A third assessment technique, namely the potential-loss analysis, is utilized to highlight the effect of risks in each removal methods.
Chaos Suppression in Fractional order Permanent Magnet Synchronous Generator in Wind Turbine Systems
NASA Astrophysics Data System (ADS)
Rajagopal, Karthikeyan; Karthikeyan, Anitha; Duraisamy, Prakash
2017-06-01
In this paper we investigate the control of three-dimensional non-autonomous fractional-order uncertain model of a permanent magnet synchronous generator (PMSG) via a adaptive control technique. We derive a dimensionless fractional order model of the PMSM from the integer order presented in the literatures. Various dynamic properties of the fractional order model like eigen values, Lyapunov exponents, bifurcation and bicoherence are investigated. The system chaotic behavior for various orders of fractional calculus are presented. An adaptive controller is derived to suppress the chaotic oscillations of the fractional order model. As the direct Lyapunov stability analysis of the robust controller is difficult for a fractional order first derivative, we have derived a new lemma to analyze the stability of the system. Numerical simulations of the proposed chaos suppression methodology are given to prove the analytical results derived through which we show that for the derived adaptive controller and the parameter update law, the origin of the system for any bounded initial conditions is asymptotically stable.
PECAN: Library Free Peptide Detection for Data-Independent Acquisition Tandem Mass Spectrometry Data
Ting, Ying S.; Egertson, Jarrett D.; Bollinger, James G.; Searle, Brian C.; Payne, Samuel H.; Noble, William Stafford; MacCoss, Michael J.
2017-01-01
In mass spectrometry-based shogun proteomics, data-independent acquisition (DIA) is an emerging technique for unbiased and reproducible measurement of protein mixtures. Without targeting a specific precursor ion, DIA MS/MS spectra are often highly multiplexed, containing product ions from multiple co-fragmenting precursors. Thus, detecting peptides directly from DIA data is challenging; most DIA data analyses require spectral libraries. Here we present a new library-free, peptide-centric tool PECAN that detects peptides directly from DIA data. PECAN reports evidence of detection based on product ion scoring, enabling detection of low abundance analytes with poor precursor ion signal. We benchmarked PECAN with chromatographic peak picking accuracy and peptide detection capability. We further validated PECAN detection with data-dependent acquisition and targeted analyses. Last, we used PECAN to build a library from DIA data and to query sequence variants. Together, these results show that PECAN detects peptides robustly and accurately from DIA data without using a library. PMID:28783153
Kovács, Béla; Kántor, Lajos Kristóf; Croitoru, Mircea Dumitru; Kelemen, Éva Katalin; Obreja, Mona; Nagy, Előd Ernő; Székely-Szentmiklósi, Blanka; Gyéresi, Árpád
2018-06-01
A reverse-phase HPLC (RP-HPLC) method was developed for strontium ranelate using a full factorial, screening experimental design. The analytical procedure was validated according to international guidelines for linearity, selectivity, sensitivity, accuracy and precision. A separate experimental design was used to demonstrate the robustness of the method. Strontium ranelate was eluted at 4.4 minutes and showed no interference with the excipients used in the formulation, at 321 nm. The method is linear in the range of 20-320 μg mL-1 (R2 = 0.99998). Recovery, tested in the range of 40-120 μg mL-1, was found to be 96.1-102.1 %. Intra-day and intermediate precision RSDs ranged from 1.0-1.4 and 1.2-1.4 %, resp. The limit of detection and limit of quantitation were 0.06 and 0.20 μg mL-1, resp. The proposed technique is fast, cost-effective, reliable and reproducible, and is proposed for the routine analysis of strontium ranelate.
Exploring the evolution of London's street network in the information space: A dual approach
NASA Astrophysics Data System (ADS)
Masucci, A. Paolo; Stanilov, Kiril; Batty, Michael
2014-01-01
We study the growth of London's street network in its dual representation, as the city has evolved over the past 224 years. The dual representation of a planar graph is a content-based network, where each node is a set of edges of the planar graph and represents a transportation unit in the so-called information space, i.e., the space where information is handled in order to navigate through the city. First, we discuss a novel hybrid technique to extract dual graphs from planar graphs, called the hierarchical intersection continuity negotiation principle. Then we show that the growth of the network can be analytically described by logistic laws and that the topological properties of the network are governed by robust log-normal distributions characterizing the network's connectivity and small-world properties that are consistent over time. Moreover, we find that the double-Pareto-like distributions for the connectivity emerge for major roads and can be modeled via a stochastic content-based network model using simple space-filling principles.
High density lipoproteins: Measurement techniques and potential biomarkers of cardiovascular risk
Hafiane, Anouar; Genest, Jacques
2015-01-01
Plasma high density lipoprotein cholesterol (HDL) comprises a heterogeneous family of lipoprotein species, differing in surface charge, size and lipid and protein compositions. While HDL cholesterol (C) mass is a strong, graded and coherent biomarker of cardiovascular risk, genetic and clinical trial data suggest that the simple measurement of HDL-C may not be causal in preventing atherosclerosis nor reflect HDL functionality. Indeed, the measurement of HDL-C may be a biomarker of cardiovascular health. To assess the issue of HDL function as a potential therapeutic target, robust and simple analytical methods are required. The complex pleiotropic effects of HDL make the development of a single measurement challenging. Development of laboratory assays that accurately HDL function must be developed validated and brought to high-throughput for clinical purposes. This review discusses the limitations of current laboratory technologies for methods that separate and quantify HDL and potential application to predict CVD, with an emphasis on emergent approaches as potential biomarkers in clinical practice. PMID:26674734
NASA Astrophysics Data System (ADS)
Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.
2016-09-01
Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.
Cortez, Juliana; Pasquini, Celio
2013-02-05
The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.
Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F
2005-12-08
This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.
ERIC Educational Resources Information Center
Toh, Chee-Seng
2007-01-01
A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.
NASA Astrophysics Data System (ADS)
Yazdchi, K.; Salehi, M.; Shokrieh, M. M.
2009-03-01
By introducing a new simplified 3D representative volume element for wavy carbon nanotubes, an analytical model is developed to study the stress transfer in single-walled carbon nanotube-reinforced polymer composites. Based on the pull-out modeling technique, the effects of waviness, aspect ratio, and Poisson ratio on the axial and interfacial shear stresses are analyzed in detail. The results of the present analytical model are in a good agreement with corresponding results for straight nanotubes.
The identification and characterization of genetic and environmental factors that predict common, complex disease is a major goal of human genetics. The ubiquitous nature of epistatic interaction in the underlying genetic etiology of such disease presents a difficult analytical ...
Partners in research exceed the sum of the parts: partners > parts
USDA-ARS?s Scientific Manuscript database
The overriding goal of analytical chemistry research has always been and will always be the same: develop and validate approaches to achieve the needed quality of results that fit the purpose of the analysis in the fastest, easiest, safest, most economical, robust, and environmentally-friendly way ...