Sample records for improved analytical methods

  1. Major advances in testing of dairy products: milk component and dairy product attribute testing.

    PubMed

    Barbano, D M; Lynch, J M

    2006-04-01

    Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.

  2. Automated dynamic analytical model improvement for damped structures

    NASA Technical Reports Server (NTRS)

    Fuh, J. S.; Berman, A.

    1985-01-01

    A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.

  3. Improved full analytical polygon-based method using Fourier analysis of the three-dimensional affine transformation.

    PubMed

    Pan, Yijie; Wang, Yongtian; Liu, Juan; Li, Xin; Jia, Jia

    2014-03-01

    Previous research [Appl. Opt.52, A290 (2013)] has revealed that Fourier analysis of three-dimensional affine transformation theory can be used to improve the computation speed of the traditional polygon-based method. In this paper, we continue our research and propose an improved full analytical polygon-based method developed upon this theory. Vertex vectors of primitive and arbitrary triangles and the pseudo-inverse matrix were used to obtain an affine transformation matrix representing the spatial relationship between the two triangles. With this relationship and the primitive spectrum, we analytically obtained the spectrum of the arbitrary triangle. This algorithm discards low-level angular dependent computations. In order to add diffusive reflection to each arbitrary surface, we also propose a whole matrix computation approach that takes advantage of the affine transformation matrix and uses matrix multiplication to calculate shifting parameters of similar sub-polygons. The proposed method improves hologram computation speed for the conventional full analytical approach. Optical experimental results are demonstrated which prove that the proposed method can effectively reconstruct three-dimensional scenes.

  4. Interactive Management and Updating of Spatial Data Bases

    NASA Technical Reports Server (NTRS)

    French, P.; Taylor, M.

    1982-01-01

    The decision making process, whether for power plant siting, load forecasting or energy resource planning, invariably involves a blend of analytical methods and judgement. Management decisions can be improved by the implementation of techniques which permit an increased comprehension of results from analytical models. Even where analytical procedures are not required, decisions can be aided by improving the methods used to examine spatially and temporally variant data. How the use of computer aided planning (CAP) programs and the selection of a predominant data structure, can improve the decision making process is discussed.

  5. Performance Evaluation of an Improved GC-MS Method to Quantify Methylmercury in Fish.

    PubMed

    Watanabe, Takahiro; Kikuchi, Hiroyuki; Matsuda, Rieko; Hayashi, Tomoko; Akaki, Koichi; Teshima, Reiko

    2015-01-01

    Here, we set out to improve our previously developed methylmercury analytical method, involving phenyl derivatization and gas chromatography-mass spectrometry (GC-MS). In the improved method, phenylation of methylmercury with sodium tetraphenylborate was carried out in a toluene/water two-phase system, instead of in water alone. The modification enabled derivatization at optimum pH, and the formation of by-products was dramatically reduced. In addition, adsorption of methyl phenyl mercury in the GC system was suppressed by co-injection of PEG200, enabling continuous analysis without loss of sensitivity. The performance of the improved analytical method was independently evaluated by three analysts using certified reference materials and methylmercury-spiked fresh fish samples. The present analytical method was validated as suitable for determination of compliance with the provisional regulation value for methylmercury in fish, set in the Food Sanitation haw.

  6. An improved 3D MoF method based on analytical partial derivatives

    NASA Astrophysics Data System (ADS)

    Chen, Xiang; Zhang, Xiong

    2016-12-01

    MoF (Moment of Fluid) method is one of the most accurate approaches among various surface reconstruction algorithms. As other second order methods, MoF method needs to solve an implicit optimization problem to obtain the optimal approximate surface. Therefore, the partial derivatives of the objective function have to be involved during the iteration for efficiency and accuracy. However, to the best of our knowledge, the derivatives are currently estimated numerically by finite difference approximation because it is very difficult to obtain the analytical derivatives of the object function for an implicit optimization problem. Employing numerical derivatives in an iteration not only increase the computational cost, but also deteriorate the convergence rate and robustness of the iteration due to their numerical error. In this paper, the analytical first order partial derivatives of the objective function are deduced for 3D problems. The analytical derivatives can be calculated accurately, so they are incorporated into the MoF method to improve its accuracy, efficiency and robustness. Numerical studies show that by using the analytical derivatives the iterations are converged in all mixed cells with the efficiency improvement of 3 to 4 times.

  7. Distributed Generation Interconnection Collaborative | NREL

    Science.gov Websites

    , reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability

  8. Improving LC-MS sensitivity through increases in chromatographic performance: comparisons of UPLC-ES/MS/MS to HPLC-ES/MS/MS.

    PubMed

    Churchwell, Mona I; Twaddle, Nathan C; Meeker, Larry R; Doerge, Daniel R

    2005-10-25

    Recent technological advances have made available reverse phase chromatographic media with a 1.7 microm particle size along with a liquid handling system that can operate such columns at much higher pressures. This technology, termed ultra performance liquid chromatography (UPLC), offers significant theoretical advantages in resolution, speed, and sensitivity for analytical determinations, particularly when coupled with mass spectrometers capable of high-speed acquisitions. This paper explores the differences in LC-MS performance by conducting a side-by-side comparison of UPLC for several methods previously optimized for HPLC-based separation and quantification of multiple analytes with maximum throughput. In general, UPLC produced significant improvements in method sensitivity, speed, and resolution. Sensitivity increases with UPLC, which were found to be analyte-dependent, were as large as 10-fold and improvements in method speed were as large as 5-fold under conditions of comparable peak separations. Improvements in chromatographic resolution with UPLC were apparent from generally narrower peak widths and from a separation of diastereomers not possible using HPLC. Overall, the improvements in LC-MS method sensitivity, speed, and resolution provided by UPLC show that further advances can be made in analytical methodology to add significant value to hypothesis-driven research.

  9. Synthesized airfoil data method for prediction of dynamic stall and unsteady airloads

    NASA Technical Reports Server (NTRS)

    Gangwani, S. T.

    1983-01-01

    A detailed analysis of dynamic stall experiments has led to a set of relatively compact analytical expressions, called synthesized unsteady airfoil data, which accurately describe in the time-domain the unsteady aerodynamic characteristics of stalled airfoils. An analytical research program was conducted to expand and improve this synthesized unsteady airfoil data method using additional available sets of unsteady airfoil data. The primary objectives were to reduce these data to synthesized form for use in rotor airload prediction analyses and to generalize the results. Unsteady drag data were synthesized which provided the basis for successful expansion of the formulation to include computation of the unsteady pressure drag of airfoils and rotor blades. Also, an improved prediction model for airfoil flow reattachment was incorporated in the method. Application of this improved unsteady aerodynamics model has resulted in an improved correlation between analytic predictions and measured full scale helicopter blade loads and stress data.

  10. System identification of analytical models of damped structures

    NASA Technical Reports Server (NTRS)

    Fuh, J.-S.; Chen, S.-Y.; Berman, A.

    1984-01-01

    A procedure is presented for identifying linear nonproportionally damped system. The system damping is assumed to be representable by a real symmetric matrix. Analytical mass, stiffness and damping matrices which constitute an approximate representation of the system are assumed to be available. Given also are an incomplete set of measured natural frequencies, damping ratios and complex mode shapes of the structure, normally obtained from test data. A method is developed to find the smallest changes in the analytical model so that the improved model can exactly predict the measured modal parameters. The present method uses the orthogonality relationship to improve mass and damping matrices and the dynamic equation to find the improved stiffness matrix.

  11. Integrative Mixed Methods Data Analytic Strategies in Research on School Success in Challenging Circumstances

    ERIC Educational Resources Information Center

    Jang, Eunice E.; McDougall, Douglas E.; Pollon, Dawn; Herbert, Monique; Russell, Pia

    2008-01-01

    There are both conceptual and practical challenges in dealing with data from mixed methods research studies. There is a need for discussion about various integrative strategies for mixed methods data analyses. This article illustrates integrative analytic strategies for a mixed methods study focusing on improving urban schools facing challenging…

  12. Probabilistic power flow using improved Monte Carlo simulation method with correlated wind sources

    NASA Astrophysics Data System (ADS)

    Bie, Pei; Zhang, Buhan; Li, Hang; Deng, Weisi; Wu, Jiasi

    2017-01-01

    Probabilistic Power Flow (PPF) is a very useful tool for power system steady-state analysis. However, the correlation among different random injection power (like wind power) brings great difficulties to calculate PPF. Monte Carlo simulation (MCS) and analytical methods are two commonly used methods to solve PPF. MCS has high accuracy but is very time consuming. Analytical method like cumulants method (CM) has high computing efficiency but the cumulants calculating is not convenient when wind power output does not obey any typical distribution, especially when correlated wind sources are considered. In this paper, an Improved Monte Carlo simulation method (IMCS) is proposed. The joint empirical distribution is applied to model different wind power output. This method combines the advantages of both MCS and analytical method. It not only has high computing efficiency, but also can provide solutions with enough accuracy, which is very suitable for on-line analysis.

  13. Analyte sensing mediated by adapter/carrier molecules

    DOEpatents

    Bayley, Hagan; Braha, Orit; Gu, LiQun

    2002-07-30

    This invention relates to an improved method and system for sensing of one or more analytes. A host molecule, which serves as an adapter/carrier, is used to facilitate interaction between the analyte and the sensor element. A detectable signal is produced reflecting the identity and concentration of analyte present.

  14. Improvement of analytical dynamic models using modal test data

    NASA Technical Reports Server (NTRS)

    Berman, A.; Wei, F. S.; Rao, K. V.

    1980-01-01

    A method developed to determine maximum changes in analytical mass and stiffness matrices to make them consistent with a set of measured normal modes and natural frequencies is presented. The corrected model will be an improved base for studies of physical changes, boundary condition changes, and for prediction of forced responses. The method features efficient procedures not requiring solutions of the eigenvalue problem, and the ability to have more degrees of freedom than the test data. In addition, modal displacements are obtained for all analytical degrees of freedom, and the frequency dependence of the coordinate transformations is properly treated.

  15. High resolution x-ray CMT: Reconstruction methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, J.K.

    This paper qualitatively discusses the primary characteristics of methods for reconstructing tomographic images from a set of projections. These reconstruction methods can be categorized as either {open_quotes}analytic{close_quotes} or {open_quotes}iterative{close_quotes} techniques. Analytic algorithms are derived from the formal inversion of equations describing the imaging process, while iterative algorithms incorporate a model of the imaging process and provide a mechanism to iteratively improve image estimates. Analytic reconstruction algorithms are typically computationally more efficient than iterative methods; however, analytic algorithms are available for a relatively limited set of imaging geometries and situations. Thus, the framework of iterative reconstruction methods is better suited formore » high accuracy, tomographic reconstruction codes.« less

  16. IMPROVED METHOD FOR THE STORAGE OF GROUND WATER SAMPLES CONTAINING VOLATILE ORGANIC ANALYTES

    EPA Science Inventory

    The sorption of volatile organic analytes from water samples by the Teflon septum surface used with standard glass 40-ml sample collection vials was investigated. Analytes tested included alkanes, isoalkanes, olefins, cycloalkanes, a cycloalkene, monoaromatics, a polynuclear arom...

  17. An improved method for predicting the lightning performance of high and extra-high-voltage substation shielding

    NASA Astrophysics Data System (ADS)

    Vinh, T.

    1980-08-01

    There is a need for better and more effective lightning protection for transmission and switching substations. In the past, a number of empirical methods were utilized to design systems to protect substations and transmission lines from direct lightning strokes. The need exists for convenient analytical lightning models adequate for engineering usage. In this study, analytical lightning models were developed along with a method for improved analysis of the physical properties of lightning through their use. This method of analysis is based upon the most recent statistical field data. The result is an improved method for predicting the occurrence of sheilding failure and for designing more effective protection for high and extra high voltage substations from direct strokes.

  18. Non-enzymatic browning in citrus juice: chemical markers, their detection and ways to improve product quality.

    PubMed

    Bharate, Sonali S; Bharate, Sandip B

    2014-10-01

    Citrus juices are widely consumed due to their nutritional benefits and variety of pharmacological properties. Non-enzymatic browning (NEB) is one of the most important chemical reactions responsible for quality and color changes during the heating or prolonged storage of citrus products. The present review covers various aspects of NEB in citrus juice viz. chemistry of NEB, identifiable markers of NEB, analytical methods to identify NEB markers and ways to improve the quality of citrus juice. 2,5-Dimethyl-4-hydroxy-3(2H)-furanone (DMHF) is one of the promising marker formed during browning process with number of analytical methods reported for its analysis; therefore it can be used as an indicator for NEB process. Amongst analytical methods reported, RP-HPLC is more sensitive and accurate method, which can be used as analytical tool. NEB can be prevented by removal of amino acids/ proteins (via ion exchange treatment) or by targeting NEB reactions (e.g. blockage of furfural/ HMF by sulphiting agent).

  19. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  20. Analysis of polymeric phenolics in red wines using different techniques combined with gel permeation chromatography fractionation.

    PubMed

    Guadalupe, Zenaida; Soldevilla, Alberto; Sáenz-Navajas, María-Pilar; Ayestarán, Belén

    2006-04-21

    A multiple-step analytical method was developed to improve the analysis of polymeric phenolics in red wines. With a common initial step based on the fractionation of wine phenolics by gel permeation chromatography (GPC), different analytical techniques were used: high-performance liquid chromatography-diode array detection (HPLC-DAD), HPLC-mass spectrometry (MS), capillary zone electrophoresis (CZE) and spectrophotometry. This method proved to be valid for analyzing different families of phenolic compounds, such as monomeric phenolics and their derivatives, polymeric pigments and proanthocyanidins. The analytical characteristics of fractionation by GPC were studied and the method was fully validated, yielding satisfactory statistical results. GPC fractionation substantially improved the analysis of polymeric pigments by CZE, in terms of response, repeatability and reproducibility. It also represented an improvement in the traditional vanillin assay used for proanthocyanidin (PA) quantification. Astringent proanthocyanidins were also analyzed using a simple combined method that allowed these compounds, for which only general indexes were available, to be quantified.

  1. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    PubMed

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  2. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory

    PubMed Central

    Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721

  3. Review of recent advances in analytical techniques for the determination of neurotransmitters

    PubMed Central

    Perry, Maura; Li, Qiang; Kennedy, Robert T.

    2009-01-01

    Methods and advances for monitoring neurotransmitters in vivo or for tissue analysis of neurotransmitters over the last five years are reviewed. The review is organized primarily by neurotransmitter type. Transmitter and related compounds may be monitored by either in vivo sampling coupled to analytical methods or implanted sensors. Sampling is primarily performed using microdialysis, but low-flow push-pull perfusion may offer advantages of spatial resolution while minimizing the tissue disruption associated with higher flow rates. Analytical techniques coupled to these sampling methods include liquid chromatography, capillary electrophoresis, enzyme assays, sensors, and mass spectrometry. Methods for the detection of amino acid, monoamine, neuropeptide, acetylcholine, nucleoside, and soluable gas neurotransmitters have been developed and improved upon. Advances in the speed and sensitivity of these methods have enabled improvements in temporal resolution and increased the number of compounds detectable. Similar advances have enabled improved detection at tissue samples, with a substantial emphasis on single cell and other small samples. Sensors provide excellent temporal and spatial resolution for in vivo monitoring. Advances in application to catecholamines, indoleamines, and amino acids have been prominent. Improvements in stability, sensitivity, and selectivity of the sensors have been of paramount interest. PMID:19800472

  4. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  5. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  6. Analysis of a virtual memory model for maintaining database views

    NASA Technical Reports Server (NTRS)

    Kinsley, Kathryn C.; Hughes, Charles E.

    1992-01-01

    This paper presents an analytical model for predicting the performance of a new support strategy for database views. This strategy, called the virtual method, is compared with traditional methods for supporting views. The analytical model's predictions of improved performance by the virtual method are then validated by comparing these results with those achieved in an experimental implementation.

  7. Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry

    PubMed Central

    Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui

    2014-01-01

    Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355

  8. Targeted analyte detection by standard addition improves detection limits in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui

    2012-09-18

    Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.

  9. Learning Analytics in Higher Education Development: A Roadmap

    ERIC Educational Resources Information Center

    Adejo, Olugbenga; Connolly, Thomas

    2017-01-01

    The increase in education data and advance in technology are bringing about enhanced teaching and learning methodology. The emerging field of Learning Analytics (LA) continues to seek ways to improve the different methods of gathering, analysing, managing and presenting learners' data with the sole aim of using it to improve the student learning…

  10. Problem-based learning on quantitative analytical chemistry course

    NASA Astrophysics Data System (ADS)

    Fitri, Noor

    2017-12-01

    This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.

  11. Method for improving the limit of detection in a data signal

    DOEpatents

    Synovec, Robert E.; Yueng, Edward S.

    1989-10-17

    A method for improving the limit of detection for a data set in which experimental noise is uncorrelated along a given abscissa and an analytical signal is correlated to the abscissa, the steps comprising collecting the data set, converting the data set into a data signal including an analytical portion and the experimental noise portion, designating and adjusting a baseline of the data signal to center the experimental noise numerically about a zero reference, and integrating the data signal preserving the corresponding information for each point of the data signal. The steps of the method produce an enhanced integrated data signal which improves the limit of detection of the data signal.

  12. Method for improving the limit of detection in a data signal

    DOEpatents

    Synovec, R.E.; Yueng, E.S.

    1989-10-17

    Disclosed is a method for improving the limit of detection for a data set in which experimental noise is uncorrelated along a given abscissa and an analytical signal is correlated to the abscissa, the steps comprising collecting the data set, converting the data set into a data signal including an analytical portion and the experimental noise portion, designating and adjusting a baseline of the data signal to center the experimental noise numerically about a zero reference, and integrating the data signal preserving the corresponding information for each point of the data signal. The steps of the method produce an enhanced integrated data signal which improves the limit of detection of the data signal. 8 figs.

  13. Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration

    NASA Technical Reports Server (NTRS)

    Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.

    1993-01-01

    Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.

  14. Regarding on the prototype solutions for the nonlinear fractional-order biological population model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baskonus, Haci Mehmet, E-mail: hmbaskonus@gmail.com; Bulut, Hasan

    2016-06-08

    In this study, we have submitted to literature a method newly extended which is called as Improved Bernoulli sub-equation function method based on the Bernoulli Sub-ODE method. The proposed analytical scheme has been expressed with steps. We have obtained some new analytical solutions to the nonlinear fractional-order biological population model by using this technique. Two and three dimensional surfaces of analytical solutions have been drawn by wolfram Mathematica 9. Finally, a conclusion has been submitted by mentioning important acquisitions founded in this study.

  15. PARTNERING TO IMPROVE HUMAN EXPOSURE METHODS

    EPA Science Inventory

    Methods development research is an application-driven scientific area that addresses programmatic needs. The goals are to reduce measurement uncertainties, address data gaps, and improve existing analytical procedures for estimating human exposures. Partnerships have been develop...

  16. Improved Design of Tunnel Supports : Executive Summary

    DOT National Transportation Integrated Search

    1979-12-01

    This report focuses on improvement of design methodologies related to the ground-structure interaction in tunneling. The design methods range from simple analytical and empirical methods to sophisticated finite element techniques as well as an evalua...

  17. Big data analytics to improve cardiovascular care: promise and challenges.

    PubMed

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  18. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies

    PubMed Central

    Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F

    2016-01-01

    Background The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. Objective To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. Methods The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Results Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Conclusions Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. PMID:26728006

  19. Quantitative interference by cysteine and N-acetylcysteine metabolites during the LC-MS/MS bioanalysis of a small molecule.

    PubMed

    Barricklow, Jason; Ryder, Tim F; Furlong, Michael T

    2009-08-01

    During LC-MS/MS quantification of a small molecule in human urine samples from a clinical study, an unexpected peak was observed to nearly co-elute with the analyte of interest in many study samples. Improved chromatographic resolution revealed the presence of at least 3 non-analyte peaks, which were identified as cysteine metabolites and N-acetyl (mercapturic acid) derivatives thereof. These metabolites produced artifact responses in the parent compound MRM channel due to decomposition in the ionization source of the mass spectrometer. Quantitative comparison of the analyte concentrations in study samples using the original chromatographic method and the improved chromatographic separation method demonstrated that the original method substantially over-estimated the analyte concentration in many cases. The substitution of electrospray ionization (ESI) for atmospheric pressure chemical ionization (APCI) nearly eliminated the source instability of these metabolites, which would have mitigated their interference in the quantification of the analyte, even without chromatographic separation. These results 1) demonstrate the potential for thiol metabolite interferences during the quantification of small molecules in pharmacokinetic samples, and 2) underscore the need to carefully evaluate LC-MS/MS methods for molecules that can undergo metabolism to thiol adducts to ensure that they are not susceptible to such interferences during quantification.

  20. Analytical and Numerical Studies of Active and Passive Microwave Ocean Remote Sensing

    DTIC Science & Technology

    2001-09-30

    of both analytical and efficient numerical methods for electromagnetics and hydrodynamics. New insights regarding these phenomena can then be applied to improve microwave active and passive remote sensing of the ocean surface.

  1. Annual banned-substance review: Analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans

    2018-01-01

    Several high-profile revelations concerning anti-doping rule violations over the past 12 months have outlined the importance of tackling prevailing challenges and reducing the limitations of the current anti-doping system. At this time, the necessity to enhance, expand, and improve analytical test methods in response to the substances outlined in the World Anti-Doping Agency's (WADA) Prohibited List represents an increasingly crucial task for modern sports drug-testing programs. The ability to improve analytical testing methods often relies on the expedient application of novel information regarding superior target analytes for sports drug-testing assays, drug elimination profiles, alternative test matrices, together with recent advances in instrumental developments. This annual banned-substance review evaluates literature published between October 2016 and September 2017 offering an in-depth evaluation of developments in these arenas and their potential application to substances reported in WADA's 2017 Prohibited List. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Analytical strategy for the determination of non-steroidal anti-inflammatory drugs in plasma and improved analytical strategy for the determination of authorized and non-authorized non-steroidal anti-inflammatory drugs in milk by LC-MS/MS.

    PubMed

    Dowling, Geraldine; Malone, Edward; Harbison, Tom; Martin, Sheila

    2010-07-01

    A sensitive and selective method for the determination of six non-steroidal anti-inflammatory drugs (NSAIDs) in bovine plasma was developed. An improved method for the determination of authorized and non-authorized residues of 10 non-steroidal anti-inflammatory drugs in milk was developed. Analytes were separated and acquired by high performance liquid chromatography coupled with an electrospray ionisation tandem mass spectrometer (ESI-MS/MS). Target compounds were acidified in plasma, and plasma and milk samples were extracted with acetonitrile and both extracts were purified on an improved solid phase extraction procedure utilising Evolute ABN cartridges. The accuracy of the methods for milk and plasma was between 73 and 109%. The precision of the method for authorized and non-authorized NSAIDs in milk and plasma expressed as % RSD, for the within lab reproducibility was less than 16%. The % RSD for authorized NSAIDs at their associated MRL(s) in milk was less than 10% for meloxicam, flunixin and tolfenamic acid and was less than 25% for hydroxy flunixin. The methods were validated according to Commission Decision 2002/657/EC.

  3. A Newton-Krylov method with an approximate analytical Jacobian for implicit solution of Navier-Stokes equations on staggered overset-curvilinear grids with immersed boundaries.

    PubMed

    Asgharzadeh, Hafez; Borazjani, Iman

    2017-02-15

    The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for nonlinear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42 - 74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal Jacobian when the stretching factor was increased, respectively. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80-90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future.

  4. A Newton–Krylov method with an approximate analytical Jacobian for implicit solution of Navier–Stokes equations on staggered overset-curvilinear grids with immersed boundaries

    PubMed Central

    Asgharzadeh, Hafez; Borazjani, Iman

    2016-01-01

    The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for nonlinear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42 – 74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal Jacobian when the stretching factor was increased, respectively. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80–90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future. PMID:28042172

  5. A Newton-Krylov method with an approximate analytical Jacobian for implicit solution of Navier-Stokes equations on staggered overset-curvilinear grids with immersed boundaries

    NASA Astrophysics Data System (ADS)

    Asgharzadeh, Hafez; Borazjani, Iman

    2017-02-01

    The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for non-linear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form a preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42-74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal and full Jacobian, respectivley, when the stretching factor was increased. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80-90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future.

  6. Comparing an analytical spacetime metric for a merging binary to a fully nonlinear numerical evolution using curvature scalars

    NASA Astrophysics Data System (ADS)

    Sadiq, Jam; Zlochower, Yosef; Nakano, Hiroyuki

    2018-04-01

    We introduce a new geometrically invariant prescription for comparing two different spacetimes based on geodesic deviation. We use this method to compare a family of recently introduced analytical spacetime representing inspiraling black-hole binaries to fully nonlinear numerical solutions to the Einstein equations. Our method can be used to improve analytical spacetime models by providing a local measure of the effects that violations of the Einstein equations will have on timelike geodesics, and indirectly, gas dynamics. We also discuss the advantages and limitations of this method.

  7. A simplified analytic form for generation of axisymmetric plasma boundaries

    DOE PAGES

    Luce, Timothy C.

    2017-02-23

    An improved method has been formulated for generating analytic boundary shapes as input for axisymmetric MHD equilibria. This method uses the family of superellipses as the basis function, as previously introduced. The improvements are a simplified notation, reduction of the number of simultaneous nonlinear equations to be solved, and the realization that not all combinations of input parameters admit a solution to the nonlinear constraint equations. The method tests for the existence of a self-consistent solution and, when no solution exists, it uses a deterministic method to find a nearby solution. As a result, examples of generation of boundaries, includingmore » tests with an equilibrium solver, are given.« less

  8. A simplified analytic form for generation of axisymmetric plasma boundaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luce, Timothy C.

    An improved method has been formulated for generating analytic boundary shapes as input for axisymmetric MHD equilibria. This method uses the family of superellipses as the basis function, as previously introduced. The improvements are a simplified notation, reduction of the number of simultaneous nonlinear equations to be solved, and the realization that not all combinations of input parameters admit a solution to the nonlinear constraint equations. The method tests for the existence of a self-consistent solution and, when no solution exists, it uses a deterministic method to find a nearby solution. As a result, examples of generation of boundaries, includingmore » tests with an equilibrium solver, are given.« less

  9. Compensating for Effects of Humidity on Electronic Noses

    NASA Technical Reports Server (NTRS)

    Homer, Margie; Ryan, Margaret A.; Manatt, Kenneth; Zhou, Hanying; Manfreda, Allison

    2004-01-01

    A method of compensating for the effects of humidity on the readouts of electronic noses has been devised and tested. The method is especially appropriate for use in environments in which humidity is not or cannot be controlled for example, in the vicinity of a chemical spill, which can be accompanied by large local changes in humidity. Heretofore, it has been common practice to treat water vapor as merely another analyte, the concentration of which is determined, along with that of the other analytes, in a computational process based on deconvolution. This practice works well, but leaves room for improvement: changes in humidity can give rise to large changes in electronic-nose responses. If corrections for humidity are not made, the large humidity-induced responses may swamp smaller responses associated with low concentrations of analytes. The present method offers an improvement. The underlying concept is simple: One augments an electronic nose with a separate humidity and a separate temperature sensor. The outputs of the humidity and temperature sensors are used to generate values that are subtracted from the readings of the other sensors in an electronic nose to correct for the temperature-dependent contributions of humidity to those readings. Hence, in principle, what remains after corrections are the contributions of the analytes only. Laboratory experiments on a first-generation electronic nose have shown that this method is effective and improves the success rate of identification of analyte/ water mixtures. Work on a second-generation device was in progress at the time of reporting the information for this article.

  10. An Improved Method of AGM for High Precision Geolocation of SAR Images

    NASA Astrophysics Data System (ADS)

    Zhou, G.; He, C.; Yue, T.; Huang, W.; Huang, Y.; Li, X.; Chen, Y.

    2018-05-01

    In order to take full advantage of SAR images, it is necessary to obtain the high precision location of the image. During the geometric correction process of images, to ensure the accuracy of image geometric correction and extract the effective mapping information from the images, precise image geolocation is important. This paper presents an improved analytical geolocation method (IAGM) that determine the high precision geolocation of each pixel in a digital SAR image. This method is based on analytical geolocation method (AGM) proposed by X. K. Yuan aiming at realizing the solution of RD model. Tests will be conducted using RADARSAT-2 SAR image. Comparing the predicted feature geolocation with the position as determined by high precision orthophoto, results indicate an accuracy of 50m is attainable with this method. Error sources will be analyzed and some recommendations about improving image location accuracy in future spaceborne SAR's will be given.

  11. Simple and Robust N-Glycan Analysis Based on Improved 2-Aminobenzoic Acid Labeling for Recombinant Therapeutic Glycoproteins.

    PubMed

    Jeong, Yeong Ran; Kim, Sun Young; Park, Young Sam; Lee, Gyun Min

    2018-03-21

    N-glycans of therapeutic glycoproteins are critical quality attributes that should be monitored throughout all stages of biopharmaceutical development. To reduce both the time for sample preparation and the variations in analytical results, we have developed an N-glycan analysis method that includes improved 2-aminobenzoic acid (2-AA) labeling to easily remove deglycosylated proteins. Using this analytical method, 15 major 2-AA-labeled N-glycans of Enbrel ® were separated into single peaks in hydrophilic interaction chromatography mode and therefore could be quantitated. 2-AA-labeled N-glycans were also highly compatible with in-line quadrupole time-of-flight mass spectrometry (MS) for structural identification. The structures of 15 major and 18 minor N-glycans were identified from their mass values determined by quadrupole time-of-flight MS. Furthermore, the structures of 14 major N-glycans were confirmed by interpreting the MS/MS data of each N-glycan. This analytical method was also successfully applied to neutral N-glycans of Humira ® and highly sialylated N-glycans of NESP ® . Furthermore, the analysis data of Enbrel ® that were accumulated for 2.5 years demonstrated the high-level consistency of this analytical method. Taken together, the results show that a wide repertoire of N-glycans of therapeutic glycoproteins can be analyzed with high efficiency and consistency using the improved 2-AA labeling-based N-glycan analysis method. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  12. Analytical N beam position monitor method

    NASA Astrophysics Data System (ADS)

    Wegscheider, A.; Langner, A.; Tomás, R.; Franchi, A.

    2017-11-01

    Measurement and correction of focusing errors is of great importance for performance and machine protection of circular accelerators. Furthermore LHC needs to provide equal luminosities to the experiments ATLAS and CMS. High demands are also set on the speed of the optics commissioning, as the foreseen operation with β*-leveling on luminosity will require many operational optics. A fast measurement of the β -function around a storage ring is usually done by using the measured phase advance between three consecutive beam position monitors (BPMs). A recent extension of this established technique, called the N-BPM method, was successfully applied for optics measurements at CERN, ALBA, and ESRF. We present here an improved algorithm that uses analytical calculations for both random and systematic errors and takes into account the presence of quadrupole, sextupole, and BPM misalignments, in addition to quadrupolar field errors. This new scheme, called the analytical N-BPM method, is much faster, further improves the measurement accuracy, and is applicable to very pushed beam optics where the existing numerical N-BPM method tends to fail.

  13. -Omic and Electronic Health Records Big Data Analytics for Precision Medicine

    PubMed Central

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.

    2017-01-01

    Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470

  14. Generation of gas-phase ions from charged clusters: an important ionization step causing suppression of matrix and analyte ions in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Lou, Xianwen; van Dongen, Joost L J; Milroy, Lech-Gustav; Meijer, E W

    2016-12-30

    Ionization in matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is a very complicated process. It has been reported that quaternary ammonium salts show extremely strong matrix and analyte suppression effects which cannot satisfactorily be explained by charge transfer reactions. Further investigation of the reasons causing these effects can be useful to improve our understanding of the MALDI process. The dried-droplet and modified thin-layer methods were used as sample preparation methods. In the dried-droplet method, analytes were co-crystallized with matrix, whereas in the modified thin-layer method analytes were deposited on the surface of matrix crystals. Model compounds, tetrabutylammonium iodide ([N(Bu) 4 ]I), cesium iodide (CsI), trihexylamine (THA) and polyethylene glycol 600 (PEG 600), were selected as the test analytes given their ability to generate exclusively pre-formed ions, protonated ions and metal ion adducts respectively in MALDI. The strong matrix suppression effect (MSE) observed using the dried-droplet method might disappear using the modified thin-layer method, which suggests that the incorporation of analytes in matrix crystals contributes to the MSE. By depositing analytes on the matrix surface instead of incorporating in the matrix crystals, the competition for evaporation/ionization from charged matrix/analyte clusters could be weakened resulting in reduced MSE. Further supporting evidence for this inference was found by studying the analyte suppression effect using the same two sample deposition methods. By comparing differences between the mass spectra obtained via the two sample preparation methods, we present evidence suggesting that the generation of gas-phase ions from charged matrix/analyte clusters may induce significant suppression of matrix and analyte ions. The results suggest that the generation of gas-phase ions from charged matrix/analyte clusters is an important ionization step in MALDI-MS. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Analytical concepts for health management systems of liquid rocket engines

    NASA Technical Reports Server (NTRS)

    Williams, Richard; Tulpule, Sharayu; Hawman, Michael

    1990-01-01

    Substantial improvement in health management systems performance can be realized by implementing advanced analytical methods of processing existing liquid rocket engine sensor data. In this paper, such techniques ranging from time series analysis to multisensor pattern recognition to expert systems to fault isolation models are examined and contrasted. The performance of several of these methods is evaluated using data from test firings of the Space Shuttle main engines.

  16. Sample Size and Power Estimates for a Confirmatory Factor Analytic Model in Exercise and Sport: A Monte Carlo Approach

    ERIC Educational Resources Information Center

    Myers, Nicholas D.; Ahn, Soyeon; Jin, Ying

    2011-01-01

    Monte Carlo methods can be used in data analytic situations (e.g., validity studies) to make decisions about sample size and to estimate power. The purpose of using Monte Carlo methods in a validity study is to improve the methodological approach within a study where the primary focus is on construct validity issues and not on advancing…

  17. Improving the efficiency of quantitative (1)H NMR: an innovative external standard-internal reference approach.

    PubMed

    Huang, Yande; Su, Bao-Ning; Ye, Qingmei; Palaniswamy, Venkatapuram A; Bolgar, Mark S; Raglione, Thomas V

    2014-01-01

    The classical internal standard quantitative NMR (qNMR) method determines the purity of an analyte by the determination of a solution containing the analyte and a standard. Therefore, the standard must meet the requirements of chemical compatibility and lack of resonance interference with the analyte as well as a known purity. The identification of such a standard can be time consuming and must be repeated for each analyte. In contrast, the external standard qNMR method utilizes a standard with a known purity to calibrate the NMR instrument. The external standard and the analyte are measured separately, thereby eliminating the matter of chemical compatibility and resonance interference between the standard and the analyte. However, the instrumental factors, including the quality of NMR tubes, must be kept the same. Any deviations will compromise the accuracy of the results. An innovative qNMR method reported herein utilizes an internal reference substance along with an external standard to assume the role of the standard used in the traditional internal standard qNMR method. In this new method, the internal reference substance must only be chemically compatible and be free of resonance-interference with the analyte or external standard whereas the external standard must only be of a known purity. The exact purity or concentration of the internal reference substance is not required as long as the same quantity is added to the external standard and the analyte. The new method reduces the burden of searching for an appropriate standard for each analyte significantly. Therefore the efficiency of the qNMR purity assay increases while the precision of the internal standard method is retained. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. ADVANCED EMISSIONS SPECIATION METHODOLOGIES FOR THE AUTO/OIL AIR QUALITY IMPROVEMENT RESEARCH PROGRAM - II. ALDEHYDES, KETONES, AND ALCOHOLS

    EPA Science Inventory

    Analytical methods for determining individual aldehyde, ketone, and alcohol emissions from gasoline-, methanol-, and variable-fueled vehicles are described. These methods were used in the Auto/Oil Air quality Improvement Research Program to provide emission data for comparison of...

  19. Lubrication and cooling for high speed gears

    NASA Technical Reports Server (NTRS)

    Townsend, D. P.

    1985-01-01

    The problems and failures occurring with the operation of high speed gears are discussed. The gearing losses associated with high speed gearing such as tooth mesh friction, bearing friction, churning, and windage are discussed with various ways shown to help reduce these losses and thereby improve efficiency. Several different methods of oil jet lubrication for high speed gearing are given such as into mesh, out of mesh, and radial jet lubrication. The experiments and analytical results for the various methods of oil jet lubrication are shown with the strengths and weaknesses of each method discussed. The analytical and experimental results of gear lubrication and cooling at various test conditions are presented. These results show the very definite need of improved methods of gear cooling at high speed and high load conditions.

  20. [The water content reference material of water saturated octanol].

    PubMed

    Wang, Haifeng; Ma, Kang; Zhang, Wei; Li, Zhanyuan

    2011-03-01

    The national standards of biofuels specify the technique specification and analytical methods. A water content certified reference material based on the water saturated octanol was developed in order to satisfy the needs of the instrument calibration and the methods validation, assure the accuracy and consistency of results in water content measurements of biofuels. Three analytical methods based on different theories were employed to certify the water content of the reference material, including Karl Fischer coulometric titration, Karl Fischer volumetric titration and quantitative nuclear magnetic resonance. The consistency of coulometric and volumetric titration was achieved through the improvement of methods. The accuracy of the certified result was improved by the introduction of the new method of quantitative nuclear magnetic resonance. Finally, the certified value of reference material is 4.76% with an expanded uncertainty of 0.09%.

  1. DEVELOPMENT AND VALIDATION OF ANALYTICAL METHODS FOR ENUMERATION OF FECAL INDICATORS AND EMERGING CHEMICAL CONTAMINANTS IN BIOSOLIDS

    EPA Science Inventory

    In 2002 the National Research Council (NRC) issued a report which identified a number of issues regarding biosolids land application practices and pointed out the need for improved and validated analytical techniques for regulated indicator organisms and pathogens. They also call...

  2. A new frequency approach for light flicker evaluation in electric power systems

    NASA Astrophysics Data System (ADS)

    Feola, Luigi; Langella, Roberto; Testa, Alfredo

    2015-12-01

    In this paper, a new analytical estimator for light flicker in frequency domain, which is able to take into account also the frequency components neglected by the classical methods proposed in literature, is proposed. The analytical solutions proposed apply for any generic stationary signal affected by interharmonic distortion. The light flicker analytical estimator proposed is applied to numerous numerical case studies with the goal of showing i) the correctness and the improvements of the analytical approach proposed with respect to the other methods proposed in literature and ii) the accuracy of the results compared to those obtained by means of the classical International Electrotechnical Commission (IEC) flickermeter. The usefulness of the proposed analytical approach is that it can be included in signal processing tools for interharmonic penetration studies for the integration of renewable energy sources in future smart grids.

  3. A new multi-step technique with differential transform method for analytical solution of some nonlinear variable delay differential equations.

    PubMed

    Benhammouda, Brahim; Vazquez-Leal, Hector

    2016-01-01

    This work presents an analytical solution of some nonlinear delay differential equations (DDEs) with variable delays. Such DDEs are difficult to treat numerically and cannot be solved by existing general purpose codes. A new method of steps combined with the differential transform method (DTM) is proposed as a powerful tool to solve these DDEs. This method reduces the DDEs to ordinary differential equations that are then solved by the DTM. Furthermore, we show that the solutions can be improved by Laplace-Padé resummation method. Two examples are presented to show the efficiency of the proposed technique. The main advantage of this technique is that it possesses a simple procedure based on a few straight forward steps and can be combined with any analytical method, other than the DTM, like the homotopy perturbation method.

  4. Back analysis of geomechanical parameters in underground engineering using artificial bee colony.

    PubMed

    Zhu, Changxing; Zhao, Hongbo; Zhao, Ming

    2014-01-01

    Accurate geomechanical parameters are critical in tunneling excavation, design, and supporting. In this paper, a displacements back analysis based on artificial bee colony (ABC) algorithm is proposed to identify geomechanical parameters from monitored displacements. ABC was used as global optimal algorithm to search the unknown geomechanical parameters for the problem with analytical solution. To the problem without analytical solution, optimal back analysis is time-consuming, and least square support vector machine (LSSVM) was used to build the relationship between unknown geomechanical parameters and displacement and improve the efficiency of back analysis. The proposed method was applied to a tunnel with analytical solution and a tunnel without analytical solution. The results show the proposed method is feasible.

  5. Single excitation-emission fluorescence spectrum (EEF) for determination of cetane improver in diesel fuel.

    PubMed

    Insausti, Matías; Fernández Band, Beatriz S

    2015-04-05

    A highly sensitive spectrofluorimetric method has been developed for the determination of 2-ethylhexyl nitrate in diesel fuel. Usually, this compound is used as an additive in order to improve cetane number. The analytical method consists in building the chemometric model as a first step. Then, it is possible to quantify the analyte with only recording a single excitation-emission fluorescence spectrum (EEF), whose data are introduced in the chemometric model above mentioned. Another important characteristic of this method is that the fuel sample was used without any pre-treatment for EEF. This work provides an interest improvement to fluorescence techniques using the rapid and easily applicable EEF approach to analyze such complex matrices. Exploding EEF was the key to a successful determination, obtaining a detection limit of 0.00434% (v/v) and a limit of quantification of 0.01446% (v/v). Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Task-based image quality evaluation of iterative reconstruction methods for low dose CT using computer simulations

    NASA Astrophysics Data System (ADS)

    Xu, Jingyan; Fuld, Matthew K.; Fung, George S. K.; Tsui, Benjamin M. W.

    2015-04-01

    Iterative reconstruction (IR) methods for x-ray CT is a promising approach to improve image quality or reduce radiation dose to patients. The goal of this work was to use task based image quality measures and the channelized Hotelling observer (CHO) to evaluate both analytic and IR methods for clinical x-ray CT applications. We performed realistic computer simulations at five radiation dose levels, from a clinical reference low dose D0 to 25% D0. A fixed size and contrast lesion was inserted at different locations into the liver of the XCAT phantom to simulate a weak signal. The simulated data were reconstructed on a commercial CT scanner (SOMATOM Definition Flash; Siemens, Forchheim, Germany) using the vendor-provided analytic (WFBP) and IR (SAFIRE) methods. The reconstructed images were analyzed by CHOs with both rotationally symmetric (RS) and rotationally oriented (RO) channels, and with different numbers of lesion locations (5, 10, and 20) in a signal known exactly (SKE), background known exactly but variable (BKEV) detection task. The area under the receiver operating characteristic curve (AUC) was used as a summary measure to compare the IR and analytic methods; the AUC was also used as the equal performance criterion to derive the potential dose reduction factor of IR. In general, there was a good agreement in the relative AUC values of different reconstruction methods using CHOs with RS and RO channels, although the CHO with RO channels achieved higher AUCs than RS channels. The improvement of IR over analytic methods depends on the dose level. The reference dose level D0 was based on a clinical low dose protocol, lower than the standard dose due to the use of IR methods. At 75% D0, the performance improvement was statistically significant (p < 0.05). The potential dose reduction factor also depended on the detection task. For the SKE/BKEV task involving 10 lesion locations, a dose reduction of at least 25% from D0 was achieved.

  7. Determination of acrylamide in various food matrices: evaluation of LC and GC mass spectrometric methods.

    PubMed

    Becalski, Adam; Lau, Benjamin P Y; Lewis, David; Seaman, Stephen W; Sun, Wing F

    2005-01-01

    Recent concerns surrounding the presence of acrylamide in many types of thermally processed food have brought about the need for the development of analytical methods suitable for determination of acrylamide in diverse matrices with the goals of improving overall confidence in analytical results and better understanding of method capabilities. Consequently, the results are presented of acrylamide testing in commercially available food products--potato fries, potato chips, crispbread, instant coffee, coffee beans, cocoa, chocolate and peanut butter, obtained by using the same sample extract. The results obtained by using LC-MS/MS, GC/MS (El), GC/HRMS (El)--with or without derivatization--and the use of different analytical columns, are discussed and compared with respect to matrix borne interferences, detection limits and method complexities.

  8. Modern Instrumental Methods in Forensic Toxicology*

    PubMed Central

    Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.

    2009-01-01

    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

  9. The decline of soil due to the pile of highway project Medan-Kualanamu (STA 35 + 901) with the finite element method

    NASA Astrophysics Data System (ADS)

    Hastuty, I. P.; Roesyanto; Sihite, A. B.

    2018-02-01

    Consolidation is the process of discharge of water from the ground through the pore cavity. Consolidation occurs in soft soil or unstable soil that allows an improvement in order to make the soil more stable. The method of using Prefabricated Vertical Drain (PVD) is one way to improve unstable soils. PVD works like a sand column that can drain water vertically. This study aims to determine the decrease, pore water pressure and soil consolidation rate with Prefabricated Vertical Drain (PVD) and without PVD analytically and using finite element method that affect the duration of soil decline to reach 90% consolidation or in other words soil does not decline anymore. Based on the analytical calculation, the decrease obtained is equal to 0.47 m meanwhile the result of calculation using finite element method is 0.45 m. The consolidation rate obtained from analytical calculation is 19 days with PVD and 115 days without PVD. The consolidation rate obtained from finite element method is 63 days with PVD and 110 days without PVD. And the pore water pressure is 0.92 KN/m2.

  10. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    PubMed

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  11. [Construction of NIRS-based process analytical system for production of salvianolic acid for injection and relative discussion].

    PubMed

    Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-10-01

    Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.

  12. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    PubMed

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  13. Gasoline and Diesel Fuel Test Methods Additional Resources

    EPA Pesticide Factsheets

    Supporting documents on the Direct Final Rule that allows refiners and laboratories to use more current and improved fuel testing procedures for twelve American Society for Testing and Materials analytical test methods.

  14. Improved sample management in the cylindrical-tube microelectrophoresis method

    NASA Technical Reports Server (NTRS)

    Smolka, A. J. K.

    1980-01-01

    A modification to an analytical microelectrophoresis system is described that improves the manipulation of the sample particles and fluid. The apparatus modification and improved operational procedure should yield more accurate measurements of particle mobilities and permit less skilled operators to use the apparatus.

  15. Electrodialytic in-line preconcentration for ionic solute analysis.

    PubMed

    Ohira, Shin-Ichi; Yamasaki, Takayuki; Koda, Takumi; Kodama, Yuko; Toda, Kei

    2018-04-01

    Preconcentration is an effective way to improve analytical sensitivity. Many types of methods are used for enrichment of ionic solute analytes. However, current methods are batchwise and include procedures such as trapping and elution. In this manuscript, we propose in-line electrodialytic enrichment of ionic solutes. The method can enrich ionic solutes within seconds by quantitative transfer of analytes from the sample solution to the acceptor solution under an electric field. Because of quantitative ion transfer, the enrichment factor (the ratio of the concentration in the sample and to that in the obtained acceptor solution) only depends on the flow rate ratio of the sample solution to the acceptor solution. The ratios of the concentrations and flow rates are equal for ratios up to 70, 20, and 70 for the tested ionic solutes of inorganic cations, inorganic anions, and heavy metal ions, respectively. The sensitivity of ionic solute determinations is also improved based on the enrichment factor. The method can also simultaneously achieve matrix isolation and enrichment. The method was successively applied to determine the concentrations of trace amounts of chloroacetic acids in tap water. The regulated concentration levels cannot be determined by conventional high-performance liquid chromatography with ultraviolet detection (HPLC-UV) without enrichment. However, enrichment with the present method is effective for determination of tap water quality by improving the limits of detection of HPLC-UV. The standard addition test with real tap water samples shows good recoveries (94.9-109.6%). Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Analytical modelling of Halbach linear generator incorporating pole shifting and piece-wise spring for ocean wave energy harvesting

    NASA Astrophysics Data System (ADS)

    Tan, Yimin; Lin, Kejian; Zu, Jean W.

    2018-05-01

    Halbach permanent magnet (PM) array has attracted tremendous research attention in the development of electromagnetic generators for its unique properties. This paper has proposed a generalized analytical model for linear generators. The slotted stator pole-shifting and implementation of Halbach array have been combined for the first time. Initially, the magnetization components of the Halbach array have been determined using Fourier decomposition. Then, based on the magnetic scalar potential method, the magnetic field distribution has been derived employing specially treated boundary conditions. FEM analysis has been conducted to verify the analytical model. A slotted linear PM generator with Halbach PM has been constructed to validate the model and further improved using piece-wise springs to trigger full range reciprocating motion. A dynamic model has been developed to characterize the dynamic behavior of the slider. This analytical method provides an effective tool in development and optimization of Halbach PM generator. The experimental results indicate that piece-wise springs can be employed to improve generator performance under low excitation frequency.

  17. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies.

    PubMed

    Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F

    2016-04-01

    The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  18. An Investigation to Manufacturing Analytical Services Composition using the Analytical Target Cascading Method.

    PubMed

    Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas

    2017-01-01

    As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.

  19. On-line focusing of flavin derivatives using Dynamic pH junction-sweeping capillary electrophoresis with laser-induced fluorescence detection.

    PubMed

    Britz-McKibbin, Philip; Otsuka, Koji; Terabe, Shigeru

    2002-08-01

    Simple yet effective methods to enhance concentration sensitivity is needed for capillary electrophoresis (CE) to become a practical method to analyze trace levels of analytes in real samples. In this report, the development of a novel on-line preconcentration technique combining dynamic pH junction and sweeping modes of focusing is applied to the sensitive and selective analysis of three flavin derivatives: riboflavin, flavin mononucleotide (FMN) and flavin adenine dinucleotide (FAD). Picomolar (pM) detectability of flavins by CE with laser-induced fluorescence (LIF) detection is demonstrated through effective focusing of large sample volumes (up to 22% capillary length) using a dual pH junction-sweeping focusing mode. This results in greater than a 1,200-fold improvement in sensitivity relative to conventional injection methods, giving a limit of detection (S/N = 3) of approximately 4.0 pM for FAD and FMN. Flavin focusing is examined in terms of analyte mobility dependence on buffer pH, borate complexation and SDS interaction. Dynamic pH junction-sweeping extends on-line focusing to both neutral (hydrophobic) and weakly acidic (hydrophilic) species and is considered useful in cases when either conventional sweeping or dynamic pH junction techniques used alone are less effective for certain classes of analytes. Enhanced focusing performance by this hyphenated method was demonstrated by greater than a 4-fold reduction in flavin bandwidth, as compared to either sweeping or dynamic pH junction, reflected by analyte detector bandwidths <0.20 cm. Novel on-line focusing strategies are required to improve sensitivity in CE, which may be applied toward more effective biochemical analysis methods for diverse types of analytes.

  20. Reports of the AAAI 2009 Spring Symposia: Technosocial Predictive Analytics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.

    2009-10-01

    The Technosocial Predictive Analytics AAAI symposium was held at Stanford University, Stanford, CA, March 23-25, 2009. The goal of this symposium was to explore new methods for anticipatory analytical thinking that provide decision advantage through the integration of human and physical models. Special attention was also placed on how to leverage supporting disciplines to (a) facilitate the achievement of knowledge inputs, (b) improve the user experience, and (c) foster social intelligence through collaborative/competitive work.

  1. [Continual improvement of quantitative analytical method development of Panax notogineng saponins based on quality by design].

    PubMed

    Dai, Sheng-Yun; Xu, Bing; Shi, Xin-Yuan; Xu, Xiang; Sun, Ying-Qiang; Qiao, Yan-Jiang

    2017-03-01

    This study is aimed to propose a continual improvement strategy based on quality by design (QbD). An ultra high performance liquid chromatography (UPLC) method was developed to accomplish the method transformation from HPLC to UPLC of Panax notogineng saponins (PNS) and achieve the continual improvement of PNS based on QbD, for example. Plackett-Burman screening design and Box-Behnken optimization design were employed to further understand the relationship between the critical method parameters (CMPs) and critical method attributes (CMAs). And then the Bayesian design space was built. The separation degree of the critical peaks (ginsenoside Rg₁ and ginsenoside Re) was over 2.0 and the analysis time was less than 17 min by a method chosen from the design space with 20% of the initial concentration of the acetonitrile, 10 min of the isocratic time and 6%•min⁻¹ of the gradient slope. At last, the optimum method was validated by accuracy profile. Based on the same analytical target profile (ATP), the comparison of HPLC and UPLC including chromatograph method, CMA identification, CMP-CMA model and system suitability test (SST) indicated that the UPLC method could shorten the analysis time, improve the critical separation and satisfy the requirement of the SST. In all, HPLC method could be replaced by UPLC for the quantity analysis of PNS. Copyright© by the Chinese Pharmaceutical Association.

  2. On-line solid-phase microextraction of triclosan, bisphenol A, chlorophenols, and selected pharmaceuticals in environmental water samples by high-performance liquid chromatography-ultraviolet detection.

    PubMed

    Kim, Dalho; Han, Jungho; Choi, Yongwook

    2013-01-01

    A method using on-line solid-phase microextraction (SPME) on a carbowax-templated fiber followed by liquid chromatography (LC) with ultraviolet (UV) detection was developed for the determination of triclosan in environmental water samples. Along with triclosan, other selected phenolic compounds, bisphenol A, and acidic pharmaceuticals were studied. Previous SPME/LC or stir-bar sorptive extraction/LC-UV for polar analytes showed lack of sensitivity. In this study, the calculated octanol-water distribution coefficient (log D) values of the target analytes at different pH values were used to estimate polarity of the analytes. The lack of sensitivity observed in earlier studies is identified as a lack of desorption by strong polar-polar interactions between analyte and solid-phase. Calculated log D values were useful to understand or predict the interaction between analyte and solid phase. Under the optimized conditions, the method detection limit of selected analytes by using on-line SPME-LC-UV method ranged from 5 to 33 ng L(-1), except for very polar 3-chlorophenol and 2,4-dichlorophenol which was obscured in wastewater samples by an interfering substance. This level of detection represented a remarkable improvement over the conventional existing methods. The on-line SPME-LC-UV method, which did not require derivatization of analytes, was applied to the determination of TCS including phenolic compounds and acidic pharmaceuticals in tap water and river water and municipal wastewater samples.

  3. Visual analytics in healthcare education: exploring novel ways to analyze and represent big data in undergraduate medical education

    PubMed Central

    Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    Introduction. The big data present in the medical curriculum that informs undergraduate medical education is beyond human abilities to perceive and analyze. The medical curriculum is the main tool used by teachers and directors to plan, design, and deliver teaching and assessment activities and student evaluations in medical education in a continuous effort to improve it. Big data remains largely unexploited for medical education improvement purposes. The emerging research field of visual analytics has the advantage of combining data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognize visual patterns. Nevertheless, there is a lack of research on the use and benefits of visual analytics in medical education. Methods. The present study is based on analyzing the data in the medical curriculum of an undergraduate medical program as it concerns teaching activities, assessment methods and learning outcomes in order to explore visual analytics as a tool for finding ways of representing big data from undergraduate medical education for improvement purposes. Cytoscape software was employed to build networks of the identified aspects and visualize them. Results. After the analysis of the curriculum data, eleven aspects were identified. Further analysis and visualization of the identified aspects with Cytoscape resulted in building an abstract model of the examined data that presented three different approaches; (i) learning outcomes and teaching methods, (ii) examination and learning outcomes, and (iii) teaching methods, learning outcomes, examination results, and gap analysis. Discussion. This study identified aspects of medical curriculum that play an important role in how medical education is conducted. The implementation of visual analytics revealed three novel ways of representing big data in the undergraduate medical education context. It appears to be a useful tool to explore such data with possible future implications on healthcare education. It also opens a new direction in medical education informatics research. PMID:25469323

  4. Visual analytics in healthcare education: exploring novel ways to analyze and represent big data in undergraduate medical education.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    Introduction. The big data present in the medical curriculum that informs undergraduate medical education is beyond human abilities to perceive and analyze. The medical curriculum is the main tool used by teachers and directors to plan, design, and deliver teaching and assessment activities and student evaluations in medical education in a continuous effort to improve it. Big data remains largely unexploited for medical education improvement purposes. The emerging research field of visual analytics has the advantage of combining data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognize visual patterns. Nevertheless, there is a lack of research on the use and benefits of visual analytics in medical education. Methods. The present study is based on analyzing the data in the medical curriculum of an undergraduate medical program as it concerns teaching activities, assessment methods and learning outcomes in order to explore visual analytics as a tool for finding ways of representing big data from undergraduate medical education for improvement purposes. Cytoscape software was employed to build networks of the identified aspects and visualize them. Results. After the analysis of the curriculum data, eleven aspects were identified. Further analysis and visualization of the identified aspects with Cytoscape resulted in building an abstract model of the examined data that presented three different approaches; (i) learning outcomes and teaching methods, (ii) examination and learning outcomes, and (iii) teaching methods, learning outcomes, examination results, and gap analysis. Discussion. This study identified aspects of medical curriculum that play an important role in how medical education is conducted. The implementation of visual analytics revealed three novel ways of representing big data in the undergraduate medical education context. It appears to be a useful tool to explore such data with possible future implications on healthcare education. It also opens a new direction in medical education informatics research.

  5. Evaluation of available analytical techniques for monitoring the quality of space station potable water

    NASA Technical Reports Server (NTRS)

    Geer, Richard D.

    1989-01-01

    To assure the quality of potable water (PW) on the Space Station (SS) a number of chemical and physical tests must be conducted routinely. After reviewing the requirements for potable water, both direct and indirect analytical methods are evaluated that could make the required tests and improvements compatible with the Space Station operation. A variety of suggestions are made to improve the analytical techniques for SS operation. The most important recommendations are: (1) the silver/silver chloride electrode (SB) method of removing I sub 2/I (-) biocide from the water, since it may interfere with analytical procedures for PW and also its end uses; (2) the orbital reactor (OR) method of carrying out chemistry and electrochemistry in microgravity by using a disk shaped reactor on an orbital table to impart artificial G force to the contents, allowing solution mixing and separation of gases and liquids; and (3) a simple ultra low volume highly sensitive electrochemical/conductivity detector for use with a capillary zone electrophoresis apparatus. It is also recommended, since several different conductivity and resistance measurements are made during the analysis of PW, that the bipolar pulse measuring circuit be used in all these applications for maximum compatibility and redundancy of equipment.

  6. Evaluation of strength and failure of brittle rock containing initial cracks under lithospheric conditions

    NASA Astrophysics Data System (ADS)

    Li, Xiaozhao; Qi, Chengzhi; Shao, Zhushan; Ma, Chao

    2018-02-01

    Natural brittle rock contains numerous randomly distributed microcracks. Crack initiation, growth, and coalescence play a predominant role in evaluation for the strength and failure of brittle rocks. A new analytical method is proposed to predict the strength and failure of brittle rocks containing initial microcracks. The formulation of this method is based on an improved wing crack model and a suggested micro-macro relation. In this improved wing crack model, the parameter of crack angle is especially introduced as a variable, and the analytical stress-crack relation considering crack angle effect is obtained. Coupling the proposed stress-crack relation and the suggested micro-macro relation describing the relation between crack growth and axial strain, the stress-strain constitutive relation is obtained to predict the rock strength and failure. Considering different initial microcrack sizes, friction coefficients and confining pressures, effects of crack angle on tensile wedge force acting on initial crack interface are studied, and effects of crack angle on stress-strain constitutive relation of rocks are also analyzed. The strength and crack initiation stress under different crack angles are discussed, and the value of most disadvantaged angle triggering crack initiation and rock failure is founded. The analytical results are similar to the published study results. Rationality of this proposed analytical method is verified.

  7. Amie Sluiter | NREL

    Science.gov Websites

    biomass analysis methods and is primary author on 11 Laboratory Analytical Procedures, which are ) spectroscopic analysis methods. These methods allow analysts to predict the composition of feedstock and process . Patent No. 6,737,258 (2002) Featured Publications "Improved methods for the determination of drying

  8. Student Receivables Management: Opportunities for Improved Practices.

    ERIC Educational Resources Information Center

    Jacquin, Jules C.; Goyal, Anil K.

    1995-01-01

    The college or university's business office can help reduce problems with student receivables through procedural review of the tuition revenue process, application of analytical methods, and improved operating practices. Admissions, financial aid, and billing offices must all be involved. (MSE)

  9. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Keller, J.; Wallen, R.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  10. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    PubMed

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  12. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  13. Assessing the Importance of Treatment Goals in Patients with Psoriasis: Analytic Hierarchy Process vs. Likert Scales.

    PubMed

    Gutknecht, Mandy; Danner, Marion; Schaarschmidt, Marthe-Lisa; Gross, Christian; Augustin, Matthias

    2018-02-15

    To define treatment benefit, the Patient Benefit Index contains a weighting of patient-relevant treatment goals using the Patient Needs Questionnaire, which includes a 5-point Likert scale ranging from 0 ("not important at all") to 4 ("very important"). These treatment goals have been assigned to five health dimensions. The importance of each dimension can be derived by averaging the importance ratings on the Likert scales of associated treatment goals. As the use of a Likert scale does not allow for a relative assessment of importance, the objective of this study was to estimate relative importance weights for health dimensions and associated treatment goals in patients with psoriasis by using the analytic hierarchy process and to compare these weights with the weights resulting from the Patient Needs Questionnaire. Furthermore, patients' judgments on the difficulty of the methods were investigated. Dimensions of the Patient Benefit Index and their treatment goals were mapped into a hierarchy of criteria and sub-criteria to develop the analytic hierarchy process questionnaire. Adult patients with psoriasis starting a new anti-psoriatic therapy in the outpatient clinic of the Institute for Health Services Research in Dermatology and Nursing at the University Medical Center Hamburg (Germany) were recruited and completed both methods (analytic hierarchy process, Patient Needs Questionnaire). Ratings of treatment goals on the Likert scales (Patient Needs Questionnaire) were summarized within each dimension to assess the importance of the respective health dimension/criterion. Following the analytic hierarchy process approach, consistency in judgments was assessed using a standardized measurement (consistency ratio). At the analytic hierarchy process level of criteria, 78 of 140 patients achieved the accepted consistency. Using the analytic hierarchy process, the dimension "improvement of physical functioning" was most important, followed by "improvement of social functioning". Concerning the Patient Needs Questionnaire results, these dimensions were ranked in second and fifth position, whereas "strengthening of confidence in the therapy and in a possible healing" was ranked most important, which was least important in the analytic hierarchy process ranking. In both methods, "improvement of psychological well-being" and "reduction of impairments due to therapy" were equally ranked in positions three and four. In contrast to this, on the level of sub-criteria, predominantly a similar ranking of treatment goals could be observed between the analytic hierarchy process and the Patient Needs Questionnaire. From the patients' point of view, the Likert scales (Patient Needs Questionnaire) were easier to complete than the analytic hierarchy process pairwise comparisons. Patients with psoriasis assign different importance to health dimensions and associated treatment goals. In choosing a method to assess the importance of health dimensions and/or treatment goals, it needs to be considered that resulting importance weights may differ in dependence on the used method. However, in this study, observed discrepancies in importance weights of the health dimensions were most likely caused by the different methodological approaches focusing on treatment goals to assess the importance of health dimensions on the one hand (Patient Needs Questionnaire) or directly assessing health dimensions on the other hand (analytic hierarchy process).

  14. Guided-inquiry laboratory experiments to improve students' analytical thinking skills

    NASA Astrophysics Data System (ADS)

    Wahyuni, Tutik S.; Analita, Rizki N.

    2017-12-01

    This study aims to improve the experiment implementation quality and analytical thinking skills of undergraduate students through guided-inquiry laboratory experiments. This study was a classroom action research conducted in three cycles. The study has been carried out with 38 undergraduate students of the second semester of Biology Education Department of State Islamic Institute (SII) of Tulungagung, as a part of Chemistry for Biology course. The research instruments were lesson plans, learning observation sheets and undergraduate students' experimental procedure. Research data were analyzed using quantitative-descriptive method. The increasing of analytical thinking skills could be measured using gain score normalized and statistical paired t-test. The results showed that guided-inquiry laboratory experiments model was able to improve both the experiment implementation quality and the analytical thinking skills. N-gain score of the analytical thinking skills was increased, in spite of just 0.03 with low increase category, indicated by experimental reports. Some of undergraduate students have had the difficulties in detecting the relation of one part to another and to an overall structure. The findings suggested that giving feedback the procedural knowledge and experimental reports were important. Revising the experimental procedure that completed by some scaffolding questions were also needed.

  15. Robust electroencephalogram phase estimation with applications in brain-computer interface systems.

    PubMed

    Seraj, Esmaeil; Sameni, Reza

    2017-03-01

    In this study, a robust method is developed for frequency-specific electroencephalogram (EEG) phase extraction using the analytic representation of the EEG. Based on recent theoretical findings in this area, it is shown that some of the phase variations-previously associated to the brain response-are systematic side-effects of the methods used for EEG phase calculation, especially during low analytical amplitude segments of the EEG. With this insight, the proposed method generates randomized ensembles of the EEG phase using minor perturbations in the zero-pole loci of narrow-band filters, followed by phase estimation using the signal's analytical form and ensemble averaging over the randomized ensembles to obtain a robust EEG phase and frequency. This Monte Carlo estimation method is shown to be very robust to noise and minor changes of the filter parameters and reduces the effect of fake EEG phase jumps, which do not have a cerebral origin. As proof of concept, the proposed method is used for extracting EEG phase features for a brain computer interface (BCI) application. The results show significant improvement in classification rates using rather simple phase-related features and a standard K-nearest neighbors and random forest classifiers, over a standard BCI dataset. The average performance was improved between 4-7% (in absence of additive noise) and 8-12% (in presence of additive noise). The significance of these improvements was statistically confirmed by a paired sample t-test, with 0.01 and 0.03 p-values, respectively. The proposed method for EEG phase calculation is very generic and may be applied to other EEG phase-based studies.

  16. Centrifugal ultrafiltration of human serum for improving immunoglobulin A quantification using attenuated total reflectance infrared spectroscopy.

    PubMed

    Elsohaby, Ibrahim; McClure, J Trenton; Riley, Christopher B; Bryanton, Janet; Bigsby, Kathryn; Shaw, R Anthony

    2018-02-20

    Attenuated total reflectance infrared (ATR-IR) spectroscopy is a simple, rapid and cost-effective method for the analysis of serum. However, the complex nature of serum remains a limiting factor to the reliability of this method. We investigated the benefits of coupling the centrifugal ultrafiltration with ATR-IR spectroscopy for quantification of human serum IgA concentration. Human serum samples (n = 196) were analyzed for IgA using an immunoturbidimetric assay. ATR-IR spectra were acquired for whole serum samples and for the retentate (residue) reconstituted with saline following 300 kDa centrifugal ultrafiltration. IR-based analytical methods were developed for each of the two spectroscopic datasets, and the accuracy of each of the two methods compared. Analytical methods were based upon partial least squares regression (PLSR) calibration models - one with 5-PLS factors (for whole serum) and the second with 9-PLS factors (for the reconstituted retentate). Comparison of the two sets of IR-based analytical results to reference IgA values revealed improvements in the Pearson correlation coefficient (from 0.66 to 0.76), and the root mean squared error of prediction in IR-based IgA concentrations (from 102 to 79 mg/dL) for the ultrafiltration retentate-based method as compared to the method built upon whole serum spectra. Depleting human serum low molecular weight proteins using a 300 kDa centrifugal filter thus enhances the accuracy IgA quantification by ATR-IR spectroscopy. Further evaluation and optimization of this general approach may ultimately lead to routine analysis of a range of high molecular-weight analytical targets that are otherwise unsuitable for IR-based analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. An Improved Method for the Extraction and Thin-Layer Chromatography of Chlorophyll A and B from Spinach

    ERIC Educational Resources Information Center

    Quach, Hao T.; Steeper, Robert L.; Griffin, William G.

    2004-01-01

    A simple and fast method, which resolves chlorophyll a and b from spinach leaves on analytical plates while minimizing the appearance of chlorophyll degradation products is shown. An improved mobile phase for the Thin-layer chromatographic analysis of spinach extract that allows for the complete resolution of the common plant pigments found in…

  18. A review of analytical methods for the treatment of flows with detached shocks

    NASA Technical Reports Server (NTRS)

    Busemann, Adolf

    1949-01-01

    The transonic flow theory has been considerably improved in recent years. The problems at subsonic speeds of a moving body concern chiefly the drag and the problems at supersonic speeds, the detached and attached shock waves. Inasmuch as the literature contains some information that is valuable and some other information that is misleading, the purpose of this paper is to discuss those analytical methods and their applications which are regarded as reliable in the transonic range. After these methods are reviewed, a short discussion without details and proofs follows to round out the picture. (author)

  19. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory

    PubMed Central

    Kumar, B. Vinodh; Mohan, Thuthi

    2018-01-01

    OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587

  20. Microfluidic photoinduced chemical oxidation for Ru(bpy)33+ chemiluminescence - A comprehensive experimental comparison with on-chip direct chemical oxidation.

    PubMed

    Kadavilpparampu, Afsal Mohammed; Al Lawati, Haider A J; Suliman, Fakhr Eldin O

    2017-08-05

    For the first time, the analytical figures of merit in detection capabilities of the very less explored photoinduced chemical oxidation method for Ru(bpy) 3 2+ CL has been investigated in detail using 32 structurally different analytes. It was carried out on-chip using peroxydisulphate and visible light and compared with well-known direct chemical oxidation approaches using Ce(IV). The analytes belong to various chemical classes such as tertiary amine, secondary amine, sulphonamide, betalactam, thiol and benzothiadiazine. Influence of detection environment on CL emission with respect to method of oxidation was evaluated by changing the buffers and pH. The photoinduced chemical oxidation exhibited more universal nature for Ru(bpy) 3 2+ CL in detection towards selected analytes. No additional enhancers, reagents, or modification in instrumental configuration were required. Wide detectability and enhanced emission has been observed for analytes from all the chemical classes when photoinduced chemical oxidation was employed. Some of these analytes are reported for the first time under photoinduced chemical oxidation like compounds from sulphonamide, betalactam, thiol and benzothiadiazine class. On the other hand, many of the selected analytes including tertiary and secondary amines such as cetirizine, azithromycin fexofenadine and proline did not produced any analytically useful CL signal (S/N=3 or above for 1μgmL -1 analyte) under chemical oxidation. The most fascinating observations was in the detection limits; for example ofloxacin was 15 times more intense with a detection limit of 5.81×10 -10 M compared to most lowest ever reported 6×10 -9 M. Earlier, penicillamine was detected at 0.1μgmL -1 after derivatization using photoinduced chemical oxidation, but in this study, we improved it to 5.82ngmL -1 without any prior derivatization. The detection limits of many other analytes were also found to be improved by several orders of magnitude under photoinduced chemical oxidation. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Microfluidic photoinduced chemical oxidation for Ru(bpy)33 + chemiluminescence - A comprehensive experimental comparison with on-chip direct chemical oxidation

    NASA Astrophysics Data System (ADS)

    Kadavilpparampu, Afsal Mohammed; Al Lawati, Haider A. J.; Suliman, Fakhr Eldin O.

    2017-08-01

    For the first time, the analytical figures of merit in detection capabilities of the very less explored photoinduced chemical oxidation method for Ru(bpy)32 + CL has been investigated in detail using 32 structurally different analytes. It was carried out on-chip using peroxydisulphate and visible light and compared with well-known direct chemical oxidation approaches using Ce(IV). The analytes belong to various chemical classes such as tertiary amine, secondary amine, sulphonamide, betalactam, thiol and benzothiadiazine. Influence of detection environment on CL emission with respect to method of oxidation was evaluated by changing the buffers and pH. The photoinduced chemical oxidation exhibited more universal nature for Ru(bpy)32 + CL in detection towards selected analytes. No additional enhancers, reagents, or modification in instrumental configuration were required. Wide detectability and enhanced emission has been observed for analytes from all the chemical classes when photoinduced chemical oxidation was employed. Some of these analytes are reported for the first time under photoinduced chemical oxidation like compounds from sulphonamide, betalactam, thiol and benzothiadiazine class. On the other hand, many of the selected analytes including tertiary and secondary amines such as cetirizine, azithromycin fexofenadine and proline did not produced any analytically useful CL signal (S/N = 3 or above for 1 μgmL- 1 analyte) under chemical oxidation. The most fascinating observations was in the detection limits; for example ofloxacin was 15 times more intense with a detection limit of 5.81 × 10- 10 M compared to most lowest ever reported 6 × 10- 9 M. Earlier, penicillamine was detected at 0.1 μg mL- 1 after derivatization using photoinduced chemical oxidation, but in this study, we improved it to 5.82 ng mL- 1 without any prior derivatization. The detection limits of many other analytes were also found to be improved by several orders of magnitude under photoinduced chemical oxidation.

  2. Accuracy verification and identification of matrix effects. The College of American Pathologists' Protocol.

    PubMed

    Eckfeldt, J H; Copeland, K R

    1993-04-01

    Proficiency testing using stabilized control materials has been used for decades as a means of monitoring and improving performance in the clinical laboratory. Often, the commonly used proficiency testing materials exhibit "matrix effects" that cause them to behave differently from fresh human specimens in certain clinical analytic systems. Because proficiency testing is the primary method in which regulatory agencies have chosen to evaluate clinical laboratory performance, the College of American Pathologists (CAP) has proposed guidelines for investigating the influence of matrix effects on their Survey results. The purpose of this investigation was to determine the feasibility, usefulness, and potential problems associated with this CAP Matrix Effect Analytical Protocol, in which fresh patient specimens and CAP proficiency specimens are analyzed simultaneously by a field method and a definitive, reference, or other comparative method. The optimal outcome would be that both the fresh human and CAP Survey specimens agree closely with the comparative method result. However, this was not always the case. Using several different analytic configurations, we were able to demonstrate matrix and calibration biases for several of the analytes investigated.

  3. Development of the Ion Exchange-Gravimetric Method for Sodium in Serum as a Definitive Method

    PubMed Central

    Moody, John R.; Vetter, Thomas W.

    1996-01-01

    An ion exchange-gravimetric method, previously developed as a National Committee for Clinical Laboratory Standards (NCCLS) reference method for the determination of sodium in human serum, has been re-evaluated and improved. Sources of analytical error in this method have been examined more critically and the overall uncertainties decreased. Additionally, greater accuracy and repeatability have been achieved by the application of this definitive method to a sodium chloride reference material. In this method sodium in serum is ion-exchanged, selectively eluted and converted to a weighable precipitate as Na2SO4. Traces of sodium eluting before or after the main fraction, and precipitate contaminants are determined instrumentally. Co-precipitating contaminants contribute less than 0.1 % while the analyte lost to other eluted ion-exchange fractions contributes less than 0.02 % to the total precipitate mass. With improvements, the relative expanded uncertainty (k = 2) of the method, as applied to serum, is 0.3 % to 0.4 % and is less than 0.1 % when applied to a sodium chloride reference material. PMID:27805122

  4. Quantitative Electron Probe Microanalysis: State of the Art

    NASA Technical Reports Server (NTRS)

    Carpernter, P. K.

    2005-01-01

    Quantitative electron-probe microanalysis (EPMA) has improved due to better instrument design and X-ray correction methods. Design improvement of the electron column and X-ray spectrometer has resulted in measurement precision that exceeds analytical accuracy. Wavelength-dispersive spectrometer (WDS) have layered-dispersive diffraction crystals with improved light-element sensitivity. Newer energy-dispersive spectrometers (EDS) have Si-drift detector elements, thin window designs, and digital processing electronics with X-ray throughput approaching that of WDS Systems. Using these systems, digital X-ray mapping coupled with spectrum imaging is a powerful compositional mapping tool. Improvements in analytical accuracy are due to better X-ray correction algorithms, mass absorption coefficient data sets,and analysis method for complex geometries. ZAF algorithms have ban superceded by Phi(pz) algorithms that better model the depth distribution of primary X-ray production. Complex thin film and particle geometries are treated using Phi(pz) algorithms, end results agree well with Monte Carlo simulations. For geological materials, X-ray absorption dominates the corretions end depends on the accuracy of mass absorption coefficient (MAC) data sets. However, few MACs have been experimentally measured, and the use of fitted coefficients continues due to general success of the analytical technique. A polynomial formulation of the Bence-Albec alpha-factor technique, calibrated using Phi(pz) algorithms, is used to critically evaluate accuracy issues and can be also be used for high 2% relative and is limited by measurement precision for ideal cases, but for many elements the analytical accuracy is unproven. The EPMA technique has improved to the point where it is frequently used instead of the petrogaphic microscope for reconnaissance work. Examples of stagnant research areas are: WDS detector design characterization of calibration standards, and the need for more complete treatment of the continuum X-ray fluorescence correction.

  5. Capillary Electrophoresis Sensitivity Enhancement Based on Adaptive Moving Average Method.

    PubMed

    Drevinskas, Tomas; Telksnys, Laimutis; Maruška, Audrius; Gorbatsova, Jelena; Kaljurand, Mihkel

    2018-06-05

    In the present work, we demonstrate a novel approach to improve the sensitivity of the "out of lab" portable capillary electrophoretic measurements. Nowadays, many signal enhancement methods are (i) underused (nonoptimal), (ii) overused (distorts the data), or (iii) inapplicable in field-portable instrumentation because of a lack of computational power. The described innovative migration velocity-adaptive moving average method uses an optimal averaging window size and can be easily implemented with a microcontroller. The contactless conductivity detection was used as a model for the development of a signal processing method and the demonstration of its impact on the sensitivity. The frequency characteristics of the recorded electropherograms and peaks were clarified. Higher electrophoretic mobility analytes exhibit higher-frequency peaks, whereas lower electrophoretic mobility analytes exhibit lower-frequency peaks. On the basis of the obtained data, a migration velocity-adaptive moving average algorithm was created, adapted, and programmed into capillary electrophoresis data-processing software. Employing the developed algorithm, each data point is processed depending on a certain migration time of the analyte. Because of the implemented migration velocity-adaptive moving average method, the signal-to-noise ratio improved up to 11 times for sampling frequency of 4.6 Hz and up to 22 times for sampling frequency of 25 Hz. This paper could potentially be used as a methodological guideline for the development of new smoothing algorithms that require adaptive conditions in capillary electrophoresis and other separation methods.

  6. Investigation of prediction methods for the loads and stresses of Apollo type spacecraft parachutes. Volume 1: Loads

    NASA Technical Reports Server (NTRS)

    Mickey, F. E.; Mcewan, A. J.; Ewing, E. G.; Huyler, W. C., Jr.; Khajeh-Nouri, B.

    1970-01-01

    An analysis was conducted with the objective of upgrading and improving the loads, stress, and performance prediction methods for Apollo spacecraft parachutes. The subjects considered were: (1) methods for a new theoretical approach to the parachute opening process, (2) new experimental-analytical techniques to improve the measurement of pressures, stresses, and strains in inflight parachutes, and (3) a numerical method for analyzing the dynamical behavior of rapidly loaded pilot chute risers.

  7. Analytical close-form solutions to the elastic fields of solids with dislocations and surface stress

    NASA Astrophysics Data System (ADS)

    Ye, Wei; Paliwal, Bhasker; Ougazzaden, Abdallah; Cherkaoui, Mohammed

    2013-07-01

    The concept of eigenstrain is adopted to derive a general analytical framework to solve the elastic field for 3D anisotropic solids with general defects by considering the surface stress. The formulation shows the elastic constants and geometrical features of the surface play an important role in determining the elastic fields of the solid. As an application, the analytical close-form solutions to the stress fields of an infinite isotropic circular nanowire are obtained. The stress fields are compared with the classical solutions and those of complex variable method. The stress fields from this work demonstrate the impact from the surface stress when the size of the nanowire shrinks but becomes negligible in macroscopic scale. Compared with the power series solutions of complex variable method, the analytical solutions in this work provide a better platform and they are more flexible in various applications. More importantly, the proposed analytical framework profoundly improves the studies of general 3D anisotropic materials with surface effects.

  8. On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood

    ERIC Educational Resources Information Center

    Karabatsos, George

    2017-01-01

    This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon…

  9. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  10. Approximate analytical description of the elastic strain field due to an inclusion in a continuous medium with cubic anisotropy

    NASA Astrophysics Data System (ADS)

    Nenashev, A. V.; Koshkarev, A. A.; Dvurechenskii, A. V.

    2018-03-01

    We suggest an approach to the analytical calculation of the strain distribution due to an inclusion in elastically anisotropic media for the case of cubic anisotropy. The idea consists in the approximate reduction of the anisotropic problem to a (simpler) isotropic problem. This gives, for typical semiconductors, an improvement in accuracy by an order of magnitude, compared to the isotropic approximation. Our method allows using, in the case of elastically anisotropic media, analytical solutions obtained for isotropic media only, such as analytical formulas for the strain due to polyhedral inclusions. The present work substantially extends the applicability of analytical results, making them more suitable for describing real systems, such as epitaxial quantum dots.

  11. Matrix vapor deposition/recrystallization and dedicated spray preparation for high-resolution scanning microprobe matrix-assisted laser desorption/ionization imaging mass spectrometry (SMALDI-MS) of tissue and single cells.

    PubMed

    Bouschen, Werner; Schulz, Oliver; Eikel, Daniel; Spengler, Bernhard

    2010-02-01

    Matrix preparation techniques such as air spraying or vapor deposition were investigated with respect to lateral migration, integration of analyte into matrix crystals and achievable lateral resolution for the purpose of high-resolution biological imaging. The accessible mass range was found to be beyond 5000 u with sufficient analytical sensitivity. Gas-assisted spraying methods (using oxygen-free gases) provide a good compromise between crystal integration of analyte and analyte migration within the sample. Controlling preparational parameters with this method, however, is difficult. Separation of the preparation procedure into two steps, instead, leads to an improved control of migration and incorporation. The first step is a dry vapor deposition of matrix onto the investigated sample. In a second step, incorporation of analyte into the matrix crystal is enhanced by a controlled recrystallization of matrix in a saturated water atmosphere. With this latter method an effective analytical resolution of 2 microm in the x and y direction was achieved for scanning microprobe matrix-assisted laser desorption/ionization imaging mass spectrometry (SMALDI-MS). Cultured A-498 cells of human renal carcinoma were successfully investigated by high-resolution MALDI imaging using the new preparation techniques. Copyright 2010 John Wiley & Sons, Ltd.

  12. Interactions between butterfly-shaped pulses in the inhomogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Wen-Jun; Beijing National Laboratory for Condensed Matter Physics, Institute of Physics, Chinese Academy of Sciences, Beijing 100190; Huang, Long-Gang

    2014-10-15

    Pulse interactions affect pulse qualities during the propagation. Interactions between butterfly-shaped pulses are investigated to improve pulse qualities in the inhomogeneous media. In order to describe the interactions between butterfly-shaped pulses, analytic two-soliton solutions are derived. Based on those solutions, influences of corresponding parameters on pulse interactions are discussed. Methods to control the pulse interactions are suggested. - Highlights: • Interactions between butterfly-shaped pulses are investigated. • Methods to control the pulse interactions are suggested. • Analytic two-soliton solutions for butterfly-shaped pulses are derived.

  13. Studies toward the synthesis of linear triazole linked pseudo oligosaccharides and the use of ferrocene as analytical probe.

    PubMed

    Schmidt, Magnus S; Götz, Kathrin H; Koch, Wolfgang; Grimm, Tanja; Ringwald, Markus

    2016-04-29

    Three different building blocks have been synthesised and used for the synthesis of linear triazole linked pseudo oligosaccharides with copper(I)-catalysed cycloaddition (CuAAC). Ethynylferrocene has been used as analytical probe to improve the UV/Vis properties and HPLC methods have been used and optimised for the analysis of the pseudo oligosaccharides. The smallest ones have been isolated and characterised by analytical HPLC, NMR, ESI-MS and elemental analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Analytical Methods of Decoupling the Automotive Engine Torque Roll Axis

    NASA Astrophysics Data System (ADS)

    JEONG, TAESEOK; SINGH, RAJENDRA

    2000-06-01

    This paper analytically examines the multi-dimensional mounting schemes of an automotive engine-gearbox system when excited by oscillating torques. In particular, the issue of torque roll axis decoupling is analyzed in significant detail since it is poorly understood. New dynamic decoupling axioms are presented an d compared with the conventional elastic axis mounting and focalization methods. A linear time-invariant system assumption is made in addition to a proportionally damped system. Only rigid-body modes of the powertrain are considered and the chassis elements are assumed to be rigid. Several simplified physical systems are considered and new closed-form solutions for symmetric and asymmetric engine-mounting systems are developed. These clearly explain the design concepts for the 4-point mounting scheme. Our analytical solutions match with the existing design formulations that are only applicable to symmetric geometries. Spectra for all six rigid-body motions are predicted using the alternate decoupling methods and the closed-form solutions are verified. Also, our method is validated by comparing modal solutions with prior experimental and analytical studies. Parametric design studies are carried out to illustrate the methodology. Chief contributions of this research include the development of new or refined analytical models and closed-form solutions along with improved design strategies for the torque roll axis decoupling.

  15. Maximum entropy formalism for the analytic continuation of matrix-valued Green's functions

    NASA Astrophysics Data System (ADS)

    Kraberger, Gernot J.; Triebl, Robert; Zingl, Manuel; Aichhorn, Markus

    2017-10-01

    We present a generalization of the maximum entropy method to the analytic continuation of matrix-valued Green's functions. To treat off-diagonal elements correctly based on Bayesian probability theory, the entropy term has to be extended for spectral functions that are possibly negative in some frequency ranges. In that way, all matrix elements of the Green's function matrix can be analytically continued; we introduce a computationally cheap element-wise method for this purpose. However, this method cannot ensure important constraints on the mathematical properties of the resulting spectral functions, namely positive semidefiniteness and Hermiticity. To improve on this, we present a full matrix formalism, where all matrix elements are treated simultaneously. We show the capabilities of these methods using insulating and metallic dynamical mean-field theory (DMFT) Green's functions as test cases. Finally, we apply the methods to realistic material calculations for LaTiO3, where off-diagonal matrix elements in the Green's function appear due to the distorted crystal structure.

  16. Environmental and human monitoring of Americium-241 utilizing extraction chromatography and alpha-spectrometry.

    PubMed

    Goldstein, S J; Hensley, C A; Armenta, C E; Peters, R J

    1997-03-01

    Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for alpha-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of "real" environmental and bioassay samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of approximately 2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously.

  17. Electrochemical determination of inorganic mercury and arsenic--A review.

    PubMed

    Zaib, Maria; Athar, Muhammad Makshoof; Saeed, Asma; Farooq, Umar

    2015-12-15

    Inorganic mercury and arsenic encompasses a term which includes As(III), As(V) and Hg(II) species. These metal ions have been extensively studied due to their toxicity related issues. Different analytical methods are used to monitor inorganic mercury and arsenic in a variety of samples at trace level. The present study reviews various analytical techniques available for detection of inorganic mercury and arsenic with particular emphasis on electrochemical methods especially stripping voltammetry. A detailed critical evaluation of methods, advantages of electrochemical methods over other analytical methods, and various electrode materials available for mercury and arsenic analysis is presented in this review study. Modified carbon paste electrode provides better determination due to better deposition with linear and improved response under studied set of conditions. Biological materials may be the potent and economical alternative as compared to macro-electrodes and chemically modified carbon paste electrodes in stripping analysis of inorganic mercury and arsenic. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of chromium in water by graphite furnace atomic absorption spectrophotometry

    USGS Publications Warehouse

    McLain, B.J.

    1993-01-01

    Graphite furnace atomic absorption spectrophotometry is a sensitive, precise, and accurate method for the determination of chromium in natural water samples. The detection limit for this analytical method is 0.4 microg/L with a working linear limit of 25.0 microg/L. The precision at the detection limit ranges from 20 to 57 percent relative standard deviation (RSD) with an improvement to 4.6 percent RSD for concentrations more than 3 microg/L. Accuracy of this method was determined for a variety of reference standards that was representative of the analytical range. The results were within the established standard deviations. Samples were spiked with known concentrations of chromium with recoveries ranging from 84 to 122 percent. In addition, a comparison of data between graphite furnace atomic absorption spectrophotometry and direct-current plasma atomic emission spectrometry resulted in suitable agreement between the two methods, with an average deviation of +/- 2.0 microg/L throughout the analytical range.

  19. Analytical energy gradient based on spin-free infinite-order Douglas-Kroll-Hess method with local unitary transformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakajima, Yuya; Seino, Junji; Nakai, Hiromi, E-mail: nakai@waseda.jp

    In this study, the analytical energy gradient for the spin-free infinite-order Douglas-Kroll-Hess (IODKH) method at the levels of the Hartree-Fock (HF), density functional theory (DFT), and second-order Møller-Plesset perturbation theory (MP2) is developed. Furthermore, adopting the local unitary transformation (LUT) scheme for the IODKH method improves the efficiency in computation of the analytical energy gradient. Numerical assessments of the present gradient method are performed at the HF, DFT, and MP2 levels for the IODKH with and without the LUT scheme. The accuracies are examined for diatomic molecules such as hydrogen halides, halogen dimers, coinage metal (Cu, Ag, and Au) halides,more » and coinage metal dimers, and 20 metal complexes, including the fourth–sixth row transition metals. In addition, the efficiencies are investigated for one-, two-, and three-dimensional silver clusters. The numerical results confirm the accuracy and efficiency of the present method.« less

  20. Multisyringe flow injection analysis hyphenated with liquid core waveguides for the development of cleaner spectroscopic analytical methods: improved determination of chloride in waters.

    PubMed

    Maya, Fernando; Estela, José Manuel; Cerdà, Víctor

    2009-07-01

    In this work, the hyphenation of the multisyringe flow injection analysis technique with a 100-cm-long pathlength liquid core waveguide has been accomplished. The Cl-/Hg(SCN)2/Fe3+ reaction system for the spectrophotometric determination of chloride (Cl(-)) in waters was used as chemical model. As a result, this classic analytical methodology has been improved, minimizing dramatically the consumption of reagents, in particular, that of the highly biotoxic chemical Hg(SCN)2. The proposed method features a linear dynamic range composed of two steps between (1) 0.2-2 and (2) 2-8 mg Cl- L(-1), thus extended applicability due to on-line sample dilution (up to 400 mg Cl- L(-1)). It also presents improved limits of detection and quantification of 0.06 and 0.20 mg Cl- L(-1), respectively. The coefficient of variation and the injection throughput were 1.3% (n = 10, 2 mg Cl- L(-1)) and 21 h(-1). Furthermore, a very low consumption of reagents per Cl- determination of 0.2 microg Hg(II) and 28 microg Fe3+ has been achieved. The method was successfully applied to the determination of Cl- in different types of water samples. Finally, the proposed system is critically compared from a green analytical chemistry point of view against other flow systems for the same purpose.

  1. Homeland Security Research Improves the Nation's Ability to ...

    EPA Pesticide Factsheets

    Technical Brief Homeland Security (HS) Research develops data, tools, and technologies to minimize the impact of accidents, natural disasters, terrorist attacks, and other incidents that can result in toxic chemical, biological or radiological (CBR) contamination. HS Research develops ways to detect contamination, sampling strategies, sampling and analytical methods, cleanup methods, waste management approaches, exposure assessment methods, and decision support tools (including water system models). These contributions improve EPA’s response to a broad range of environmental disasters.

  2. Multi-residue pesticide analysis (gas chromatography-tandem mass spectrometry detection)-Improvement of the quick, easy, cheap, effective, rugged, and safe method for dried fruits and fat-rich cereals-Benefit and limit of a standardized apple purée calibration (screening).

    PubMed

    Rasche, Claudia; Fournes, Britta; Dirks, Uwe; Speer, Karl

    2015-07-17

    Some steps of the QuEChERS method for the analysis of pesticides with GC-MS/MS in cereals and dried fruits were improved or simplified. For the latter, a mixing vessel with stator-rotor-system proved to be advantageous. The extraction procedure of dried fruits is much easier and safer than the Ultra Turrax and results in excellent validation data at a concentration level of 0.01mg/kg (116 of 118 analytes with recoveries in the range of 70-120%, 117 of 118 analytes with RSD <20%). After qualifying problematic lipophilic pesticides in fat-rich cereals (fat content >7%), predominantly organochlorines showed recoveries of <70% in quantification when the standard QuEChERS method with water was used. A second extraction was carried out analogous to the QuEChERS method, however, without the addition of water. With this simple modification, the problematic lipophilic pesticides, which had been strongly affected by the fat content of the commodities, could be determined with recoveries above 70% even at a concentration level of 0.01mg/kg. Moreover, a GC-MS/MS screening method for 120 pesticides at a concentration level of 0.01mg/kg was established by employing analyte protectants (ethylglycerol, gulonolactone, and sorbitol). The use of only one standardized calibration, made of an apple purée extract in combination with analyte protectants, allowed for a qualitative and quantitative analysis of 120 pesticides in different matrix extracts (tomato, red pepper, sour cherries, dried apples, black currant powder, raisins, wheat flour, rolled oats, wheat germ). The analyte protectants leveled the differences in the matrix-induced protection effect of the analyzed extracts over a wide range. The majority of the pesticides were analyzed with good analytical results (recoveries in the range of 70-120% and RSD <20%). Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Phase-recovery improvement using analytic wavelet transform analysis of a noisy interferogram cepstrum.

    PubMed

    Etchepareborda, Pablo; Vadnjal, Ana Laura; Federico, Alejandro; Kaufmann, Guillermo H

    2012-09-15

    We evaluate the extension of the exact nonlinear reconstruction technique developed for digital holography to the phase-recovery problems presented by other optical interferometric methods, which use carrier modulation. It is shown that the introduction of an analytic wavelet analysis in the ridge of the cepstrum transformation corresponding to the analyzed interferogram can be closely related to the well-known wavelet analysis of the interferometric intensity. Subsequently, the phase-recovery process is improved. The advantages and limitations of this framework are analyzed and discussed using numerical simulations in singular scalar light fields and in temporal speckle pattern interferometry.

  4. High-performance heat pipes for heat recovery applications

    NASA Technical Reports Server (NTRS)

    Saaski, E. W.; Hartl, J. H.

    1980-01-01

    Methods to improve the performance of reflux heat pipes for heat recovery applications were examined both analytically and experimentally. Various models for the estimation of reflux heat pipe transport capacity were surveyed in the literature and compared with experimental data. A high transport capacity reflux heat pipe was developed that provides up to a factor of 10 capacity improvement over conventional open tube designs; analytical models were developed for this device and incorporated into a computer program HPIPE. Good agreement of the model predictions with data for R-11 and benzene reflux heat pipes was obtained.

  5. Solid Lubrication Fundamentals and Applications. Chapter 2

    NASA Technical Reports Server (NTRS)

    Miyoshi, Kazuhisa

    1998-01-01

    This chapter describes powerful analytical techniques capable of sampling tribological surfaces and solid-film lubricants. Some of these techniques may also be used to determine the locus of failure in a bonded structure or coated substrate; such information is important when seeking improved adhesion between a solid-film lubricant and a substrate and when seeking improved performance and long life expectancy of solid lubricants. Many examples are given here and through-out the book on the nature and character of solid surfaces and their significance in lubrication, friction, and wear. The analytical techniques used include the late spectroscopic methods.

  6. Evaluation of analytical performance of a new high-sensitivity immunoassay for cardiac troponin I.

    PubMed

    Masotti, Silvia; Prontera, Concetta; Musetti, Veronica; Storti, Simona; Ndreu, Rudina; Zucchelli, Gian Carlo; Passino, Claudio; Clerico, Aldo

    2018-02-23

    The study aim was to evaluate and compare the analytical performance of the new chemiluminescent immunoassay for cardiac troponin I (cTnI), called Access hs-TnI using DxI platform, with those of Access AccuTnI+3 method, and high-sensitivity (hs) cTnI method for ARCHITECT platform. The limits of blank (LoB), detection (LoD) and quantitation (LoQ) at 10% and 20% CV were evaluated according to international standardized protocols. For the evaluation of analytical performance and comparison of cTnI results, both heparinized plasma samples, collected from healthy subjects and patients with cardiac diseases, and quality control samples distributed in external quality assessment programs were used. LoB, LoD and LoQ at 20% and 10% CV values of the Access hs-cTnI method were 0.6, 1.3, 2.1 and 5.3 ng/L, respectively. Access hs-cTnI method showed analytical performance significantly better than that of Access AccuTnI+3 method and similar results to those of hs ARCHITECT cTnI method. Moreover, the cTnI concentrations measured with Access hs-cTnI method showed close linear regressions with both Access AccuTnI+3 and ARCHITECT hs-cTnI methods, although there were systematic differences between these methods. There was no difference between cTnI values measured by Access hs-cTnI in heparinized plasma and serum samples, whereas there was a significant difference between cTnI values, respectively measured in EDTA and heparin plasma samples. Access hs-cTnI has analytical sensitivity parameters significantly improved compared to Access AccuTnI+3 method and is similar to those of the high-sensitivity method using ARCHITECT platform.

  7. Integrated pest management of "Golden Delicious" apples.

    PubMed

    Simončič, A; Stopar, M; Velikonja Bolta, Š; Bavčar, D; Leskovšek, R; Baša Česnik, H

    2015-01-01

    Monitoring of plant protection product (PPP) residues in "Golden Delicious" apples was performed in 2011-2013, where 216 active substances were analysed with three analytical methods. Integrated pest management (IPM) production and improved IPM production were compared. Results were in favour of improved IPM production. Some active compounds determined in IPM production (boscalid, pyraclostrobin, thiacloprid and thiametoxam) were not found in improved IPM production. Besides that, in 2011 and 2012, captan residues were lower in improved IPM production. Risk assessment was also performed. Chronic exposure of consumers was low in general, but showed no major differences for IPM and improved IPM production for active substances determined in both types of production. Analytical results were compared with the European Union report of 2010 where 1.3% of apple samples exceeded maximum residue levels (MRLs), while MRL exceedances were not observed in this survey.

  8. Blade loss transient dynamics analysis, volume 2. Task 2: Theoretical and analytical development. Task 3: Experimental verification

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.

  9. Determination of dimethyltryptamine and β-carbolines (ayahuasca alkaloids) in plasma samples by LC-MS/MS.

    PubMed

    Oliveira, Carolina Dizioli Rodrigues; Okai, Guilherme Gonçalves; da Costa, José Luiz; de Almeida, Rafael Menck; Oliveira-Silva, Diogo; Yonamine, Mauricio

    2012-07-01

    Ayahuasca is a psychoactive plant beverage originally used by indigenous people throughout the Amazon Basin, long before its modern use by syncretic religious groups established in Brazil, the USA and European countries. The objective of this study was to develop a method for quantification of dimethyltryptamine and β-carbolines in human plasma samples. The analytes were extracted by means of C18 cartridges and injected into LC-MS/MS, operated in positive ion mode and multiple reaction monitoring. The LOQs obtained for all analytes were below 0.5 ng/ml. By using the weighted least squares linear regression, the accuracy of the analytical method was improved at the lower end of the calibration curve (from 0.5 to 100 ng/ml; r(2)> 0.98). The method proved to be simple, rapid and useful to estimate administered doses for further pharmacological and toxicological investigations of ayahuasca exposure.

  10. Improvement of the accuracy of noise measurements by the two-amplifier correlation method.

    PubMed

    Pellegrini, B; Basso, G; Fiori, G; Macucci, M; Maione, I A; Marconcini, P

    2013-10-01

    We present a novel method for device noise measurement, based on a two-channel cross-correlation technique and a direct "in situ" measurement of the transimpedance of the device under test (DUT), which allows improved accuracy with respect to what is available in the literature, in particular when the DUT is a nonlinear device. Detailed analytical expressions for the total residual noise are derived, and an experimental investigation of the increased accuracy provided by the method is performed.

  11. Quantitative Profiling of Endogenous Fat-Soluble Vitamins and Carotenoids in Human Plasma Using an Improved UHPSFC-ESI-MS Interface.

    PubMed

    Petruzziello, Filomena; Grand-Guillaume Perrenoud, Alexandre; Thorimbert, Anita; Fogwill, Michael; Rezzi, Serge

    2017-07-18

    Analytical solutions enabling the quantification of circulating levels of liposoluble micronutrients such as vitamins and carotenoids are currently limited to either single or a reduced panel of analytes. The requirement to use multiple approaches hampers the investigation of the biological variability on a large number of samples in a time and cost efficient manner. With the goal to develop high-throughput and robust quantitative methods for the profiling of micronutrients in human plasma, we introduce a novel, validated workflow for the determination of 14 fat-soluble vitamins and carotenoids in a single run. Automated supported liquid extraction was optimized and implemented to simultaneously parallelize 48 samples in 1 h, and the analytes were measured using ultrahigh-performance supercritical fluid chromatography coupled to tandem mass spectrometry in less than 8 min. An improved mass spectrometry interface hardware was built up to minimize the post-decompression volume and to allow better control of the chromatographic effluent density on its route toward and into the ion source. In addition, a specific make-up solvent condition was developed to ensure both analytes and matrix constituents solubility after mobile phase decompression. The optimized interface resulted in improved spray plume stability and conserved matrix compounds solubility leading to enhanced hyphenation robustness while ensuring both suitable analytical repeatability and improved the detection sensitivity. The overall developed methodology gives recoveries within 85-115%, as well as within and between-day coefficient of variation of 2 and 14%, respectively.

  12. Improving Sample Distribution Homogeneity in Three-Dimensional Microfluidic Paper-Based Analytical Devices by Rational Device Design.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Milan, Luis Aparecido; Stockton, Amanda M; Carrilho, Emanuel

    2017-05-02

    Paper-based devices are a portable, user-friendly, and affordable technology that is one of the best analytical tools for inexpensive diagnostic devices. Three-dimensional microfluidic paper-based analytical devices (3D-μPADs) are an evolution of single layer devices and they permit effective sample dispersion, individual layer treatment, and multiplex analytical assays. Here, we present the rational design of a wax-printed 3D-μPAD that enables more homogeneous permeation of fluids along the cellulose matrix than other existing designs in the literature. Moreover, we show the importance of the rational design of channels on these devices using glucose oxidase, peroxidase, and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS) reactions. We present an alternative method for layer stacking using a magnetic apparatus, which facilitates fluidic dispersion and improves the reproducibility of tests performed on 3D-μPADs. We also provide the optimized designs for printing, facilitating further studies using 3D-μPADs.

  13. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2014-01-01

    Monitoring the misuse of drugs and the abuse of substances and methods potentially or evidently improving athletic performance by analytical chemistry strategies is one of the main pillars of modern anti-doping efforts. Owing to the continuously growing knowledge in medicine, pharmacology, and (bio)chemistry, new chemical entities are frequently established and developed, various of which present a temptation for sportsmen and women due to assumed/attributed beneficial effects of such substances and preparations on, for example, endurance, strength, and regeneration. By means of new technologies, expanded existing test protocols, new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA), analytical assays have been further improved in agreement with the content of the 2013 Prohibited List. In this annual banned-substance review, literature concerning human sports drug testing that was published between October 2012 and September 2013 is summarized and reviewed with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2013 John Wiley & Sons, Ltd.

  14. Review and assessment of the HOST turbine heat transfer program

    NASA Technical Reports Server (NTRS)

    Gladden, Herbert J.

    1988-01-01

    The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena occurring in high-performance gas turbine engines and to assess and improve the analytical methods used to predict the fluid dynamics and heat transfer phenomena. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. Therefore, a building-block approach was utilized, with research ranging from the study of fundamental phenomena and analytical modeling to experiments in simulated real-engine environments. Experimental research accounted for 75 percent of the project, and analytical efforts accounted for approximately 25 percent. Extensive experimental datasets were created depicting the three-dimensional flow field, high free-stream turbulence, boundary-layer transition, blade tip region heat transfer, film cooling effects in a simulated engine environment, rough-wall cooling enhancement in a rotating passage, and rotor-stator interaction effects. In addition, analytical modeling of these phenomena was initiated using boundary-layer assumptions as well as Navier-Stokes solutions.

  15. The combination of four analytical methods to explore skeletal muscle metabolomics: Better coverage of metabolic pathways or a marketing argument?

    PubMed

    Bruno, C; Patin, F; Bocca, C; Nadal-Desbarats, L; Bonnier, F; Reynier, P; Emond, P; Vourc'h, P; Joseph-Delafont, K; Corcia, P; Andres, C R; Blasco, H

    2018-01-30

    Metabolomics is an emerging science based on diverse high throughput methods that are rapidly evolving to improve metabolic coverage of biological fluids and tissues. Technical progress has led researchers to combine several analytical methods without reporting the impact on metabolic coverage of such a strategy. The objective of our study was to develop and validate several analytical techniques (mass spectrometry coupled to gas or liquid chromatography and nuclear magnetic resonance) for the metabolomic analysis of small muscle samples and evaluate the impact of combining methods for more exhaustive metabolite covering. We evaluated the muscle metabolome from the same pool of mouse muscle samples after 2 metabolite extraction protocols. Four analytical methods were used: targeted flow injection analysis coupled with mass spectrometry (FIA-MS/MS), gas chromatography coupled with mass spectrometry (GC-MS), liquid chromatography coupled with high-resolution mass spectrometry (LC-HRMS), and nuclear magnetic resonance (NMR) analysis. We evaluated the global variability of each compound i.e., analytical (from quality controls) and extraction variability (from muscle extracts). We determined the best extraction method and we reported the common and distinct metabolites identified based on the number and identity of the compounds detected with low analytical variability (variation coefficient<30%) for each method. Finally, we assessed the coverage of muscle metabolic pathways obtained. Methanol/chloroform/water and water/methanol were the best extraction solvent for muscle metabolome analysis by NMR and MS, respectively. We identified 38 metabolites by nuclear magnetic resonance, 37 by FIA-MS/MS, 18 by GC-MS, and 80 by LC-HRMS. The combination led us to identify a total of 132 metabolites with low variability partitioned into 58 metabolic pathways, such as amino acid, nitrogen, purine, and pyrimidine metabolism, and the citric acid cycle. This combination also showed that the contribution of GC-MS was low when used in combination with other mass spectrometry methods and nuclear magnetic resonance to explore muscle samples. This study reports the validation of several analytical methods, based on nuclear magnetic resonance and several mass spectrometry methods, to explore the muscle metabolome from a small amount of tissue, comparable to that obtained during a clinical trial. The combination of several techniques may be relevant for the exploration of muscle metabolism, with acceptable analytical variability and overlap between methods However, the difficult and time-consuming data pre-processing, processing, and statistical analysis steps do not justify systematically combining analytical methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Efficient alignment-free DNA barcode analytics.

    PubMed

    Kuksa, Pavel; Pavlovic, Vladimir

    2009-11-10

    In this work we consider barcode DNA analysis problems and address them using alternative, alignment-free methods and representations which model sequences as collections of short sequence fragments (features). The methods use fixed-length representations (spectrum) for barcode sequences to measure similarities or dissimilarities between sequences coming from the same or different species. The spectrum-based representation not only allows for accurate and computationally efficient species classification, but also opens possibility for accurate clustering analysis of putative species barcodes and identification of critical within-barcode loci distinguishing barcodes of different sample groups. New alignment-free methods provide highly accurate and fast DNA barcode-based identification and classification of species with substantial improvements in accuracy and speed over state-of-the-art barcode analysis methods. We evaluate our methods on problems of species classification and identification using barcodes, important and relevant analytical tasks in many practical applications (adverse species movement monitoring, sampling surveys for unknown or pathogenic species identification, biodiversity assessment, etc.) On several benchmark barcode datasets, including ACG, Astraptes, Hesperiidae, Fish larvae, and Birds of North America, proposed alignment-free methods considerably improve prediction accuracy compared to prior results. We also observe significant running time improvements over the state-of-the-art methods. Our results show that newly developed alignment-free methods for DNA barcoding can efficiently and with high accuracy identify specimens by examining only few barcode features, resulting in increased scalability and interpretability of current computational approaches to barcoding.

  17. The current preference for the immuno-analytical ELISA method for quantitation of steroid hormones (endocrine disruptor compounds) in wastewater in South Africa.

    PubMed

    Manickum, Thavrin; John, Wilson

    2015-07-01

    The availability of national test centers to offer a routine service for analysis and quantitation of some selected steroid hormones [natural estrogens (17-β-estradiol, E2; estrone, E1; estriol, E3), synthetic estrogen (17-α-ethinylestradiol, EE2), androgen (testosterone), and progestogen (progesterone)] in wastewater matrix was investigated; corresponding internationally used chemical- and immuno-analytical test methods were reviewed. The enzyme-linked immunosorbent assay (ELISA) (immuno-analytical technique) was also assessed for its suitability as a routine test method to quantitate the levels of these hormones at a sewage/wastewater treatment plant (WTP) (Darvill, Pietermaritzburg, South Africa), over a 2-year period. The method performance and other relevant characteristics of the immuno-analytical ELISA method were compared to the conventional chemical-analytical methodology, like gas/liquid chromatography-mass spectrometry (GC/LC-MS), and GC-LC/tandem mass spectrometry (MSMS), for quantitation of the steroid hormones in wastewater and environmental waters. The national immuno-analytical ELISA technique was found to be sensitive (LOQ 5 ng/L, LOD 0.2-5 ng/L), accurate (mean recovery 96%), precise (RSD 7-10%), and cost-effective for screening and quantitation of these steroid hormones in wastewater and environmental water matrix. A survey of the most current international literature indicates a fairly equal use of the LC-MS/MS, GC-MS/MS (chemical-analytical), and ELISA (immuno-analytical) test methods for screening and quantitation of the target steroid hormones in both water and wastewater matrix. Internationally, the observed sensitivity, based on LOQ (ng/L), for the steroid estrogens E1, E2, EE2, is, in decreasing order: LC-MSMS (0.08-9.54) > GC-MS (1) > ELISA (5) (chemical-analytical > immuno-analytical). At the national level, the routine, unoptimized chemical-analytical LC-MSMS method was found to lack the required sensitivity for meeting environmental requirements for steroid hormone quantitation. Further optimization of the sensitivity of the chemical-analytical LC-tandem mass spectrometry methods, especially for wastewater screening, in South Africa is required. Risk assessment studies showed that it was not practical to propose standards or allowable limits for the steroid estrogens E1, E2, EE2, and E3; the use of predicted-no-effect concentration values of the steroid estrogens appears to be appropriate for use in their risk assessment in relation to aquatic organisms. For raw water sources, drinking water, raw and treated wastewater, the use of bioassays, with trigger values, is a useful screening tool option to decide whether further examination of specific endocrine activity may be warranted, or whether concentrations of such activity are of low priority, with respect to health concerns in the human population. The achievement of improved quantitation limits for immuno-analytical methods, like ELISA, used for compound quantitation, and standardization of the method for measuring E2 equivalents (EEQs) used for biological activity (endocrine: e.g., estrogenic) are some areas for future EDC research.

  18. A Requirements-Driven Optimization Method for Acoustic Liners Using Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.; Lopes, Leonard V.

    2017-01-01

    More than ever, there is flexibility and freedom in acoustic liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. In a previous paper on this subject, a method deriving the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground was described. A simple code-wrapping approach was used to evaluate a community noise objective function for an external optimizer. Gradients were evaluated using a finite difference formula. The subject of this paper is an application of analytic derivatives that supply precise gradients to an optimization process. Analytic derivatives improve the efficiency and accuracy of gradient-based optimization methods and allow consideration of more design variables. In addition, the benefit of variable impedance liners is explored using a multi-objective optimization.

  19. Review of Thawing Time Prediction Models Depending
on Process Conditions and Product Characteristics

    PubMed Central

    Kluza, Franciszek; Spiess, Walter E. L.; Kozłowicz, Katarzyna

    2016-01-01

    Summary Determining thawing times of frozen foods is a challenging problem as the thermophysical properties of the product change during thawing. A number of calculation models and solutions have been developed. The proposed solutions range from relatively simple analytical equations based on a number of assumptions to a group of empirical approaches that sometimes require complex calculations. In this paper analytical, empirical and graphical models are presented and critically reviewed. The conditions of solution, limitations and possible applications of the models are discussed. The graphical and semi--graphical models are derived from numerical methods. Using the numerical methods is not always possible as running calculations takes time, whereas the specialized software and equipment are not always cheap. For these reasons, the application of analytical-empirical models is more useful for engineering. It is demonstrated that there is no simple, accurate and feasible analytical method for thawing time prediction. Consequently, simplified methods are needed for thawing time estimation of agricultural and food products. The review reveals the need for further improvement of the existing solutions or development of new ones that will enable accurate determination of thawing time within a wide range of practical conditions of heat transfer during processing. PMID:27904387

  20. Combining Heterogeneous Correlation Matrices: Simulation Analysis of Fixed-Effects Methods

    ERIC Educational Resources Information Center

    Hafdahl, Adam R.

    2008-01-01

    Monte Carlo studies of several fixed-effects methods for combining and comparing correlation matrices have shown that two refinements improve estimation and inference substantially. With rare exception, however, these simulations have involved homogeneous data analyzed using conditional meta-analytic procedures. The present study builds on…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tukey, J.W.; Bloomfield, P.

    In its most general terms, the work carried out under the contract consists of the development of new data analytic methods and the improvement of existing methods, their implementation on computer, especially minicomputers, and the development of non-statistical, systems-level software to support these activities. The work reported or completed is reviewed. (GHT)

  2. Characterization of structural connections for multicomponent systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Huckelbridge, Arthur A.

    1988-01-01

    This study explores combining Component Mode Synthesis methods for coupling structural components with Parameter Identification procedures for improving the analytical modeling of the connections. Improvements in the connection stiffness and damping properties are computed in terms of physical parameters so that the physical characteristics of the connections can be better understood, in addition to providing improved input for the system model.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chhiber, R; Usmanov, AV; Matthaeus, WH

    Simple estimates of the number of Coulomb collisions experienced by the interplanetary plasma to the point of observation, i.e., the “collisional age”, can be usefully employed in the study of non-thermal features of the solar wind. Usually these estimates are based on local plasma properties at the point of observation. Here we improve the method of estimation of the collisional age by employing solutions obtained from global three-dimensional magnetohydrodynamics simulations. This enables evaluation of the complete analytical expression for the collisional age without using approximations. The improved estimation of the collisional timescale is compared with turbulence and expansion timescales tomore » assess the relative importance of collisions. The collisional age computed using the approximate formula employed in previous work is compared with the improved simulation-based calculations to examine the validity of the simplified formula. We also develop an analytical expression for the evaluation of the collisional age and we find good agreement between the numerical and analytical results. Finally, we briefly discuss the implications for an improved estimation of collisionality along spacecraft trajectories, including Solar Probe Plus.« less

  4. QSPR studies on the photoinduced-fluorescence behaviour of pharmaceuticals and pesticides.

    PubMed

    López-Malo, D; Bueso-Bordils, J I; Duart, M J; Alemán-López, P A; Martín-Algarra, R V; Antón-Fos, G M; Lahuerta-Zamora, L; Martínez-Calatayud, J

    2017-07-01

    Fluorimetric analysis is still a growing line of research in the determination of a wide range of organic compounds, including pharmaceuticals and pesticides, which makes necessary the development of new strategies aimed at improving the performance of fluorescence determinations as well as the sensitivity and, especially, the selectivity of the newly developed analytical methods. In this paper are presented applications of a useful and growing tool suitable for fostering and improving research in the analytical field. Experimental screening, molecular connectivity and discriminant analysis are applied to organic compounds to predict their fluorescent behaviour after their photodegradation by UV irradiation in a continuous flow manifold (multicommutation flow assembly). The screening was based on online fluorimetric measurement and comprised pre-selected compounds with different molecular structures (pharmaceuticals and some pesticides with known 'native' fluorescent behaviour) to study their changes in fluorescent behaviour after UV irradiation. Theoretical predictions agree with the results from the experimental screening and could be used to develop selective analytical methods, as well as helping to reduce the need for expensive, time-consuming and trial-and-error screening procedures.

  5. Asymmetric flow field-flow fractionation (AF4) for the quantification of nanoparticle release from tablets during dissolution testing.

    PubMed

    Engel, A; Plöger, M; Mulac, D; Langer, K

    2014-01-30

    Nanoparticles composed of poly(DL-lactide-co-glycolide) (PLGA) represent promising colloidal drug carriers for improved drug targeting. Although most research activities are focused on intravenous application of these carriers the peroral administration is described to improve bioavailability of poorly soluble drugs. Based on these insights the manuscript describes a model tablet formulation for PLGA-nanoparticles and especially its analytical characterisation with regard to a nanosized drug carrier. Besides physico-chemical tablet characterisation according to pharmacopoeias the main goal of the study was the development of a suitable analytical method for the quantification of nanoparticle release from tablets. An analytical flow field-flow fractionation (AF4) method was established and validated which enables determination of nanoparticle content in solid dosage forms as well as quantification of particle release during dissolution testing. For particle detection a multi-angle light scattering (MALS) detector was coupled to the AF4-system. After dissolution testing, the presence of unaltered PLGA-nanoparticles was successfully proved by dynamic light scattering and scanning electron microscopy. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Innovations in coating technology.

    PubMed

    Behzadi, Sharareh S; Toegel, Stefan; Viernstein, Helmut

    2008-01-01

    Despite representing one of the oldest pharmaceutical techniques, coating of dosage forms is still frequently used in pharmaceutical manufacturing. The aims of coating range from simply masking the taste or odour of drugs to the sophisticated controlling of site and rate of drug release. The high expectations for different coating technologies have required great efforts regarding the development of reproducible and controllable production processes. Basically, improvements in coating methods have focused on particle movement, spraying systems, and air and energy transport. Thereby, homogeneous distribution of coating material and increased drying efficiency should be accomplished in order to achieve high end product quality. Moreover, given the claim of the FDA to design the end product quality already during the manufacturing process (Quality by Design), the development of analytical methods for the analysis, management and control of coating processes has attracted special attention during recent years. The present review focuses on recent patents claiming improvements in pharmaceutical coating technology and intends to first familiarize the reader with the available procedures and to subsequently explain the application of different analytical tools. Aiming to structure this comprehensive field, coating technologies are primarily divided into pan and fluidized bed coating methods. Regarding pan coating procedures, pans rotating around inclined, horizontal and vertical axes are reviewed separately. On the other hand, fluidized bed technologies are subdivided into those involving fluidized and spouted beds. Then, continuous processing techniques and improvements in spraying systems are discussed in dedicated chapters. Finally, currently used analytical methods for the understanding and management of coating processes are reviewed in detail in the last section of the review.

  7. Laboratory Methods for the Measurement of Pollutants in Water and Waste Effluents

    NASA Technical Reports Server (NTRS)

    Ballinger, Dwight G.

    1971-01-01

    The requirement for accurate, precise, and rapid analytical procedures for the examination of water and waste samples requires the use of a variety of instruments. The instrumentation in water laboratories includes atomic absorption, UV-visible. and infrared spectrophotometers, automatic colorimetric analyzers, gas chromatographs and mass spectrometers. Because of the emphasis on regulatory action, attention is being directed toward quality control of analytical results. Among the challenging problems are the differentiation of metallic species in water at nanogram concentrations, rapid measurement of free cyanide and free ammonia, more sensitive methods for arsenic and selenium and improved characterization of organic contaminants.

  8. Accommodating subject and instrument variations in spectroscopic determinations

    DOEpatents

    Haas, Michael J [Albuquerque, NM; Rowe, Robert K [Corrales, NM; Thomas, Edward V [Albuquerque, NM

    2006-08-29

    A method and apparatus for measuring a biological attribute, such as the concentration of an analyte, particularly a blood analyte in tissue such as glucose. The method utilizes spectrographic techniques in conjunction with an improved instrument-tailored or subject-tailored calibration model. In a calibration phase, calibration model data is modified to reduce or eliminate instrument-specific attributes, resulting in a calibration data set modeling intra-instrument or intra-subject variation. In a prediction phase, the prediction process is tailored for each target instrument separately using a minimal number of spectral measurements from each instrument or subject.

  9. Polysialylated N-Glycans Identified in Human Serum Through Combined Developments in Sample Preparation, Separations and Electrospray ionization-mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kronewitter, Scott R.; Marginean, Ioan; Cox, Jonathan T.

    The N-glycan diversity of human serum glycoproteins, i.e. the human blood serum N-glycome, is complex due to the range of glycan structures potentially synthesizable by human glycosylation enzymes. The reported glycome, however, is limited by methods of sample preparation, available analytical platforms, e.g., based upon electrospray ionization-mass spectrometry (ESI-MS), and software tools for data analysis. In this report, several improvements have been implemented in sample preparation and analysis to extend ESI-MS glycan characterization and to provide an improved view of glycan diversity. Sample preparation improvements include acidified, microwave-accelerated, PNGase F N-glycan release, and sodium borohydride reduction were optimized to improvemore » quantitative yields and conserve the number of glycoforms detected. Two-stage desalting (during solid phase extraction and on the analytical column) increased the sensitivity by reducing analyte signal division between multiple reducing-end-forms or cation adducts. On-line separations were improved by using extended length graphitized carbon columns and adding TFA as an acid modifier to a formic acid/reversed phase gradient which provides additional resolving power and significantly improved desorption of both large and heavily sialylated glycans. To improve MS sensitivity and provide gentler ionization conditions at the source-MS interface, subambient pressure ionization with nanoelectrospray (SPIN) has been utilized. When method improvements are combined together with the Glycomics Quintavariate Informed Quantification (GlyQ-IQ) recently described1 these technologies demonstrate the ability to significantly extend glycan detection sensitivity and provide expanded glycan coverage. We demonstrate application of these advances in the context of the human serum glycome, and for which our initial observations include detection of a new class of heavily sialylated N-glycans, including polysialylated N-glycans.« less

  10. Advances in Adaptive Control Methods

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan

    2009-01-01

    This poster presentation describes recent advances in adaptive control technology developed by NASA. Optimal Control Modification is a novel adaptive law that can improve performance and robustness of adaptive control systems. A new technique has been developed to provide an analytical method for computing time delay stability margin for adaptive control systems.

  11. Improving the Analysis of Anthocyanidins from Blueberries Using Response Surface Methodology

    USDA-ARS?s Scientific Manuscript database

    Background: Recent interest in the health promoting potential of anthocyanins points to the need for robust and reliable analytical methods. It is essential to know that the health promoting chemicals are present in juices and other products processed from whole fruit. Many different methods have be...

  12. The calculation of transport properties in quantum liquids using the maximum entropy numerical analytic continuation method: Application to liquid para-hydrogen

    PubMed Central

    Rabani, Eran; Reichman, David R.; Krilov, Goran; Berne, Bruce J.

    2002-01-01

    We present a method based on augmenting an exact relation between a frequency-dependent diffusion constant and the imaginary time velocity autocorrelation function, combined with the maximum entropy numerical analytic continuation approach to study transport properties in quantum liquids. The method is applied to the case of liquid para-hydrogen at two thermodynamic state points: a liquid near the triple point and a high-temperature liquid. Good agreement for the self-diffusion constant and for the real-time velocity autocorrelation function is obtained in comparison to experimental measurements and other theoretical predictions. Improvement of the methodology and future applications are discussed. PMID:11830656

  13. Tributyltin--critical pollutant in whole water samples--development of traceable measurement methods for monitoring under the European Water Framework Directive (WFD) 2000/60/EC.

    PubMed

    Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert

    2015-07-01

    Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.

  14. Supervised Variational Relevance Learning, An Analytic Geometric Feature Selection with Applications to Omic Datasets.

    PubMed

    Boareto, Marcelo; Cesar, Jonatas; Leite, Vitor B P; Caticha, Nestor

    2015-01-01

    We introduce Supervised Variational Relevance Learning (Suvrel), a variational method to determine metric tensors to define distance based similarity in pattern classification, inspired in relevance learning. The variational method is applied to a cost function that penalizes large intraclass distances and favors small interclass distances. We find analytically the metric tensor that minimizes the cost function. Preprocessing the patterns by doing linear transformations using the metric tensor yields a dataset which can be more efficiently classified. We test our methods using publicly available datasets, for some standard classifiers. Among these datasets, two were tested by the MAQC-II project and, even without the use of further preprocessing, our results improve on their performance.

  15. Assessment of eutrophication in estuaries: Pressure-state-response and source apportionment

    Treesearch

    David Whitall; Suzanne Bricker

    2006-01-01

    The National Estuarine Eutrophication Assessment (NEEA) Update Program is a management oriented program designed to improve monitoring and assessment efforts through the development of type specific classification of estuaries that will allow improved assessment methods and development of analytical and research models and tools for managers which will help guide and...

  16. A Data Analytical Framework for Improving Real-Time, Decision Support Systems in Healthcare

    ERIC Educational Resources Information Center

    Yahav, Inbal

    2010-01-01

    In this dissertation we develop a framework that combines data mining, statistics and operations research methods for improving real-time decision support systems in healthcare. Our approach consists of three main concepts: data gathering and preprocessing, modeling, and deployment. We introduce the notion of offline and semi-offline modeling to…

  17. Comprehensive characterizations of nanoparticle biodistribution following systemic injection in mice

    NASA Astrophysics Data System (ADS)

    Liao, Wei-Yin; Li, Hui-Jing; Chang, Ming-Yao; Tang, Alan C. L.; Hoffman, Allan S.; Hsieh, Patrick C. H.

    2013-10-01

    Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics.Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr03954d

  18. Analysis of polyphosphates in fish and shrimps tissues by two different ion chromatography methods: implications on false-negative and -positive findings.

    PubMed

    Kaufmann, A; Maden, K; Leisser, W; Matera, M; Gude, T

    2005-11-01

    Inorganic polyphosphates (di-, tri- and higher polyphosphates) can be used to treat fish, fish fillets and shrimps in order to improve their water-binding capacity. The practical relevance of this treatment is a significant gain of weight caused by the retention/uptake of water and natural juice into the fish tissues. This practice is legal; however, the use of phosphates has to be declared. The routine control testing of fish for the presence of polyphosphates, produced some results that were difficult to explain. One of the two analytical methods used determined low diphosphate concentrations in a number of untreated samples, while the other ion chromatography (IC) method did not detect them. This initiated a number of investigations: results showed that polyphosphates in fish and shrimps tissue undergo a rapid enzymatic degradation, producing the ubiquitous orthophosphate. This led to the conclusion that sensitive analytical methods are required in order to detect previous polyphosphate treatment of a sample. The polyphosphate concentrations detected by one of the analytical methods could not be explained by the degradation of endogenous high-energy nucleotides like ATP into diphosphate, but by a coeluting compound. Further investigations by LC-MS-MS proved that the substance responsible for the observed peak was inosine monophsosphate (IMP) and not as thought the inorganic diphosphate. The method producing the false-positive result was modified and both methods were ultimately able to detect polyphosphates well separated from natural nucleotides. Polyphosphates could no longer be detected (<0.5 mg kg-1) after modification of the analytical methodology. The relevance of these findings lies in the fact that similar analytical methods are employed in various control laboratories, which might lead to false interpretation of measurements.

  19. Anisotropic Multishell Analytical Modeling of an Intervertebral Disk Subjected to Axial Compression.

    PubMed

    Demers, Sébastien; Nadeau, Sylvie; Bouzid, Abdel-Hakim

    2016-04-01

    Studies on intervertebral disk (IVD) response to various loads and postures are essential to understand disk's mechanical functions and to suggest preventive and corrective actions in the workplace. The experimental and finite-element (FE) approaches are well-suited for these studies, but validating their findings is difficult, partly due to the lack of alternative methods. Analytical modeling could allow methodological triangulation and help validation of FE models. This paper presents an analytical method based on thin-shell, beam-on-elastic-foundation and composite materials theories to evaluate the stresses in the anulus fibrosus (AF) of an axisymmetric disk composed of multiple thin lamellae. Large deformations of the soft tissues are accounted for using an iterative method and the anisotropic material properties are derived from a published biaxial experiment. The results are compared to those obtained by FE modeling. The results demonstrate the capability of the analytical model to evaluate the stresses at any location of the simplified AF. It also demonstrates that anisotropy reduces stresses in the lamellae. This novel model is a preliminary step in developing valuable analytical models of IVDs, and represents a distinctive groundwork that is able to sustain future refinements. This paper suggests important features that may be included to improve model realism.

  20. Environmental and human monitoring of Americium-241 utilizing extraction chromatography and {alpha}-Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, S.J.; Hensley, C.A.; Armenta, C.E.

    1997-03-01

    Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for {alpha}-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of `real` environmental and bioassaymore » samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of {approx}2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously. 24 refs., 2 figs., 2 tabs.« less

  1. Students' science process skill and analytical thinking ability in chemistry learning

    NASA Astrophysics Data System (ADS)

    Irwanto, Rohaeti, Eli; Widjajanti, Endang; Suyanta

    2017-08-01

    Science process skill and analytical thinking ability are needed in chemistry learning in 21st century. Analytical thinking is related with science process skill which is used by students to solve complex and unstructured problems. Thus, this research aims to determine science process skill and analytical thinking ability of senior high school students in chemistry learning. The research was conducted in Tiga Maret Yogyakarta Senior High School, Indonesia, at the middle of the first semester of academic year 2015/2016 is using the survey method. The survey involved 21 grade XI students as participants. Students were given a set of test questions consists of 15 essay questions. The result indicated that the science process skill and analytical thinking ability were relatively low ie. 30.67%. Therefore, teachers need to improve the students' cognitive and psychomotor domains effectively in learning process.

  2. Miniaturizing and automation of free acidity measurements for uranium (VI)-HNO3 solutions: Development of a new sequential injection analysis for a sustainable radio-analytical chemistry.

    PubMed

    Néri-Quiroz, José; Canto, Fabrice; Guillerme, Laurent; Couston, Laurent; Magnaldo, Alastair; Dugas, Vincent

    2016-10-01

    A miniaturized and automated approach for the determination of free acidity in solutions containing uranium (VI) is presented. The measurement technique is based on the concept of sequential injection analysis with on-line spectroscopic detection. The proposed methodology relies on the complexation and alkalimetric titration of nitric acid using a pH 5.6 sodium oxalate solution. The titration process is followed by UV/VIS detection at 650nm thanks to addition of Congo red as universal pH indicator. Mixing sequence as well as method validity was investigated by numerical simulation. This new analytical design allows fast (2.3min), reliable and accurate free acidity determination of low volume samples (10µL) containing uranium/[H(+)] moles ratio of 1:3 with relative standard deviation of <7.0% (n=11). The linearity range of the free nitric acid measurement is excellent up to 2.77molL(-1) with a correlation coefficient (R(2)) of 0.995. The method is specific, presence of actinide ions up to 0.54molL(-1) does not interfere on the determination of free nitric acid. In addition to automation, the developed sequential injection analysis method greatly improves the standard off-line oxalate complexation and alkalimetric titration method by reducing thousand fold the required sample volume, forty times the nuclear waste per analysis as well as the analysis time by eight fold. These analytical parameters are important especially in nuclear-related applications to improve laboratory safety, personnel exposure to radioactive samples and to drastically reduce environmental impacts or analytical radioactive waste. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  4. Use of LC-HRMS in full scan-XIC mode for multi-analyte urine drug testing - a step towards a 'black-box' solution?

    PubMed

    Stephanson, N N; Signell, P; Helander, A; Beck, O

    2017-08-01

    The influx of new psychoactive substances (NPS) has created a need for improved methods for drug testing in toxicology laboratories. The aim of this work was to design, validate and apply a multi-analyte liquid chromatography-high-resolution mass spectrometry (LC-HRMS) method for screening of 148 target analytes belonging to the NPS class, plant alkaloids and new psychoactive therapeutic drugs. The analytical method used a fivefold dilution of urine with nine deuterated internal standards and injection of 2 μl. The LC system involved a 2.0 μm 100 × 2.0 mm YMC-UltraHT Hydrosphere-C 18 column and gradient elution with a flow rate of 0.5 ml/min and a total analysis time of 6.0 min. Solvent A consisted of 10 mmol/l ammonium formate and 0.005% formic acid, pH 4.8, and Solvent B was methanol with 10 mmol/l ammonium formate and 0.005% formic acid. The HRMS (Q Exactive, Thermo Scientific) used a heated electrospray interface and was operated in positive mode with 70 000 resolution. The scan range was 100-650 Da, and data for extracted ion chromatograms used ± 10 ppm tolerance. Product ion monitoring was applied for confirmation analysis and for some selected analytes also for screening. Method validation demonstrated limited influence from urine matrix, linear response within the measuring range (typically 0.1-1.0 μg/ml) and acceptable imprecision in quantification (CV <15%). A few analytes were found to be unstable in urine upon storage. The method was successfully applied for routine drug testing of 17 936 unknown samples, of which 2715 (15%) contained 52 of the 148 analytes. It is concluded that the method design based on simple dilution of urine and using LC-HRMS in extracted ion chromatogram mode may offer an analytical system for urine drug testing that fulfils the requirement of a 'black box' solution and can replace immunochemical screening applied on autoanalyzers. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Shared decision-making – transferring research into practice: the Analytic Hierarchy Process (AHP)

    PubMed Central

    Dolan, James G.

    2008-01-01

    Objective To illustrate how the Analytic Hierarchy Process (AHP) can be used to promote shared decision-making and enhance clinician-patient communication. Methods Tutorial review. Results The AHP promotes shared decision making by creating a framework that is used to define the decision, summarize the information available, prioritize information needs, elicit preferences and values, and foster meaningful communication among decision stakeholders. Conclusions The AHP and related multi-criteria methods have the potential for improving the quality of clinical decisions and overcoming current barriers to implementing shared decision making in busy clinical settings. Further research is needed to determine the best way to implement these tools and to determine their effectiveness. Practice Implications Many clinical decisions involve preference-based trade-offs between competing risks and benefits. The AHP is a well-developed method that provides a practical approach for improving patient-provider communication, clinical decision-making, and the quality of patient care in these situations. PMID:18760559

  6. Combining an Analytic Hierarchy Process and TOPSIS for Selecting Postharvest Technology Method for Selayar Citrus in Indonesia

    NASA Astrophysics Data System (ADS)

    Dirpan, Andi

    2018-05-01

    This research was intended to select the best handling methods or postharvest technologies that can be used to maintain the quality of citrus fruit in Selayar, South Sulawesi, Indonesia among (1) modified atmosphere packaging (MAP (2) Controlled atmosphere storage (CAS) (3) coatings (4) hot water treatment (5) Hot Calcium Dip (HCD) by using combination between an analytic hierarchy process (AHP) and TOPSIS. Improving quality, applicability, increasing shelf life and reducing cost are used as the criteria to determine the best postharvest technologies. The results show that the most important criteria for selecting postharvest technology is improving quality followed by increasing shelf life, reducing cost and applicability. Furthermore, by using TOPSIS, it is clear that the postharvest technology that had the lowest rangking is modified atmosphere packaging (MAP), followed by controlled atmosphere storage (CAS), coatings, hot calcium dip (HCD) and hot water treatment (HWT). Therefore, it can be concluded that the best postharvest technology method for Selayar citrus is modified atmosphere packaging (MAP).

  7. JPRS Report, Science & Technology, Japan

    DTIC Science & Technology

    1988-10-05

    collagen, we are conducting research on the immobilization, through chemical bond rather than physical absorption , of collagen on synthetic material...of a large number of samples are conducted by using automated apparatus and enzymatic reagents, it is natural to devise a method to use natural...improvement of enzymatic analytical methods ; 3) development of reaction system and instrumentation system; 4) research on sample treatment methods ; and

  8. An orientation measurement method based on Hall-effect sensors for permanent magnet spherical actuators with 3D magnet array.

    PubMed

    Yan, Liang; Zhu, Bo; Jiao, Zongxia; Chen, Chin-Yin; Chen, I-Ming

    2014-10-24

    An orientation measurement method based on Hall-effect sensors is proposed for permanent magnet (PM) spherical actuators with three-dimensional (3D) magnet array. As there is no contact between the measurement system and the rotor, this method could effectively avoid friction torque and additional inertial moment existing in conventional approaches. Curved surface fitting method based on exponential approximation is proposed to formulate the magnetic field distribution in 3D space. The comparison with conventional modeling method shows that it helps to improve the model accuracy. The Hall-effect sensors are distributed around the rotor with PM poles to detect the flux density at different points, and thus the rotor orientation can be computed from the measured results and analytical models. Experiments have been conducted on the developed research prototype of the spherical actuator to validate the accuracy of the analytical equations relating the rotor orientation and the value of magnetic flux density. The experimental results show that the proposed method can measure the rotor orientation precisely, and the measurement accuracy could be improved by the novel 3D magnet array. The study result could be used for real-time motion control of PM spherical actuators.

  9. Improved DNA hybridization parameters by Twisted Intercalating Nucleic Acid (TINA).

    PubMed

    Schneider, Uffe Vest

    2012-01-01

    This thesis establishes oligonucleotide design rules and applications of a novel group of DNA stabilizing molecules collectively called Twisted Intercalating Nucleic Acid - TINA. Three peer-reviewed publications form the basis for the thesis. One publication describes an improved and rapid method for determination of DNA melting points and two publications describe the effects of positioning TINA molecules in parallel triplex helix and antiparallel duplex helix forming DNA structures. The third publication establishes that TINA molecules containing oligonucleotides improve an antiparallel duplex hybridization based capture assay's analytical sensitivity compared to conventionel DNA oligonucleotides. Clinical microbiology is traditionally based on pathogenic microorganisms' culture and serological tests. The introduction of DNA target amplification methods like PCR has improved the analytical sensitivity and total turn around time involved in clinical diagnostics of infections. Due to the relatively weak hybridization between the two strands of double stranded DNA, a number of nucleic acid stabilizing molecules have been developed to improve the sensitivity of DNA based diagnostics through superior binding properties. A short introduction is given to Watson-Crick and Hoogsteen based DNA binding and the derived DNA structures. A number of other nucleic acid stabilizing molecules are described. The stabilizing effect of TINA molecules on different DNA structures is discussed and considered in relation to other nucleic acid stabilizing molecules and in relation to future use of TINA containing oligonucleotides in clinical diagnostics and therapy. In conclusion, design of TINA modified oligonucleotides for antiparallel duplex helixes and parallel triplex helixes follows simple purpose dependent rules. TINA molecules are well suited for improving multiplex PCR assays and can be used as part of novel technologies. Future research should test whether combinations of TINA molecules and other nucleic acid stabilizing molecules can increase analytical sensitivity whilst maintaining nucleobase mismatch discrimination in triplex helix based diagnostic assays.

  10. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    PubMed

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  11. A Six Sigma Trial For Reduction of Error Rates in Pathology Laboratory.

    PubMed

    Tosuner, Zeynep; Gücin, Zühal; Kiran, Tuğçe; Büyükpinarbaşili, Nur; Turna, Seval; Taşkiran, Olcay; Arici, Dilek Sema

    2016-01-01

    A major target of quality assurance is the minimization of error rates in order to enhance patient safety. Six Sigma is a method targeting zero error (3.4 errors per million events) used in industry. The five main principles of Six Sigma are defining, measuring, analysis, improvement and control. Using this methodology, the causes of errors can be examined and process improvement strategies can be identified. The aim of our study was to evaluate the utility of Six Sigma methodology in error reduction in our pathology laboratory. The errors encountered between April 2014 and April 2015 were recorded by the pathology personnel. Error follow-up forms were examined by the quality control supervisor, administrative supervisor and the head of the department. Using Six Sigma methodology, the rate of errors was measured monthly and the distribution of errors at the preanalytic, analytic and postanalytical phases was analysed. Improvement strategies were reclaimed in the monthly intradepartmental meetings and the control of the units with high error rates was provided. Fifty-six (52.4%) of 107 recorded errors in total were at the pre-analytic phase. Forty-five errors (42%) were recorded as analytical and 6 errors (5.6%) as post-analytical. Two of the 45 errors were major irrevocable errors. The error rate was 6.8 per million in the first half of the year and 1.3 per million in the second half, decreasing by 79.77%. The Six Sigma trial in our pathology laboratory provided the reduction of the error rates mainly in the pre-analytic and analytic phases.

  12. Efficient alignment-free DNA barcode analytics

    PubMed Central

    Kuksa, Pavel; Pavlovic, Vladimir

    2009-01-01

    Background In this work we consider barcode DNA analysis problems and address them using alternative, alignment-free methods and representations which model sequences as collections of short sequence fragments (features). The methods use fixed-length representations (spectrum) for barcode sequences to measure similarities or dissimilarities between sequences coming from the same or different species. The spectrum-based representation not only allows for accurate and computationally efficient species classification, but also opens possibility for accurate clustering analysis of putative species barcodes and identification of critical within-barcode loci distinguishing barcodes of different sample groups. Results New alignment-free methods provide highly accurate and fast DNA barcode-based identification and classification of species with substantial improvements in accuracy and speed over state-of-the-art barcode analysis methods. We evaluate our methods on problems of species classification and identification using barcodes, important and relevant analytical tasks in many practical applications (adverse species movement monitoring, sampling surveys for unknown or pathogenic species identification, biodiversity assessment, etc.) On several benchmark barcode datasets, including ACG, Astraptes, Hesperiidae, Fish larvae, and Birds of North America, proposed alignment-free methods considerably improve prediction accuracy compared to prior results. We also observe significant running time improvements over the state-of-the-art methods. Conclusion Our results show that newly developed alignment-free methods for DNA barcoding can efficiently and with high accuracy identify specimens by examining only few barcode features, resulting in increased scalability and interpretability of current computational approaches to barcoding. PMID:19900305

  13. Development and validation of an improved method for the determination of chloropropanols in paperboard food packaging by GC-MS.

    PubMed

    Mezouari, S; Liu, W Yun; Pace, G; Hartman, T G

    2015-01-01

    The objective of this study was to develop an improved analytical method for the determination of 3-chloro-1,2-propanediol (3-MCPD) and 1,3-dichloropropanol (1,3-DCP) in paper-type food packaging. The established method includes aqueous extraction, matrix spiking of a deuterated surrogate internal standard (3-MCPD-d₅), clean-up using Extrelut solid-phase extraction, derivatisation using a silylation reagent, and GC-MS analysis of the chloropropanols as their corresponding trimethyl silyl ethers. The new method is applicable to food-grade packaging samples using European Commission standard aqueous extraction and aqueous food stimulant migration tests. In this improved method, the derivatisation procedure was optimised; the cost and time of the analysis were reduced by using 10 times less sample, solvents and reagents than in previously described methods. Overall the validation data demonstrate that the method is precise and reliable. The limit of detection (LOD) of the aqueous extract was 0.010 mg kg(-1) (w/w) for both 3-MCPD and 1,3-DCP. Analytical precision had a relative standard deviation (RSD) of 3.36% for 3-MCPD and an RSD of 7.65% for 1,3-DCP. The new method was satisfactorily applied to the analysis of over 100 commercial paperboard packaging samples. The data are being used to guide the product development of a next generation of wet-strength resins with reduced chloropropanol content, and also for risk assessments to calculate the virtual safe dose (VSD).

  14. Simultaneous specimen and stage cleaning device for analytical electron microscope

    DOEpatents

    Zaluzec, Nestor J.

    1996-01-01

    An improved method and apparatus are provided for cleaning both a specimen stage, a specimen and an interior of an analytical electron microscope (AEM). The apparatus for cleaning a specimen stage and specimen comprising a plasma chamber for containing a gas plasma and an air lock coupled to the plasma chamber for permitting passage of the specimen stage and specimen into the plasma chamber and maintaining an airtight chamber. The specimen stage and specimen are subjected to a reactive plasma gas that is either DC or RF excited. The apparatus can be mounted on the analytical electron microscope (AEM) for cleaning the interior of the microscope.

  15. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A.

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses onmore » validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.« less

  16. Control research in the NASA high-alpha technology program

    NASA Technical Reports Server (NTRS)

    Gilbert, William P.; Nguyen, Luat T.; Gera, Joseph

    1990-01-01

    NASA is conducting a focused technology program, known as the High-Angle-of-Attack Technology Program, to accelerate the development of flight-validated technology applicable to the design of fighters with superior stall and post-stall characteristics and agility. A carefully integrated effort is underway combining wind tunnel testing, analytical predictions, piloted simulation, and full-scale flight research. A modified F-18 aircraft has been extensively instrumented for use as the NASA High-Angle-of-Attack Research Vehicle used for flight verification of new methods and concepts. This program stresses the importance of providing improved aircraft control capabilities both by powered control (such as thrust-vectoring) and by innovative aerodynamic control concepts. The program is accomplishing extensive coordinated ground and flight testing to assess and improve available experimental and analytical methods and to develop new concepts for enhanced aerodynamics and for effective control, guidance, and cockpit displays essential for effective pilot utilization of the increased agility provided.

  17. Rapid determination of quinolones in cosmetic products by ultra high performance liquid chromatography with tandem mass spectrometry.

    PubMed

    Liu, Shao-Ying; Huang, Xi-Hui; Wang, Xiao-Fang; Jin, Quan; Zhu, Guo-Nian

    2014-05-01

    This study developed an improved analytical method for the simultaneous quantification of 13 quinolones in cosmetics by ultra high performance liquid chromatography combined with ESI triple quadrupole MS/MS under the multiple reaction monitoring mode. The analytes were extracted and purified by using an SPE cartridge. The limits of quantification ranged from 0.03 to 3.02 μg/kg. The precision for determining the quinolones was <19.39%. The proposed method was successfully developed for the determination of quinolones in real cosmetic samples. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Evaluation of Superparamagnetic Silica Nanoparticles for Extraction of Triazines in Magnetic in-Tube Solid Phase Microextraction Coupled to Capillary Liquid Chromatography

    PubMed Central

    González-Fuenzalida, R. A.; Moliner-Martínez, Y.; Prima-Garcia, Helena; Ribera, Antonio; Campins-Falcó, P.; Zaragozá, Ramon J.

    2014-01-01

    The use of magnetic nanomaterials for analytical applications has increased in the recent years. In particular, magnetic nanomaterials have shown great potential as adsorbent phase in several extraction procedures due to the significant advantages over the conventional methods. In the present work, the influence of magnetic forces over the extraction efficiency of triazines using superparamagnetic silica nanoparticles (NPs) in magnetic in tube solid phase microextraction (Magnetic-IT-SPME) coupled to CapLC has been evaluated. Atrazine, terbutylazine and simazine has been selected as target analytes. The superparamagnetic silica nanomaterial (SiO2-Fe3O4) deposited onto the surface of a capillary column gave rise to a magnetic extraction phase for IT-SPME that provided a enhancemment of the extraction efficiency for triazines. This improvement is based on two phenomena, the superparamegnetic behavior of Fe3O4 NPs and the diamagnetic repulsions that take place in a microfluidic device such a capillary column. A systematic study of analytes adsorption and desorption was conducted as function of the magnetic field and the relationship with triazines magnetic susceptibility. The positive influence of magnetism on the extraction procedure was demonstrated. The analytical characteristics of the optimized procedure were established and the method was applied to the determination of the target analytes in water samples with satisfactory results. When coupling Magnetic-IT-SPME with CapLC, improved adsorption efficiencies (60%–63%) were achieved compared with conventional adsorption materials (0.8%–3%). PMID:28344221

  19. Sensitive ionization of non-volatile analytes using protein solutions as spray liquid in desorption electrospray ionization mass spectrometry.

    PubMed

    Zhu, Zhiqiang; Han, Jing; Zhang, Yan; Zhou, Yafei; Xu, Ning; Zhang, Bo; Gu, Haiwei; Chen, Huanwen

    2012-12-15

    Desorption electrospray ionization (DESI) is the most popular ambient ionization technique for direct analysis of complex samples without sample pretreatment. However, for many applications, especially for trace analysis, it is of interest to improve the sensitivity of DESI-mass spectrometry (MS). In traditional DESI-MS, a mixture of methanol/water/acetic acid is usually used to generate the primary ions. In this article, dilute protein solutions were electrosprayed in the DESI method to create multiply charged primary ions for the desorption ionization of trace analytes on various surfaces (e.g., filter paper, glass, Al-foil) without any sample pretreatment. The analyte ions were then detected and structurally characterized using a LTQ XL mass spectrometer. Compared with the methanol/water/acetic acid (49:49:2, v/v/v) solution, protein solutions significantly increased the signal levels of non-volatile compounds such as benzoic acid, TNT, o-toluidine, peptide and insulin in either positive or negative ion detection mode. For all the analytes tested, the limits of detection (LODs) were reduced to about half of the original values which were obtained using traditional DESI. The results showed that the signal enhancement is highly correlated with the molecular weight of the proteins and the selected solid surfaces. The proposed DESI method is a universal strategy for rapid and sensitive detection of trace amounts of strongly bound and/or non-volatile analytes, including explosives, peptides, and proteins. The results indicate that the sensitivity of DESI can be further improved by selecting larger proteins and appropriate solid surfaces. Copyright © 2012 John Wiley & Sons, Ltd.

  20. Analytical interference of 4-hydroxy-3-methoxymethamphetamine with the measurement of plasma free normetanephrine by ultra-high pressure liquid chromatography-tandem mass spectrometry.

    PubMed

    Dunand, Marielle; Donzelli, Massimiliano; Rickli, Anna; Hysek, Cédric M; Liechti, Matthias E; Grouzmann, Eric

    2014-08-01

    The diagnosis of pheochromocytoma relies on the measurement of plasma free metanephrines assay whose reliability has been considerably improved by ultra-high pressure liquid chromatography tandem mass spectrometry (UHPLC-MS/MS). Here we report an analytical interference occurring between 4-hydroxy-3-methoxymethamphetamine (HMMA), a metabolite of 3,4-methylenedioxymethamphetamine (MDMA, "Ecstasy"), and normetanephrine (NMN) since they share a common pharmacophore resulting in the same product ion after fragmentation. Synthetic HMMA was spiked into plasma samples containing various concentrations of NMN and the intensity of the interference was determined by UPLC-MS/MS before and after improvement of the analytical method. Using a careful adjustment of chromatographic conditions including the change of the UPLC analytical column, we were able to distinguish both compounds. HMMA interference for NMN determination should be seriously considered since MDMA activates the sympathetic nervous system and if confounded with NMN may lead to false-positive tests when performing a differential diagnostic of pheochromocytoma. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  1. Lack of correlation between reaction speed and analytical sensitivity in isothermal amplification reveals the value of digital methods for optimization: validation using digital real-time RT-LAMP

    PubMed Central

    Khorosheva, Eugenia M.; Karymov, Mikhail A.; Selck, David A.; Ismagilov, Rustem F.

    2016-01-01

    In this paper, we asked if it is possible to identify the best primers and reaction conditions based on improvements in reaction speed when optimizing isothermal reactions. We used digital single-molecule, real-time analyses of both speed and efficiency of isothermal amplification reactions, which revealed that improvements in the speed of isothermal amplification reactions did not always correlate with improvements in digital efficiency (the fraction of molecules that amplify) or with analytical sensitivity. However, we observed that the speeds of amplification for single-molecule (in a digital device) and multi-molecule (e.g. in a PCR well plate) formats always correlated for the same conditions. Also, digital efficiency correlated with the analytical sensitivity of the same reaction performed in a multi-molecule format. Our finding was supported experimentally with examples of primer design, the use or exclusion of loop primers in different combinations, and the use of different enzyme mixtures in one-step reverse-transcription loop-mediated amplification (RT-LAMP). Our results show that measuring the digital efficiency of amplification of single-template molecules allows quick, reliable comparisons of the analytical sensitivity of reactions under any two tested conditions, independent of the speeds of the isothermal amplification reactions. PMID:26358811

  2. Analytical and experimental studies of an optimum multisegment phased liner noise suppression concept

    NASA Technical Reports Server (NTRS)

    Sawdy, D. T.; Beckemeyer, R. J.; Patterson, J. D.

    1976-01-01

    Results are presented from detailed analytical studies made to define methods for obtaining improved multisegment lining performance by taking advantage of relative placement of each lining segment. Properly phased liner segments reflect and spatially redistribute the incident acoustic energy and thus provide additional attenuation. A mathematical model was developed for rectangular ducts with uniform mean flow. Segmented acoustic fields were represented by duct eigenfunction expansions, and mode-matching was used to ensure continuity of the total field. Parametric studies were performed to identify attenuation mechanisms and define preliminary liner configurations. An optimization procedure was used to determine optimum liner impedance values for a given total lining length, Mach number, and incident modal distribution. Optimal segmented liners are presented and it is shown that, provided the sound source is well-defined and flow environment is known, conventional infinite duct optimum attenuation rates can be improved. To confirm these results, an experimental program was conducted in a laboratory test facility. The measured data are presented in the form of analytical-experimental correlations. Excellent agreement between theory and experiment verifies and substantiates the analytical prediction techniques. The results indicate that phased liners may be of immediate benefit in the development of improved aircraft exhaust duct noise suppressors.

  3. Hybrid perturbation methods based on statistical time series models

    NASA Astrophysics Data System (ADS)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  4. Dried blood spot specimen quality and validation of a new pre-analytical processing method for qualitative HIV-1 PCR, KwaZulu-Natal, South Africa

    PubMed Central

    Parboosing, Raveen; Siyaca, Ntombizandile; Moodley, Pravikrishnen

    2016-01-01

    Background Poor quality dried blood spot (DBS) specimens are usually rejected by virology laboratories, affecting early infant diagnosis of HIV. The practice of combining two incompletely-filled DBS in one specimen preparation tube during pre-analytical specimen processing (i.e., the two-spot method) has been implemented to reduce the number of specimens being rejected for insufficient volume. Objectives This study analysed laboratory data to describe the quality of DBS specimens and the use of the two-spot method over a one-year period, then validated the two-spot method against the standard (one-spot) method. Methods Data on HIV-1 PCR test requests submitted in 2014 to the Department of Virology at Inkosi Albert Luthuli Central Hospital in KwaZulu-Natal province, South Africa were analysed to describe reasons for specimen rejection, as well as results of the two-spot method. The accuracy, lower limit of detection and precision of the two-spot method were assessed. Results Of the 88 481 specimens received, 3.7% were rejected for pre-analytical problems. Of those, 48.9% were rejected as a result of insufficient specimen volume. Two health facilities had significantly more specimen rejections than other facilities. The two-spot method prevented 10 504 specimen rejections. The Pearson correlation coefficient comparing the standard to the two-spot method was 0.997. Conclusions The two-spot method was comparable with the standard method of pre-analytical specimen processing. Two health facilities were identified for targeted retraining on specimen quality. The two-spot method of DBS specimen processing can be used as an adjunct to retraining, to reduce the number of specimens rejected and improve linkage to care. PMID:28879108

  5. Evaluation of the effect of a health education campaign of HIV by using an analytical hierarchy process method.

    PubMed

    Tan, Xiaodong; Lin, Jianyan; Wang, Fengjie; Luo, Hong; Luo, Lan; Wu, Lei

    2007-09-01

    This study was designed to understand the status of HIV/AIDS knowledge, attitude and practice (KAP) among different populations and to provide scientific evidences for further health education. Three rounds of questionnaires were administered among service industry workers who were selected through stratified cluster sampling. Study subjects included hotel attendants, employees of beauty parlors and service workers of transportation industry. Data were analyzed using the analytical hierarchy process. All demonstrated high KAP overall. Synthetic scoring indexes of the three surveys were above 75%. However, the correct response rate on questions whether mosquito bite can transmit HIV/AIDS and what is the relationship between STD with HIV was unsatisfactory (lower than expected); and their attitudes towards people living with HIV and AIDS need to be improved. Moreover, the effect of health education on these groups was unclear. In conclusion, analytical hierarchy process is a valid method in estimating overall effect of HIV/AIDS health education. Although the present status of HIV/AIDS KAP among the service industry workers was relatively good, greater efforts should be made to improve their HIV transmission knowledge, attitude and understanding of the relationship between STDs and HIV.

  6. Photonic crystal-based optical biosensor: a brief investigation

    NASA Astrophysics Data System (ADS)

    Divya, J.; Selvendran, S.; Sivanantha Raja, A.

    2018-06-01

    In this paper, a two-dimensional photonic crystal biosensor for medical applications based on two waveguides and a nanocavity was explored with different shoulder-coupled nanocavity structures. The most important biosensor parameters, like the sensitivity and quality factor, can be significantly improved. By injecting an analyte into a sensing hole, the refractive index of the hole was changed. This refractive index biosensor senses the changes and shifts its operating wavelength accordingly. The transmission characteristics of light in the biosensor under different refractive indices that correspond to the change in the analyte concentration are analyzed by the finite-difference time-domain method. The band gap for each structure is designed and observed by the plane wave expansion method. These proposed structures are designed to obtain an analyte refractive index variation of about 1–1.5 in an optical wavelength range of 1.250–1.640 µm. Accordingly, an improved sensitivity of 136.6 nm RIU‑1 and a quality factor as high as 3915 is achieved. An important feature of this structure is its very small dimensions. Such a combination of attributes makes the designed structure a promising element for label-free biosensing applications.

  7. Approach to method development and validation in capillary electrophoresis for enantiomeric purity testing of active basic pharmaceutical ingredients.

    PubMed

    Sokoliess, Torsten; Köller, Gerhard

    2005-06-01

    A chiral capillary electrophoresis system allowing the determination of the enantiomeric purity of an investigational new drug was developed using a generic method development approach for basic analytes. The method was optimized in terms of type and concentration of both cyclodextrin (CD) and electrolyte, buffer pH, temperature, voltage, and rinsing procedure. Optimal chiral separation of the analyte was obtained using an electrolyte with 2.5% carboxymethyl-beta-CD in 25 mM NaH2PO4 (pH 4.0). Interchanging the inlet and outlet vials after each run improved the method's precision. To assure the method's suitability for the control of enantiomeric impurities in pharmaceutical quality control, its specificity, linearity, precision, accuracy, and robustness were validated according to the requirements of the International Conference on Harmonization. The usefulness of our generic method development approach for the validation of robustness was demonstrated.

  8. Method of sections in analytical calculations of pneumatic tires

    NASA Astrophysics Data System (ADS)

    Tarasov, V. N.; Boyarkina, I. V.

    2018-01-01

    Analytical calculations in the pneumatic tire theory are more preferable in comparison with experimental methods. The method of section of a pneumatic tire shell allows to obtain equations of intensities of internal forces in carcass elements and bead rings. Analytical dependencies of intensity of distributed forces have been obtained in tire equator points, on side walls (poles) and pneumatic tire bead rings. Along with planes in the capacity of secant surfaces cylindrical surfaces are used for the first time together with secant planes. The tire capacity equation has been obtained using the method of section, by means of which a contact body is cut off from the tire carcass along the contact perimeter by the surface which is normal to the bearing surface. It has been established that the Laplace equation for the solution of tasks of this class of pneumatic tires contains two unknown values that requires the generation of additional equations. The developed computational schemes of pneumatic tire sections and new equations allow to accelerate the pneumatic tire structure improvement process during engineering.

  9. Propeller flow visualization techniques

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Paulovich, F. J.; Greissing, J. P.; Walker, E. D.

    1982-01-01

    Propeller flow visualization techniques were tested. The actual operating blade shape as it determines the actual propeller performance and noise was established. The ability to photographically determine the advanced propeller blade tip deflections, local flow field conditions, and gain insight into aeroelastic instability is demonstrated. The analytical prediction methods which are being developed can be compared with experimental data. These comparisons contribute to the verification of these improved methods and give improved capability for designing future advanced propellers with enhanced performance and noise characteristics.

  10. On improvement of the series convergence in the problem of the vibrations of orhotropic rectangular prism

    NASA Astrophysics Data System (ADS)

    Lyashko, A. D.

    2017-11-01

    A new analytical presentation of the solution for steady-state oscillations of orthotopic rectangular prism is found. The corresponding infinite system of linear algebraic equations has been deduced by the superposition method. A countable set of precise eigenfrequencies and elementary eigenforms is found. The identities are found which make it possible to improve the convergence of all the infinite series in the solution of the problem. All the infinite series in presentation of solution are analytically summed up. Numerical calculations of stresses in the rectangular orthotropic prism with a uniform along the border and harmonic in time load on two opposite faces have been performed.

  11. Customization of UWB 3D-RTLS Based on the New Uncertainty Model of the AoA Ranging Technique

    PubMed Central

    Jachimczyk, Bartosz; Dziak, Damian; Kulesza, Wlodek J.

    2017-01-01

    The increased potential and effectiveness of Real-time Locating Systems (RTLSs) substantially influence their application spectrum. They are widely used, inter alia, in the industrial sector, healthcare, home care, and in logistic and security applications. The research aims to develop an analytical method to customize UWB-based RTLS, in order to improve their localization performance in terms of accuracy and precision. The analytical uncertainty model of Angle of Arrival (AoA) localization in a 3D indoor space, which is the foundation of the customization concept, is established in a working environment. Additionally, a suitable angular-based 3D localization algorithm is introduced. The paper investigates the following issues: the influence of the proposed correction vector on the localization accuracy; the impact of the system’s configuration and LS’s relative deployment on the localization precision distribution map. The advantages of the method are verified by comparing them with a reference commercial RTLS localization engine. The results of simulations and physical experiments prove the value of the proposed customization method. The research confirms that the analytical uncertainty model is the valid representation of RTLS’ localization uncertainty in terms of accuracy and precision and can be useful for its performance improvement. The research shows, that the Angle of Arrival localization in a 3D indoor space applying the simple angular-based localization algorithm and correction vector improves of localization accuracy and precision in a way that the system challenges the reference hardware advanced localization engine. Moreover, the research guides the deployment of location sensors to enhance the localization precision. PMID:28125056

  12. Customization of UWB 3D-RTLS Based on the New Uncertainty Model of the AoA Ranging Technique.

    PubMed

    Jachimczyk, Bartosz; Dziak, Damian; Kulesza, Wlodek J

    2017-01-25

    The increased potential and effectiveness of Real-time Locating Systems (RTLSs) substantially influence their application spectrum. They are widely used, inter alia, in the industrial sector, healthcare, home care, and in logistic and security applications. The research aims to develop an analytical method to customize UWB-based RTLS, in order to improve their localization performance in terms of accuracy and precision. The analytical uncertainty model of Angle of Arrival (AoA) localization in a 3D indoor space, which is the foundation of the customization concept, is established in a working environment. Additionally, a suitable angular-based 3D localization algorithm is introduced. The paper investigates the following issues: the influence of the proposed correction vector on the localization accuracy; the impact of the system's configuration and LS's relative deployment on the localization precision distribution map. The advantages of the method are verified by comparing them with a reference commercial RTLS localization engine. The results of simulations and physical experiments prove the value of the proposed customization method. The research confirms that the analytical uncertainty model is the valid representation of RTLS' localization uncertainty in terms of accuracy and precision and can be useful for its performance improvement. The research shows, that the Angle of Arrival localization in a 3D indoor space applying the simple angular-based localization algorithm and correction vector improves of localization accuracy and precision in a way that the system challenges the reference hardware advanced localization engine. Moreover, the research guides the deployment of location sensors to enhance the localization precision.

  13. Dried blood spot specimen quality and validation of a new pre-analytical processing method for qualitative HIV-1 PCR, KwaZulu-Natal, South Africa.

    PubMed

    Govender, Kerusha; Parboosing, Raveen; Siyaca, Ntombizandile; Moodley, Pravikrishnen

    2016-01-01

    Poor quality dried blood spot (DBS) specimens are usually rejected by virology laboratories, affecting early infant diagnosis of HIV. The practice of combining two incompletely-filled DBS in one specimen preparation tube during pre-analytical specimen processing (i.e., the two-spot method) has been implemented to reduce the number of specimens being rejected for insufficient volume. This study analysed laboratory data to describe the quality of DBS specimens and the use of the two-spot method over a one-year period, then validated the two-spot method against the standard (one-spot) method. Data on HIV-1 PCR test requests submitted in 2014 to the Department of Virology at Inkosi Albert Luthuli Central Hospital in KwaZulu-Natal province, South Africa were analysed to describe reasons for specimen rejection, as well as results of the two-spot method. The accuracy, lower limit of detection and precision of the two-spot method were assessed. Of the 88 481 specimens received, 3.7% were rejected for pre-analytical problems. Of those, 48.9% were rejected as a result of insufficient specimen volume. Two health facilities had significantly more specimen rejections than other facilities. The two-spot method prevented 10 504 specimen rejections. The Pearson correlation coefficient comparing the standard to the two-spot method was 0.997. The two-spot method was comparable with the standard method of pre-analytical specimen processing. Two health facilities were identified for targeted retraining on specimen quality. The two-spot method of DBS specimen processing can be used as an adjunct to retraining, to reduce the number of specimens rejected and improve linkage to care.

  14. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    PubMed

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  15. Preliminary numerical analysis of improved gas chromatograph model

    NASA Technical Reports Server (NTRS)

    Woodrow, P. T.

    1973-01-01

    A mathematical model for the gas chromatograph was developed which incorporates the heretofore neglected transport mechanisms of intraparticle diffusion and rates of adsorption. Because a closed-form analytical solution to the model does not appear realizable, techniques for the numerical solution of the model equations are being investigated. Criteria were developed for using a finite terminal boundary condition in place of an infinite boundary condition used in analytical solution techniques. The class of weighted residual methods known as orthogonal collocation is presently being investigated and appears promising.

  16. Solutions of the Dirac Equation with the Shifted DENG-FAN Potential Including Yukawa-Like Tensor Interaction

    NASA Astrophysics Data System (ADS)

    Yahya, W. A.; Falaye, B. J.; Oluwadare, O. J.; Oyewumi, K. J.

    2013-08-01

    By using the Nikiforov-Uvarov method, we give the approximate analytical solutions of the Dirac equation with the shifted Deng-Fan potential including the Yukawa-like tensor interaction under the spin and pseudospin symmetry conditions. After using an improved approximation scheme, we solved the resulting schr\\"{o}dinger-like equation analytically. Numerical results of the energy eigenvalues are also obtained, as expected, the tensor interaction removes degeneracies between spin and pseudospin doublets.

  17. Simple functionalization method for single conical pores with a polydopamine layer

    NASA Astrophysics Data System (ADS)

    Horiguchi, Yukichi; Goda, Tatsuro; Miyahara, Yuji

    2018-04-01

    Resistive pulse sensing (RPS) is an interesting analytical system in which micro- to nanosized pores are used to evaluate particles or small analytes. Recently, molecular immobilization techniques to improve the performance of RPS have been reported. The problem in functionalization for RPS is that molecular immobilization by chemical reaction is restricted by the pore material type. Herein, a simple functionalization is performed using mussel-inspired polydopamine as an intermediate layer to connect the pore material with functional molecules.

  18. Visual analytics as a translational cognitive science.

    PubMed

    Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard

    2011-07-01

    Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.

  19. Improving a complex finite-difference ground water flow model through the use of an analytic element screening model

    USGS Publications Warehouse

    Hunt, R.J.; Anderson, M.P.; Kelson, V.A.

    1998-01-01

    This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.

  20. Calculation of Sensitivity Derivatives in an MDAO Framework

    NASA Technical Reports Server (NTRS)

    Moore, Kenneth T.

    2012-01-01

    During gradient-based optimization of a system, it is necessary to generate the derivatives of each objective and constraint with respect to each design parameter. If the system is multidisciplinary, it may consist of a set of smaller "components" with some arbitrary data interconnection and process work ow. Analytical derivatives in these components can be used to improve the speed and accuracy of the derivative calculation over a purely numerical calculation; however, a multidisciplinary system may include both components for which derivatives are available and components for which they are not. Three methods to calculate the sensitivity of a mixed multidisciplinary system are presented: the finite difference method, where the derivatives are calculated numerically; the chain rule method, where the derivatives are successively cascaded along the system's network graph; and the analytic method, where the derivatives come from the solution of a linear system of equations. Some improvements to these methods, to accommodate mixed multidisciplinary systems, are also presented; in particular, a new method is introduced to allow existing derivatives to be used inside of finite difference. All three methods are implemented and demonstrated in the open-source MDAO framework OpenMDAO. It was found that there are advantages to each of them depending on the system being solved.

  1. An Improved Method for Determination of Cyanide Content in Bitter Almond Oil.

    PubMed

    Chen, Jia; Liu, Lei; Li, Mengjun; Yu, Xiuzhu; Zhang, Rui

    2018-01-01

    An improved colorimetric method for determination of cyanide content in bitter almond oil was developed. The optimal determination parameters were as follows: volume ratio of hydrochloric acid to bitter almond oil (v/v), 1.5:1; holding time for hydrolysis, 120 min; and volume ratio of distillation solution to bitter almond oil (v/v), 8:1. Analytical results showed that the relative standard deviations (SDs) of determinations were less than 10%, which satisfies the test requirements. The results of high-performance liquid chromatography and measurements exhibited a significant correlation (R = 0.9888, SD = 0.2015). Therefore, the improved colorimetric method can be used to determine cyanide content in bitter almond oil.

  2. Fast analytical scatter estimation using graphics processing units.

    PubMed

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  3. An improved method of chemical analysis for low levels of nitrogen in forest streams or in rainwater.

    Treesearch

    Elly E. Holcombe; Duane G. Moore; Richard L. Fredriksen

    1986-01-01

    A modification of the macro-Kjeldahl method that provides increased sensitivity was developed for determining very low levels of nitrogen in forest streams and in rain-water. The method is suitable as a routine laboratory procedure. Analytical range of the method is 0.02 to 1.5 mg/L with high recovery and excellent precision and ac-curacy. The range can be increased to...

  4. Simplex-stochastic collocation method with improved scalability

    NASA Astrophysics Data System (ADS)

    Edeling, W. N.; Dwight, R. P.; Cinnella, P.

    2016-04-01

    The Simplex-Stochastic Collocation (SSC) method is a robust tool used to propagate uncertain input distributions through a computer code. However, it becomes prohibitively expensive for problems with dimensions higher than 5. The main purpose of this paper is to identify bottlenecks, and to improve upon this bad scalability. In order to do so, we propose an alternative interpolation stencil technique based upon the Set-Covering problem, and we integrate the SSC method in the High-Dimensional Model-Reduction framework. In addition, we address the issue of ill-conditioned sample matrices, and we present an analytical map to facilitate uniformly-distributed simplex sampling.

  5. Application of Analytic Hierarchy Process (AHP) in the analysis of the fuel efficiency in the automobile industry with the utilization of Natural Fiber Polymer Composites (NFPC)

    NASA Astrophysics Data System (ADS)

    Jayamani, E.; Perera, D. S.; Soon, K. H.; Bakri, M. K. B.

    2017-04-01

    A systematic method of material analysis aiming for fuel efficiency improvement with the utilization of natural fiber reinforced polymer matrix composites in the automobile industry is proposed. A multi-factor based decision criteria with Analytical Hierarchy Process (AHP) was used and executed through MATLAB to achieve improved fuel efficiency through the weight reduction of vehicular components by effective comparison between two engine hood designs. The reduction was simulated by utilizing natural fiber polymer composites with thermoplastic polypropylene (PP) as the matrix polymer and benchmarked against a synthetic based composite component. Results showed that PP with 35% of flax fiber loading achieved a 0.4% improvement in fuel efficiency, and it was the highest among the 27 candidate fibers.

  6. Improvement of sound insulation performance of double-glazed windows by using viscoelastic connectors

    NASA Astrophysics Data System (ADS)

    Takahashi, D.; Sawaki, S.; Mu, R.-L.

    2016-06-01

    A new method for improving the sound insulation performance of double-glazed windows is proposed. This technique uses viscoelastic materials as connectors between the two glass panels to ensure that the appropriate spacing is maintained. An analytical model that makes it possible to discuss the effects of spacing, contact area, and viscoelastic properties of the connectors on the performance in terms of sound insulation is developed. The validity of the model is verified by comparing its results with measured data. The numerical experiments using this analytical model showed the importance of the ability of the connectors to achieve the appropriate spacing and their viscoelastic properties, both of which are necessary for improving the sound insulation performance. In addition, it was shown that the most effective factor is damping: the stronger the damping, the more the insulation performance increases.

  7. An improved clenbuterol detection by immunochromatographic assay with bacteria@Au composite as signal amplifier.

    PubMed

    Huang, Qiong; Bu, Tong; Zhang, Wentao; Yan, Lingzhi; Zhang, Mengyue; Yang, Qingfeng; Huang, Lunjie; Yang, Baowei; Hu, Na; Suo, Yourui; Wang, Jianlong; Zhang, Daohong

    2018-10-01

    Immunochromatographic assays (ICAs) are most frequently used for on-site rapid screening of clenbuterol. To improve sensitivity, a novel probe with bacteria as signal carriers was developed. Bacteria can load a great deal of gold nanoparticles (AuNPs) on their surface, meaning much fewer antibodies are needed to produce clearly visible results, although low concentrations of antibody could also trigger fierce competition between free analyte and the immobilized antigen. Thus, a limited number of antibodies was key to significantly improved sensitivity. Analytical conditions, including bacterial species, coupling method, and concentration, were optimized. The visual detection limit (VDL) for clenbuterol was 0.1 ng/mL, a 20-fold improvement in sensitivity compared with traditional strips. This work has opened up a new route for signal amplification and improved performance of ICAs. Furthermore, inactivated bacteria could also be environment-friendly and robust signal carriers for other biosensors. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Evolution of microbiological analytical methods for dairy industry needs

    PubMed Central

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675

  9. Evolution of microbiological analytical methods for dairy industry needs.

    PubMed

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.

  10. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validatemore » analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.« less

  11. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm

    2016-01-01

    The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.

  12. The effect of porosity on the mechanical properties of porous titanium scaffolds: comparative study on experimental and analytical values

    NASA Astrophysics Data System (ADS)

    Khodaei, Mohammad; Fathi, Mohammadhossein; Meratian, Mahmood; Savabi, Omid

    2018-05-01

    Reducing the elastic modulus and also improving biological fixation to the bone is possible by using porous scaffolds. In the present study, porous titanium scaffolds containing different porosities were fabricated using the space holder method. Pore distribution, formed phases and mechanical properties of titanium scaffolds were studied by Scanning Electron Microscope (SEM), x-ray diffraction (XRD) and cold compression test. Then the results of compression test were compared to the Gibson-Ashby model. Both experimentally measured and analytically calculated elastic modulus of porous titanium scaffolds decreased by porosity increment. The compliance between experimentally measured and analytically calculated elastic modulus of titanium scaffolds are also increased by porosity increment.

  13. Recommendations for accreditation of laboratories in molecular biology of hematologic malignancies.

    PubMed

    Flandrin-Gresta, Pascale; Cornillet, Pascale; Hayette, Sandrine; Gachard, Nathalie; Tondeur, Sylvie; Mauté, Carole; Cayuela, Jean-Michel

    2015-01-01

    Over recent years, the development of molecular biology techniques has improved the hematological diseases diagnostic and follow-up. Consequently, these techniques are largely used in the biological screening of these diseases; therefore the Hemato-oncology molecular diagnostics laboratories must be actively involved in the accreditation process according the ISO 15189 standard. The French group of molecular biologists (GBMHM) provides requirements for the implementation of quality assurance for the medical molecular laboratories. This guideline states the recommendations for the pre-analytical, analytical (methods validation procedures, quality controls, reagents), and post-analytical conditions. In addition, herein we state a strategy for the internal quality control management. These recommendations will be regularly updated.

  14. Glycidyl fatty acid esters in food by LC-MS/MS: method development.

    PubMed

    Becalski, A; Feng, S Y; Lau, B P-Y; Zhao, T

    2012-07-01

    An improved method based on liquid chromatography-tandem mass spectrometry (LC-MS/MS) for the analysis of glycidyl fatty acid esters in oils was developed. The method incorporates stable isotope dilution analysis (SIDA) for quantifying the five target analytes: glycidyl esters of palmitic (C16:0), stearic (C18:0), oleic (C18:1), linoleic (C18:2) and linolenic acid (C18:3). For the analysis, 10 mg sample of edible oil or fat is dissolved in acetone, spiked with deuterium labelled analogs of glycidyl esters and purified by a two-step chromatography on C18 and normal silica solid phase extraction (SPE) cartridges using methanol and 5% ethyl acetate in hexane, respectively. If the concentration of analytes is expected to be below 0.5 mg/kg, 0.5 g sample of oil is pre-concentrated first using a silica column. The dried final extract is re-dissolved in 250 μL of a mixture of methanol/isopropanol (1:1, v/v), 15 μL is injected on the analytical C18 LC column and analytes are eluted with 100% methanol. Detection of target glycidyl fatty acid esters is accomplished by LC-MS/MS using positive ion atmospheric pressure chemical ionization operating in Multiple Reaction Monitoring mode monitoring 2 ion transitions for each analyte. The method was tested on replicates of a virgin olive oil which was free of glycidyl esters. The method detection limit was calculated to be in the range of 70-150 μg/kg for each analyte using 10 mg sample and 1-3 μg/kg using 0.5 g sample of oil. Average recoveries of 5 glycidyl esters spiked at 10, 1 and 0.1 mg/kg were in the range 84% to 108%. The major advantage of our method is use of SIDA for all analytes using commercially available internal standards and detection limits that are lower by a factor of 5-10 from published methods when 0.5 g sample of oil is used. Additionally, MS/MS mass chromatograms offer greater specificity than liquid chromatography-mass spectrometry operated in selected ion monitoring mode. The method will be applied to the survey of glycidyl fatty acid esters in food products on the Canadian market.

  15. A note on improved F-expansion method combined with Riccati equation applied to nonlinear evolution equations.

    PubMed

    Islam, Md Shafiqul; Khan, Kamruzzaman; Akbar, M Ali; Mastroberardino, Antonio

    2014-10-01

    The purpose of this article is to present an analytical method, namely the improved F-expansion method combined with the Riccati equation, for finding exact solutions of nonlinear evolution equations. The present method is capable of calculating all branches of solutions simultaneously, even if multiple solutions are very close and thus difficult to distinguish with numerical techniques. To verify the computational efficiency, we consider the modified Benjamin-Bona-Mahony equation and the modified Korteweg-de Vries equation. Our results reveal that the method is a very effective and straightforward way of formulating the exact travelling wave solutions of nonlinear wave equations arising in mathematical physics and engineering.

  16. A note on improved F-expansion method combined with Riccati equation applied to nonlinear evolution equations

    PubMed Central

    Islam, Md. Shafiqul; Khan, Kamruzzaman; Akbar, M. Ali; Mastroberardino, Antonio

    2014-01-01

    The purpose of this article is to present an analytical method, namely the improved F-expansion method combined with the Riccati equation, for finding exact solutions of nonlinear evolution equations. The present method is capable of calculating all branches of solutions simultaneously, even if multiple solutions are very close and thus difficult to distinguish with numerical techniques. To verify the computational efficiency, we consider the modified Benjamin–Bona–Mahony equation and the modified Korteweg-de Vries equation. Our results reveal that the method is a very effective and straightforward way of formulating the exact travelling wave solutions of nonlinear wave equations arising in mathematical physics and engineering. PMID:26064530

  17. An improved method to measure nitrate/nitrite with an NO-selective electrochemical sensor

    PubMed Central

    Boo, Yong Chool; Tressel, Sarah L.; Jo, Hanjoong

    2007-01-01

    Nitric oxide produced from nitric oxide synthase(s) is an important cell signaling molecule in physiology and pathophysiology. In the present study, we describe a very sensitive and convenient analytical method to measure NOx (nitrite plus nitrate) in culture media by employing an ultra-sensitive nitric oxide-selective electrochemical sensor which became commercially available recently. An aliquot of conditioned culture media was first treated with nitrate reductase/NADPH/glucose-6-phosphate dehydrogenase/glucose-6-phosphate to convert nitrate to nitrite quantitatively. The nitrite (that is present originally plus the reduced nitrate) was then reduced to equimolar NO in an acidic iodide bath while NO was being detected by the sensor. This analytical method appears to be very useful to assess basal and stimulated NO release from cultured cells. PMID:17056288

  18. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2010-04-01

    The annual update of the list of prohibited substances and doping methods as issued by the World Anti-Doping Agency (WADA) allows the implementation of most recent considerations of performance manipulation and emerging therapeutics into human sports doping control programmes. The annual banned-substance review for human doping controls critically summarizes recent innovations in analytical approaches that support the efforts of convicting cheating athletes by improved or newly established methods that focus on known as well as newly outlawed substances and doping methods. In the current review, literature published between October 2008 and September 2009 reporting on new and/or enhanced procedures and techniques for doping analysis, as well as aspects relevant to the doping control arena, was considered to complement the 2009 annual banned-substance review.

  19. Second derivatives for approximate spin projection methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Lee M.; Hratchian, Hrant P., E-mail: hhratchian@ucmerced.edu

    2015-02-07

    The use of broken-symmetry electronic structure methods is required in order to obtain correct behavior of electronically strained open-shell systems, such as transition states, biradicals, and transition metals. This approach often has issues with spin contamination, which can lead to significant errors in predicted energies, geometries, and properties. Approximate projection schemes are able to correct for spin contamination and can often yield improved results. To fully make use of these methods and to carry out exploration of the potential energy surface, it is desirable to develop an efficient second energy derivative theory. In this paper, we formulate the analytical secondmore » derivatives for the Yamaguchi approximate projection scheme, building on recent work that has yielded an efficient implementation of the analytical first derivatives.« less

  20. Big Data and Analytics in Healthcare.

    PubMed

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  1. Development of a Standardized Approach for Assessing Potential Risks to Amphibians Exposed to Sediment and Hydric Soils

    DTIC Science & Technology

    2004-05-01

    following digestion using method 3005A. Copper concentrations were verified using atomic absorption spectroscopy/graphite furnace. Each chamber...1995. Ammonia Variation in Sediments: Spatial, Temporal and Method -Related Effects. Environ. Toxicol. Chem. 14:1499-1506. Savage, W.K., F.W...Regulator Approved Methods and Protocols for Conducting Marine and Terrestrial Risk Assessments 1.III.01.k - Improved Field Analytical Sensors

  2. "Dip-and-read" paper-based analytical devices using distance-based detection with color screening.

    PubMed

    Yamada, Kentaro; Citterio, Daniel; Henry, Charles S

    2018-05-15

    An improved paper-based analytical device (PAD) using color screening to enhance device performance is described. Current detection methods for PADs relying on the distance-based signalling motif can be slow due to the assay time being limited by capillary flow rates that wick fluid through the detection zone. For traditional distance-based detection motifs, analysis can take up to 45 min for a channel length of 5 cm. By using a color screening method, quantification with a distance-based PAD can be achieved in minutes through a "dip-and-read" approach. A colorimetric indicator line deposited onto a paper substrate using inkjet-printing undergoes a concentration-dependent colorimetric response for a given analyte. This color intensity-based response has been converted to a distance-based signal by overlaying a color filter with a continuous color intensity gradient matching the color of the developed indicator line. As a proof-of-concept, Ni quantification in welding fume was performed as a model assay. The results of multiple independent user testing gave mean absolute percentage error and average relative standard deviations of 10.5% and 11.2% respectively, which were an improvement over analysis based on simple visual color comparison with a read guide (12.2%, 14.9%). In addition to the analytical performance comparison, an interference study and a shelf life investigation were performed to further demonstrate practical utility. The developed system demonstrates an alternative detection approach for distance-based PADs enabling fast (∼10 min), quantitative, and straightforward assays.

  3. An Orientation Measurement Method Based on Hall-effect Sensors for Permanent Magnet Spherical Actuators with 3D Magnet Array

    PubMed Central

    Yan, Liang; Zhu, Bo; Jiao, Zongxia; Chen, Chin-Yin; Chen, I-Ming

    2014-01-01

    An orientation measurement method based on Hall-effect sensors is proposed for permanent magnet (PM) spherical actuators with three-dimensional (3D) magnet array. As there is no contact between the measurement system and the rotor, this method could effectively avoid friction torque and additional inertial moment existing in conventional approaches. Curved surface fitting method based on exponential approximation is proposed to formulate the magnetic field distribution in 3D space. The comparison with conventional modeling method shows that it helps to improve the model accuracy. The Hall-effect sensors are distributed around the rotor with PM poles to detect the flux density at different points, and thus the rotor orientation can be computed from the measured results and analytical models. Experiments have been conducted on the developed research prototype of the spherical actuator to validate the accuracy of the analytical equations relating the rotor orientation and the value of magnetic flux density. The experimental results show that the proposed method can measure the rotor orientation precisely, and the measurement accuracy could be improved by the novel 3D magnet array. The study result could be used for real-time motion control of PM spherical actuators. PMID:25342000

  4. An integrative 'omics' solution to the detection of recombinant human erythropoietin and blood doping.

    PubMed

    Pitsiladis, Yannis P; Durussel, Jérôme; Rabin, Olivier

    2014-05-01

    Administration of recombinant human erythropoietin (rHumanEPO) improves sporting performance and hence is frequently subject to abuse by athletes, although rHumanEPO is prohibited by the WADA. Approaches to detect rHumanEPO doping have improved significantly in recent years but remain imperfect. A new transcriptomic-based longitudinal screening approach is being developed that has the potential to improve the analytical performance of current detection methods. In particular, studies are being funded by WADA to identify a 'molecular signature' of rHumanEPO doping and preliminary results are promising. In the first systematic study to be conducted, the expression of hundreds of genes were found to be altered by rHumanEPO with numerous gene transcripts being differentially expressed after the first injection and further transcripts profoundly upregulated during and subsequently downregulated up to 4 weeks postadministration of the drug; with the same transcriptomic pattern observed in all participants. The identification of a blood 'molecular signature' of rHumanEPO administration is the strongest evidence to date that gene biomarkers have the potential to substantially improve the analytical performance of current antidoping methods such as the Athlete Biological Passport for rHumanEPO detection. Given the early promise of transcriptomics, research using an 'omics'-based approach involving genomics, transcriptomics, proteomics and metabolomics should be intensified in order to achieve improved detection of rHumanEPO and other doping substances and methods difficult to detect such a recombinant human growth hormone and blood transfusions.

  5. Experimental and Analytical Determinations of Spiral Bevel Gear-Tooth Bending Stress Compared

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.

    2000-01-01

    Spiral bevel gears are currently used in all main-rotor drive systems for rotorcraft produced in the United States. Applications such as these need spiral bevel gears to turn the corner from the horizontal gas turbine engine to the vertical rotor shaft. These gears must typically operate at extremely high rotational speeds and carry high power levels. With these difficult operating conditions, an improved analytical capability is paramount to increasing aircraft safety and reliability. Also, literature on the analysis and testing of spiral bevel gears has been very sparse in comparison to that for parallel axis gears. This is due to the complex geometry of this type of gear and to the specialized test equipment necessary to test these components. To develop an analytical model of spiral bevel gears, researchers use differential geometry methods to model the manufacturing kinematics. A three-dimensional spiral bevel gear modeling method was developed that uses finite elements for the structural analysis. This method was used to analyze the three-dimensional contact pattern between the test pinion and gear used in the Spiral Bevel Gear Test Facility at the NASA Glenn Research Center at Lewis Field. Results of this analysis are illustrated in the preceding figure. The development of the analytical method was a joint endeavor between NASA Glenn, the U.S. Army Research Laboratory, and the University of North Dakota.

  6. A novel optimization algorithm for MIMO Hammerstein model identification under heavy-tailed noise.

    PubMed

    Jin, Qibing; Wang, Hehe; Su, Qixin; Jiang, Beiyan; Liu, Qie

    2018-01-01

    In this paper, we study the system identification of multi-input multi-output (MIMO) Hammerstein processes under the typical heavy-tailed noise. To the best of our knowledge, there is no general analytical method to solve this identification problem. Motivated by this, we propose a general identification method to solve this problem based on a Gaussian-Mixture Distribution intelligent optimization algorithm (GMDA). The nonlinear part of Hammerstein process is modeled by a Radial Basis Function (RBF) neural network, and the identification problem is converted to an optimization problem. To overcome the drawbacks of analytical identification method in the presence of heavy-tailed noise, a meta-heuristic optimization algorithm, Cuckoo search (CS) algorithm is used. To improve its performance for this identification problem, the Gaussian-mixture Distribution (GMD) and the GMD sequences are introduced to improve the performance of the standard CS algorithm. Numerical simulations for different MIMO Hammerstein models are carried out, and the simulation results verify the effectiveness of the proposed GMDA. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  7. HOST turbine heat transfer program summary

    NASA Technical Reports Server (NTRS)

    Gladden, Herbert J.; Simoneau, Robert J.

    1988-01-01

    The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding with the remainder going to analytical efforts. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.

  8. Improving the analyte ion signal in matrix-assisted laser desorption/ionization imaging mass spectrometry via electrospray deposition by enhancing incorporation of the analyte in the matrix.

    PubMed

    Malys, Brian J; Owens, Kevin G

    2017-05-15

    Matrix-assisted laser desorption/ionization (MALDI) is widely used as the ionization method in high-resolution chemical imaging studies that seek to visualize the distribution of analytes within sectioned biological tissues. This work extends the use of electrospray deposition (ESD) to apply matrix with an additional solvent spray to incorporate and homogenize analyte within the matrix overlayer. Analytes and matrix are sequentially and independently applied by ESD to create a sample from which spectra are collected, mimicking a MALDI imaging mass spectrometry (IMS) experiment. Subsequently, an incorporation spray consisting of methanol is applied by ESD to the sample and another set of spectra are collected. The spectra prior to and after the incorporation spray are compared to evaluate the improvement in the analyte signal. Prior to the incorporation spray, samples prepared using α-cyano-4-hydroxycinnamic acid (CHCA) and 2,5-dihydroxybenzoic acid (DHB) as the matrix showed low signal while the sample using sinapinic acid (SA) initially exhibited good signal. Following the incorporation spray, the sample using SA did not show an increase in signal; the sample using DHB showed moderate gain factors of 2-5 (full ablation spectra) and 12-336 (raster spectra), while CHCA samples saw large increases in signal, with gain factors of 14-172 (full ablation spectra) and 148-1139 (raster spectra). The use of an incorporation spray to apply solvent by ESD to a matrix layer already deposited by ESD provides an increase in signal by both promoting incorporation of the analyte within and homogenizing the distribution of the incorporated analyte throughout the matrix layer. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Lack of correlation between reaction speed and analytical sensitivity in isothermal amplification reveals the value of digital methods for optimization: validation using digital real-time RT-LAMP.

    PubMed

    Khorosheva, Eugenia M; Karymov, Mikhail A; Selck, David A; Ismagilov, Rustem F

    2016-01-29

    In this paper, we asked if it is possible to identify the best primers and reaction conditions based on improvements in reaction speed when optimizing isothermal reactions. We used digital single-molecule, real-time analyses of both speed and efficiency of isothermal amplification reactions, which revealed that improvements in the speed of isothermal amplification reactions did not always correlate with improvements in digital efficiency (the fraction of molecules that amplify) or with analytical sensitivity. However, we observed that the speeds of amplification for single-molecule (in a digital device) and multi-molecule (e.g. in a PCR well plate) formats always correlated for the same conditions. Also, digital efficiency correlated with the analytical sensitivity of the same reaction performed in a multi-molecule format. Our finding was supported experimentally with examples of primer design, the use or exclusion of loop primers in different combinations, and the use of different enzyme mixtures in one-step reverse-transcription loop-mediated amplification (RT-LAMP). Our results show that measuring the digital efficiency of amplification of single-template molecules allows quick, reliable comparisons of the analytical sensitivity of reactions under any two tested conditions, independent of the speeds of the isothermal amplification reactions. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Accurate determination of arsenic in arsenobetaine standard solutions of BCR-626 and NMIJ CRM 7901-a by neutron activation analysis coupled with internal standard method.

    PubMed

    Miura, Tsutomu; Chiba, Koichi; Kuroiwa, Takayoshi; Narukawa, Tomohiro; Hioki, Akiharu; Matsue, Hideaki

    2010-09-15

    Neutron activation analysis (NAA) coupled with an internal standard method was applied for the determination of As in the certified reference material (CRM) of arsenobetaine (AB) standard solutions to verify their certified values. Gold was used as an internal standard to compensate for the difference of the neutron exposure in an irradiation capsule and to improve the sample-to-sample repeatability. Application of the internal standard method significantly improved linearity of the calibration curve up to 1 microg of As, too. The analytical reliability of the proposed method was evaluated by k(0)-standardization NAA. The analytical results of As in AB standard solutions of BCR-626 and NMIJ CRM 7901-a were (499+/-55)mgkg(-1) (k=2) and (10.16+/-0.15)mgkg(-1) (k=2), respectively. These values were found to be 15-20% higher than the certified values. The between-bottle variation of BCR-626 was much larger than the expanded uncertainty of the certified value, although that of NMIJ CRM 7901-a was almost negligible. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  11. A decision support system using analytical hierarchy process (AHP) for the optimal environmental reclamation of an open-pit mine

    NASA Astrophysics Data System (ADS)

    Bascetin, A.

    2007-04-01

    The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.

  12. Green approach using monolithic column for simultaneous determination of coformulated drugs.

    PubMed

    Yehia, Ali M; Mohamed, Heba M

    2016-06-01

    Green chemistry and sustainability is now entirely encompassed across the majority of pharmaceutical companies and research labs. Researchers' attention is careworn toward implementing the green analytical chemistry principles for more eco-friendly analytical methodologies. Solvents play a dominant role in determining the greenness of the analytical procedure. Using safer solvents, the greenness profile of the methodology could be increased remarkably. In this context, a green chromatographic method has been developed and validated for the simultaneous determination of phenylephrine, paracetamol, and guaifenesin in their ternary pharmaceutical mixture. The chromatographic separation was carried out using monolithic column and green solvents as mobile phase. The use of monolithic column allows efficient separation protocols at higher flow rates, which results in short time of analysis. Two-factor three-level experimental design was used to optimize the chromatographic conditions. The greenness profile of the proposed methodology was assessed using eco-scale as a green metrics and was found to be an excellent green method with regard to the usage and production of hazardous chemicals and solvents, energy consumption, and amount of produced waste. The proposed method improved the environmental impact without compromising the analytical performance criteria and could be used as a safer alternate for the routine analysis of the studied drugs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Analytical method for measuring cosmogenic 35S in natural waters

    DOE PAGES

    Uriostegui, Stephanie H.; Bibby, Richard K.; Esser, Bradley K.; ...

    2015-05-18

    Here, cosmogenic sulfur-35 in water as dissolved sulfate ( 35SO 4) has successfully been used as an intrinsic hydrologic tracer in low-SO 4, high-elevation basins. Its application in environmental waters containing high SO 4 concentrations has been limited because only small amounts of SO 4 can be analyzed using current liquid scintillation counting (LSC) techniques. We present a new analytical method for analyzing large amounts of BaSO 4 for 35S. We quantify efficiency gains when suspending BaSO 4 precipitate in Inta-Gel Plus cocktail, purify BaSO 4 precipitate to remove dissolved organic matter, mitigate interference of radium-226 and its daughter productsmore » by selection of high purity barium chloride, and optimize LSC counting parameters for 35S determination in larger masses of BaSO 4. Using this improved procedure, we achieved counting efficiencies that are comparable to published LSC techniques despite a 10-fold increase in the SO 4 sample load. 35SO 4 was successfully measured in high SO 4 surface waters and groundwaters containing low ratios of 35S activity to SO 4 mass demonstrating that this new analytical method expands the analytical range of 35SO 4 and broadens the utility of 35SO 4 as an intrinsic tracer in hydrologic settings.« less

  14. Potential role of gold nanoparticles for improved analytical methods: an introduction to characterizations and applications.

    PubMed

    Wu, Chung-Shu; Liu, Fu-Ken; Ko, Fu-Hsiang

    2011-01-01

    Nanoparticle-based material is a revolutionary scientific and engineering venture that will invariably impact the existing analytical separation and preconcentration for a variety of analytes. Nanoparticles can be regarded as a hybrid between small molecule and bulk material. A material on the nanoscale produces considerable changes on various properties, making them size- and shape-dependent. Gold nanoparticles (Au NPs), one of the wide variety of core materials available, coupled with tunable surface properties in the form of inorganic or inorganic-organic hybrid have been reported as an excellent platform for a broad range of analytical methods. This review aims to introduce the basic principles, examples, and descriptions of methods for the characterization of Au NPs by using chromatography, electrophoresis, and self-assembly strategies for separation science. Some of the latest important applications of using Au NPs as stationary phases toward open-tubular capillary electrochromatography, gas chromatography, and liquid chromatography as well as roles of run buffer additive to enhance separation and preconcentration in the field of chromatographic, electrophoretic and in chip-based systems are reviewed. Additionally, we review Au NPs-assisted state-of-the-art techniques involving the use of micellar electrokinetic chromatography, an online diode array detector, solid-phase extraction, and mass spectrometry for the preconcentration of some chemical compounds and biomolecules.

  15. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    PubMed Central

    Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.

    2017-01-01

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034

  16. A Newton method for the magnetohydrodynamic equilibrium equations

    NASA Astrophysics Data System (ADS)

    Oliver, Hilary James

    We have developed and implemented a (J, B) space Newton method to solve the full nonlinear three dimensional magnetohydrodynamic equilibrium equations in toroidal geometry. Various cases have been run successfully, demonstrating significant improvement over Picard iteration, including a 3D stellarator equilibrium at β = 2%. The algorithm first solves the equilibrium force balance equation for the current density J, given a guess for the magnetic field B. This step is taken from the Picard-iterative PIES 3D equilibrium code. Next, we apply Newton's method to Ampere's Law by expansion of the functional J(B), which is defined by the first step. An analytic calculation in magnetic coordinates, of how the Pfirsch-Schlüter currents vary in the plasma in response to a small change in the magnetic field, yields the Newton gradient term (analogous to ∇f . δx in Newton's method for f(x) = 0). The algorithm is computationally feasible because we do this analytically, and because the gradient term is flux surface local when expressed in terms of a vector potential in an Ar=0 gauge. The equations are discretized by a hybrid spectral/offset grid finite difference technique, and leading order radial dependence is factored from Fourier coefficients to improve finite- difference accuracy near the polar-like origin. After calculating the Newton gradient term we transfer the equation from the magnetic grid to a fixed background grid, which greatly improves the code's performance.

  17. Performance specifications for the extra-analytical phases of laboratory testing: Why and how.

    PubMed

    Plebani, Mario

    2017-07-01

    An important priority in the current healthcare scenario should be to address errors in laboratory testing, which account for a significant proportion of diagnostic errors. Efforts made in laboratory medicine to enhance the diagnostic process have been directed toward improving technology, greater volumes and more accurate laboratory tests being achieved, but data collected in the last few years highlight the need to re-evaluate the total testing process (TTP) as the unique framework for improving quality and patient safety. Valuable quality indicators (QIs) and extra-analytical performance specifications are required for guidance in improving all TTP steps. Yet in literature no data are available on extra-analytical performance specifications based on outcomes, and nor is it possible to set any specification using calculations involving biological variability. The collection of data representing the state-of-the-art based on quality indicators is, therefore, underway. The adoption of a harmonized set of QIs, a common data collection and standardised reporting method is mandatory as it will not only allow the accreditation of clinical laboratories according to the International Standard, but also assure guidance for promoting improvement processes and guaranteeing quality care to patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. At-line process analytical technology (PAT) for more efficient scale up of biopharmaceutical microfiltration unit operations.

    PubMed

    Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I

    2016-01-01

    Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers.

  19. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    PubMed

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  20. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    PubMed

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  1. An integrated approach of AHP and DEMATEL methods in evaluating the criteria of auto spare parts industry

    NASA Astrophysics Data System (ADS)

    Wu, Hsin-Hung; Tsai, Ya-Ning

    2012-11-01

    This study uses both analytic hierarchy process (AHP) and decision-making trial and evaluation laboratory (DEMATEL) methods to evaluate the criteria in auto spare parts industry in Taiwan. Traditionally, AHP does not consider indirect effects for each criterion and assumes that criteria are independent without further addressing the interdependence between or among the criteria. Thus, the importance computed by AHP can be viewed as short-term improvement opportunity. On the contrary, DEMATEL method not only evaluates the importance of criteria but also depicts the causal relations of criteria. By observing the causal diagrams, the improvement based on cause-oriented criteria might improve the performance effectively and efficiently for the long-term perspective. As a result, the major advantage of integrating AHP and DEMATEL methods is that the decision maker can continuously improve suppliers' performance from both short-term and long-term viewpoints.

  2. Multiple reaction monitoring with multistage fragmentation (MRM3) detection enhances selectivity for LC-MS/MS analysis of plasma free metanephrines.

    PubMed

    Wright, Michael J; Thomas, Rebecca L; Stanford, Phoebe E; Horvath, Andrea R

    2015-03-01

    LC-MS/MS with multiple reaction monitoring (MRM) is a powerful tool for quantifying target analytes in complex matrices. However, the technique lacks selectivity when plasma free metanephrines are measured. We propose the use of multistage fragmentation (MRM(3)) to improve the analytical selectivity of plasma free metanephrine measurement. Metanephrines were extracted from plasma with weak cation exchange solid-phase extraction before separation by hydrophilic interaction liquid chromatography. We quantified normetanephrine and metanephrine by either MRM or MRM(3) transitions m/z 166→134→79 and m/z 180→149→121, respectively. Over a 6-month period, approximately 1% (n = 21) of patient samples showed uncharacterized coeluting substances that interfered with the routine assay, resulting in an inability to report results. Quantification with MRM(3) removed these interferences and enabled measurement of the target compounds. For patient samples unaffected by interferences, Deming regression analysis demonstrated a correlation between MRM(3) and MRM methods of y = 1.00x - 0.00 nmol/L for normetanephrine and y = 0.99x + 0.03 nmol/L for metanephrine. Between the MRM(3) method and the median of all LC-MS/MS laboratories enrolled in a quality assurance program, the correlations were y = 0.97x + 0.03 nmol/L for normetanephrine and y = 1.03x - 0.04 nmol/L for metanephrine. Imprecision for the MRM(3) method was 6.2%-7.0% for normetanephrine and 6.1%-9.9% for metanephrine (n = 10). The lower limits of quantification for the MRM(3) method were 0.20 nmol/L for normetanephrine and 0.16 nmol/L for metanephrine. The use of MRM(3) technology improves the analytical selectivity of plasma free metanephrine quantification by LC-MS/MS while demonstrating sufficient analytical sensitivity and imprecision. © 2014 American Association for Clinical Chemistry.

  3. Towards European urinalysis guidelines. Introduction of a project under European Confederation of Laboratory Medicine.

    PubMed

    Kouri, T T; Gant, V A; Fogazzi, G B; Hofmann, W; Hallander, H O; Guder, W G

    2000-07-01

    Improved standardized performance is needed because urinalysis continues to be one of the most frequently requested laboratory tests. Since 1997, the European Confederation of Laboratory Medicine (ECLM) has been supporting an interdisciplinary project aiming to produce European urinalysis guidelines. More than seventy clinical chemists, microbiologists and ward-based clinicians, as well as representatives of manufacturers are taking part. These guidelines aim to improve the quality and consistency of chemical urinalysis, particle counting and bacterial culture by suggesting optimal investigative processes that could be applied in Europe. The approach is based on medical needs for urinalysis. The importance of the pre-analytical stage for total quality is stressed by detailed illustrative advice for specimen collection. Attention is also given to emerging automated technology. For cost containment reasons, both optimum (ideal) procedures and minimum analytical approaches are suggested. Since urinalysis mostly lacks genuine reference methods (primary reference measurement procedures; Level 4), a novel classification of the methods is proposed: comparison measurement procedures (Level 3), quantitative routine procedures (Level 2), and ordinal scale examinations (Level 1). Stepwise strategies are suggested to save costs, applying different rules for general and specific patient populations. New analytical quality specifications have been created. After a consultation period, the final written text will be published in full as a separate document.

  4. Long-term variability in sugarcane bagasse feedstock compositional methods: Sources and magnitude of analytical variability

    DOE PAGES

    Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie; ...

    2016-10-18

    In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less

  5. Long-term variability in sugarcane bagasse feedstock compositional methods: Sources and magnitude of analytical variability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie

    In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less

  6. Determination of a risk management primer at petroleum-contaminated sites: developing new human health risk assessment strategy.

    PubMed

    Park, In-Sun; Park, Jae-Woo

    2011-01-30

    Total petroleum hydrocarbon (TPH) is an important environmental contaminant that is toxic to human and environmental receptors. However, human health risk assessment for petroleum, oil, and lubricant (POL)-contaminated sites is especially challenging because TPH is not a single compound, but rather a mixture of numerous substances. To address this concern, this study recommends a new human health risk assessment strategy for POL-contaminated sites. The strategy is based on a newly modified TPH fractionation method and includes an improved analytical protocol. The proposed TPH fractionation method is composed of ten fractions (e.g., aliphatic and aromatic EC8-10, EC10-12, EC12-16, EC16-22 and EC22-40). Physicochemical properties and toxicity values of each fraction were newly defined in this study. The stepwise ultrasonication-based analytical process was established to measure TPH fractions. Analytical results were compared with those from the TPH Criteria Working Group (TPHCWG) Direct Method. Better analytical efficiencies in TPH, aliphatic, and aromatic fractions were achieved when contaminated soil samples were analyzed with the new analytical protocol. Finally, a human health risk assessment was performed based on the developed tiered risk assessment framework. Results showed that a detailed quantitative risk assessment should be conducted to determine scientifically and economically appropriate cleanup target levels, although the phase II process is useful for determining the potency of human health risks posed by POL-contamination. Copyright © 2010 Elsevier B.V. All rights reserved.

  7. HPV Genotyping of Modified General Primer-Amplicons Is More Analytically Sensitive and Specific by Sequencing than by Hybridization

    PubMed Central

    Meisal, Roger; Rounge, Trine Ballestad; Christiansen, Irene Kraus; Eieland, Alexander Kirkeby; Worren, Merete Molton; Molden, Tor Faksvaag; Kommedal, Øyvind; Hovig, Eivind; Leegaard, Truls Michael

    2017-01-01

    Sensitive and specific genotyping of human papillomaviruses (HPVs) is important for population-based surveillance of carcinogenic HPV types and for monitoring vaccine effectiveness. Here we compare HPV genotyping by Next Generation Sequencing (NGS) to an established DNA hybridization method. In DNA isolated from urine, the overall analytical sensitivity of NGS was found to be 22% higher than that of hybridization. NGS was also found to be the most specific method and expanded the detection repertoire beyond the 37 types of the DNA hybridization assay. Furthermore, NGS provided an increased resolution by identifying genetic variants of individual HPV types. The same Modified General Primers (MGP)-amplicon was used in both methods. The NGS method is described in detail to facilitate implementation in the clinical microbiology laboratory and includes suggestions for new standards for detection and calling of types and variants with improved resolution. PMID:28045981

  8. HPV Genotyping of Modified General Primer-Amplicons Is More Analytically Sensitive and Specific by Sequencing than by Hybridization.

    PubMed

    Meisal, Roger; Rounge, Trine Ballestad; Christiansen, Irene Kraus; Eieland, Alexander Kirkeby; Worren, Merete Molton; Molden, Tor Faksvaag; Kommedal, Øyvind; Hovig, Eivind; Leegaard, Truls Michael; Ambur, Ole Herman

    2017-01-01

    Sensitive and specific genotyping of human papillomaviruses (HPVs) is important for population-based surveillance of carcinogenic HPV types and for monitoring vaccine effectiveness. Here we compare HPV genotyping by Next Generation Sequencing (NGS) to an established DNA hybridization method. In DNA isolated from urine, the overall analytical sensitivity of NGS was found to be 22% higher than that of hybridization. NGS was also found to be the most specific method and expanded the detection repertoire beyond the 37 types of the DNA hybridization assay. Furthermore, NGS provided an increased resolution by identifying genetic variants of individual HPV types. The same Modified General Primers (MGP)-amplicon was used in both methods. The NGS method is described in detail to facilitate implementation in the clinical microbiology laboratory and includes suggestions for new standards for detection and calling of types and variants with improved resolution.

  9. Laminar flow control perforated wing panel development

    NASA Technical Reports Server (NTRS)

    Fischler, J. E.

    1986-01-01

    Many structural concepts for a wing leading edge laminar flow control hybrid panel were analytically investigated. After many small, medium, and large tests, the selected design was verified. New analytic methods were developed to combine porous titanium sheet bonded to a substructure of fiberglass and carbon/epoxy cloth. At -65 and +160 F test conditions, the critical bond of the porous titanium to the composite failed at lower than anticipated test loads. New cure cycles, design improvements, and test improvements significantly improved the strength and reduced the deflections from thermal and lateral loadings. The wave tolerance limits for turbulence were not exceeded. Consideration of the beam column midbay deflections from the combinations of the axial and lateral loadings and thermal bowing at -65 F, room temperature, and +160 F were included. Many lap shear tests were performed at several cure cycles. Results indicate that sufficient verification was obtained to fabricate a demonstration vehicle.

  10. Analytical study of exact solutions of the nonlinear Korteweg-de Vries equation with space-time fractional derivatives

    NASA Astrophysics Data System (ADS)

    Liu, Jiangen; Zhang, Yufeng

    2018-01-01

    This paper gives an analytical study of dynamic behavior of the exact solutions of nonlinear Korteweg-de Vries equation with space-time local fractional derivatives. By using the improved (G‧ G )-expansion method, the explicit traveling wave solutions including periodic solutions, dark soliton solutions, soliton solutions and soliton-like solutions, are obtained for the first time. They can better help us further understand the physical phenomena and provide a strong basis. Meanwhile, some solutions are presented through 3D-graphs.

  11. Big–deep–smart data in imaging for guiding materials design

    DOE PAGES

    Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.

    2015-09-23

    Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.

  12. Inversion of the anomalous diffraction approximation for variable complex index of refraction near unity. [numerical tests for water-haze aerosol model

    NASA Technical Reports Server (NTRS)

    Smith, C. B.

    1982-01-01

    The Fymat analytic inversion method for retrieving a particle-area distribution function from anomalous diffraction multispectral extinction data and total area is generalized to the case of a variable complex refractive index m(lambda) near unity depending on spectral wavelength lambda. Inversion tests are presented for a water-haze aerosol model. An upper-phase shift limit of 5 pi/2 retrieved an accurate peak area distribution profile. Analytical corrections using both the total number and area improved the inversion.

  13. Big-deep-smart data in imaging for guiding materials design.

    PubMed

    Kalinin, Sergei V; Sumpter, Bobby G; Archibald, Richard K

    2015-10-01

    Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.

  14. Big-deep-smart data in imaging for guiding materials design

    NASA Astrophysics Data System (ADS)

    Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.

    2015-10-01

    Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.

  15. Big–deep–smart data in imaging for guiding materials design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.

    Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.

  16. Improving Conceptions in Analytical Chemistry: The Central Limit Theorem

    ERIC Educational Resources Information Center

    Rodriguez-Lopez, Margarita; Carrasquillo, Arnaldo, Jr.

    2006-01-01

    This article describes the central limit theorem (CLT) and its relation to analytical chemistry. The pedagogic rational, which argues for teaching the CLT in the analytical chemistry classroom, is discussed. Some analytical chemistry concepts that could be improved through an understanding of the CLT are also described. (Contains 2 figures.)

  17. Multiplexed biosensors for detection of mycotoxins

    USDA-ARS?s Scientific Manuscript database

    As analytical methods have improved it has become apparent that mycotoxins exist in many forms within a commodity or food. For the established toxins there has been increased interest in the presence of metabolites that might also harbor toxicity. These include biosynthetic precursors as well as pro...

  18. Verification and application of the Iosipescu shear test method

    NASA Technical Reports Server (NTRS)

    Walrath, D. E.; Adams, D. F.

    1984-01-01

    Finite element models were used to study the effects of notch angle variations on the stress state within an Iosipescu shear test speciment. These analytical results were also studied to determine the feasibility of using strain gage rosettes and a modified extensometer to measure shear strains in this test specimen. Analytical results indicate that notch angle variations produced only small differences in simulated shear properties. Both strain gage rosettes and the modified extensometer were shown to be feasible shear strain transducers for the test method. The Iosipoescu shear test fixture was redesigned to incorporate several improvements. These improvements include accommodation of a 50 percent larger specimen for easier measurement of shear train, a clamping mechanism to relax strict tolerances on specimen width, and a self contained alignment tool for use during specimen installation. A set of in-plane and interlaminar shear properties were measured for three graphite fabric/epoxy composites of T300/934 composite material. The three weave patterns were Oxford, 5-harness satin, and 8-harness satin.

  19. Meta-analytic framework for sparse K-means to identify disease subtypes in multiple transcriptomic studies

    PubMed Central

    Huo, Zhiguang; Ding, Ying; Liu, Silvia; Oesterreich, Steffi; Tseng, George

    2016-01-01

    Disease phenotyping by omics data has become a popular approach that potentially can lead to better personalized treatment. Identifying disease subtypes via unsupervised machine learning is the first step towards this goal. In this paper, we extend a sparse K-means method towards a meta-analytic framework to identify novel disease subtypes when expression profiles of multiple cohorts are available. The lasso regularization and meta-analysis identify a unique set of gene features for subtype characterization. An additional pattern matching reward function guarantees consistent subtype signatures across studies. The method was evaluated by simulations and leukemia and breast cancer data sets. The identified disease subtypes from meta-analysis were characterized with improved accuracy and stability compared to single study analysis. The breast cancer model was applied to an independent METABRIC dataset and generated improved survival difference between subtypes. These results provide a basis for diagnosis and development of targeted treatments for disease subgroups. PMID:27330233

  20. Utilization management affects health care practices at Walter Reed Army Medical Center: analytical methods applied to decrease length of stay and assign appropriate level of care.

    PubMed

    Phillips, J S; Hamm, C K; Pierce, J R; Kussman, M J

    1999-12-01

    The Department of Defense has embraced utilization management (UM) as an important tool to control and possibly decrease medical costs. Budgetary withholds have been taken by the Office of the Assistant Secretary of Defense (Health Affairs) to encourage the military services to implement UM programs. In response, Walter Reed Army Medical Center implemented a UM program along with other initiatives to effect changes in the delivery of inpatient care. This paper describes this UM program and other organizational initiatives, such as the introduction of new levels of care in an attempt to effect reductions in length of stay and unnecessary admissions. We demonstrate the use of a diversity of databases and analytical methods to quantify improved utilization and management of resources. The initiatives described significantly reduced hospital length of stay and inappropriate inpatient days. Without solid command and clinical leadership support and empowerment of the professional staffs, these significant changes and improvements could not have occurred.

  1. An improved dispersive solid-phase extraction clean-up method for the gas chromatography-negative chemical ionisation tandem mass spectrometric determination of multiclass pesticide residues in edible oils.

    PubMed

    Deme, Pragney; Azmeera, Tirupathi; Prabhavathi Devi, B L A; Jonnalagadda, Padmaja R; Prasad, R B N; Vijaya Sarathi, U V R

    2014-01-01

    An improved sample preparation using dispersive solid-phase extraction clean-up was proposed for the trace level determination of 35 multiclass pesticide residues (organochlorine, organophosphorus and synthetic pyrethroids) in edible oils. Quantification of the analytes was carried out by gas chromatography-mass spectrometry in negative chemical ionisation mode (GC-NCI-MS/MS). The limit of detection and limit of quantification of residues were in the range of 0.01-1ng/g and 0.05-2ng/g, respectively. The analytes showed recoveries between 62% and 110%, and the matrix effect was observed to be less than 25% for most of the pesticides. Crude edible oil samples showed endosulfan isomers, p,p'-DDD, α-cypermethrin, chlorpyrifos, and diazinon residues in the range of 0.56-2.14ng/g. However, no pesticide residues in the detection range of the method were observed in refined oils. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Meta-analytic framework for sparse K-means to identify disease subtypes in multiple transcriptomic studies.

    PubMed

    Huo, Zhiguang; Ding, Ying; Liu, Silvia; Oesterreich, Steffi; Tseng, George

    Disease phenotyping by omics data has become a popular approach that potentially can lead to better personalized treatment. Identifying disease subtypes via unsupervised machine learning is the first step towards this goal. In this paper, we extend a sparse K -means method towards a meta-analytic framework to identify novel disease subtypes when expression profiles of multiple cohorts are available. The lasso regularization and meta-analysis identify a unique set of gene features for subtype characterization. An additional pattern matching reward function guarantees consistent subtype signatures across studies. The method was evaluated by simulations and leukemia and breast cancer data sets. The identified disease subtypes from meta-analysis were characterized with improved accuracy and stability compared to single study analysis. The breast cancer model was applied to an independent METABRIC dataset and generated improved survival difference between subtypes. These results provide a basis for diagnosis and development of targeted treatments for disease subgroups.

  3. Advances in Instrumental Analysis of Brominated Flame Retardants: Current Status and Future Perspectives

    PubMed Central

    2014-01-01

    This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482

  4. Similarity solution of the Boussinesq equation

    NASA Astrophysics Data System (ADS)

    Lockington, D. A.; Parlange, J.-Y.; Parlange, M. B.; Selker, J.

    Similarity transforms of the Boussinesq equation in a semi-infinite medium are available when the boundary conditions are a power of time. The Boussinesq equation is reduced from a partial differential equation to a boundary-value problem. Chen et al. [Trans Porous Media 1995;18:15-36] use a hodograph method to derive an integral equation formulation of the new differential equation which they solve by numerical iteration. In the present paper, the convergence of their scheme is improved such that numerical iteration can be avoided for all practical purposes. However, a simpler analytical approach is also presented which is based on Shampine's transformation of the boundary value problem to an initial value problem. This analytical approximation is remarkably simple and yet more accurate than the analytical hodograph approximations.

  5. Study on application of aerospace technology to improve surgical implants

    NASA Technical Reports Server (NTRS)

    Johnson, R. E.; Youngblood, J. L.

    1982-01-01

    The areas where aerospace technology could be used to improve the reliability and performance of metallic, orthopedic implants was assessed. Specifically, comparisons were made of material controls, design approaches, analytical methods and inspection approaches being used in the implant industry with hardware for the aerospace industries. Several areas for possible improvement were noted such as increased use of finite element stress analysis and fracture control programs on devices where the needs exist for maximum reliability and high structural performance.

  6. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    PubMed

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used to determine if the individual compounds quantified provide a suitable mass balance of total airborne organofluorochemicals based on known fluorine content. Improvements in precision and/or recovery as well as some additional testing would be needed to meet all NIOSH validation criteria. This study provided valuable information about the accuracy of this method for organofluorochemical exposure assessment.

  7. Surface engineered nanoparticles for improved surface enhanced Raman scattering applications and method for preparing same

    DOEpatents

    Simmons, Blake A [San Francisco, CA; Talin, Albert Alec [Livermore, CA

    2009-11-27

    A method for producing metal nanoparticles that when associated with an analyte material will generate an amplified SERS spectrum when the analyte material is illuminated by a light source and a spectrum is recorded. The method for preparing the metal nanoparticles comprises the steps of (i) forming a water-in-oil microemulsion comprising a bulk oil phase, a dilute water phase, and one or more surfactants, wherein the water phase comprises a transition metal ion; (ii) adding an aqueous solution comprising a mild reducing agent to the water-in-oil microemulsion; (iii) stirring the water-in-oil microemulsion and aqueous solution to initiate a reduction reaction resulting in the formation of a fine precipitate dispersed in the water-in-oil microemulsion; and (iv) separating the precipitate from the water-in-oil microemulsion.

  8. Validation of Rapid Radiochemical Method for Californium ...

    EPA Pesticide Factsheets

    Technical Brief In the event of a radiological/nuclear contamination event, the response community would need tools and methodologies to rapidly assess the nature and the extent of contamination. To characterize a radiologically contaminated outdoor area and to inform risk assessment, large numbers of environmental samples would be collected and analyzed over a short period of time. To address the challenge of quickly providing analytical results to the field, the U.S. EPA developed a robust analytical method. This method allows response officials to characterize contaminated areas and to assess the effectiveness of remediation efforts, both rapidly and accurately, in the intermediate and late phases of environmental cleanup. Improvement in sample processing and analysis leads to increased laboratory capacity to handle the analysis of a large number of samples following the intentional or unintentional release of a radiological/nuclear contaminant.

  9. A microfluidic paper-based analytical device for the assay of albumin-corrected fructosamine values from whole blood samples.

    PubMed

    Boonyasit, Yuwadee; Laiwattanapaisal, Wanida

    2015-01-01

    A method for acquiring albumin-corrected fructosamine values from whole blood using a microfluidic paper-based analytical system that offers substantial improvement over previous methods is proposed. The time required to quantify both serum albumin and fructosamine is shortened to 10 min with detection limits of 0.50 g dl(-1) and 0.58 mM, respectively (S/N = 3). The proposed system also exhibited good within-run and run-to-run reproducibility. The results of the interference study revealed that the acceptable recoveries ranged from 95.1 to 106.2%. The system was compared with currently used large-scale methods (n = 15), and the results demonstrated good agreement among the techniques. The microfluidic paper-based system has the potential to continuously monitor glycemic levels in low resource settings.

  10. Reaction wheel low-speed compensation using a dither signal

    NASA Astrophysics Data System (ADS)

    Stetson, John B., Jr.

    1993-08-01

    A method for improving low-speed reaction wheel performance on a three-axis controlled spacecraft is presented. The method combines a constant amplitude offset with an unbiased, oscillating dither to harmonically linearize rolling solid friction dynamics. The complete, nonlinear rolling solid friction dynamics using an analytic modification to the experimentally verified Dahl solid friction model were analyzed using the dual-input describing function method to assess the benefits of dither compensation. The modified analytic solid friction model was experimentally verified with a small dc servomotor actuated reaction wheel assembly. Using dither compensation abrupt static friction disturbances are eliminated and near linear behavior through zero rate can be achieved. Simulated vehicle response to a wheel rate reversal shows that when the dither and offset compensation is used, elastic modes are not significantly excited, and the uncompensated attitude error reduces by 34:1.

  11. Computer modeling of lung cancer diagnosis-to-treatment process

    PubMed Central

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U.; Yu, Xinhua; Faris, Nick

    2015-01-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed. PMID:26380181

  12. The performance evaluation model of mining project founded on the weight optimization entropy value method

    NASA Astrophysics Data System (ADS)

    Mao, Chao; Chen, Shou

    2017-01-01

    According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.

  13. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  14. New robust bilinear least squares method for the analysis of spectral-pH matrix data.

    PubMed

    Goicoechea, Héctor C; Olivieri, Alejandro C

    2005-07-01

    A new second-order multivariate method has been developed for the analysis of spectral-pH matrix data, based on a bilinear least-squares (BLLS) model achieving the second-order advantage and handling multiple calibration standards. A simulated Monte Carlo study of synthetic absorbance-pH data allowed comparison of the newly proposed BLLS methodology with constrained parallel factor analysis (PARAFAC) and with the combination multivariate curve resolution-alternating least-squares (MCR-ALS) technique under different conditions of sample-to-sample pH mismatch and analyte-background ratio. The results indicate an improved prediction ability for the new method. Experimental data generated by measuring absorption spectra of several calibration standards of ascorbic acid and samples of orange juice were subjected to second-order calibration analysis with PARAFAC, MCR-ALS, and the new BLLS method. The results indicate that the latter method provides the best analytical results in regard to analyte recovery in samples of complex composition requiring strict adherence to the second-order advantage. Linear dependencies appear when multivariate data are produced by using the pH or a reaction time as one of the data dimensions, posing a challenge to classical multivariate calibration models. The presently discussed algorithm is useful for these latter systems.

  15. A method for direct, semi-quantitative analysis of gas phase samples using gas chromatography-inductively coupled plasma-mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carter, Kimberly E; Gerdes, Kirk

    2013-07-01

    A new and complete GC–ICP-MS method is described for direct analysis of trace metals in a gas phase process stream. The proposed method is derived from standard analytical procedures developed for ICP-MS, which are regularly exercised in standard ICP-MS laboratories. In order to implement the method, a series of empirical factors were generated to calibrate detector response with respect to a known concentration of an internal standard analyte. Calibrated responses are ultimately used to determine the concentration of metal analytes in a gas stream using a semi-quantitative algorithm. The method was verified using a traditional gas injection from a GCmore » sampling valve and a standard gas mixture containing either a 1 ppm Xe + Kr mix with helium balance or 100 ppm Xe with helium balance. Data collected for Xe and Kr gas analytes revealed that agreement of 6–20% with the actual concentration can be expected for various experimental conditions. To demonstrate the method using a relevant “unknown” gas mixture, experiments were performed for continuous 4 and 7 hour periods using a Hg-containing sample gas that was co-introduced into the GC sample loop with the xenon gas standard. System performance and detector response to the dilute concentration of the internal standard were pre-determined, which allowed semi-quantitative evaluation of the analyte. The calculated analyte concentrations varied during the course of the 4 hour experiment, particularly during the first hour of the analysis where the actual Hg concentration was under predicted by up to 72%. Calculated concentration improved to within 30–60% for data collected after the first hour of the experiment. Similar results were seen during the 7 hour test with the deviation from the actual concentration being 11–81% during the first hour and then decreasing for the remaining period. The method detection limit (MDL) was determined for the mercury by injecting the sample gas into the system following a period of equilibration. The MDL for Hg was calculated as 6.8 μg · m -3. This work describes the first complete GC–ICP-MS method to directly analyze gas phase samples, and detailed sample calculations and comparisons to conventional ICP-MS methods are provided.« less

  16. Remote Sensing of Soils for Environmental Assessment and Management.

    NASA Technical Reports Server (NTRS)

    DeGloria, Stephen D.; Irons, James R.; West, Larry T.

    2014-01-01

    The next generation of imaging systems integrated with complex analytical methods will revolutionize the way we inventory and manage soil resources across a wide range of scientific disciplines and application domains. This special issue highlights those systems and methods for the direct benefit of environmental professionals and students who employ imaging and geospatial information for improved understanding, management, and monitoring of soil resources.

  17. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  18. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  19. Improved sample preparation and rapid UHPLC analysis of SO2 binding carbonyls in wine by derivatisation to 2,4-dinitrophenylhydrazine.

    PubMed

    Jackowetz, J N; Mira de Orduña, R

    2013-08-15

    Sulphur dioxide (SO2) is essential for the preservation of wines. The presence of SO2 binding compounds in musts and wines may limit sulphite efficacy leading to higher total SO2 additions, which may exceed SO2 limits permitted by law and pose health risks for sensitive individuals. An improved method for the quantification of significant wine SO2 binding compounds is presented that applies a novel sample treatment approach and rapid UHPLC separation. Glucose, galacturonic acid, alpha-ketoglutarate, pyruvate, acetoin and acetaldehyde were derivatised with 2,4-dinitrophenylhydrazine and separated using a solid core C18 phase by ultra high performance liquid chromatography. Addition of EDTA to samples prevented de novo acetaldehyde formation from ethanol oxidation. Optimised derivatisation duration enhanced reproducibility and allowed for glucose and galacturonic acid quantification. High glucose residues were found to interfere with the recovery of other SO2 binders, but practical SO2 concentrations and red wine pigments did not affect derivatisation efficiency. The calibration range, method accuracy, precision and limits of detection were found to be satisfactory for routine analysis of SO2 binders in wines. The current method represents a significant improvement in the comprehensive analysis of SO2 binding wine carbonyls. It allows for the quantification of major SO2 binders at practical analyte concentrations, and uses a simple sample treatment method that prevents treatment artifacts. Equipment utilisation could be reduced by rapid LC separation while maintaining analytical performance parameters. The improved method will be a valuable addition for the analysis of total SO2 binder pools in oenological samples. Published by Elsevier Ltd.

  20. The Use of Life Cycle Tools to Support Decision Making for Sustainable Nanotechnologies

    EPA Science Inventory

    Nanotechnology is a broad-impact technology with applications ranging from materials and electronics to analytical methods and metrology. The many benefits that can be realized through the utilization of nanotechnology are intended to lead to an improved quality of life. However,...

  1. THE APPLICATION OF BIOMONITORING DATA IN RISK ASSESSMENT: AN EXPANDED CASE-STUDY WITH BENZENE

    EPA Science Inventory

    Improved analytical methods permit the measurement of low levels of chemicals in human tissues. Despite evidence that chemicals are absorbed, it is unclear whether the relatively low levels detected in human tissue represent a potential adverse health risk. Furthermore, without...

  2. Analytical and Experimental Evaluation of the Heat Transfer Distribution over the Surfaces of Turbine Vanes

    NASA Technical Reports Server (NTRS)

    Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.

    1983-01-01

    Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.

  3. An analytical model for pressure of volume fractured tight oil reservoir with horizontal well

    NASA Astrophysics Data System (ADS)

    Feng, Qihong; Dou, Kaiwen; Zhang, Xianmin; Xing, Xiangdong; Xia, Tian

    2017-05-01

    The property of tight oil reservoir is worse than common reservoir that we usually seen before, the porosity and permeability is low, the diffusion is very complex. Therefore, the ordinary depletion method is useless here. The volume fracture breaks through the conventional EOR mechanism, which set the target by amplifying the contact area of fracture and reservoir so as to improving the production of every single well. In order to forecast the production effectively, we use the traditional dual-porosity model, build an analytical model for production of volume fractured tight oil reservoir with horizontal well, and get the analytical solution in Laplace domain. Then we construct the log-log plot of dimensionless pressure and time by stiffest conversion. After that, we discuss the influential factors of pressure. Several factors like cross flow, skin factors and threshold pressure gradient was analyzed in the article. This model provides a useful method for tight oil production forecast and it has certain guiding significance for the production capacity prediction and dynamic analysis.

  4. Optimization of a Precolumn OPA Derivatization HPLC Assay for Monitoring of l-Asparagine Depletion in Serum during l-Asparaginase Therapy.

    PubMed

    Zhang, Mei; Zhang, Yong; Ren, Siqi; Zhang, Zunjian; Wang, Yongren; Song, Rui

    2018-06-06

    A method for monitoring l-asparagine (ASN) depletion in patients' serum using reversed-phase high-performance liquid chromatography with precolumn o-phthalaldehyde and ethanethiol (ET) derivatization is described. In order to improve the signal and stability of analytes, several important factors including precipitant reagent, derivatization conditions and detection wavelengths were optimized. The recovery of the analytes in biological matrix was the highest when 4% sulfosalicylic acid (1:1, v/v) was used as a precipitant reagent. Optimal fluorescence detection parameters were determined as λex = 340 nm and λem = 444 nm for maximal signal. The signal of analytes was the highest when the reagent ET and borate buffer of pH 9.9 were used in the derivatization solution. And the corresponding derivative products were stable up to 19 h. The validated method had been successfully applied to monitor ASN depletion and l-aspartic acid, l-glutamine, l-glutamic acid levels in pediatric patients during l-asparaginase therapy.

  5. Analytical and experimental evaluation of the heat transfer distribution over the surfaces of turbine vanes

    NASA Astrophysics Data System (ADS)

    Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.

    1983-05-01

    Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.

  6. Novel permanent magnet linear motor with isolated movers: analytical, numerical and experimental study.

    PubMed

    Yan, Liang; Peng, Juanjuan; Jiao, Zongxia; Chen, Chin-Yin; Chen, I-Ming

    2014-10-01

    This paper proposes a novel permanent magnet linear motor possessing two movers and one stator. The two movers are isolated and can interact with the stator poles to generate independent forces and motions. Compared with conventional multiple motor driving system, it helps to increase the system compactness, and thus improve the power density and working efficiency. The magnetic field distribution is obtained by using equivalent magnetic circuit method. Following that, the formulation of force output considering armature reaction is carried out. Then inductances are analyzed with finite element method to investigate the relationships of the two movers. It is found that the mutual-inductances are nearly equal to zero, and thus the interaction between the two movers is negligible. A research prototype of the linear motor and a measurement apparatus on thrust force have been developed. Both numerical computation and experiment measurement are conducted to validate the analytical model of thrust force. Comparison shows that the analytical model matches the numerical and experimental results well.

  7. Analytical Optimization of the Net Residual Dispersion in SPM-Limited Dispersion-Managed Systems

    NASA Astrophysics Data System (ADS)

    Xiao, Xiaosheng; Gao, Shiming; Tian, Yu; Yang, Changxi

    2006-05-01

    Dispersion management is an effective technique to suppress the nonlinear impairment in fiber transmission systems, which includes tuning the amounts of precompensation, residual dispersion per span (RDPS), and net residual dispersion (NRD) of the systems. For self-phase modulation (SPM)-limited systems, optimizing the NRD is necessary because it can greatly improve the system performance. In this paper, an analytical method is presented to optimize NRD for SPM-limited dispersion-managed systems. The method is based on the correlation between the nonlinear impairment and the output pulse broadening of SPM-limited systems; therefore, dispersion-managed systems can be optimized through minimizing the output single-pulse broadening. A set of expressions is derived to calculate the output pulse broadening of the SPM-limited dispersion-managed system, from which the analytical result of optimal NRD is obtained. Furthermore, with the expressions of pulse broadening, how the nonlinear impairment depends on the amounts of precompensation and RDPS can be revealed conveniently.

  8. Improving Causal Inferences in Meta-analyses of Longitudinal Studies: Spanking as an Illustration.

    PubMed

    Larzelere, Robert E; Gunnoe, Marjorie Lindner; Ferguson, Christopher J

    2018-05-24

    To evaluate and improve the validity of causal inferences from meta-analyses of longitudinal studies, two adjustments for Time-1 outcome scores and a temporally backwards test are demonstrated. Causal inferences would be supported by robust results across both adjustment methods, distinct from results run backwards. A systematic strategy for evaluating potential confounds is also introduced. The methods are illustrated by assessing the impact of spanking on subsequent externalizing problems (child age: 18 months to 11 years). Significant results indicated a small risk or a small benefit of spanking, depending on the adjustment method. These meta-analytic methods are applicable for research on alternatives to spanking and other developmental science topics. The underlying principles can also improve causal inferences in individual studies. © 2018 Society for Research in Child Development.

  9. The Wide-Field Imaging Interferometry Testbed: Enabling Techniques for High Angular Resolution Astronomy

    NASA Technical Reports Server (NTRS)

    Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.; hide

    2007-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.

  10. Review and assessment of the database and numerical modeling for turbine heat transfer

    NASA Technical Reports Server (NTRS)

    Gladden, H. J.; Simoneau, R. J.

    1988-01-01

    The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.

  11. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Improvement of LOD in Fluorescence Detection with Spectrally Nonuniform Background by Optimization of Emission Filtering.

    PubMed

    Galievsky, Victor A; Stasheuski, Alexander S; Krylov, Sergey N

    2017-10-17

    The limit-of-detection (LOD) in analytical instruments with fluorescence detection can be improved by reducing noise of optical background. Efficiently reducing optical background noise in systems with spectrally nonuniform background requires complex optimization of an emission filter-the main element of spectral filtration. Here, we introduce a filter-optimization method, which utilizes an expression for the signal-to-noise ratio (SNR) as a function of (i) all noise components (dark, shot, and flicker), (ii) emission spectrum of the analyte, (iii) emission spectrum of the optical background, and (iv) transmittance spectrum of the emission filter. In essence, the noise components and the emission spectra are determined experimentally and substituted into the expression. This leaves a single variable-the transmittance spectrum of the filter-which is optimized numerically by maximizing SNR. Maximizing SNR provides an accurate way of filter optimization, while a previously used approach based on maximizing a signal-to-background ratio (SBR) is the approximation that can lead to much poorer LOD specifically in detection of fluorescently labeled biomolecules. The proposed filter-optimization method will be an indispensable tool for developing new and improving existing fluorescence-detection systems aiming at ultimately low LOD.

  13. NBOMe: new potent hallucinogens--pharmacology, analytical methods, toxicities, fatalities: a review.

    PubMed

    Kyriakou, C; Marinelli, E; Frati, P; Santurro, A; Afxentiou, M; Zaami, S; Busardo, F P

    2015-09-01

    NBOMe is a class of emerging new psychoactive substances that has recently gained prominence in the drug abuse market. NBOMes are N-2-methoxy-benzyl substituted 2C class of hallucinogens, currently being marked online as "research chemicals" under various names: N-bomb, Smiles, Solaris, and Cimbi. This article reviews available literature on the pharmacology; the analytical methods currently used for the detection and quantification of NBOMe in biological matrices and blotters, together with intoxication cases and NBOMe-related fatalities. Relevant scientific articles were identified from Medline, Cochrane Central, Scopus, Web of Science, Science Direct, EMBASE and Google Scholar, through June 2015 using the following keywords: "NBOMe", "Nbomb", "Smiles", "intoxication", "toxicity" "fatalities", "death", "pharmacology", "5-HT2A receptor", "analysis" and "analytical methods". The main key word "NBOMe" was individually searched in association to each of the others. The review of the literature allowed us to identify 43 citations on pharmacology, analytical methods and NBOMe-related toxicities and fatalities. The high potency of NBOMes (potent agonists of 5-HT2A receptor) has led to several severe intoxications, overdose and traumatic fatalities; thus, their increase raises significant public health concerns. Moreover, due to the high potency and ease of synthesis, it is likely that their recreational use will become more widespread in the future. The publication of new data, case reports and evaluation of the NBOMes metabolites is necessary in order to improve knowledge and awareness within the forensic community.

  14. Transfer of analytical procedures: a panel of strategies selected for risk management, with emphasis on an integrated equivalence-based comparative testing approach.

    PubMed

    Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A

    2011-09-10

    In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Determination of steroid hormones and related compounds in filtered and unfiltered water by solid-phase extraction, derivatization, and gas chromatography with tandem mass spectrometry

    USGS Publications Warehouse

    Foreman, William T.; Gray, James L.; ReVello, Rhiannon C.; Lindley, Chris E.; Losche, Scott A.; Barber, Larry B.

    2012-01-01

    A new analytical method has been developed and implemented at the U.S. Geological Survey National Water Quality Laboratory that determines a suite of 20 steroid hormones and related compounds in filtered water (using laboratory schedule 2434) and in unfiltered water (using laboratory schedule 4434). This report documents the procedures and initial performance data for the method and provides guidance on application of the method and considerations of data quality in relation to data interpretation. The analytical method determines 6 natural and 3 synthetic estrogen compounds, 6 natural androgens, 1 natural and 1 synthetic progestin compound, and 2 sterols: cholesterol and 3--coprostanol. These two sterols have limited biological activity but typically are abundant in wastewater effluents and serve as useful tracers. Bisphenol A, an industrial chemical used primarily to produce polycarbonate plastic and epoxy resins and that has been shown to have estrogenic activity, also is determined by the method. A technique referred to as isotope-dilution quantification is used to improve quantitative accuracy by accounting for sample-specific procedural losses in the determined analyte concentration. Briefly, deuterium- or carbon-13-labeled isotope-dilution standards (IDSs), all of which are direct or chemically similar isotopic analogs of the method analytes, are added to all environmental and quality-control and quality-assurance samples before extraction. Method analytes and IDS compounds are isolated from filtered or unfiltered water by solid-phase extraction onto an octadecylsilyl disk, overlain with a graded glass-fiber filter to facilitate extraction of unfiltered sample matrices. The disks are eluted with methanol, and the extract is evaporated to dryness, reconstituted in solvent, passed through a Florisil solid-phase extraction column to remove polar organic interferences, and again evaporated to dryness in a reaction vial. The method compounds are reacted with activated -methyl--trimethylsilyl trifluoroacetamide at 65 degrees Celsius for 1 hour to form trimethylsilyl or trimethylsilyl-enol ether derivatives that are more amenable to gas chromatographic separation than the underivatized compounds. Analysis is carried out by gas chromatography with tandem mass spectrometry using calibration standards that are derivatized concurrently with the sample extracts. Analyte concentrations are quantified relative to specific IDS compounds in the sample, which directly compensate for procedural losses (incomplete recovery) in the determined and reported analyte concentrations. Thus, reported analyte concentrations (or analyte recoveries for spiked samples) are corrected based on recovery of the corresponding IDS compound during the quantification process. Recovery for each IDS compound is reported for each sample and represents an absolute recovery in a manner comparable to surrogate recoveries for other organic methods used by the National Water Quality Laboratory. Thus, IDS recoveries provide a useful tool for evaluating sample-specific analytical performance from an absolute mass recovery standpoint. IDS absolute recovery will differ and typically be lower than the corresponding analyte’s method recovery in spiked samples. However, additional correction of reported analyte concentrations is unnecessary and inappropriate because the analyte concentration (or recovery) already is compensated for by the isotope-dilution quantification procedure. Method analytes were spiked at 10 and 100 nanograms per liter (ng/L) for most analytes (10 times greater spike levels were used for bisphenol A and 100 times greater spike levels were used for 3--coprostanol and cholesterol) into the following validation-sample matrices: reagent water, wastewater-affected surface water, a secondary-treated wastewater effluent, and a primary (no biological treatment) wastewater effluent. Overall method recovery for all analytes in these matrices averaged 100 percent, with overall relative standard deviation of 28 percent. Mean recoveries of the 20 individual analytes for spiked reagent-water samples prepared along with field samples and analyzed in 2009–2010 ranged from 84–104 percent, with relative standard deviations of 6–36 percent. Concentrations for two analytes, equilin and progesterone, are reported as estimated because these analytes had excessive bias or variability, or both. Additional database coding is applied to other reported analyte data as needed, based on sample-specific IDS recovery performance. Detection levels were derived statistically by fortifying reagent water at six different levels (0.1 to 4 ng/L) and range from about 0.4 to 4 ng/L for 16 analytes. Interim reporting levels applied to analytes in this report range from 0.8 to 8 ng/L. Bisphenol A and the sterols (cholesterol and 3-beta-coprostanol) were consistently detected in laboratory and field blanks. The minimum reporting levels were set at 100 ng/L for bisphenol A and at 200 ng/L for the two sterols to prevent any bias associated with the presence of these compounds in the blanks. A minimum reporting level of 2 ng/L was set for 11-ketotestosterone to minimize false positive risk from an interfering siloxane compound emanating as chromatographic-column bleed, from vial septum material, or from other sources at no more than 1 ng/L.

  16. Determination of glycols in air: development of sampling and analytical methodology and application to theatrical smokes.

    PubMed

    Pendergrass, S M

    1999-01-01

    Glycol-based fluids are used in the production of theatrical smokes in theaters, concerts, and other stage productions. The fluids are heated and dispersed in aerosol form to create the effect of a smoke, mist, or fog. There have been reports of adverse health effects such as respiratory irritation, chest tightness, shortness of breath, asthma, and skin rashes. Previous attempts to collect and quantify the aerosolized glycols used in fogging agents have been plagued by inconsistent results, both in the efficiency of collection and in the chromatographic analysis of the glycol components. The development of improved sampling and analytical methodology for aerosolized glycols was required to assess workplace exposures more effectively. An Occupational Safety and Health Administration versatile sampler tube was selected for the collection of ethylene glycol, propylene glycol, 1,3-butylene glycol, diethylene glycol, triethylene glycol, and tetraethylene glycol aerosols. Analytical methodology for the separation, identification, and quantitation of the six glycols using gas chromatography/flame ionization detection is described. Limits of detection of the glycol analytes ranged from 7 to 16 micrograms/sample. Desorption efficiencies for all glycol compounds were determined over the range of study and averaged greater than 90%. Storage stability results were acceptable after 28 days for all analytes except ethylene glycol, which was stable at ambient temperature for 14 days. Based on the results of this study, the new glycol method was published in the NIOSH Manual of Analytical Methods.

  17. Potential energy surface fitting by a statistically localized, permutationally invariant, local interpolating moving least squares method for the many-body potential: Method and application to N{sub 4}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, Jason D.; Doraiswamy, Sriram; Candler, Graham V., E-mail: truhlar@umn.edu, E-mail: candler@aem.umn.edu

    2014-02-07

    Fitting potential energy surfaces to analytic forms is an important first step for efficient molecular dynamics simulations. Here, we present an improved version of the local interpolating moving least squares method (L-IMLS) for such fitting. Our method has three key improvements. First, pairwise interactions are modeled separately from many-body interactions. Second, permutational invariance is incorporated in the basis functions, using permutationally invariant polynomials in Morse variables, and in the weight functions. Third, computational cost is reduced by statistical localization, in which we statistically correlate the cutoff radius with data point density. We motivate our discussion in this paper with amore » review of global and local least-squares-based fitting methods in one dimension. Then, we develop our method in six dimensions, and we note that it allows the analytic evaluation of gradients, a feature that is important for molecular dynamics. The approach, which we call statistically localized, permutationally invariant, local interpolating moving least squares fitting of the many-body potential (SL-PI-L-IMLS-MP, or, more simply, L-IMLS-G2), is used to fit a potential energy surface to an electronic structure dataset for N{sub 4}. We discuss its performance on the dataset and give directions for further research, including applications to trajectory calculations.« less

  18. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  19. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  20. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  1. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  2. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  3. SAM Radiochemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.

  4. Assessment of Orbital-Optimized Third-Order Møller-Plesset Perturbation Theory and Its Spin-Component and Spin-Opposite Scaled Variants for Thermochemistry and Kinetics.

    PubMed

    Soydaş, Emine; Bozkaya, Uğur

    2013-03-12

    An assessment of the OMP3 method and its spin-component and spin-scaled variants for thermochemistry and kinetics is presented. For reaction energies of closed-shell systems, the CCSD, SCS-MP3, and SCS-OMP3 methods show better performances than other considered methods, and no significant improvement is observed due to orbital optimization. For barrier heights, OMP3 and SCS-OMP3 provide the lowest mean absolute deviations. The MP3 method yields considerably higher errors, and the spin scaling approaches do not help to improve upon MP3, but worsen it. For radical stabilization energies, the CCSD, OMP3, and SCS-OMP3 methods exhibit noticeably better performances than MP3 and its variants. Our results demonstrate that if the reference wave function suffers from a spin-contamination, then the MP3 methods dramatically fail. On the other hand, the OMP3 method and its variants can tolerate the spin-contamination in the reference wave function. For overall evaluation, we conclude that OMP3 is quite helpful, especially in electronically challenged systems, such as free radicals or transition states where spin contamination dramatically deteriorates the quality of the canonical MP3 and SCS-MP3 methods. Both OMP3 and CCSD methods scale as n(6), where n is the number of basis functions. However, the OMP3 method generally converges in much fewer iterations than CCSD. In practice, OMP3 is several times faster than CCSD in energy computations. Further, the stationary properties of OMP3 make it much more favorable than CCSD in the evaluation of analytic derivatives. For OMP3, the analytic gradient computations are much less expensive than CCSD. For the frequency computation, both methods require the evaluation of the perturbed amplitudes and orbitals. However, in the OMP3 case there is still a significant computational time savings due to simplifications in the analytic Hessian expression owing to the stationary property of OMP3. Hence, the OMP3 method emerges as a very useful tool for computational quantum chemistry.

  5. Metformin: A Review of Characteristics, Properties, Analytical Methods and Impact in the Green Chemistry.

    PubMed

    da Trindade, Mariana Teixeira; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-01-02

    Diabetes mellitus (DM) is considered a public health problem. The initial treatment consists of improving the lifestyle and making changes in the diet. When these changes are not enough, the use of medication becomes necessary. The metformin aims to reduce the hepatic production of glucose and is the preferred treatment for type 2. The objective is to survey the characteristics and properties of metformin, as well as hold a discussion on the existing analytical methods to green chemistry and their impacts for both the operator and the environment. For the survey, data searches were conducted by scientific papers in the literature as well as in official compendium. The characteristics and properties are shown, also, methods using liquid chromatography techniques, titration, absorption spectrophotometry in the ultraviolet and the infrared region. Most of the methods presented are not green chemistry oriented. It is necessary the awareness of everyone involved in the optimization of the methods applied through the implementation of green chemistry to determine the metformin.

  6. JT8D and JT9D jet engine performance improvement program. Task 1: Feasibility analysis

    NASA Technical Reports Server (NTRS)

    Gaffin, W. O.; Webb, D. E.

    1979-01-01

    JT8D and JT9D component performance improvement concepts which have a high probability of incorporation into production engines were identified and ranked. An evaluation method based on airline payback period was developed for the purpose of identifying the most promising concepts. The method used available test data and analytical models along with conceptual/preliminary designs to predict the performance improvements, weight, installation characteristics, cost for new production and retrofit, maintenance cost, and qualitative characteristics of candidate concepts. These results were used to arrive at the concept payback period, which is the time required for an airline to recover the investment cost of concept implementation.

  7. Development of an Advanced HPLC–MS/MS Method for the Determination of Carotenoids and Fat-Soluble Vitamins in Human Plasma

    PubMed Central

    Hrvolová, Barbora; Martínez-Huélamo, Miriam; Colmán-Martínez, Mariel; Hurtado-Barroso, Sara; Lamuela-Raventós, Rosa Maria; Kalina, Jiří

    2016-01-01

    The concentration of carotenoids and fat-soluble vitamins in human plasma may play a significant role in numerous chronic diseases such as age-related macular degeneration and some types of cancer. Although these compounds are of utmost interest for human health, methods for their simultaneous determination are scarce. A new high pressure liquid chromatography (HPLC)-tandem mass spectrometry (MS/MS) method for the quantification of selected carotenoids and fat-soluble vitamins in human plasma was developed, validated, and then applied in a pilot dietary intervention study with healthy volunteers. In 50 min, 16 analytes were separated with an excellent resolution and suitable MS signal intensity. The proposed HPLC–MS/MS method led to improvements in the limits of detection (LOD) and quantification (LOQ) for all analyzed compounds compared to the most often used HPLC–DAD methods, in some cases being more than 100-fold lower. LOD values were between 0.001 and 0.422 µg/mL and LOQ values ranged from 0.003 to 1.406 µg/mL, according to the analyte. The accuracy, precision, and stability met with the acceptance criteria of the AOAC (Association of Official Analytical Chemists) International. According to these results, the described HPLC-MS/MS method is adequately sensitive, repeatable and suitable for the large-scale analysis of compounds in biological fluids. PMID:27754400

  8. Development of an Advanced HPLC-MS/MS Method for the Determination of Carotenoids and Fat-Soluble Vitamins in Human Plasma.

    PubMed

    Hrvolová, Barbora; Martínez-Huélamo, Miriam; Colmán-Martínez, Mariel; Hurtado-Barroso, Sara; Lamuela-Raventós, Rosa Maria; Kalina, Jiří

    2016-10-14

    The concentration of carotenoids and fat-soluble vitamins in human plasma may play a significant role in numerous chronic diseases such as age-related macular degeneration and some types of cancer. Although these compounds are of utmost interest for human health, methods for their simultaneous determination are scarce. A new high pressure liquid chromatography (HPLC)-tandem mass spectrometry (MS/MS) method for the quantification of selected carotenoids and fat-soluble vitamins in human plasma was developed, validated, and then applied in a pilot dietary intervention study with healthy volunteers. In 50 min, 16 analytes were separated with an excellent resolution and suitable MS signal intensity. The proposed HPLC-MS/MS method led to improvements in the limits of detection (LOD) and quantification (LOQ) for all analyzed compounds compared to the most often used HPLC-DAD methods, in some cases being more than 100-fold lower. LOD values were between 0.001 and 0.422 µg/mL and LOQ values ranged from 0.003 to 1.406 µg/mL, according to the analyte. The accuracy, precision, and stability met with the acceptance criteria of the AOAC (Association of Official Analytical Chemists) International. According to these results, the described HPLC-MS/MS method is adequately sensitive, repeatable and suitable for the large-scale analysis of compounds in biological fluids.

  9. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  10. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  11. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  12. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  13. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  14. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  15. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  16. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture... Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MREs are listed as follows: (1) Official Methods of Analysis of AOAC...

  17. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  18. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  19. Differential Mobility Spectrometry for Improved Selectivity in Hydrophilic Interaction Liquid Chromatography-Tandem Mass Spectrometry Analysis of Paralytic Shellfish Toxins

    NASA Astrophysics Data System (ADS)

    Beach, Daniel G.

    2017-08-01

    Paralytic shellfish toxins (PSTs) are neurotoxins produced by dinoflagellates and cyanobacteria that cause paralytic shellfish poisoning in humans. PST quantitation by LC-MS is challenging because of their high polarity, lability as gas-phase ions, and large number of potentially interfering analogues. Differential mobility spectrometry (DMS) has the potential to improve the performance of LC-MS methods for PSTs in terms of selectivity and limits of detection. This work describes a comprehensive investigation of the separation of 16 regulated PSTs by DMS and the development of highly selective LC-DMS-MS methods for PST quantitation. The effects of all DMS parameters on the separation of PSTs from one another were first investigated in detail. The labile nature of 11α-gonyautoxin epimers gave unique insight into fragmentation of labile analytes before, during, and after the DMS analyzer. Two sets of DMS parameters were identified that either optimized the resolution of PSTs from one another or transmitted them at a limited number of compensation voltage (CV) values corresponding to structural subclasses. These were used to develop multidimensional LC-DMS-MS/MS methods using existing HILIC-MS/MS parameters. In both cases, improved selectivity was observed when using DMS, and the quantitative capabilities of a rapid UPLC-DMS-MS/MS method were evaluated. Limits of detection of the developed method were similar to those without DMS, and differences were highly analyte-dependant. Analysis of shellfish matrix reference materials showed good agreement with established methods. The developed methods will be useful in cases where specific matrix interferences are encountered in the LC-MS/MS analysis of PSTs in complex biological samples.

  20. An Interoperable Electronic Medical Record-Based Platform for Personalized Predictive Analytics

    ERIC Educational Resources Information Center

    Abedtash, Hamed

    2017-01-01

    Precision medicine refers to the delivering of customized treatment to patients based on their individual characteristics, and aims to reduce adverse events, improve diagnostic methods, and enhance the efficacy of therapies. Among efforts to achieve the goals of precision medicine, researchers have used observational data for developing predictive…

  1. Chemical imaging of secondary cell wall development in cotton fibers using a mid-infrared focal-plane array detector

    USDA-ARS?s Scientific Manuscript database

    Market demands for cotton varieties with improved fiber properties also call for the development of fast, reliable analytical methods for monitoring fiber development and measuring their properties. Currently, cotton breeders rely on instrumentation that can require significant amounts of sample, w...

  2. Determination of zilpaterol in sheep urine and tissues using immunochromatographic assay

    USDA-ARS?s Scientific Manuscript database

    Introduction: Zilpaterol is a feed additive used to increase weight gain, improve feed efficiency, and increase carcass leanness in cattle. An on-site analytical method is needed to determine zilpaterol exposure in animals to assist producer and trade groups in avoiding un-necessary animal or carca...

  3. Advanced bridge safety initiative, task 1 : development of improved analytical load rating procedures for flat-slab concrete bridges - a thesis and guidelines.

    DOT National Transportation Integrated Search

    2010-01-01

    Current AASHTO provisions for the conventional load rating of flat slab bridges rely on the equivalent strip method : of analysis for determining live load effects, this is generally regarded as overly conservative by many professional : engineers. A...

  4. 40 CFR 161.180 - Enforcement analytical method.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Enforcement analytical method. 161.180... DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements § 161.180 Enforcement analytical method. An analytical method suitable for enforcement purposes must be...

  5. Rapid analysis of aminoglycoside antibiotics in bovine tissues using disposable pipette extraction and ultrahigh performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Lehotay, Steven J; Mastovska, Katerina; Lightfield, Alan R; Nuñez, Alberto; Dutko, Terry; Ng, Chilton; Bluhm, Louis

    2013-10-25

    A high-throughput qualitative screening and identification method for 9 aminoglycosides of regulatory interest has been developed, validated, and implemented for bovine kidney, liver, and muscle tissues. The method involves extraction at previously validated conditions, cleanup using disposable pipette extraction, and analysis by a 3 min ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method. The drug analytes include neomycin, streptomycin, dihydrosptreptomycin, and spectinomycin, which have residue tolerances in bovine in the US, and kanamicin, gentamicin, apramycin, amikacin, and hygromycin, which do not have US tolerances established in bovine tissues. Tobramycin was used as an internal standard. An additional drug, paromomycin also was validated in the method, but it was dropped during implementation due to conversion of neomycin into paromomycin. Proposed fragmentation patterns for the monitored ions of each analyte were elucidated with the aid of high resolution MS using a quadrupole-time-of-flight instrument. Recoveries from spiking experiments at regulatory levels of concern showed that all analytes averaged 70-120% recoveries in all tissues, except hygromycin averaged 61% recovery. Lowest calibrated levels were as low as 0.005 μg/g in matrix extracts, which approximately corresponded to the limit of detection for screening purposes. Drug identifications at levels <0.05 μg/g were made in spiked and/or real samples for all analytes and tissues tested. Analyses of 60 samples from 20 slaughtered cattle previously screened positive for aminoglycosides showed that this method worked well in practice. The UHPLC-MS/MS method has several advantages compared to the previous microbial inhibition screening assay, especially for distinguishing individual drugs from a mixture and improving identification of gentamicin in tissue samples. Published by Elsevier B.V.

  6. Advantages of using tetrahydrofuran-water as mobile phases in the quantitation of cyclosporin A in monkey and rat plasma by liquid chromatography-tandem mass spectrometry.

    PubMed

    Li, Austin C; Li, Yinghe; Guirguis, Micheal S; Caldwell, Robert G; Shou, Wilson Z

    2007-01-04

    A new analytical method is described here for the quantitation of anti-inflammatory drug cyclosporin A (CyA) in monkey and rat plasma. The method used tetrahydrofuran (THF)-water mobile phases to elute the analyte and internal standard, cyclosporin C (CyC). The gradient mobile phase program successfully eluted CyA into a sharp peak and therefore improved resolution between the analyte and possible interfering materials compared with previously reported analytical approaches, where CyA was eluted as a broad peak due to the rapid conversion between different conformers. The sharp peak resulted from this method facilitated the quantitative calculation as multiple smoothing and large number of bunching factors were not necessary. The chromatography in the new method was performed at 30 degrees C instead of 65-70 degrees C as reported previously. Other advantages of the method included simple and fast sample extraction-protein precipitation, direct injection of the extraction supernatant to column for analysis, and elimination of evaporation and reconstitution steps, which were needed in solid phase extraction or liquid-liquid extraction reported before. This method is amenable to high-throughput analysis with a total chromatographic run time of 3 min. This approach has been verified as sensitive, linear (0.977-4000 ng/mL), accurate and precise for the quantitation of CyA in monkey and rat plasma. However, compared with the usage of conventional mobile phases, the only drawback of this approach was the reduced detection response from the mass spectrometer that was possibly caused by poor desolvation in the ionization source. This is the first report to demonstrate the advantages of using THF-water mobile phases to elute CyA in liquid chromatography.

  7. Measuring bio-oil upgrade intermediates and corrosive species with polarity-matched analytical approaches

    DOE PAGES

    Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...

    2014-10-03

    Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less

  8. Estimating the Health Effects of Greenhouse Gas Mitigation Strategies: Addressing Parametric, Model, and Valuation Challenges

    PubMed Central

    Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid

    2014-01-01

    Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270

  9. Teaching dermatoscopy of pigmented skin tumours to novices: comparison of analytic vs. heuristic approach.

    PubMed

    Tschandl, P; Kittler, H; Schmid, K; Zalaudek, I; Argenziano, G

    2015-06-01

    There are two strategies to approach the dermatoscopic diagnosis of pigmented skin tumours, namely the verbal-based analytic and the more visual-global heuristic method. It is not known if one or the other is more efficient in teaching dermatoscopy. To compare two teaching methods in short-term training of dermatoscopy to medical students. Fifty-seven medical students in the last year of the curriculum were given a 1-h lecture of either the heuristic- or the analytic-based teaching of dermatoscopy. Before and after this session, they were shown the same 50 lesions and asked to diagnose them and rate for chance of malignancy. Test lesions consisted of melanomas, basal cell carcinomas, nevi, seborrhoeic keratoses, benign vascular tumours and dermatofibromas. Performance measures were diagnostic accuracy regarding malignancy as measured by the area under the curves of receiver operating curves (range: 0-1), as well as per cent correct diagnoses (range: 0-100%). Diagnostic accuracy as well as per cent correct diagnoses increased by +0.21 and +32.9% (heuristic teaching) and +0.19 and +35.7% (analytic teaching) respectively (P for all <0.001). Neither for diagnostic accuracy (P = 0.585), nor for per cent correct diagnoses (P = 0.298) was a difference between the two groups. Short-term training of dermatoscopy to medical students allows significant improvement in diagnostic abilities. Choosing a heuristic or analytic method does not have an influence on this effect in short training using common pigmented skin lesions. © 2014 European Academy of Dermatology and Venereology.

  10. Impact of Advanced Propeller Technology on Aircraft/Mission Characteristics of Several General Aviation Aircraft

    NASA Technical Reports Server (NTRS)

    Keiter, I. D.

    1982-01-01

    Studies of several General Aviation aircraft indicated that the application of advanced technologies to General Aviation propellers can reduce fuel consumption in future aircraft by a significant amount. Propeller blade weight reductions achieved through the use of composites, propeller efficiency and noise improvements achieved through the use of advanced concepts and improved propeller analytical design methods result in aircraft with lower operating cost, acquisition cost and gross weight.

  11. The Literature Review of Analytical Support to Defence Transformation: Lessons Learned from Turkish Air Force Transformation Activities

    DTIC Science & Technology

    2010-04-01

    available [11]. Additionally, Table-3 is a guide for DMAIC methodology including 29 different methods [12]. RTO-MP-SAS-081 6 - 4 NATO UNCLASSIFIED NATO...Table 3: DMAIC Methodology (5-Phase Methodology). Define Measure Analyze Improve Control Project Charter Prioritization Matrix 5 Whys Analysis...Methodology Scope [13] DMAIC PDCA Develop performance priorities This is a preliminary stage that precedes specific improvement projects, and the aim

  12. An improved UHPLC-UV method for separation and quantification of carotenoids in vegetable crops.

    PubMed

    Maurer, Megan M; Mein, Jonathan R; Chaudhuri, Swapan K; Constant, Howard L

    2014-12-15

    Carotenoid identification and quantitation is critical for the development of improved nutrition plant varieties. Industrial analysis of carotenoids is typically carried out on multiple crops with potentially thousands of samples per crop, placing critical needs on speed and broad utility of the analytical methods. Current chromatographic methods for carotenoid analysis have had limited industrial application due to their low throughput, requiring up to 60 min for complete separation of all compounds. We have developed an improved UHPLC-UV method that resolves all major carotenoids found in broccoli (Brassica oleracea L. var. italica), carrot (Daucus carota), corn (Zea mays), and tomato (Solanum lycopersicum). The chromatographic method is completed in 13.5 min allowing for the resolution of the 11 carotenoids of interest, including the structural isomers lutein/zeaxanthin and α-/β-carotene. Additional minor carotenoids have also been separated and identified with this method, demonstrating the utility of this method across major commercial food crops. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. An implementation of an aeroacoustic prediction model for broadband noise from a vertical axis wind turbine using a CFD informed methodology

    NASA Astrophysics Data System (ADS)

    Botha, J. D. M.; Shahroki, A.; Rice, H.

    2017-12-01

    This paper presents an enhanced method for predicting aerodynamically generated broadband noise produced by a Vertical Axis Wind Turbine (VAWT). The method improves on existing work for VAWT noise prediction and incorporates recently developed airfoil noise prediction models. Inflow-turbulence and airfoil self-noise mechanisms are both considered. Airfoil noise predictions are dependent on aerodynamic input data and time dependent Computational Fluid Dynamics (CFD) calculations are carried out to solve for the aerodynamic solution. Analytical flow methods are also benchmarked against the CFD informed noise prediction results to quantify errors in the former approach. Comparisons to experimental noise measurements for an existing turbine are encouraging. A parameter study is performed and shows the sensitivity of overall noise levels to changes in inflow velocity and inflow turbulence. Noise sources are characterised and the location and mechanism of the primary sources is determined, inflow-turbulence noise is seen to be the dominant source. The use of CFD calculations is seen to improve the accuracy of noise predictions when compared to the analytic flow solution as well as showing that, for inflow-turbulence noise sources, blade generated turbulence dominates the atmospheric inflow turbulence.

  14. Monitoring occupational exposure to cancer chemotherapy drugs

    NASA Technical Reports Server (NTRS)

    Baker, E. S.; Connor, T. H.

    1996-01-01

    Reports of the health effects of handling cytotoxic drugs and compliance with guidelines for handling these agents are briefly reviewed, and studies using analytical and biological methods of detecting exposure are evaluated. There is little conclusive evidence of detrimental health effects from occupational exposure to cytotoxic drugs. Work practices have improved since the issuance of guidelines for handling these drugs, but compliance with the recommended practices is still inadequate. Of 64 reports published since 1979 on studies of workers' exposure to these drugs, 53 involved studies of changes in cellular or molecular endpoints (biological markers) and 12 described chemical analyses of drugs or their metabolites in urine (2 involved both, and 2 reported the same study). The primary biological markers used were urine mutagenicity, sister chromatid exchange, and chromosomal aberrations; other studies involved formation of micronuclei and measurements of urinary thioethers. The studies had small sample sizes, and the methods were qualitative, nonspecific, subject to many confounders, and possibly not sensitive enough to detect most occupational exposures. Since none of the currently available biological and analytical methods is sufficiently reliable or reproducible for routine monitoring of exposure in the workplace, further studies using these methods are not recommended; efforts should focus instead on wide-spread implementation of improved practices for handling cytotoxic drugs.

  15. 40 CFR 158.355 - Enforcement analytical method.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Enforcement analytical method. 158.355... DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.355 Enforcement analytical method. An analytical method suitable for enforcement purposes must be provided for each active ingredient in the...

  16. [Physical, chemical and morphological urine examination guidelines for the Analytical Phase from the Intersociety Urinalysis Group].

    PubMed

    Manoni, Fabio; Gessoni, Gianluca; Fogazzi, Giovanni Battista; Alessio, Maria Grazia; Caleffi, Alberta; Gambaro, Giovanni; Epifani, Maria Grazia; Pieretti, Barbara; Perego, Angelo; Ottomano, Cosimo; Saccani, Graziella; Valverde, Sara; Secchiero, Sandra

    2016-01-01

    With these guidelines the Intersociety Urinalysis Group (GIAU) aims to stimulate the following aspects: Improvement and standardization of the analytical approach to physical, chemical and morphological urine examination (ECMU). Improvement of the chemical analysis of urine with particular regard to the reconsideration of the diagnostic significance of the parameters that are traditionally evaluated in dipstick analysis together with an increasing awareness of the limits of sensitivity and specificity of this analytical method. Increase the awareness of the importance of professional skills in the field of urinary morphology and the relationship with the clinicians. Implement a policy of evaluation of the analytical quality by using, in addition to traditional internal and external controls, a program for the evaluation of morphological competence. Stimulate the diagnostics industry to focus research efforts and development methodology and instrumental catering on the needs of clinical diagnosis. The hope is to revalue the enormous diagnostic potential of 'ECMU, implementing a urinalysis on personalized diagnostic needs for each patient. Emphasize the value added to ECMU by automated analyzers for the study of the morphology of the corpuscular fraction urine. The hope is to revalue the enormous potential diagnostic of 'ECMU, implementing a urinalysis on personalized diagnostic needs that each patient brings with it.

  17. Velocity gap mode of capillary electrophoresis developed for high-resolution chiral separations.

    PubMed

    Li, Xue; Li, Youxin; Zhao, Lumeng; Shen, Jianguo; Zhang, Yong; Bao, James J

    2014-10-01

    A new CE method based on velocity gap (VG) theory has been developed for high-resolution chiral separations. In VG, two consecutive electric fields are adopted to drive analytes passing through two capillaries, which are linked together through a joint. The joint is immersed inside another buffer vial which has conductivity communication with the buffer inside the capillary. By adjusting the field strengths onto the two capillaries, it is possible to observe different velocities of an analyte when it passes through those two capillaries and there would be a net velocity change (NVC) for the same analyte. Different analytes may have different NVC which may be specifically meaningful for enantioseparations because enantiomers are usually hard to resolve. By taking advantage of this NVC, it is possible to enhance the resolution of a chiral separation if a proper voltage program is applied. The feasibility of using NVC to enhance chiral separation was demonstrated in the separations of three pairs of enantiomers: terbutaline, chlorpheniramine, and promethazine. All separations started with partial separation in a conventional CE and were significantly improved under the same experimental conditions. The results indicated that VG has the potential to be used to improve the resolving power of CE in chiral separations. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Maneuver Planning for Conjunction Risk Mitigation with Ground-track Control Requirements

    NASA Technical Reports Server (NTRS)

    McKinley, David

    2008-01-01

    The planning of conjunction Risk Mitigation Maneuvers (RMM) in the presence of ground-track control requirements is analyzed. Past RMM planning efforts on the Aqua, Aura, and Terra spacecraft have demonstrated that only small maneuvers are available when ground-track control requirements are maintained. Assuming small maneuvers, analytical expressions for the effect of a given maneuver on conjunction geometry are derived. The analytical expressions are used to generate a large trade space for initial RMM design. This trade space represents a significant improvement in initial maneuver planning over existing methods that employ high fidelity maneuver models and propagation.

  19. High Sensitivity Analysis of Nanoliter Volumes of Volatile and Nonvolatile Compounds using Matrix Assisted Ionization (MAI) Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Hoang, Khoa; Pophristic, Milan; Horan, Andrew J.; Johnston, Murray V.; McEwen, Charles N.

    2016-10-01

    First results are reported using a simple, fast, and reproducible matrix-assisted ionization (MAI) sample introduction method that provides substantial improvements relative to previously published MAI methods. The sensitivity of the new MAI methods, which requires no laser, high voltage, or nebulizing gas, is comparable to those reported for MALDI-TOF and n-ESI. High resolution full acquisition mass spectra having low chemical background are acquired from low nanoliters of solution using only a few femtomoles of analyte. The limit-of-detection for angiotensin II is less than 50 amol on an Orbitrap Exactive mass spectrometer. Analysis of peptides, including a bovine serum albumin digest, and drugs, including drugs in urine without a purification step, are reported using a 1 μL zero dead volume syringe in which only the analyte solution wetting the walls of the syringe needle is used in the analysis.

  20. Determination of tamoxifen and endoxifen in dried blood spots using LC-MS/MS and the effect of coated DBS cards on recovery and matrix effects.

    PubMed

    Jager, Nynke Gl; Rosing, Hilde; Schellens, Jan Hm; Beijnen, Jos H

    2014-01-01

    We developed an HPLC-MS/MS method to quantify tamoxifen (2.5-250 ng/ml) and its metabolite (Z)-endoxifen (0.5-50 ng/ml) in dried blood spots. Extraction recovery of both analytes from Whatman DMPK-A cards was 100% and consistent over time, however, recovery of (Z)-endoxifen from Whatman 903 cards was incomplete and increased upon storage. When SDS, a constituent of the DMPK-A coating, was present during the extraction, recovery improved. The method using DMPK-A cards was validated using bioanalytical guidelines. Additionally, influence of haematocrit (0.29-0.48 L/L), spot volume (20-50 µl) and homogeneity was within limits and both analytes were stable in DBS for at least 4 months. The method for the quantification of tamoxifen and (Z)-endoxifen in DBS collected on DMPK-A cards was successfully validated.

  1. Economic benefit evaluation for renewable energy transmitted by HVDC based on production simulation (PS) and analytic hierarchy process(AHP)

    NASA Astrophysics Data System (ADS)

    Zhang, Jinfang; Zheng, Kuan; Liu, Jun; Huang, Xinting

    2018-02-01

    In order to support North and West China’s RE (RE) development and enhance accommodation in reasonable high level, HVDC’s traditional operation curves need some change to follow the output characteristic of RE, which helps to shrink curtailment electricity and curtailment ratio of RE. In this paper, an economic benefit analysis method based on production simulation (PS) and Analytic hierarchy process (AHP) has been proposed. PS is the basic tool to analyze chosen power system operation situation, and AHP method could give a suitable comparison result among many candidate schemes. Based on four different transmission curve combinations, related economic benefit has been evaluated by PS and AHP. The results and related index have shown the efficiency of suggested method, and finally it has been validated that HVDC operation curve in following RE output mode could have benefit in decreasing RE curtailment level and improving economic operation.

  2. A Comparison of Analytical and Data Preprocessing Methods for Spectral Fingerprinting

    PubMed Central

    LUTHRIA, DEVANAND L.; MUKHOPADHYAY, SUDARSAN; LIN, LONG-ZE; HARNLY, JAMES M.

    2013-01-01

    Spectral fingerprinting, as a method of discriminating between plant cultivars and growing treatments for a common set of broccoli samples, was compared for six analytical instruments. Spectra were acquired for finely powdered solid samples using Fourier transform infrared (FT-IR) and Fourier transform near-infrared (NIR) spectrometry. Spectra were also acquired for unfractionated aqueous methanol extracts of the powders using molecular absorption in the ultraviolet (UV) and visible (VIS) regions and mass spectrometry with negative (MS−) and positive (MS+) ionization. The spectra were analyzed using nested one-way analysis of variance (ANOVA) and principal component analysis (PCA) to statistically evaluate the quality of discrimination. All six methods showed statistically significant differences between the cultivars and treatments. The significance of the statistical tests was improved by the judicious selection of spectral regions (IR and NIR), masses (MS+ and MS−), and derivatives (IR, NIR, UV, and VIS). PMID:21352644

  3. The detection of problem analytes in a single proficiency test challenge in the absence of the Health Care Financing Administration rule violations.

    PubMed

    Cembrowski, G S; Hackney, J R; Carey, N

    1993-04-01

    The Clinical Laboratory Improvement Act of 1988 (CLIA 88) has dramatically changed proficiency testing (PT) practices having mandated (1) satisfactory PT for certain analytes as a condition of laboratory operation, (2) fixed PT limits for many of these "regulated" analytes, and (3) an increased number of PT specimens (n = 5) for each testing cycle. For many of these analytes, the fixed limits are much broader than the previously employed Standard Deviation Index (SDI) criteria. Paradoxically, there may be less incentive to identify and evaluate analytically significant outliers to improve the analytical process. Previously described "control rules" to evaluate these PT results are unworkable as they consider only two or three results. We used Monte Carlo simulations of Kodak Ektachem analyzers participating in PT to determine optimal control rules for the identification of PT results that are inconsistent with those from other laboratories using the same methods. The analysis of three representative analytes, potassium, creatine kinase, and iron was simulated with varying intrainstrument and interinstrument standard deviations (si and sg, respectively) obtained from the College of American Pathologists (Northfield, Ill) Quality Assurance Services data and Proficiency Test data, respectively. Analytical errors were simulated in each of the analytes and evaluated in terms of multiples of the interlaboratory SDI. Simple control rules for detecting systematic and random error were evaluated with power function graphs, graphs of probability of error detected vs magnitude of error. Based on the simulation results, we recommend screening all analytes for the occurrence of two or more observations exceeding the same +/- 1 SDI limit. For any analyte satisfying this condition, the mean of the observations should be calculated. For analytes with sg/si ratios between 1.0 and 1.5, a significant systematic error is signaled by the mean exceeding 1.0 SDI. Significant random error is signaled by one observation exceeding the +/- 3-SDI limit or the range of the observations exceeding 4 SDIs. For analytes with higher sg/si, significant systematic or random error is signaled by violation of the screening rule (having at least two observations exceeding the same +/- 1 SDI limit). Random error can also be signaled by one observation exceeding the +/- 1.5-SDI limit or the range of the observations exceeding 3 SDIs. We present a practical approach to the workup of apparent PT errors.

  4. Computational Methodology for Absolute Calibration Curves for Microfluidic Optical Analyses

    PubMed Central

    Chang, Chia-Pin; Nagel, David J.; Zaghloul, Mona E.

    2010-01-01

    Optical fluorescence and absorption are two of the primary techniques used for analytical microfluidics. We provide a thorough yet tractable method for computing the performance of diverse optical micro-analytical systems. Sample sizes range from nano- to many micro-liters and concentrations from nano- to milli-molar. Equations are provided to trace quantitatively the flow of the fundamental entities, namely photons and electrons, and the conversion of energy from the source, through optical components, samples and spectral-selective components, to the detectors and beyond. The equations permit facile computations of calibration curves that relate the concentrations or numbers of molecules measured to the absolute signals from the system. This methodology provides the basis for both detailed understanding and improved design of microfluidic optical analytical systems. It saves prototype turn-around time, and is much simpler and faster to use than ray tracing programs. Over two thousand spreadsheet computations were performed during this study. We found that some design variations produce higher signal levels and, for constant noise levels, lower minimum detection limits. Improvements of more than a factor of 1,000 were realized. PMID:22163573

  5. 7 CFR 94.103 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  6. 7 CFR 94.103 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  7. 7 CFR 94.103 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  8. 7 CFR 94.103 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  9. 7 CFR 94.103 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  10. Efficient use of retention time for the analysis of 302 drugs in equine plasma by liquid chromatography-MS/MS with scheduled multiple reaction monitoring and instant library searching for doping control.

    PubMed

    Liu, Ying; Uboh, Cornelius E; Soma, Lawrence R; Li, Xiaoqing; Guan, Fuyu; You, Youwen; Chen, Jin-Wen

    2011-09-01

    Multiple drug target analysis (MDTA) used in doping control is more efficient than single drug target analysis (SDTA). The number of drugs with the potential for abuse is so extensive that full coverage is not possible with SDTA. To address this problem, a liquid chromatography tandem mass spectrometric method was developed for simultaneous analysis of 302 drugs using a scheduled multiple reaction monitoring (s-MRM) algorithm. With a known retention time of an analyte, the s-MRM algorithm monitors each MRM transition only around its expected retention time. Analytes were recovered from plasma by liquid-liquid extraction. Information-dependent acquisition (IDA) functionality was used to combine s-MRM with enhanced product ion (EPI) scans within the same chromatographic analysis. An EPI spectrum library was also generated for rapid identification of analytes. Analysis time for the 302 drugs was 7 min. Scheduled MRM improved the quality of the chromatograms, signal response, reproducibility, and enhanced signal-to-noise ratio (S/N), resulting in more data points. Reduction in total cycle time from 2.4 s in conventional MRM (c-MRM) to 1 s in s-MRM allowed completion of the EPI scan at the same time. The speed for screening and identification of multiple drugs in equine plasma for doping control analysis was greatly improved by this method.

  11. First-order analytic propagation of satellites in the exponential atmosphere of an oblate planet

    NASA Astrophysics Data System (ADS)

    Martinusi, Vladimir; Dell'Elce, Lamberto; Kerschen, Gaëtan

    2017-04-01

    The paper offers the fully analytic solution to the motion of a satellite orbiting under the influence of the two major perturbations, due to the oblateness and the atmospheric drag. The solution is presented in a time-explicit form, and takes into account an exponential distribution of the atmospheric density, an assumption that is reasonably close to reality. The approach involves two essential steps. The first one concerns a new approximate mathematical model that admits a closed-form solution with respect to a set of new variables. The second step is the determination of an infinitesimal contact transformation that allows to navigate between the new and the original variables. This contact transformation is obtained in exact form, and afterwards a Taylor series approximation is proposed in order to make all the computations explicit. The aforementioned transformation accommodates both perturbations, improving the accuracy of the orbit predictions by one order of magnitude with respect to the case when the atmospheric drag is absent from the transformation. Numerical simulations are performed for a low Earth orbit starting at an altitude of 350 km, and they show that the incorporation of drag terms into the contact transformation generates an error reduction by a factor of 7 in the position vector. The proposed method aims at improving the accuracy of analytic orbit propagation and transforming it into a viable alternative to the computationally intensive numerical methods.

  12. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  13. TopicLens: Efficient Multi-Level Visual Topic Exploration of Large-Scale Document Collections.

    PubMed

    Kim, Minjeong; Kang, Kyeongpil; Park, Deokgun; Choo, Jaegul; Elmqvist, Niklas

    2017-01-01

    Topic modeling, which reveals underlying topics of a document corpus, has been actively adopted in visual analytics for large-scale document collections. However, due to its significant processing time and non-interactive nature, topic modeling has so far not been tightly integrated into a visual analytics workflow. Instead, most such systems are limited to utilizing a fixed, initial set of topics. Motivated by this gap in the literature, we propose a novel interaction technique called TopicLens that allows a user to dynamically explore data through a lens interface where topic modeling and the corresponding 2D embedding are efficiently computed on the fly. To support this interaction in real time while maintaining view consistency, we propose a novel efficient topic modeling method and a semi-supervised 2D embedding algorithm. Our work is based on improving state-of-the-art methods such as nonnegative matrix factorization and t-distributed stochastic neighbor embedding. Furthermore, we have built a web-based visual analytics system integrated with TopicLens. We use this system to measure the performance and the visualization quality of our proposed methods. We provide several scenarios showcasing the capability of TopicLens using real-world datasets.

  14. An overview of the environmental applicability of vermicompost: from wastewater treatment to the development of sensitive analytical methods.

    PubMed

    Pereira, Madson de Godoi; Neta, Lourdes Cardoso de Souza; Fontes, Maurício Paulo Ferreira; Souza, Adriana Nascimento; Matos, Thaionara Carvalho; Sachdev, Raquel de Lima; dos Santos, Arnaud Victor; da Guarda Souza, Marluce Oliveira; de Andrade, Marta Valéria Almeida Santana; Paulo, Gabriela Marinho Maciel; Ribeiro, Joselito Nardy; Ribeiro, Araceli Verónica Flores Nardy

    2014-01-01

    The use of vermicompost (humified material) for treating wastewaters, remediating polluted soils, improving agricultural productivity, protecting crop production, and developing sensitive analytical methods is reviewed here, covering the past 17 years. The main advantages of vermicompost, considering all applications covered in this paper, comprise (i) easy acquisition, (ii) low costs, (iii) structural, chemical, and biological characteristics responsible for exceptional adsorptive capacities as well as pollutant degradation, and (iv) the promotion of biocontrol. Specifically, for wastewater decontamination, a considerable number of works have verified the adsorption of toxic metals, but the application of vermicompost is still scarce for the retention of organic compounds. Problems related to the final disposal of enriched vermicompost (after treatment steps) are often found, in spite of some successful destinations such as organic fertilizer. For decontaminating soils, the use of vermicompost is quite scarce, mainly for inorganic pollutants. In agricultural productivity and biocontrol, vermicompost imparts remarkable benefits regarding soil aggregation, plant nutrition, and the development of beneficial microorganisms against phytopathogens. Finally, the use of vermicompost in sensitive analytical methods for quantifying toxic metals is the newest application of this adsorbent.

  15. An Overview of the Environmental Applicability of Vermicompost: From Wastewater Treatment to the Development of Sensitive Analytical Methods

    PubMed Central

    Pereira, Madson de Godoi; Cardoso de Souza Neta, Lourdes; Fontes, Maurício Paulo Ferreira; Souza, Adriana Nascimento; Carvalho Matos, Thaionara; de Lima Sachdev, Raquel; dos Santos, Arnaud Victor; Oliveira da Guarda Souza, Marluce; de Andrade, Marta Valéria Almeida Santana; Marinho Maciel Paulo, Gabriela; Ribeiro, Joselito Nardy; Verónica Flores Nardy Ribeiro, Araceli

    2014-01-01

    The use of vermicompost (humified material) for treating wastewaters, remediating polluted soils, improving agricultural productivity, protecting crop production, and developing sensitive analytical methods is reviewed here, covering the past 17 years. The main advantages of vermicompost, considering all applications covered in this paper, comprise (i) easy acquisition, (ii) low costs, (iii) structural, chemical, and biological characteristics responsible for exceptional adsorptive capacities as well as pollutant degradation, and (iv) the promotion of biocontrol. Specifically, for wastewater decontamination, a considerable number of works have verified the adsorption of toxic metals, but the application of vermicompost is still scarce for the retention of organic compounds. Problems related to the final disposal of enriched vermicompost (after treatment steps) are often found, in spite of some successful destinations such as organic fertilizer. For decontaminating soils, the use of vermicompost is quite scarce, mainly for inorganic pollutants. In agricultural productivity and biocontrol, vermicompost imparts remarkable benefits regarding soil aggregation, plant nutrition, and the development of beneficial microorganisms against phytopathogens. Finally, the use of vermicompost in sensitive analytical methods for quantifying toxic metals is the newest application of this adsorbent. PMID:24578668

  16. Base catalytic transesterification of vegetable oil.

    PubMed

    Mainali, Kalidas

    2012-01-01

    Sustainable economic and industrial growth requires safe, sustainable resources of energy. Biofuel is becoming increasingly important as an alternative fuel for the diesel engine. The use of non-edible vegetable oils for biofuel production is significant because of the increasing demand for edible oils as food. With the recent debate of food versus fuel, some non-edible oils like soapnut and Jatropha (Jatropha curcus. L) are being investigated as possible sources of biofuel. Recent research has focused on the application of heterogeneous catalysis. This review considers catalytic transesterification and the possibility of heterogeneous base catalysts. The process of transesterification, and the effect of parameters, mechanism and kinetics are reviewed. Although chromatography (GC and HPLC) are the analytical methods most often used for biofuel characterization, other techniques and some improvements to analytical methods are discussed.

  17. Decision Support System for Determining Scholarship Selection using an Analytical Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Puspitasari, T. D.; Sari, E. O.; Destarianto, P.; Riskiawan, H. Y.

    2018-01-01

    Decision Support System is a computer program application that analyzes data and presents it so that users can make decision more easily. Determining Scholarship Selection study case in Senior High School in east Java wasn’t easy. It needed application to solve the problem, to improve the accuracy of targets for prospective beneficiaries of poor students and to speed up the screening process. This research will build system uses the method of Analytical Hierarchy Process (AHP) is a method that solves a complex and unstructured problem into its group, organizes the groups into a hierarchical order, inputs numerical values instead of human perception in comparing relative and ultimately with a synthesis determined elements that have the highest priority. The accuracy system for this research is 90%.

  18. Co-elution effects can influence molar mass determination of large macromolecules with asymmetric flow field-flow fractionation coupled to multiangle light scattering.

    PubMed

    Perez-Rea, Daysi; Zielke, Claudia; Nilsson, Lars

    2017-07-14

    Starch and hence, amylopectin is an important biomacromolecule in both the human diet as well as in technical applications. Therefore, accurate and reliable analytical methods for its characterization are needed. A suitable method for analyzing macromolecules with ultra-high molar mass, branched structure and high polydispersity is asymmetric flow field-flow fractionation (AF4) in combination with multiangle light scattering (MALS) detection. In this paper we illustrate how co-elution of low quantities of very large analytes in AF4 may cause disturbances in the MALS data which, in turn, causes an overestimation of the size. Furthermore, it is shown how pre-injection filtering of the sample can improve the results. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Microfluidic paper-based analytical devices for potential use in quantitative and direct detection of disease biomarkers in clinical analysis.

    PubMed

    Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei

    2017-08-15

    Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education

    PubMed Central

    Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-01-01

    Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840

  1. 7 CFR 94.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.4 Section 94.4 Agriculture... POULTRY AND EGG PRODUCTS Mandatory Analyses of Egg Products § 94.4 Analytical methods. The majority of analytical methods used by the USDA laboratories to perform mandatory analyses for egg products are listed as...

  2. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  3. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  4. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  5. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  6. Considerations regarding the validation of chromatographic mass spectrometric methods for the quantification of endogenous substances in forensics.

    PubMed

    Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra

    2018-02-01

    The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Improved Analytical Sensitivity of Lateral Flow Assay using Sponge for HBV Nucleic Acid Detection.

    PubMed

    Tang, Ruihua; Yang, Hui; Gong, Yan; Liu, Zhi; Li, XiuJun; Wen, Ting; Qu, ZhiGuo; Zhang, Sufeng; Mei, Qibing; Xu, Feng

    2017-05-02

    Hepatitis B virus (HBV) infection is a serious public health problem, which can be transmitted through various routes (e.g., blood donation) and cause hepatitis, liver cirrhosis and liver cancer. Hence, it is necessary to do diagnostic screening for high-risk HBV patients in these transmission routes. Nowadays, protein-based technologies have been used for HBV testing, which however involve the issues of large sample volume, antibody instability and poor specificity. Nucleic acid hybridization-based lateral flow assay (LFA) holds great potential to address these limitations due to its low-cost, rapid, and simple features, but the poor analytical sensitivity of LFA restricts its application. In this study, we developed a low-cost, simple and easy-to-use method to improve analytical sensitivity by integrating sponge shunt into LFA to decrease the fluid flow rate. The thickness, length and hydrophobicity of the sponge shunt were sequentially optimized, and achieved 10-fold signal enhancement in nucleic acid testing of HBV as compared to the unmodified LFA. The enhancement was further confirmed by using HBV clinical samples, where we achieved the detection limit of 10 3 copies/ml as compared to 10 4 copies/ml in unmodified LFA. The improved LFA holds great potential for diseases diagnostics, food safety control and environment monitoring at point-of-care.

  8. Improvement and validation of the method to determine neutral detergent fiber in feed.

    PubMed

    Hiraoka, Hisaaki; Fukunaka, Rie; Ishikuro, Eiichi; Enishi, Osamu; Goto, Tetsuhisa

    2012-10-01

    To improve the performance of the analytical method for neutral detergent fiber in feed with heat-stable α-amylase treatment (aNDFom), the process of adding heat-stable α-amylase, as well as other analytical conditions, were examined. In this new process, the starch in the samples was removed by adding amylase to neutral detergent (ND) solution twice, just after the start of heating and immediately after refluxing. We also examined the effects of the use of sodium sulfite, and drying and ashing conditions for aNDFom analysis by this modified amylase addition method. A collaborative study to validate this new method was carried out with 15 laboratories. These laboratories analyzed two samples, alfalfa pellet and dairy mixed feed, with blind duplicates. Ten laboratories used a conventional apparatus and five used a Fibertec(®) type apparatus. There were no significant differences in aNDFom values between these two refluxing apparatuses. The aNDFom values in alfalfa pellet and dairy mixed feed were 388 g/kg and 145 g/kg, the coefficients of variation for the repeatability and reproducibility (CV(r) and CV(R) ) were 1.3% and 2.9%, and the HorRat values were 0.8 and 1.1, respectively. This new method was validated with 5.8% uncertainty (k = 2) from the collaborative study. © 2012 The Authors. Animal Science Journal © 2012 Japanese Society of Animal Science.

  9. Addressing unmeasured confounding in comparative observational research.

    PubMed

    Zhang, Xiang; Faries, Douglas E; Li, Hu; Stamey, James D; Imbens, Guido W

    2018-04-01

    Observational pharmacoepidemiological studies can provide valuable information on the effectiveness or safety of interventions in the real world, but one major challenge is the existence of unmeasured confounder(s). While many analytical methods have been developed for dealing with this challenge, they appear under-utilized, perhaps due to the complexity and varied requirements for implementation. Thus, there is an unmet need to improve understanding the appropriate course of action to address unmeasured confounding under a variety of research scenarios. We implemented a stepwise search strategy to find articles discussing the assessment of unmeasured confounding in electronic literature databases. Identified publications were reviewed and characterized by the applicable research settings and information requirements required for implementing each method. We further used this information to develop a best practice recommendation to help guide the selection of appropriate analytical methods for assessing the potential impact of unmeasured confounding. Over 100 papers were reviewed, and 15 methods were identified. We used a flowchart to illustrate the best practice recommendation which was driven by 2 critical components: (1) availability of information on the unmeasured confounders; and (2) goals of the unmeasured confounding assessment. Key factors for implementation of each method were summarized in a checklist to provide further assistance to researchers for implementing these methods. When assessing comparative effectiveness or safety in observational research, the impact of unmeasured confounding should not be ignored. Instead, we suggest quantitatively evaluating the impact of unmeasured confounding and provided a best practice recommendation for selecting appropriate analytical methods. Copyright © 2018 John Wiley & Sons, Ltd.

  10. [Spectral scatter correction of coal samples based on quasi-linear local weighted method].

    PubMed

    Lei, Meng; Li, Ming; Ma, Xiao-Ping; Miao, Yan-Zi; Wang, Jian-Sheng

    2014-07-01

    The present paper puts forth a new spectral correction method based on quasi-linear expression and local weighted function. The first stage of the method is to search 3 quasi-linear expressions to replace the original linear expression in MSC method, such as quadratic, cubic and growth curve expression. Then the local weighted function is constructed by introducing 4 kernel functions, such as Gaussian, Epanechnikov, Biweight and Triweight kernel function. After adding the function in the basic estimation equation, the dependency between the original and ideal spectra is described more accurately and meticulously at each wavelength point. Furthermore, two analytical models were established respectively based on PLS and PCA-BP neural network method, which can be used for estimating the accuracy of corrected spectra. At last, the optimal correction mode was determined by the analytical results with different combination of quasi-linear expression and local weighted function. The spectra of the same coal sample have different noise ratios while the coal sample was prepared under different particle sizes. To validate the effectiveness of this method, the experiment analyzed the correction results of 3 spectral data sets with the particle sizes of 0.2, 1 and 3 mm. The results show that the proposed method can eliminate the scattering influence, and also can enhance the information of spectral peaks. This paper proves a more efficient way to enhance the correlation between corrected spectra and coal qualities significantly, and improve the accuracy and stability of the analytical model substantially.

  11. Linear modeling of steady-state behavioral dynamics.

    PubMed Central

    Palya, William L; Walter, Donald; Kessel, Robert; Lucke, Robert

    2002-01-01

    The observed steady-state behavioral dynamics supported by unsignaled periods of reinforcement within repeating 2,000-s trials were modeled with a linear transfer function. These experiments employed improved schedule forms and analytical methods to improve the precision of the measured transfer function, compared to previous work. The refinements include both the use of multiple reinforcement periods that improve spectral coverage and averaging of independently determined transfer functions. A linear analysis was then used to predict behavior observed for three different test schedules. The fidelity of these predictions was determined. PMID:11831782

  12. MO-DE-207A-07: Filtered Iterative Reconstruction (FIR) Via Proximal Forward-Backward Splitting: A Synergy of Analytical and Iterative Reconstruction Method for CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, H

    Purpose: This work is to develop a general framework, namely filtered iterative reconstruction (FIR) method, to incorporate analytical reconstruction (AR) method into iterative reconstruction (IR) method, for enhanced CT image quality. Methods: FIR is formulated as a combination of filtered data fidelity and sparsity regularization, and then solved by proximal forward-backward splitting (PFBS) algorithm. As a result, the image reconstruction decouples data fidelity and image regularization with a two-step iterative scheme, during which an AR-projection step updates the filtered data fidelity term, while a denoising solver updates the sparsity regularization term. During the AR-projection step, the image is projected tomore » the data domain to form the data residual, and then reconstructed by certain AR to a residual image which is in turn weighted together with previous image iterate to form next image iterate. Since the eigenvalues of AR-projection operator are close to the unity, PFBS based FIR has a fast convergence. Results: The proposed FIR method is validated in the setting of circular cone-beam CT with AR being FDK and total-variation sparsity regularization, and has improved image quality from both AR and IR. For example, AIR has improved visual assessment and quantitative measurement in terms of both contrast and resolution, and reduced axial and half-fan artifacts. Conclusion: FIR is proposed to incorporate AR into IR, with an efficient image reconstruction algorithm based on PFBS. The CBCT results suggest that FIR synergizes AR and IR with improved image quality and reduced axial and half-fan artifacts. The authors was partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000), and the Shanghai Pujiang Talent Program (#14PJ1404500).« less

  13. Reanalysis of a 15-year Archive of IMPROVE Samples

    NASA Astrophysics Data System (ADS)

    Hyslop, N. P.; White, W. H.; Trzepla, K.

    2013-12-01

    The IMPROVE (Interagency Monitoring of PROtected Visual Environments) network monitors aerosol concentrations at 170 remote sites throughout the United States. Twenty-four-hour filter samples of particulate matter are collected every third day and analyzed for chemical composition. About 30 of the sites have operated continuously since 1988, and the sustained data record (http://views.cira.colostate.edu/web/) offers a unique window on regional aerosol trends. All elemental analyses have been performed by Crocker Nuclear Laboratory at the University of California in Davis, and sample filters collected since 1995 are archived on campus. The suite of reported elements has remained constant, but the analytical methods employed for their determination have evolved. For example, the elements Na - Mn were determined by PIXE until November 2001, then by XRF analysis in a He-flushed atmosphere through 2004, and by XRF analysis in vacuum since January 2005. In addition to these fundamental changes, incompletely-documented operational factors such as detector performance and calibration details have introduced variations in the measurements. Because the past analytical methods were non-destructive, the archived filters can be re-analyzed with the current analytical systems and protocols. The 15-year sample archives from Great Smoky Mountains, Mount Rainier, and Point Reyes National Parks were selected for reanalysis. The agreement between the new analyses and original determinations varies with element and analytical era (Figure 1). Temporal trends for some elements are affected by these changes in measurement technique while others are not (Figure 2). Figure 1. Repeatability of analyses for sulfur and vanadium at Great Smoky Mountains National Park. Each point shows the ratio of mass loadings determined by the original analysis and recent reanalysis. Major method distinctions are indicated at the top. Figure 2. Trends, based on Thiel-Sen regression, in lead concentrations based on the original and reanalysis data.

  14. Electrospun fibrous thin film microextraction coupled with desorption corona beam ionization-mass spectrometry for rapid analysis of antidepressants in human plasma.

    PubMed

    Chen, Di; Hu, Yu-Ning; Hussain, Dilshad; Zhu, Gang-Tian; Huang, Yun-Qing; Feng, Yu-Qi

    2016-05-15

    Appropriate sample preparations prior to analysis can significantly enhance the sensitivity of ambient ionization techniques, especially during the enrichment or purification of analytes in the presence of complex biological matrix. Here in, we developed a rapid analysis method by the combination of thin film microextraction (TFME) and desorption corona beam ionization (DCBI) for the determination of antidepressants in human plasma. Thin films used for extraction consisted of sub-micron sized highly ordered mesoporous silica-carbon composite fibers (OMSCFs), simply prepared by electrospinning and subsequent carbonization. Typically, OMSCFs thin film was immersed into the diluted plasma for extraction of target analytes and then directly subjected to the DCBI-MS for detection. Size-exclusion effect of mesopores contributed to avoid of the protein precipitation step prior to extraction. Mass transfer was benefited from high surface-to-volume ratio which is attributed to macroporous network and ordered mesostructures. Moreover, the OMSCFs provided mixed-mode hydrophobic/ion-exchange interactions towards target analytes. Thus, the detection sensitivity was greatly improved due to effective enrichment of the target analytes and elimination of matrix interferences. After optimization of several parameters related to extraction performance, the proposed method was eventually applied for the determination of three antidepressants in human plasma. The calibration curves were plotted in the range of 5-1000 ng/mL with acceptable linearity (R(2) >0.983). The limits of detection (S/N=3) of three antidepressants were in ranges of 0.3-1 ng/mL. Reproducibility was achieved with RSD less than 17.6% and the relative recoveries were in ranges of 83.6-116.9%. Taken together, TFME-DCBI-MS method offers a powerful capacity for rapid analysis to achieve much-improved sensitivity. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Use of the Threshold of Toxicological Concern (TTC) approach for deriving target values for drinking water contaminants.

    PubMed

    Mons, M N; Heringa, M B; van Genderen, J; Puijker, L M; Brand, W; van Leeuwen, C J; Stoks, P; van der Hoek, J P; van der Kooij, D

    2013-03-15

    Ongoing pollution and improving analytical techniques reveal more and more anthropogenic substances in drinking water sources, and incidentally in treated water as well. In fact, complete absence of any trace pollutant in treated drinking water is an illusion as current analytical techniques are capable of detecting very low concentrations. Most of the substances detected lack toxicity data to derive safe levels and have not yet been regulated. Although the concentrations in treated water usually do not have adverse health effects, their presence is still undesired because of customer perception. This leads to the question how sensitive analytical methods need to become for water quality screening, at what levels water suppliers need to take action and how effective treatment methods need to be designed to remove contaminants sufficiently. Therefore, in the Netherlands a clear and consistent approach called 'Drinking Water Quality for the 21st century (Q21)' has been developed within the joint research program of the drinking water companies. Target values for anthropogenic drinking water contaminants were derived by using the recently introduced Threshold of Toxicological Concern (TTC) approach. The target values for individual genotoxic and steroid endocrine chemicals were set at 0.01 μg/L. For all other organic chemicals the target values were set at 0.1 μg/L. The target value for the total sum of genotoxic chemicals, the total sum of steroid hormones and the total sum of all other organic compounds were set at 0.01, 0.01 and 1.0 μg/L, respectively. The Dutch Q21 approach is further supplemented by the standstill-principle and effect-directed testing. The approach is helpful in defining the goals and limits of future treatment process designs and of analytical methods to further improve and ensure the quality of drinking water, without going to unnecessary extents. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. New procedure of quantitative mapping of Ti and Al released from dental implant and Mg, Ca, Fe, Zn, Cu, Mn as physiological elements in oral mucosa by LA-ICP-MS.

    PubMed

    Sajnóg, Adam; Hanć, Anetta; Koczorowski, Ryszard; Barałkiewicz, Danuta

    2017-12-01

    A new procedure for determination of elements derived from titanium implants and physiological elements in soft tissues by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) is presented. The analytical procedure was developed which involved preparation of in-house matrix matched solid standards with analyte addition based on certified reference material (CRM) MODAS-4 Cormorant Tissue. Addition of gelatin, serving as a binding agent, essentially improved physical properties of standards. Performance of the analytical method was assayed and validated by calculating parameters like precision, detection limits, trueness and recovery of analyte addition using additional CRM - ERM-BB184 Bovine Muscle. Analyte addition was additionally confirmed by microwave digestion of solid standards and analysis by solution nebulization ICP-MS. The detection limits are in range 1.8μgg -1 to 450μgg -1 for Mn and Ca respectively. The precision values range from 7.3% to 42% for Al and Zn respectively. The estimated recoveries of analyte addition line within scope of 83%-153% for Mn and Cu respectively. Oral mucosa samples taken from patients treated with titanium dental implants were examined using developed analytical method. Standards and tissue samples were cryocut into 30µm thin sections. LA-ICP-MS allowed to obtain two-dimensional maps of distribution of elements in tested samples which revealed high content of Ti and Al derived from implants. Photographs from optical microscope displayed numerous particles with µm size in oral mucosa samples which suggests that they are residues from implantation procedure. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Review and assessment of the database and numerical modeling for turbine heat transfer

    NASA Technical Reports Server (NTRS)

    Gladden, H. J.; Simoneau, R. J.

    1989-01-01

    The objectives of the NASA Hot Section Technology (HOST) Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.

  18. Current Technical Approaches for the Early Detection of Foodborne Pathogens: Challenges and Opportunities.

    PubMed

    Cho, Il-Hoon; Ku, Seockmo

    2017-09-30

    The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.

  19. VOFTools - A software package of calculation tools for volume of fluid methods using general convex grids

    NASA Astrophysics Data System (ADS)

    López, J.; Hernández, J.; Gómez, P.; Faura, F.

    2018-02-01

    The VOFTools library includes efficient analytical and geometrical routines for (1) area/volume computation, (2) truncation operations that typically arise in VOF (volume of fluid) methods, (3) area/volume conservation enforcement (VCE) in PLIC (piecewise linear interface calculation) reconstruction and(4) computation of the distance from a given point to the reconstructed interface. The computation of a polyhedron volume uses an efficient formula based on a quadrilateral decomposition and a 2D projection of each polyhedron face. The analytical VCE method is based on coupling an interpolation procedure to bracket the solution with an improved final calculation step based on the above volume computation formula. Although the library was originally created to help develop highly accurate advection and reconstruction schemes in the context of VOF methods, it may have more general applications. To assess the performance of the supplied routines, different tests, which are provided in FORTRAN and C, were implemented for several 2D and 3D geometries.

  20. Comprehensive Numerical Analysis of Finite Difference Time Domain Methods for Improving Optical Waveguide Sensor Accuracy

    PubMed Central

    Samak, M. Mosleh E. Abu; Bakar, A. Ashrif A.; Kashif, Muhammad; Zan, Mohd Saiful Dzulkifly

    2016-01-01

    This paper discusses numerical analysis methods for different geometrical features that have limited interval values for typically used sensor wavelengths. Compared with existing Finite Difference Time Domain (FDTD) methods, the alternating direction implicit (ADI)-FDTD method reduces the number of sub-steps by a factor of two to three, which represents a 33% time savings in each single run. The local one-dimensional (LOD)-FDTD method has similar numerical equation properties, which should be calculated as in the previous method. Generally, a small number of arithmetic processes, which result in a shorter simulation time, are desired. The alternating direction implicit technique can be considered a significant step forward for improving the efficiency of unconditionally stable FDTD schemes. This comparative study shows that the local one-dimensional method had minimum relative error ranges of less than 40% for analytical frequencies above 42.85 GHz, and the same accuracy was generated by both methods.

  1. Analysis of 40 conventional and emerging disinfection by-products in fresh-cut produce wash water by modified EPA methods.

    PubMed

    Lee, Wan-Ning; Huang, Ching-Hua; Zhu, Guangxuan

    2018-08-01

    Chlorine sanitizers used in washing fresh and fresh-cut produce can lead to generation of disinfection by-products (DBPs) that are harmful to human health. Monitoring of DBPs is necessary to protect food safety but comprehensive analytical methods have been lacking. This study has optimized three U.S. Environmental Protection Agency methods for drinking water DBPs to improve their performance for produce wash water. The method development encompasses 40 conventional and emerging DBPs. Good recoveries (60-130%) were achieved for most DBPs in deionized water and in lettuce, strawberry and cabbage wash water. The method detection limits are in the range of 0.06-0.58 μg/L for most DBPs and 10-24 ng/L for nitrosamines in produce wash water. Preliminary results revealed the formation of many DBPs when produce is washed with chlorine. The optimized analytical methods by this study effectively reduce matrix interference and can serve as useful tools for future research on food DBPs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiedenman, B. J.; White, T. L.; Mahannah, R. N.

    Ion Chromatography (IC) is the principal analytical method used to support studies of Sludge Reciept and Adjustment Tank (SRAT) chemistry at DWPF. A series of prior analytical ''Round Robin'' (RR) studies included both supernate and sludge samples from SRAT simulant, previously reported as memos, are tabulated in this report.2,3 From these studies it was determined to standardize IC column size to 4 mm diameter, eliminating the capillary column from use. As a follow on test, the DWPF laboratory, the PSAL laboratory, and the AD laboratory participated in the current analytical RR to determine a suite of anions in SRAT simulantmore » by IC, results also are tabulated in this report. The particular goal was to confirm the laboratories ability to measure and quantitate glycolate ion. The target was + or - 20% inter-lab agreement of the analyte averages for the RR. Each of the three laboratories analyzed a batch of 12 samples. For each laboratory, the percent relative standard deviation (%RSD) of the averages on nitrate, glycolate, and oxalate, was 10% or less. The three laboratories all met the goal of 20% relative agreement for nitrate and glycolate. For oxalate, the PSAL laboratory reported an average value that was 20% higher than the average values reported by the DWPF laboratory and the AD laboratory. Because of this wider window of agreement, it was concluded to continue the practice of an additional acid digestion for total oxalate measurement. It should also be noted that large amounts of glycolate in the SRAT samples will have an impact on detection limits of near eluting peaks, namely Fluoride and Formate. A suite of scoping experiments are presented in the report to identify and isolate other potential interlaboratory disceprancies. Specific ion chromatography inter-laboratory method conditions and differences are tabulated. Most differences were minor but there are some temperature control equipment differences that are significant leading to a recommendation of a heated jacket for analytical columns that are remoted for use in radiohoods. A suggested method improvement would be to implement column temperture control at a temperature slightly above ambient to avoid peak shifting due to temperature fluctuations. Temperature control in this manner would improve short and longer term peak retention time stability. An unknown peak was observed during the analysis of glycolic acid and SRAT simulant. The unknown peak was determined to best match diglycolic acid. The development of a method for acetate is summaraized, and no significant amount of acetate was observed in the SRAT products tested. In addition, an alternative Gas Chromatograph (GC) method for glycolate is summarized.« less

  3. Validating a faster method for reconstitution of Crotalidae Polyvalent Immune Fab (ovine).

    PubMed

    Gerring, David; King, Thomas R; Branton, Richard

    2013-07-01

    Reconstitution of CroFab(®) (Crotalidae Polyvalent Immune Fab [ovine]) lyophilized drug product was previously performed using 10 mL sterile water for injection followed by up to 36 min of gentle swirling of the vial. CroFab has been clinically demonstrated to be most effective when administered within 6 h of snake envenomation, and improved clinical outcomes are correlated with quicker timing of administration. An alternate reconstitution method was devised, using 18 mL 0.9% saline with manual inversion, with the goal of shortening reconstitution time while maintaining a high quality, efficacious product. An analytical study was designed to compare the physicochemical properties of 3 separate batches of CroFab when reconstituted using the standard procedure (10 mL WFI with gentle swirling) and a modified rapid procedure using 18 mL 0.9% saline and manual inversion. The physical and chemical characteristics of the same 3 batches were assessed using various analytic methodologies associated with routine quality control release testing. In addition further analytical methodologies were applied in order to elucidate possible structural changes that may be induced by the changed reconstitution procedure. Batches A, B, and C required mean reconstitution times of 25 min 51 s using the label method and 3 min 07 s (a 88.0% mean decrease) using the modified method. Physicochemical characteristics (color and clarity, pH, purity, protein content, potency) were found to be highly comparable. Characterization assays (dynamic light scattering, analytical ultracentrifugation, LC-MS, SDS-PAGE and circular dichroism spectroscopy were also all found to be comparable between methods. When comparing CroFab batches that were reconstituted using the labeled and modified methods, the physicochemical and biological (potency) characteristics of CroFab were not significantly changed when challenged by the various standard analytical methodologies applied in routine quality control analysis. Additionally, no changes in the CroFab molecule regarding degradation, aggregation, purity, structure, or mass were observed. The analyses performed validated the use of the more rapid reconstitution method using 18 mL 0.9% saline in order to allow a significantly reduced time to administration of CroFab to patients in need. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Genome-wide analytical approaches for reverse metabolic engineering of industrially relevant phenotypes in yeast

    PubMed Central

    Oud, Bart; Maris, Antonius J A; Daran, Jean-Marc; Pronk, Jack T

    2012-01-01

    Successful reverse engineering of mutants that have been obtained by nontargeted strain improvement has long presented a major challenge in yeast biotechnology. This paper reviews the use of genome-wide approaches for analysis of Saccharomyces cerevisiae strains originating from evolutionary engineering or random mutagenesis. On the basis of an evaluation of the strengths and weaknesses of different methods, we conclude that for the initial identification of relevant genetic changes, whole genome sequencing is superior to other analytical techniques, such as transcriptome, metabolome, proteome, or array-based genome analysis. Key advantages of this technique over gene expression analysis include the independency of genome sequences on experimental context and the possibility to directly and precisely reproduce the identified changes in naive strains. The predictive value of genome-wide analysis of strains with industrially relevant characteristics can be further improved by classical genetics or simultaneous analysis of strains derived from parallel, independent strain improvement lineages. PMID:22152095

  5. Genome-wide analytical approaches for reverse metabolic engineering of industrially relevant phenotypes in yeast.

    PubMed

    Oud, Bart; van Maris, Antonius J A; Daran, Jean-Marc; Pronk, Jack T

    2012-03-01

    Successful reverse engineering of mutants that have been obtained by nontargeted strain improvement has long presented a major challenge in yeast biotechnology. This paper reviews the use of genome-wide approaches for analysis of Saccharomyces cerevisiae strains originating from evolutionary engineering or random mutagenesis. On the basis of an evaluation of the strengths and weaknesses of different methods, we conclude that for the initial identification of relevant genetic changes, whole genome sequencing is superior to other analytical techniques, such as transcriptome, metabolome, proteome, or array-based genome analysis. Key advantages of this technique over gene expression analysis include the independency of genome sequences on experimental context and the possibility to directly and precisely reproduce the identified changes in naive strains. The predictive value of genome-wide analysis of strains with industrially relevant characteristics can be further improved by classical genetics or simultaneous analysis of strains derived from parallel, independent strain improvement lineages. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  6. Evaluating the Effectiveness of Programs to Improve Educational Attainment of Unwed African American Teen Mothers: A Meta Analysis

    ERIC Educational Resources Information Center

    Baytop, Chanza M.

    2006-01-01

    A study implements meta-analytic methods to synthesize the findings and analyze the effects of interventions, including secondary teen pregnancy prevention programs, on educational achievement among unwed African American teen mothers. Results indicate that secondary teen pregnancy prevention programs and other interventions for adolescent mothers…

  7. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework

    EPA Science Inventory

    Driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deploy...

  8. Improved HF183 reverse primer and probe for greater analytical sensitivity of human Bacteroides in the environment

    EPA Science Inventory

    Background: Numerous indicators have been used to assess the presence of fecal pollution, many relying on molecular methods such as qPCR. One of the targets frequently used, the human-associated Bacteroides 16s rRNA region, has several assays in current usage. These assays vary...

  9. Toward improved understanding and control in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hieftje, Gary M.

    1989-01-01

    As with most papers which attempt to predict the future, this treatment will begin with a coverage of past events. It will be shown that progress in the field of analytical atomic spectrometry has occurred through a series of steps which involve the addition of new techniques and the occasional displacement of established ones. Because it is difficult or impossible to presage true breakthroughs, this manuscript will focus on how such existing methods can be modified or improved to greatest advantage. The thesis will be that rational improvement can be accomplished most effectively by understanding fundamentally the nature of an instrumental system, a measurement process, and a spectrometric technique. In turn, this enhanced understanding can lead to closer control, from which can spring improved performance. Areas where understanding is now lacking and where control is most greatly needed will be identified and a possible scheme for implementing control procedures will be outlined. As we draw toward the new millennium, these novel procedures seem particularly appealing; new high-speed computers, the availability of expert systems, and our enhanced understanding of atomic spectrometric events combine to make future prospects extremely bright.

  10. Electromagnetic Launch Technology Assessment. Scientific Basis and Unified Treatment: Forces and Electromechanical Power Conversion (Analytical and Numerical Methods),

    DTIC Science & Technology

    1990-06-01

    on simple railgun accelerators andI homopolar generators. Complex rotating flux compressors would drastically improve the performance of EM launchers...velocities. If this is the direction of improvement, then energies stored in the electric trains built with linear electric motors in Japan and Western I...laboratories which had power supplies 3 already built for other programs ( homopolar generators in conjunction with an inductor and an opening switch

  11. Ways to improve your correlation functions

    NASA Technical Reports Server (NTRS)

    Hamilton, A. J. S.

    1993-01-01

    This paper describes a number of ways to improve on the standard method for measuring the two-point correlation function of large scale structure in the Universe. Issues addressed are: (1) the problem of the mean density, and how to solve it; (2) how to estimate the uncertainty in a measured correlation function; (3) minimum variance pair weighting; (4) unbiased estimation of the selection function when magnitudes are discrete; and (5) analytic computation of angular integrals in background pair counts.

  12. An Assessment of Operational Energy Capability Improvement Fund (OECIF) Programs 17-S-2544

    DTIC Science & Technology

    2017-09-19

    persistently attack key operational energy problems . OECIF themes are summarized in Table 1, and Appendix A includes more detail on the programs within... problems FY 2014 Analytical methods and tools FY 2015 Improving fuel economy for the current tactical ground fleet FY 2016 Increasing the operational...involve a variety of organizations to solve operational energy problems . In FY 2015, the OECIF program received a one-time $14.1M Congressional plus-up

  13. Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.

    1989-01-01

    The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.

  14. Determination of lead isotopes in a new Greenland deep ice core at the sub-picogram per gram level by thermal ionization mass spectrometry using an improved decontamination method.

    PubMed

    Han, Changhee; Burn-Nunes, Laurie J; Lee, Khanghyun; Chang, Chaewon; Kang, Jung-Ho; Han, Yeongcheol; Hur, Soon Do; Hong, Sungmin

    2015-08-01

    An improved decontamination method and ultraclean analytical procedures have been developed to minimize Pb contamination of processed glacial ice cores and to achieve reliable determination of Pb isotopes in North Greenland Eemian Ice Drilling (NEEM) deep ice core sections with concentrations at the sub-picogram per gram level. A PL-7 (Fuso Chemical) silica-gel activator has replaced the previously used colloidal silica activator produced by Merck and has been shown to provide sufficiently enhanced ion beam intensity for Pb isotope analysis for a few tens of picograms of Pb. Considering the quantities of Pb contained in the NEEM Greenland ice core and a sample weight of 10 g used for the analysis, the blank contribution from the sample treatment was observed to be negligible. The decontamination and analysis of the artificial ice cores and selected NEEM Greenland ice core sections confirmed the cleanliness and effectiveness of the overall analytical process. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. System parameter identification from projection of inverse analysis

    NASA Astrophysics Data System (ADS)

    Liu, K.; Law, S. S.; Zhu, X. Q.

    2017-05-01

    The output of a system due to a change of its parameters is often approximated with the sensitivity matrix from the first order Taylor series. The system output can be measured in practice, but the perturbation in the system parameters is usually not available. Inverse sensitivity analysis can be adopted to estimate the unknown system parameter perturbation from the difference between the observation output data and corresponding analytical output data calculated from the original system model. The inverse sensitivity analysis is re-visited in this paper with improvements based on the Principal Component Analysis on the analytical data calculated from the known system model. The identification equation is projected into a subspace of principal components of the system output, and the sensitivity of the inverse analysis is improved with an iterative model updating procedure. The proposed method is numerical validated with a planar truss structure and dynamic experiments with a seven-storey planar steel frame. Results show that it is robust to measurement noise, and the location and extent of stiffness perturbation can be identified with better accuracy compared with the conventional response sensitivity-based method.

  16. Teaching Cell Biology in the Large-Enrollment Classroom: Methods to Promote Analytical Thinking and Assessment of Their Effectiveness

    PubMed Central

    Kitchen, Elizabeth; Bell, John D.; Reeve, Suzanne; Sudweeks, Richard R.; Bradshaw, William S.

    2003-01-01

    A large-enrollment, undergraduate cellular biology lecture course is described whose primary goal is to help students acquire skill in the interpretation of experimental data. The premise is that this kind of analytical reasoning is not intuitive for most people and, in the absence of hands-on laboratory experience, will not readily develop unless instructional methods and examinations specifically designed to foster it are employed. Promoting scientific thinking forces changes in the roles of both teacher and student. We describe didactic strategies that include directed practice of data analysis in a workshop format, active learning through verbal and written communication, visualization of abstractions diagrammatically, and the use of ancillary small-group mentoring sessions with faculty. The implications for a teacher in reducing the breadth and depth of coverage, becoming coach instead of lecturer, and helping students to diagnose cognitive weaknesses are discussed. In order to determine the efficacy of these strategies, we have carefully monitored student performance and have demonstrated a large gain in a pre- and posttest comparison of scores on identical problems, improved test scores on several successive midterm examinations when the statistical analysis accounts for the relative difficulty of the problems, and higher scores in comparison to students in a control course whose objective was information transfer, not acquisition of reasoning skills. A novel analytical index (student mobility profile) is described that demonstrates that this improvement was not random, but a systematic outcome of the teaching/learning strategies employed. An assessment of attitudes showed that, in spite of finding it difficult, students endorse this approach to learning, but also favor curricular changes that would introduce an analytical emphasis earlier in their training. PMID:14506506

  17. An improved LC-MS/MS method for the quantification of alverine and para hydroxy alverine in human plasma for a bioequivalence study☆.

    PubMed

    Rathod, Dhiraj M; Patel, Keyur R; Mistri, Hiren N; Jangid, Arvind G; Shrivastav, Pranav S; Sanyal, Mallika

    2017-04-01

    A highly sensitive and selective high performance liquid chromatography-tandem mass spectrometry method was developed and validated for the quantification of alverine (ALV) and its active metabolite, para hydroxy alverine (PHA), in human plasma. For sample preparation, solid phase extraction of analytes was performed on Phenomenex Strata-X cartridges using alverine-d5 as the internal standard. The analytes were separated on Symmetry Shield RP 18 (150 mm×3.9 mm, 5 µm) column with a mobile phase consisting of acetonitrile and 10 mM ammonium formate (65:35, v/v). Detection and quantitation was done by electrospray ionization mass spectrometry in the positive mode using multiple reaction monitoring. The assay method was fully validated over the concentration range of 15.0-15,000 pg/mL for ALV and 30.0-15,000 pg/mL for PHA. The intra-day and inter-day accuracy and precision (% CV) ranged from 94.00% to 96.00% and 0.48% to 4.15% for both the analytes. The mean recovery obtained for ALV and PHA was 80.59% and 81.26%, respectively. Matrix effect, expressed as IS-normalized matrix factor ranged from 0.982 to 1.009 for both the analytes. The application of the method was demonstrated for the specific analysis of ALV and PHA for a bioequivalence study in 52 healthy subjects using 120 mg ALV capsules. The assay reproducibility was also verified by reanalysis of 175 incurred subject samples.

  18. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  19. An improved algorithm for balanced POD through an analytic treatment of impulse response tails

    NASA Astrophysics Data System (ADS)

    Tu, Jonathan H.; Rowley, Clarence W.

    2012-06-01

    We present a modification of the balanced proper orthogonal decomposition (balanced POD) algorithm for systems with simple impulse response tails. In this new method, we use dynamic mode decomposition (DMD) to estimate the slowly decaying eigenvectors that dominate the long-time behavior of the direct and adjoint impulse responses. This is done using a new, low-memory variant of the DMD algorithm, appropriate for large datasets. We then formulate analytic expressions for the contribution of these eigenvectors to the controllability and observability Gramians. These contributions can be accounted for in the balanced POD algorithm by simply appending the impulse response snapshot matrices (direct and adjoint, respectively) with particular linear combinations of the slow eigenvectors. Aside from these additions to the snapshot matrices, the algorithm remains unchanged. By treating the tails analytically, we eliminate the need to run long impulse response simulations, lowering storage requirements and speeding up ensuing computations. To demonstrate its effectiveness, we apply this method to two examples: the linearized, complex Ginzburg-Landau equation, and the two-dimensional fluid flow past a cylinder. As expected, reduced-order models computed using an analytic tail match or exceed the accuracy of those computed using the standard balanced POD procedure, at a fraction of the cost.

  20. Development of Standard Reference Materials to support assessment of iodine status for nutritional and public health purposes.

    PubMed

    Long, Stephen E; Catron, Brittany L; Boggs, Ashley Sp; Tai, Susan Sc; Wise, Stephen A

    2016-09-01

    The use of urinary iodine as an indicator of iodine status relies in part on the accuracy of the analytical measurement of iodine in urine. Likewise, the use of dietary iodine intake as an indicator of iodine status relies in part on the accuracy of the analytical measurement of iodine in dietary sources, including foods and dietary supplements. Similarly, the use of specific serum biomarkers of thyroid function to screen for both iodine deficiency and iodine excess relies in part on the accuracy of the analytical measurement of those biomarkers. The National Institute of Standards and Technology has been working with the NIH Office of Dietary Supplements for several years to develop higher-order reference measurement procedures and Standard Reference Materials to support the validation of new routine analytical methods for iodine in foods and dietary supplements, for urinary iodine, and for several serum biomarkers of thyroid function including thyroid-stimulating hormone, thyroglobulin, total and free thyroxine, and total and free triiodothyronine. These materials and methods have the potential to improve the assessment of iodine status and thyroid function in observational studies and clinical trials, thereby promoting public health efforts related to iodine nutrition. © 2016 American Society for Nutrition.

  1. Ultrasonic nebulization atmospheric pressure glow discharge - Preliminary study

    NASA Astrophysics Data System (ADS)

    Greda, Krzysztof; Jamroz, Piotr; Pohl, Pawel

    2016-07-01

    Atmospheric pressure glow microdischarge (μAPGD) generated between a small-sized He nozzle jet anode and a flowing liquid cathode was coupled with ultrasonic nebulization (USN) for analytical optical emission spectrometry (OES). The spatial distributions of the emitted spectra from the novel coupled USN-μAPGD system and the conventional μAPGD system were compared. In the μAPGD, the maxima of the intensity distribution profiles of the atomic emission lines Ca, Cd, In, K, Li, Mg, Mn, Na and Sr were observed in the near cathode region, whereas, in the case of the USN-μAPGD, they were shifted towards the anode. In the novel system, the intensities of the analytical lines of the studied metals were boosted from several to 35 times. As compared to the conventional μAPGD-OES with the introduction of analytes through the sputtering and/or the electrospray-like nebulization of the flowing liquid cathode solution, the proposed method with the USN introduction of analytes in the form of a dry aerosol provides improved detectability of the studied metals. The detection limits of metals achieved with the USN-μAPGD-OES method were in the range from 0.08 μg L- 1 for Li to 52 μg L- 1 for Mn.

  2. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...

  3. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...

  4. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...

  5. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...

  6. Improved partition equilibrium model for predicting analyte response in electrospray ionization mass spectrometry.

    PubMed

    Du, Lihong; White, Robert L

    2009-02-01

    A previously proposed partition equilibrium model for quantitative prediction of analyte response in electrospray ionization mass spectrometry is modified to yield an improved linear relationship. Analyte mass spectrometer response is modeled by a competition mechanism between analyte and background electrolytes that is based on partition equilibrium considerations. The correlation between analyte response and solution composition is described by the linear model over a wide concentration range and the improved model is shown to be valid for a wide range of experimental conditions. The behavior of an analyte in a salt solution, which could not be explained by the original model, is correctly predicted. The ion suppression effects of 16:0 lysophosphatidylcholine (LPC) on analyte signals are attributed to a combination of competition for excess charge and reduction of total charge due to surface tension effects. In contrast to the complicated mathematical forms that comprise the original model, the simplified model described here can more easily be employed to predict analyte mass spectrometer responses for solutions containing multiple components. Copyright (c) 2008 John Wiley & Sons, Ltd.

  7. An efficient impedance method for induced field evaluation based on a stabilized Bi-conjugate gradient algorithm.

    PubMed

    Wang, Hua; Liu, Feng; Xia, Ling; Crozier, Stuart

    2008-11-21

    This paper presents a stabilized Bi-conjugate gradient algorithm (BiCGstab) that can significantly improve the performance of the impedance method, which has been widely applied to model low-frequency field induction phenomena in voxel phantoms. The improved impedance method offers remarkable computational advantages in terms of convergence performance and memory consumption over the conventional, successive over-relaxation (SOR)-based algorithm. The scheme has been validated against other numerical/analytical solutions on a lossy, multilayered sphere phantom excited by an ideal coil loop. To demonstrate the computational performance and application capability of the developed algorithm, the induced fields inside a human phantom due to a low-frequency hyperthermia device is evaluated. The simulation results show the numerical accuracy and superior performance of the method.

  8. Using an innovative combination of quality-by-design and green analytical chemistry approaches for the development of a stability indicating UHPLC method in pharmaceutical products.

    PubMed

    Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen

    2015-11-10

    An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Quantification of fluorine traces in solid samples using CaF molecular emission bands in atmospheric air Laser-Induced Breakdown Spectroscopy

    NASA Astrophysics Data System (ADS)

    Alvarez-Llamas, C.; Pisonero, J.; Bordel, N.

    2016-09-01

    Direct solid determination of trace amounts of fluorine using Laser-Induced Breakdown Spectroscopy (LIBS) is a challenging task due to the low excitation efficiency of this element. Several strategies have been developed to improve the detection capabilities, including the use of LIBS in a He atmosphere to enhance the signal to background ratios of F atomic emission lines. An alternative method is based on the detection of the molecular compounds that are formed with fluorine in the LIBS plasma. In this work, the detection of CaF molecular emission bands is investigated to improve the analytical capabilities of atmospheric air LIBS for the determination of fluorine traces in solid samples. In particular, Cu matrix samples containing different fluorine concentration (between 50 and 600 μg/g), and variable amounts of Ca, are used to demonstrate the linear relationships between CaF emission signal and F concentration. Limits of detection for fluorine are improved by more than 1 order of magnitude using CaF emission bands versus F atomic lines, in atmospheric-air LIBS. Furthermore, a toothpaste powder sample is used to validate this analytical method. Good agreement is observed between the nominal and the predicted fluorine mass-content.

  10. Building analytic capacity, facilitating partnerships, and promoting data use in state health agencies: a distance-based workforce development initiative applied to maternal and child health epidemiology.

    PubMed

    Rankin, Kristin M; Kroelinger, Charlan D; Rosenberg, Deborah; Barfield, Wanda D

    2012-12-01

    The purpose of this article is to summarize the methodology, partnerships, and products developed as a result of a distance-based workforce development initiative to improve analytic capacity among maternal and child health (MCH) epidemiologists in state health agencies. This effort was initiated by the Centers for Disease Control's MCH Epidemiology Program and faculty at the University of Illinois at Chicago to encourage and support the use of surveillance data by MCH epidemiologists and program staff in state agencies. Beginning in 2005, distance-based training in advanced analytic skills was provided to MCH epidemiologists. To support participants, this model of workforce development included: lectures about the practical application of innovative epidemiologic methods, development of multidisciplinary teams within and across agencies, and systematic, tailored technical assistance The goal of this initiative evolved to emphasize the direct application of advanced methods to the development of state data products using complex sample surveys, resulting in the articles published in this supplement to MCHJ. Innovative methods were applied by participating MCH epidemiologists, including regional analyses across geographies and datasets, multilevel analyses of state policies, and new indicator development. Support was provided for developing cross-state and regional partnerships and for developing and publishing the results of analytic projects. This collaboration was successful in building analytic capacity, facilitating partnerships and promoting surveillance data use to address state MCH priorities, and may have broader application beyond MCH epidemiology. In an era of decreasing resources, such partnership efforts between state and federal agencies and academia are essential for promoting effective data use.

  11. Nanocoating cellulose paper based microextraction combined with nanospray mass spectrometry for rapid and facile quantitation of ribonucleosides in human urine.

    PubMed

    Wan, Lingzhong; Zhu, Haijing; Guan, Yafeng; Huang, Guangming

    2017-07-01

    A rapid and facile analytical method for quantification of ribonucleosides in human urine was developed by the combination of nanocoating cellulose paper based microextraction and nanoelectrospray ionization-tandem mass spectrometry (nESI-MS/MS). Cellulose paper used for microextraction was modified by nano-precision deposition of uniform ultrathin zirconia gel film using a sol-gel process. Due to the large surface area of the cellulose paper and the strong affinity between zirconia and the cis-diol compounds, the target analytes were selectively extracted from the complex matrix. Thus, the detection sensitivity was greatly improved. Typically, the nanocoating cellulose paper was immersed into the diluted urine for selective extraction of target analytes, then the extracted analytes were subjected to nESI-MS/MS detection. The whole analytical procedure could be completed within 10min. The method was evaluated by the determination of ribonucleosides (adenosine, cytidine, uridine, guanosine) in urine sample. The signal intensities of the ribonuclesides extracted by the nanocoating cellulose paper were greatly enhanced by 136-459-folds compared with the one of the unmodified cellulose paper based microextraction. The limits of detection (LODs) and the limits of quantification (LOQs) of the four ribonucleosides were in the range of 0.0136-1.258μgL -1 and 0.0454-4.194μgL -1 , respectively. The recoveries of the target nucleosides from spiked human urine were in the range of 75.64-103.49% with the relative standard deviations (RSDs) less than 9.36%. The results demonstrate the potential of the proposed method for rapid and facile determination of endogenous ribonucleosides in urine sample. Copyright © 2017. Published by Elsevier B.V.

  12. Three-dimensional photoacoustic tomography based on graphics-processing-unit-accelerated finite element method.

    PubMed

    Peng, Kuan; He, Ling; Zhu, Ziqiang; Tang, Jingtian; Xiao, Jiaying

    2013-12-01

    Compared with commonly used analytical reconstruction methods, the frequency-domain finite element method (FEM) based approach has proven to be an accurate and flexible algorithm for photoacoustic tomography. However, the FEM-based algorithm is computationally demanding, especially for three-dimensional cases. To enhance the algorithm's efficiency, in this work a parallel computational strategy is implemented in the framework of the FEM-based reconstruction algorithm using a graphic-processing-unit parallel frame named the "compute unified device architecture." A series of simulation experiments is carried out to test the accuracy and accelerating effect of the improved method. The results obtained indicate that the parallel calculation does not change the accuracy of the reconstruction algorithm, while its computational cost is significantly reduced by a factor of 38.9 with a GTX 580 graphics card using the improved method.

  13. SAM Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation

  14. Multiple internal standard normalization for improving HS-SPME-GC-MS quantitation in virgin olive oil volatile organic compounds (VOO-VOCs) profile.

    PubMed

    Fortini, Martina; Migliorini, Marzia; Cherubini, Chiara; Cecchi, Lorenzo; Calamai, Luca

    2017-04-01

    The commercial value of virgin olive oils (VOOs) strongly depends on their classification, also based on the aroma of the oils, usually evaluated by a panel test. Nowadays, a reliable analytical method is still needed to evaluate the volatile organic compounds (VOCs) and support the standard panel test method. To date, the use of HS-SPME sampling coupled to GC-MS is generally accepted for the analysis of VOCs in VOOs. However, VOO is a challenging matrix due to the simultaneous presence of: i) compounds at ppm and ppb concentrations; ii) molecules belonging to different chemical classes and iii) analytes with a wide range of molecular mass. Therefore, HS-SPME-GC-MS quantitation based upon the use of external standard method or of only a single internal standard (ISTD) for data normalization in an internal standard method, may be troublesome. In this work a multiple internal standard normalization is proposed to overcome these problems and improving quantitation of VOO-VOCs. As many as 11 ISTDs were used for quantitation of 71 VOCs. For each of them the most suitable ISTD was selected and a good linearity in a wide range of calibration was obtained. Except for E-2-hexenal, without ISTD or with an unsuitable ISTD, the linear range of calibration was narrower with respect to that obtained by a suitable ISTD, confirming the usefulness of multiple internal standard normalization for the correct quantitation of VOCs profile in VOOs. The method was validated for 71 VOCs, and then applied to a series of lampante virgin olive oils and extra virgin olive oils. In light of our results, we propose the application of this analytical approach for routine quantitative analyses and to support sensorial analysis for the evaluation of positive and negative VOOs attributes. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. New algorithms for motion error detection of numerical control machine tool by laser tracking measurement on the basis of GPS principle.

    PubMed

    Wang, Jindong; Chen, Peng; Deng, Yufen; Guo, Junjie

    2018-01-01

    As a three-dimensional measuring instrument, the laser tracker is widely used in industrial measurement. To avoid the influence of angle measurement error on the overall measurement accuracy, the multi-station and time-sharing measurement with a laser tracker is introduced on the basis of the global positioning system (GPS) principle in this paper. For the proposed method, how to accurately determine the coordinates of each measuring point by using a large amount of measured data is a critical issue. Taking detecting motion error of a numerical control machine tool, for example, the corresponding measurement algorithms are investigated thoroughly. By establishing the mathematical model of detecting motion error of a machine tool with this method, the analytical algorithm concerning on base station calibration and measuring point determination is deduced without selecting the initial iterative value in calculation. However, when the motion area of the machine tool is in a 2D plane, the coefficient matrix of base station calibration is singular, which generates a distortion result. In order to overcome the limitation of the original algorithm, an improved analytical algorithm is also derived. Meanwhile, the calibration accuracy of the base station with the improved algorithm is compared with that with the original analytical algorithm and some iterative algorithms, such as the Gauss-Newton algorithm and Levenberg-Marquardt algorithm. The experiment further verifies the feasibility and effectiveness of the improved algorithm. In addition, the different motion areas of the machine tool have certain influence on the calibration accuracy of the base station, and the corresponding influence of measurement error on the calibration result of the base station depending on the condition number of coefficient matrix are analyzed.

  16. New algorithms for motion error detection of numerical control machine tool by laser tracking measurement on the basis of GPS principle

    NASA Astrophysics Data System (ADS)

    Wang, Jindong; Chen, Peng; Deng, Yufen; Guo, Junjie

    2018-01-01

    As a three-dimensional measuring instrument, the laser tracker is widely used in industrial measurement. To avoid the influence of angle measurement error on the overall measurement accuracy, the multi-station and time-sharing measurement with a laser tracker is introduced on the basis of the global positioning system (GPS) principle in this paper. For the proposed method, how to accurately determine the coordinates of each measuring point by using a large amount of measured data is a critical issue. Taking detecting motion error of a numerical control machine tool, for example, the corresponding measurement algorithms are investigated thoroughly. By establishing the mathematical model of detecting motion error of a machine tool with this method, the analytical algorithm concerning on base station calibration and measuring point determination is deduced without selecting the initial iterative value in calculation. However, when the motion area of the machine tool is in a 2D plane, the coefficient matrix of base station calibration is singular, which generates a distortion result. In order to overcome the limitation of the original algorithm, an improved analytical algorithm is also derived. Meanwhile, the calibration accuracy of the base station with the improved algorithm is compared with that with the original analytical algorithm and some iterative algorithms, such as the Gauss-Newton algorithm and Levenberg-Marquardt algorithm. The experiment further verifies the feasibility and effectiveness of the improved algorithm. In addition, the different motion areas of the machine tool have certain influence on the calibration accuracy of the base station, and the corresponding influence of measurement error on the calibration result of the base station depending on the condition number of coefficient matrix are analyzed.

  17. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  18. Improving traditional balancing methods for high-speed rotors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ling, J.; Cao, Y.

    1996-01-01

    This paper introduces frequency response functions, analyzes the relationships between the frequency response functions and influence coefficients theoretically, and derives corresponding mathematical equations for high-speed rotor balancing. The relationships between the imbalance masses on the rotor and frequency response functions are also analyzed based upon the modal balancing method, and the equations related to the static and dynamic imbalance masses and the frequency response function are obtained. Experiments on a high-speed rotor balancing rig were performed to verify the theory, and the experimental data agree satisfactorily with the analytical solutions. The improvement on the traditional balancing method proposed in thismore » paper will substantially reduce the number of rotor startups required during the balancing process of rotating machinery.« less

  19. SAM Pathogen Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.

  20. Refinement of Fread s Method for improved tracking of stream discharges during unsteady flows

    DOE PAGES

    Lee, Kyutae; Muste, Marian

    2017-02-07

    There are a plethora of analytical approaches to account for the effect of unsteady flow (a.k.a. hysteretic behavior) on the conventionally-built steady rating curves (RCs) used to continuously estimate discharges in open channel flow. One of the most complete correction methods is Fread s method (Fread, 1975) which is based on fully dynamic one-dimensional wave equation. Proposed herein is a modified Fread s method which is adjusted to account for the actual geometry of the cross section. This method improves the accuracy associated with the estimation of conveyance factor and energy slope, so it is particularly useful for small tomore » mid-size streams/rivers where the original method s assumption does not properly hold. The modified Fread s method is tested for the sites in Clear Creek (Iowa, USA) and Ebro River (Spain) to illustrate the significance of its improvement in discharge estimation. While the degree of improvement is apparent for the conveyance factor because the hydraulic depth is replaced by hydraulic radius, that for the energy slope term specifically depends on the site and event conditions.« less

  1. Refinement of Fread s Method for improved tracking of stream discharges during unsteady flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Kyutae; Muste, Marian

    There are a plethora of analytical approaches to account for the effect of unsteady flow (a.k.a. hysteretic behavior) on the conventionally-built steady rating curves (RCs) used to continuously estimate discharges in open channel flow. One of the most complete correction methods is Fread s method (Fread, 1975) which is based on fully dynamic one-dimensional wave equation. Proposed herein is a modified Fread s method which is adjusted to account for the actual geometry of the cross section. This method improves the accuracy associated with the estimation of conveyance factor and energy slope, so it is particularly useful for small tomore » mid-size streams/rivers where the original method s assumption does not properly hold. The modified Fread s method is tested for the sites in Clear Creek (Iowa, USA) and Ebro River (Spain) to illustrate the significance of its improvement in discharge estimation. While the degree of improvement is apparent for the conveyance factor because the hydraulic depth is replaced by hydraulic radius, that for the energy slope term specifically depends on the site and event conditions.« less

  2. Contamination in food from packaging material.

    PubMed

    Lau, O W; Wong, S K

    2000-06-16

    Packaging has become an indispensible element in the food manufacturing process, and different types of additives, such as antioxidants, stabilizers, lubricants, anti-static and anti-blocking agents, have also been developed to improve the performance of polymeric packaging materials. Recently the packaging has been found to represent a source of contamination itself through the migration of substances from the packaging into food. Various analytical methods have been developed to analyze the migrants in the foodstuff, and migration evaluation procedures based on theoretical prediction of migration from plastic food contact material were also introduced recently. In this paper, the regulatory control, analytical methodology, factors affecting the migration and migration evaluation are reviewed.

  3. Propfan experimental data analysis

    NASA Technical Reports Server (NTRS)

    Vernon, David F.; Page, Gregory S.; Welge, H. Robert

    1984-01-01

    A data reduction method, which is consistent with the performance prediction methods used for analysis of new aircraft designs, is defined and compared to the method currently used by NASA using data obtained from an Ames Res. Center 11 foot transonic wind tunnel test. Pressure and flow visualization data from the Ames test for both the powered straight underwing nacelle, and an unpowered contoured overwing nacelle installation is used to determine the flow phenomena present for a wind mounted turboprop installation. The test data is compared to analytic methods, showing the analytic methods to be suitable for design and analysis of new configurations. The data analysis indicated that designs with zero interference drag levels are achieveable with proper wind and nacelle tailoring. A new overwing contoured nacelle design and a modification to the wing leading edge extension for the current wind tunnel model design are evaluated. Hardware constraints of the current model parts prevent obtaining any significant performance improvement due to a modified nacelle contouring. A new aspect ratio wing design for an up outboard rotation turboprop installation is defined, and an advanced contoured nacelle is provided.

  4. Meta-analysis in evidence-based healthcare: a paradigm shift away from random effects is overdue.

    PubMed

    Doi, Suhail A R; Furuya-Kanamori, Luis; Thalib, Lukman; Barendregt, Jan J

    2017-12-01

    Each year up to 20 000 systematic reviews and meta-analyses are published whose results influence healthcare decisions, thus making the robustness and reliability of meta-analytic methods one of the world's top clinical and public health priorities. The evidence synthesis makes use of either fixed-effect or random-effects statistical methods. The fixed-effect method has largely been replaced by the random-effects method as heterogeneity of study effects led to poor error estimation. However, despite the widespread use and acceptance of the random-effects method to correct this, it too remains unsatisfactory and continues to suffer from defective error estimation, posing a serious threat to decision-making in evidence-based clinical and public health practice. We discuss here the problem with the random-effects approach and demonstrate that there exist better estimators under the fixed-effect model framework that can achieve optimal error estimation. We argue for an urgent return to the earlier framework with updates that address these problems and conclude that doing so can markedly improve the reliability of meta-analytical findings and thus decision-making in healthcare.

  5. Assessing the detectability of antioxidants in two-dimensional high-performance liquid chromatography.

    PubMed

    Bassanese, Danielle N; Conlan, Xavier A; Barnett, Neil W; Stevenson, Paul G

    2015-05-01

    This paper explores the analytical figures of merit of two-dimensional high-performance liquid chromatography for the separation of antioxidant standards. The cumulative two-dimensional high-performance liquid chromatography peak area was calculated for 11 antioxidants by two different methods--the areas reported by the control software and by fitting the data with a Gaussian model; these methods were evaluated for precision and sensitivity. Both methods demonstrated excellent precision in regards to retention time in the second dimension (%RSD below 1.16%) and cumulative second dimension peak area (%RSD below 3.73% from the instrument software and 5.87% for the Gaussian method). Combining areas reported by the high-performance liquid chromatographic control software displayed superior limits of detection, in the order of 1 × 10(-6) M, almost an order of magnitude lower than the Gaussian method for some analytes. The introduction of the countergradient eliminated the strong solvent mismatch between dimensions, leading to a much improved peak shape and better detection limits for quantification. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Pharmaceutical cocrystals, salts and polymorphs: Advanced characterization techniques.

    PubMed

    Pindelska, Edyta; Sokal, Agnieszka; Kolodziejski, Waclaw

    2017-08-01

    The main goal of a novel drug development is to obtain it with optimal physiochemical, pharmaceutical and biological properties. Pharmaceutical companies and scientists modify active pharmaceutical ingredients (APIs), which often are cocrystals, salts or carefully selected polymorphs, to improve the properties of a parent drug. To find the best form of a drug, various advanced characterization methods should be used. In this review, we have described such analytical methods, dedicated to solid drug forms. Thus, diffraction, spectroscopic, thermal and also pharmaceutical characterization methods are discussed. They all are necessary to study a solid API in its intrinsic complexity from bulk down to the molecular level, gain information on its structure, properties, purity and possible transformations, and make the characterization efficient, comprehensive and complete. Furthermore, these methods can be used to monitor and investigate physical processes, involved in the drug development, in situ and in real time. The main aim of this paper is to gather information on the current advancements in the analytical methods and highlight their pharmaceutical relevance. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...

  8. Compliance and stress sensitivity of spur gear teeth

    NASA Technical Reports Server (NTRS)

    Cornell, R. W.

    1983-01-01

    The magnitude and variation of tooth pair compliance with load position affects the dynamics and loading significantly, and the tooth root stressing per load varies significantly with load position. Therefore, the recently developed time history, interactive, closed form solution for the dynamic tooth loads for both low and high contact ratio spur gears was expanded to include improved and simplified methods for calculating the compliance and stress sensitivity for three involute tooth forms as a function of load position. The compliance analysis has an improved fillet/foundation. The stress sensitivity analysis is a modified version of the Heywood method but with an improvement in the magnitude and location of the peak stress in the fillet. These improved compliance and stress sensitivity analyses are presented along with their evaluation using test, finite element, and analytic transformation results, which showed good agreement.

  9. Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory

    NASA Astrophysics Data System (ADS)

    Bozkaya, Uǧur

    2013-09-01

    Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory (OMP3) [U. Bozkaya, J. Chem. Phys. 135, 224103 (2011)], 10.1063/1.3665134 are presented. The OMP3 method is applied to problematic chemical systems with challenging electronic structures. The performance of the OMP3 method is compared with those of canonical second-order Møller-Plesset perturbation theory (MP2), third-order Møller-Plesset perturbation theory (MP3), coupled-cluster singles and doubles (CCSD), and coupled-cluster singles and doubles with perturbative triples [CCSD(T)] for investigating equilibrium geometries, vibrational frequencies, and open-shell reaction energies. For bond lengths, the performance of OMP3 is in between those of MP3 and CCSD. For harmonic vibrational frequencies, the OMP3 method significantly eliminates the singularities arising from the abnormal response contributions observed for MP3 in case of symmetry-breaking problems, and provides noticeably improved vibrational frequencies for open-shell molecules. For open-shell reaction energies, OMP3 exhibits a better performance than MP3 and CCSD as in case of barrier heights and radical stabilization energies. As discussed in previous studies, the OMP3 method is several times faster than CCSD in energy computations. Further, in analytic gradient computations for the CCSD method one needs to solve λ-amplitude equations, however for OMP3 one does not since λ _{ab}^{ij(1)} = t_{ij}^{ab(1)} and λ _{ab}^{ij(2)} = t_{ij}^{ab(2)}. Additionally, one needs to solve orbital Z-vector equations for CCSD, but for OMP3 orbital response contributions are zero owing to the stationary property of OMP3. Overall, for analytic gradient computations the OMP3 method is several times less expensive than CCSD (roughly ˜4-6 times). Considering the balance of computational cost and accuracy we conclude that the OMP3 method emerges as a very useful tool for the study of electronically challenging chemical systems.

  10. Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory.

    PubMed

    Bozkaya, Uğur

    2013-09-14

    Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory (OMP3) [U. Bozkaya, J. Chem. Phys. 135, 224103 (2011)] are presented. The OMP3 method is applied to problematic chemical systems with challenging electronic structures. The performance of the OMP3 method is compared with those of canonical second-order Møller-Plesset perturbation theory (MP2), third-order Møller-Plesset perturbation theory (MP3), coupled-cluster singles and doubles (CCSD), and coupled-cluster singles and doubles with perturbative triples [CCSD(T)] for investigating equilibrium geometries, vibrational frequencies, and open-shell reaction energies. For bond lengths, the performance of OMP3 is in between those of MP3 and CCSD. For harmonic vibrational frequencies, the OMP3 method significantly eliminates the singularities arising from the abnormal response contributions observed for MP3 in case of symmetry-breaking problems, and provides noticeably improved vibrational frequencies for open-shell molecules. For open-shell reaction energies, OMP3 exhibits a better performance than MP3 and CCSD as in case of barrier heights and radical stabilization energies. As discussed in previous studies, the OMP3 method is several times faster than CCSD in energy computations. Further, in analytic gradient computations for the CCSD method one needs to solve λ-amplitude equations, however for OMP3 one does not since λ(ab)(ij(1))=t(ij)(ab(1)) and λ(ab)(ij(2))=t(ij)(ab(2)). Additionally, one needs to solve orbital Z-vector equations for CCSD, but for OMP3 orbital response contributions are zero owing to the stationary property of OMP3. Overall, for analytic gradient computations the OMP3 method is several times less expensive than CCSD (roughly ~4-6 times). Considering the balance of computational cost and accuracy we conclude that the OMP3 method emerges as a very useful tool for the study of electronically challenging chemical systems.

  11. External quality assurance programs as a tool for verifying standardization of measurement procedures: Pilot collaboration in Europe.

    PubMed

    Perich, C; Ricós, C; Alvarez, V; Biosca, C; Boned, B; Cava, F; Doménech, M V; Fernández-Calle, P; Fernández-Fernández, P; García-Lario, J V; Minchinela, J; Simón, M; Jansen, R

    2014-05-15

    Current external quality assurance schemes have been classified into six categories, according to their ability to verify the degree of standardization of the participating measurement procedures. SKML (Netherlands) is a Category 1 EQA scheme (commutable EQA materials with values assigned by reference methods), whereas SEQC (Spain) is a Category 5 scheme (replicate analyses of non-commutable materials with no values assigned by reference methods). The results obtained by a group of Spanish laboratories participating in a pilot study organized by SKML are examined, with the aim of pointing out the improvements over our current scheme that a Category 1 program could provide. Imprecision and bias are calculated for each analyte and laboratory, and compared with quality specifications derived from biological variation. Of the 26 analytes studied, 9 had results comparable with those from reference methods, and 10 analytes did not have comparable results. The remaining 7 analytes measured did not have available reference method values, and in these cases, comparison with the peer group showed comparable results. The reasons for disagreement in the second group can be summarized as: use of non-standard methods (IFCC without exogenous pyridoxal phosphate for AST and ALT, Jaffé kinetic at low-normal creatinine concentrations and with eGFR); non-commutability of the reference material used to assign values to the routine calibrator (calcium, magnesium and sodium); use of reference materials without established commutability instead of reference methods for AST and GGT, and lack of a systematic effort by manufacturers to harmonize results. Results obtained in this work demonstrate the important role of external quality assurance programs using commutable materials with values assigned by reference methods to correctly monitor the standardization of laboratory tests with consequent minimization of risk to patients. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. SAM Biotoxin Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.

  13. SAM Chemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery

  14. Integration within the Felsenstein equation for improved Markov chain Monte Carlo methods in population genetics

    PubMed Central

    Hey, Jody; Nielsen, Rasmus

    2007-01-01

    In 1988, Felsenstein described a framework for assessing the likelihood of a genetic data set in which all of the possible genealogical histories of the data are considered, each in proportion to their probability. Although not analytically solvable, several approaches, including Markov chain Monte Carlo methods, have been developed to find approximate solutions. Here, we describe an approach in which Markov chain Monte Carlo simulations are used to integrate over the space of genealogies, whereas other parameters are integrated out analytically. The result is an approximation to the full joint posterior density of the model parameters. For many purposes, this function can be treated as a likelihood, thereby permitting likelihood-based analyses, including likelihood ratio tests of nested models. Several examples, including an application to the divergence of chimpanzee subspecies, are provided. PMID:17301231

  15. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  16. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  17. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  18. Net analyte signal standard addition method (NASSAM) as a novel spectrofluorimetric and spectrophotometric technique for simultaneous determination, application to assay of melatonin and pyridoxine

    NASA Astrophysics Data System (ADS)

    Asadpour-Zeynali, Karim; Bastami, Mohammad

    2010-02-01

    In this work a new modification of the standard addition method called "net analyte signal standard addition method (NASSAM)" is presented for the simultaneous spectrofluorimetric and spectrophotometric analysis. The proposed method combines the advantages of standard addition method with those of net analyte signal concept. The method can be applied for the determination of analyte in the presence of known interferents. The accuracy of the predictions against H-point standard addition method is not dependent on the shape of the analyte and interferent spectra. The method was successfully applied to simultaneous spectrofluorimetric and spectrophotometric determination of pyridoxine (PY) and melatonin (MT) in synthetic mixtures and in a pharmaceutical formulation.

  19. Multi-parameter flow cytometry as a process analytical technology (PAT) approach for the assessment of bacterial ghost production.

    PubMed

    Langemann, Timo; Mayr, Ulrike Beate; Meitz, Andrea; Lubitz, Werner; Herwig, Christoph

    2016-01-01

    Flow cytometry (FCM) is a tool for the analysis of single-cell properties in a cell suspension. In this contribution, we present an improved FCM method for the assessment of E-lysis in Enterobacteriaceae. The result of the E-lysis process is empty bacterial envelopes-called bacterial ghosts (BGs)-that constitute potential products in the pharmaceutical field. BGs have reduced light scattering properties when compared with intact cells. In combination with viability information obtained from staining samples with the membrane potential-sensitive fluorescent dye bis-(1,3-dibutylarbituric acid) trimethine oxonol (DiBAC4(3)), the presented method allows to differentiate between populations of viable cells, dead cells, and BGs. Using a second fluorescent dye RH414 as a membrane marker, non-cellular background was excluded from the data which greatly improved the quality of the results. Using true volumetric absolute counting, the FCM data correlated well with cell count data obtained from colony-forming units (CFU) for viable populations. Applicability of the method to several Enterobacteriaceae (different Escherichia coli strains, Salmonella typhimurium, Shigella flexneri 2a) could be shown. The method was validated as a resilient process analytical technology (PAT) tool for the assessment of E-lysis and for particle counting during 20-l batch processes for the production of Escherichia coli Nissle 1917 BGs.

  20. Temporal abstraction-based clinical phenotyping with Eureka!

    PubMed

    Post, Andrew R; Kurc, Tahsin; Willard, Richie; Rathod, Himanshu; Mansour, Michel; Pai, Akshatha Kalsanka; Torian, William M; Agravat, Sanjay; Sturm, Suzanne; Saltz, Joel H

    2013-01-01

    Temporal abstraction, a method for specifying and detecting temporal patterns in clinical databases, is very expressive and performs well, but it is difficult for clinical investigators and data analysts to understand. Such patterns are critical in phenotyping patients using their medical records in research and quality improvement. We have previously developed the Analytic Information Warehouse (AIW), which computes such phenotypes using temporal abstraction but requires software engineers to use. We have extended the AIW's web user interface, Eureka! Clinical Analytics, to support specifying phenotypes using an alternative model that we developed with clinical stakeholders. The software converts phenotypes from this model to that of temporal abstraction prior to data processing. The model can represent all phenotypes in a quality improvement project and a growing set of phenotypes in a multi-site research study. Phenotyping that is accessible to investigators and IT personnel may enable its broader adoption.

  1. Assessment of analytical quality in Nordic clinical chemistry laboratories using data from contemporary national programs.

    PubMed

    Aronsson, T; Bjørnstad, P; Leskinen, E; Uldall, A; de Verdier, C H

    1984-01-01

    The aim of this investigation was primarily to assess analytical quality expressed as between-laboratory, within-laboratory, and total imprecision, not in order to detect laboratories with poor performance, but in the positive sense to provide data for improving critical steps in analytical methodology. The aim was also to establish the present state of the art in comparison with earlier investigations to see if improvement in analytical quality could be observed.

  2. Multivariate analysis in the pharmaceutical industry: enabling process understanding and improvement in the PAT and QbD era.

    PubMed

    Ferreira, Ana P; Tobyn, Mike

    2015-01-01

    In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.

  3. Improved analytical techniques of sulfur isotopic composition in nanomole quantities by MC-ICP-MS.

    PubMed

    Yu, Tsai-Luen; Wang, Bo-Shian; Shen, Chuan-Chou; Wang, Pei-Ling; Yang, Tsanyao Frank; Burr, George S; Chen, Yue-Gau

    2017-10-02

    We propose an improved method for precise sulfur isotopic measurements by multi-collector inductively coupled plasma mass spectrometry (MC-ICP-MS) in conjunction with a membrane desolvation nebulization system. The problems of sulfur loss through the membrane desolvation apparatus are carefully quantified and resolved. The method overcomes low intrinsic sulfur transmission through the instrument, which was initially 1% when operating at a desolvation temperature of 160 °C. Sulfur loss through the membrane desolvation apparatus was resolved by doping with sodium. A Na/S ratio of 2 mol mol -1 produced sulfur transmissions with 98% recovery. Samples of 3 nmol (100 ng) sulfur achieved an external precision of ±0.18‰ (2 SD) for δ 34 S and ±0.10‰ (2 SD) for Δ 33 S (uppercase delta expresses the extent of mass-independent isotopic fractionation). Measurements made on certified reference materials and in-house standards demonstrate analytical accuracy and reproducibility. We applied the method to examine microbial-induced sulfur transformation in marine sediment pore waters from the sulfate-methane transition zone. The technique is quite versatile, and can be applied to a range of materials, including natural waters and minerals. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    ERIC Educational Resources Information Center

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  5. Toward the use of surface modified activated carbon in speciation: selective preconcentration of selenite and selenate in environmental waters.

    PubMed

    Tsoi, Yeuk-Ki; Leung, Kelvin Sze-Yin

    2011-04-22

    This paper describes a novel application of tetrabutylammonium hydroxide-modified activated carbon (AC-TBAH) to the speciation of ultra-trace Se(IV) and Se(VI) using LC-ICP-DRC-MS. The anion exchange functionality was immobilized onto the AC surface enables selective preconcentration of inorganic Se anions in a wide range of working pHs. Simultaneous retention and elution of both analytes, followed by subsequent analysis with LC-ICP-DRC-MS, allows to accomplish speciation analysis in natural samples without complicated redox pre-treatment. The laboratory-made column of immobilized AC (0.4 g of sorbent packed in a 6 mL syringe barrel) has achieved analyte enrichment factors of 76 and 93, respectively, for Se(IV) and Se(VI), thus proving its superior preconcentration efficiency and selectivity over common AC. The considerable enhancement in sensitivity achieved by using the preconcentration column has improved the method's detection limits to 1.9-2.2 ng L(-1), which is a 100-fold improvement compared with direct injection. The analyte recoveries from heavily polluted river matrix were between 95.3 and 107.7% with less than 5.0% RSD. The robustness of the preconcentration and speciation method was validated by analysis of natural waters collected from rivers and reservoirs in Hong Kong. The modified AC material is hence presented as a low-cost yet robust substitute for conventional anion exchange resins for routine applications. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.

    PubMed

    Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis

    2016-07-01

    Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.

  7. A shipboard comparison of analytic methods for ballast water compliance monitoring

    NASA Astrophysics Data System (ADS)

    Bradie, Johanna; Broeg, Katja; Gianoli, Claudio; He, Jianjun; Heitmüller, Susanne; Curto, Alberto Lo; Nakata, Akiko; Rolke, Manfred; Schillak, Lothar; Stehouwer, Peter; Vanden Byllaardt, Julie; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah

    2018-03-01

    Promising approaches for indicative analysis of ballast water samples have been developed that require study in the field to examine their utility for determining compliance with the International Convention for the Control and Management of Ships' Ballast Water and Sediments. To address this gap, a voyage was undertaken on board the RV Meteor, sailing the North Atlantic Ocean from Mindelo (Cape Verde) to Hamburg (Germany) during June 4-15, 2015. Trials were conducted on local sea water taken up by the ship's ballast system at multiple locations along the trip, including open ocean, North Sea, and coastal water, to evaluate a number of analytic methods that measure the numeric concentration or biomass of viable organisms according to two size categories (≥ 50 μm in minimum dimension: 7 techniques, ≥ 10 μm and < 50 μm: 9 techniques). Water samples were analyzed in parallel to determine whether results were similar between methods and whether rapid, indicative methods offer comparable results to standard, time- and labor-intensive detailed methods (e.g. microscopy) and high-end scientific approaches (e.g. flow cytometry). Several promising indicative methods were identified that showed high correlation with microscopy, but allow much quicker processing and require less expert knowledge. This study is the first to concurrently use a large number of analytic tools to examine a variety of ballast water samples on board an operational ship in the field. Results are useful to identify the merits of each method and can serve as a basis for further improvement and development of tools and methodologies for ballast water compliance monitoring.

  8. A workflow learning model to improve geovisual analytics utility

    PubMed Central

    Roth, Robert E; MacEachren, Alan M; McCabe, Craig A

    2011-01-01

    Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545

  9. A workflow learning model to improve geovisual analytics utility.

    PubMed

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.

  10. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    PubMed

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  11. Compensation for matrix effects in the gas chromatography-mass spectrometry analysis of 186 pesticides in tea matrices using analyte protectants.

    PubMed

    Li, Yan; Chen, Xi; Fan, Chunlin; Pang, Guofang

    2012-11-30

    A gas chromatography-mass spectrometry (GC-MS) analytical method was developed for simultaneously determining 186 pesticides in tea matrices using analyte protectants to counteract the matrix-induced effect. The matrix effects were evaluated for green, oolong and black tea, representing unfermented, partially fermented and completely fermented teas respectively and depending on the type of tea, 72%, 94% and 94% of the pesticides presented strong response enhancement effect. Several analyte protectants as well as certain combinations of these protectants were evaluated to check their compensation effects. A mixture of triglycerol and d-ribonic acid-γ-lactone (both at 2mg/mL in the injected samples) was found to be the most effective in improving the chromatographic behavior of the 186 pesticides. More than 96% of the 186 pesticides achieved recoveries within the range of 70-120% when using the selected mixture of analyte protectants. The simple addition of analyte protectants offers a more convenient solution to overcome matrix effects, results in less active sites compared to matrix-matched standardization and can be an effective approach to compensate for matrix effects in the GC-MS analysis of pesticide residues. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Optimization of classification and regression analysis of four monoclonal antibodies from Raman spectra using collaborative machine learning approach.

    PubMed

    Le, Laetitia Minh Maï; Kégl, Balázs; Gramfort, Alexandre; Marini, Camille; Nguyen, David; Cherti, Mehdi; Tfaili, Sana; Tfayli, Ali; Baillet-Guffroy, Arlette; Prognon, Patrice; Chaminade, Pierre; Caudron, Eric

    2018-07-01

    The use of monoclonal antibodies (mAbs) constitutes one of the most important strategies to treat patients suffering from cancers such as hematological malignancies and solid tumors. These antibodies are prescribed by the physician and prepared by hospital pharmacists. An analytical control enables the quality of the preparations to be ensured. The aim of this study was to explore the development of a rapid analytical method for quality control. The method used four mAbs (Infliximab, Bevacizumab, Rituximab and Ramucirumab) at various concentrations and was based on recording Raman data and coupling them to a traditional chemometric and machine learning approach for data analysis. Compared to conventional linear approach, prediction errors are reduced with a data-driven approach using statistical machine learning methods. In the latter, preprocessing and predictive models are jointly optimized. An additional original aspect of the work involved on submitting the problem to a collaborative data challenge platform called Rapid Analytics and Model Prototyping (RAMP). This allowed using solutions from about 300 data scientists in collaborative work. Using machine learning, the prediction of the four mAbs samples was considerably improved. The best predictive model showed a combined error of 2.4% versus 14.6% using linear approach. The concentration and classification errors were 5.8% and 0.7%, only three spectra were misclassified over the 429 spectra of the test set. This large improvement obtained with machine learning techniques was uniform for all molecules but maximal for Bevacizumab with an 88.3% reduction on combined errors (2.1% versus 17.9%). Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Analytical analysis and implementation of a low-speed high-torque permanent magnet vernier in-wheel motor for electric vehicle

    NASA Astrophysics Data System (ADS)

    Li, Jiangui; Wang, Junhua; Zhigang, Zhao; Yan, Weili

    2012-04-01

    In this paper, analytical analysis of the permanent magnet vernier (PMV) is presented. The key is to analytically solve the governing Laplacian/quasi-Poissonian field equations in the motor regions. By using the time-stepping finite element method, the analytical method is verified. Hence, the performances of the PMV machine are quantitatively compared with that of the analytical results. The analytical results agree well with the finite element method results. Finally, the experimental results are given to further show the validity of the analysis.

  14. Simultaneous determination of sixteen metabolites related to neural tube defects in maternal serum by liquid chromatography coupling with electrospray tandem mass spectrometry.

    PubMed

    Liang, Xiao-Ping; Liang, Qiong-Lin; Xia, Jian-Fei; Wang, Yong; Hu, Ping; Wang, Yi-Ming; Zheng, Xiao-Ying; Zhang, Ting; Luo, Guo-An

    2009-06-15

    Disturbances in maternal folate, homocysteine, and glutathione metabolism have been reported to be associated with neural tube defects (NTDs). However, the role played by specific components in the metabolic pathways leading to NTDs remains unclear. Thus an analytical method for simultaneous measurement of sixteen compounds involved in such three metabolic pathways by high performance liquid chromatography-tandem mass spectrometry was developed. The use of hydrophilic chromatography column improved the separation of polar analytes and the detection mode of multiple-reaction monitoring (MRM) enhanced the specificity and sensitivity so as to achieve simultaneous determination of three class of metabolites which have much variance in polarity and contents. The influence of parameters such as temperature, pH, flow rate on the performance of the analytes were studied to get an optimal condition. The method was validated for its linearity, accuracy, and precision, and also used for the analysis of serum samples of NTDs-affected pregnancies and normal women. The result showed that the present method is sensitive and reliable for simultaneous determination of as many as sixteen interesting metabolites which may provide a new means to study the underlying mechanism of NTDs as well as to discover new potential biomarkers.

  15. Enhancement in the sensitivity of microfluidic enzyme-linked immunosorbent assays through analyte preconcentration.

    PubMed

    Yanagisawa, Naoki; Dutta, Debashis

    2012-08-21

    In this Article, we describe a microfluidic enzyme-linked immunosorbent assay (ELISA) method whose sensitivity can be substantially enhanced through preconcentration of the target analyte around a semipermeable membrane. The reported preconcentration has been accomplished in our current work via electrokinetic means allowing a significant increase in the amount of captured analyte relative to nonspecific binding in the trapping/detection zone. Upon introduction of an enzyme substrate into this region, the rate of generation of the ELISA reaction product (resorufin) was observed to increase by over a factor of 200 for the sample and 2 for the corresponding blank compared to similar assays without analyte trapping. Interestingly, in spite of nonuniformities in the amount of captured analyte along the surface of our analysis channel, the measured fluorescence signal in the preconcentration zone increased linearly with time over an enzyme reaction period of 30 min and at a rate that was proportional to the analyte concentration in the bulk sample. In our current study, the reported technique has been shown to reduce the smallest detectable concentration of the tumor marker CA 19-9 and Blue Tongue Viral antibody by over 2 orders of magnitude compared to immunoassays without analyte preconcentration. When compared to microwell based ELISAs, the reported microfluidic approach not only yielded a similar improvement in the smallest detectable analyte concentration but also reduced the sample consumption in the assay by a factor of 20 (5 μL versus 100 μL).

  16. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    PubMed

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  17. Temperature dependence of nuclear fission time in heavy-ion fusion-fission reactions

    NASA Astrophysics Data System (ADS)

    Eccles, Chris; Roy, Sanil; Gray, Thomas H.; Zaccone, Alessio

    2017-11-01

    Accounting for viscous damping within Fokker-Planck equations led to various improvements in the understanding and analysis of nuclear fission of heavy nuclei. Analytical expressions for the fission time are typically provided by Kramers' theory, which improves on the Bohr-Wheeler estimate by including the time scale related to many-particle dissipative processes along the deformation coordinate. However, Kramers' formula breaks down for sufficiently high excitation energies where Kramers' assumption of a large barrier no longer holds. Focusing on the overdamped regime for energies T >1 MeV, Kramers' theory should be replaced by a new analytical theory derived from the Ornstein-Uhlenbeck first-passage time method that is proposed here. The theory is applied to fission time data from fusion-fission experiments on 16O+208Pb→224Th . The proposed model provides an internally consistent one-parameter fitting of fission data with a constant nuclear friction as the fitting parameter, whereas Kramers' fitting requires a value of friction which falls out of the allowed range. The theory provides also an analytical formula that in future work can be easily implemented in numerical codes such as cascade or joanne4.

  18. A sample preparation method for recovering suppressed analyte ions in MALDI TOF MS.

    PubMed

    Lou, Xianwen; de Waal, Bas F M; Milroy, Lech-Gustav; van Dongen, Joost L J

    2015-05-01

    In matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI TOF MS), analyte signals can be substantially suppressed by other compounds in the sample. In this technical note, we describe a modified thin-layer sample preparation method that significantly reduces the analyte suppression effect (ASE). In our method, analytes are deposited on top of the surface of matrix preloaded on the MALDI plate. To prevent embedding of analyte into the matrix crystals, the sample solution were prepared without matrix and efforts were taken not to re-dissolve the preloaded matrix. The results with model mixtures of peptides, synthetic polymers and lipids show that detection of analyte ions, which were completely suppressed using the conventional dried-droplet method, could be effectively recovered by using our method. Our findings suggest that the incorporation of analytes in the matrix crystals has an important contributory effect on ASE. By reducing ASE, our method should be useful for the direct MALDI MS analysis of multicomponent mixtures. Copyright © 2015 John Wiley & Sons, Ltd.

  19. An analysis of temperature-induced errors for an ultrasound distance measuring system. M. S. Thesis

    NASA Technical Reports Server (NTRS)

    Wenger, David Paul

    1991-01-01

    The presentation of research is provided in the following five chapters. Chapter 2 presents the necessary background information and definitions for general work with ultrasound and acoustics. It also discusses the basis for errors in the slant range measurements. Chapter 3 presents a method of problem solution and an analysis of the sensitivity of the equations to slant range measurement errors. It also presents various methods by which the error in the slant range measurements can be reduced to improve overall measurement accuracy. Chapter 4 provides a description of a type of experiment used to test the analytical solution and provides a discussion of its results. Chapter 5 discusses the setup of a prototype collision avoidance system, discusses its accuracy, and demonstrates various methods of improving the accuracy along with the improvements' ramifications. Finally, Chapter 6 provides a summary of the work and a discussion of conclusions drawn from it. Additionally, suggestions for further research are made to improve upon what has been presented here.

  20. Detecting a wide range of environmental contaminants in human blood samples--combining QuEChERS with LC-MS and GC-MS methods.

    PubMed

    Plassmann, Merle M; Schmidt, Magdalena; Brack, Werner; Krauss, Martin

    2015-09-01

    Exposure to environmental pollution and consumer products may result in an uptake of chemicals into human tissues. Several studies have reported the presence of diverse environmental contaminants in human blood samples. However, previously developed multi-target methods for the analysis of human blood include a fairly limited amount of compounds stemming from one or two related compound groups. Thus, the sample preparation method QuEChERS (quick easy cheap effective rugged and safe) was tested for the extraction of 64 analytes covering a broad compound domain followed by detection using liquid and gas chromatography coupled to mass spectrometry (LC- and GC-MS). Forty-seven analytes showed absolute recoveries above 70% in the first QuEChERS step, being a simple liquid-liquid extraction (LLE) using acetonitrile and salt. The second QuEChERS step, being a dispersive solid phase extraction, did not result in an overall improvement of recoveries or removal of background signals. Using solely the LLE step, eight analytes could subsequently be detected in human blood samples from the German Environmental Specimen Bank. Using a LC-multiple reaction monitoring (MRM) method with a triple quadrupole instrument, better recoveries were achieved than with an older LC-high-resolution (HR) MS full scan orbitrap instrument, which required a higher concentration factor of the extracts. However, the application of HRMS full scan methods could be used for the detection of additional compounds retrospectively.

  1. New trends in the analytical determination of emerging contaminants and their transformation products in environmental waters.

    PubMed

    Agüera, Ana; Martínez Bueno, María Jesús; Fernández-Alba, Amadeo R

    2013-06-01

    Since the so-called emerging contaminants were established as a new group of pollutants of environmental concern, a great effort has been devoted to the knowledge of their distribution, fate and effects in the environment. After more than 20 years of work, a significant improvement in knowledge about these contaminants has been achieved, but there is still a large gap of information on the growing number of new potential contaminants that are appearing and especially of their unpredictable transformation products. Although the environmental problem arising from emerging contaminants must be addressed from an interdisciplinary point of view, it is obvious that analytical chemistry plays an important role as the first step of the study, as it allows establishing the presence of chemicals in the environment, estimate their concentration levels, identify sources and determine their degradation pathways. These tasks involve serious difficulties requiring different analytical solutions adjusted to purpose. Thus, the complexity of the matrices requires highly selective analytical methods; the large number and variety of compounds potentially present in the samples demands the application of wide scope methods; the low concentrations at which these contaminants are present in the samples require a high detection sensitivity, and high demands on the confirmation and high structural information are needed for the characterisation of unknowns. New developments on analytical instrumentation have been applied to solve these difficulties. Furthermore and not less important has been the development of new specific software packages intended for data acquisition and, in particular, for post-run analysis. Thus, the use of sophisticated software tools has allowed successful screening analysis, determining several hundreds of analytes, and assisted in the structural elucidation of unknown compounds in a timely manner.

  2. Recalibration of blood analytes over 25 years in the Atherosclerosis Risk in Communities Study: The impact of recalibration on chronic kidney disease prevalence and incidence

    PubMed Central

    Parrinello, Christina M.; Grams, Morgan E.; Couper, David; Ballantyne, Christie M.; Hoogeveen, Ron C.; Eckfeldt, John H.; Selvin, Elizabeth; Coresh, Josef

    2016-01-01

    Background Equivalence of laboratory tests over time is important for longitudinal studies. Even a small systematic difference (bias) can result in substantial misclassification. Methods We selected 200 Atherosclerosis Risk in Communities Study participants attending all 5 study visits over 25 years. Eight analytes were re-measured in 2011–13 from stored blood samples from multiple visits: creatinine, uric acid, glucose, total cholesterol, HDL-cholesterol, LDL-cholesterol, triglycerides, and high-sensitivity C-reactive protein. Original values were recalibrated to re-measured values using Deming regression. Differences >10% were considered to reflect substantial bias, and correction equations were applied to affected analytes in the total study population. We examined trends in chronic kidney disease (CKD) pre- and post-recalibration. Results Repeat measures were highly correlated with original values (Pearson’s r>0.85 after removing outliers [median 4.5% of paired measurements]), but 2 of 8 analytes (creatinine and uric acid) had differences >10%. Original values of creatinine and uric acid were recalibrated to current values using correction equations. CKD prevalence differed substantially after recalibration of creatinine (visits 1, 2, 4 and 5 pre-recalibration: 21.7%, 36.1%, 3.5%, 29.4%; post-recalibration: 1.3%, 2.2%, 6.4%, 29.4%). For HDL-cholesterol, the current direct enzymatic method differed substantially from magnesium dextran precipitation used during visits 1–4. Conclusions Analytes re-measured in samples stored for ~25 years were highly correlated with original values, but two of the 8 analytes showed substantial bias at multiple visits. Laboratory recalibration improved reproducibility of test results across visits and resulted in substantial differences in CKD prevalence. We demonstrate the importance of consistent recalibration of laboratory assays in a cohort study. PMID:25952043

  3. Personality, Cognitive Style, Motivation, and Aptitude Predict Systematic Trends in Analytic Forecasting Behavior.

    PubMed

    Poore, Joshua C; Forlines, Clifton L; Miller, Sarah M; Regan, John R; Irvine, John M

    2014-12-01

    The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures-personality, cognitive style, motivated cognition-predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated "top-down" cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains.

  4. Personality, Cognitive Style, Motivation, and Aptitude Predict Systematic Trends in Analytic Forecasting Behavior

    PubMed Central

    Forlines, Clifton L.; Miller, Sarah M.; Regan, John R.; Irvine, John M.

    2014-01-01

    The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures—personality, cognitive style, motivated cognition—predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated “top-down” cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains. PMID:25983670

  5. Nanomaterial-Based Sensing and Biosensing of Phenolic Compounds and Related Antioxidant Capacity in Food.

    PubMed

    Della Pelle, Flavio; Compagnone, Dario

    2018-02-04

    Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field.

  6. Nanomaterial-Based Sensing and Biosensing of Phenolic Compounds and Related Antioxidant Capacity in Food

    PubMed Central

    2018-01-01

    Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field. PMID:29401719

  7. Proposal of Classification Method of Time Series Data in International Emissions Trading Market Using Agent-based Simulation

    NASA Astrophysics Data System (ADS)

    Nakada, Tomohiro; Takadama, Keiki; Watanabe, Shigeyoshi

    This paper proposes the classification method using Bayesian analytical method to classify the time series data in the international emissions trading market depend on the agent-based simulation and compares the case with Discrete Fourier transform analytical method. The purpose demonstrates the analytical methods mapping time series data such as market price. These analytical methods have revealed the following results: (1) the classification methods indicate the distance of mapping from the time series data, it is easier the understanding and inference than time series data; (2) these methods can analyze the uncertain time series data using the distance via agent-based simulation including stationary process and non-stationary process; and (3) Bayesian analytical method can show the 1% difference description of the emission reduction targets of agent.

  8. 40 CFR 141.704 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  9. 40 CFR 141.704 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 23 2014-07-01 2014-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  10. 40 CFR 141.704 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 24 2013-07-01 2013-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  11. 77 FR 41336 - Analytical Methods Used in Periodic Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... Methods Used in Periodic Reporting AGENCY: Postal Regulatory Commission. ACTION: Notice of filing. SUMMARY... proceeding to consider changes in analytical methods used in periodic reporting. This notice addresses... informal rulemaking proceeding to consider changes in the analytical methods approved for use in periodic...

  12. 40 CFR 141.704 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  13. 40 CFR 141.704 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 24 2012-07-01 2012-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  14. Recently published analytical methods for determining alcohol in body materials : alcohol countermeasures literature review

    DOT National Transportation Integrated Search

    1974-10-01

    The author has brought the review of published analytical methods for determining alcohol in body materials up-to- date. The review deals with analytical methods for alcohol in blood and other body fluids and tissues; breath alcohol methods; factors ...

  15. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    ERIC Educational Resources Information Center

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  16. Modification of a successive corrections objective analysis for improved higher order calculations

    NASA Technical Reports Server (NTRS)

    Achtemeier, Gary L.

    1988-01-01

    The use of objectively analyzed fields of meteorological data for the initialization of numerical prediction models and for complex diagnostic studies places the requirements upon the objective method that derivatives of the gridded fields be accurate and free from interpolation error. A modification was proposed for an objective analysis developed by Barnes that provides improvements in analysis of both the field and its derivatives. Theoretical comparisons, comparisons between analyses of analytical monochromatic waves, and comparisons between analyses of actual weather data are used to show the potential of the new method. The new method restores more of the amplitudes of desired wavelengths while simultaneously filtering more of the amplitudes of undesired wavelengths. These results also hold for the first and second derivatives calculated from the gridded fields. Greatest improvements were for the Laplacian of the height field; the new method reduced the variance of undesirable very short wavelengths by 72 percent. Other improvements were found in the divergence of the gridded wind field and near the boundaries of the field of data.

  17. Improved Quantification of Free and Ester-Bound Gallic Acid in Foods and Beverages by UHPLC-MS/MS.

    PubMed

    Newsome, Andrew G; Li, Yongchao; van Breemen, Richard B

    2016-02-17

    Hydrolyzable tannins are measured routinely during the characterization of food and beverage samples. Most methods for the determination of hydrolyzable tannins use hydrolysis or methanolysis to convert complex tannins to small molecules (gallic acid, methyl gallate, and ellagic acid) for quantification by HPLC-UV. Often unrecognized, analytical limitations and variability inherent in these approaches for the measurement of hydrolyzable tannins include the variable mass fraction (0-0.90) that is released as analyte, contributions of sources other than tannins to hydrolyzable gallate (can exceed >10 wt %/wt), the measurement of both free and total analyte, and lack of controls to account for degradation. An accurate, specific, sensitive, and higher-throughput approach for the determination of hydrolyzable gallate based on ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) that overcomes these limitations was developed.

  18. A Widely Applicable Silver Sol for TLC Detection with Rich and Stable SERS Features.

    PubMed

    Zhu, Qingxia; Li, Hao; Lu, Feng; Chai, Yifeng; Yuan, Yongfang

    2016-12-01

    Thin-layer chromatography (TLC) coupled with surface-enhanced Raman spectroscopy (SERS) has gained tremendous popularity in the study of various complex systems. However, the detection of hydrophobic analytes is difficult, and the specificity still needs to be improved. In this study, a SERS-active non-aqueous silver sol which could activate the analytes to produce rich and stable spectral features was rapidly synthesized. Then, the optimized silver nanoparticles (AgNPs)-DMF sol was employed for TLC-SERS detection of hydrophobic (and also hydrophilic) analytes. SERS performance of this sol was superior to that of traditional Lee-Meisel AgNPs due to its high specificity, acceptable stability, and wide applicability. The non-aqueous AgNPs would be suitable for the TLC-SERS method, which shows great promise for applications in food safety assurance, environmental monitoring, medical diagnoses, and many other fields.

  19. A Widely Applicable Silver Sol for TLC Detection with Rich and Stable SERS Features

    NASA Astrophysics Data System (ADS)

    Zhu, Qingxia; Li, Hao; Lu, Feng; Chai, Yifeng; Yuan, Yongfang

    2016-04-01

    Thin-layer chromatography (TLC) coupled with surface-enhanced Raman spectroscopy (SERS) has gained tremendous popularity in the study of various complex systems. However, the detection of hydrophobic analytes is difficult, and the specificity still needs to be improved. In this study, a SERS-active non-aqueous silver sol which could activate the analytes to produce rich and stable spectral features was rapidly synthesized. Then, the optimized silver nanoparticles (AgNPs)-DMF sol was employed for TLC-SERS detection of hydrophobic (and also hydrophilic) analytes. SERS performance of this sol was superior to that of traditional Lee-Meisel AgNPs due to its high specificity, acceptable stability, and wide applicability. The non-aqueous AgNPs would be suitable for the TLC-SERS method, which shows great promise for applications in food safety assurance, environmental monitoring, medical diagnoses, and many other fields.

  20. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2017-01-01

    There has been an immense amount of visibility of doping issues on the international stage over the past 12 months with the complexity of doping controls reiterated on various occasions. Hence, analytical test methods continuously being updated, expanded, and improved to provide specific, sensitive, and comprehensive test results in line with the World Anti-Doping Agency's (WADA) 2016 Prohibited List represent one of several critical cornerstones of doping controls. This enterprise necessitates expediting the (combined) exploitation of newly generated information on novel and/or superior target analytes for sports drug testing assays, drug elimination profiles, alternative test matrices, and recent advances in instrumental developments. This paper is a continuation of the series of annual banned-substance reviews appraising the literature published between October 2015 and September 2016 concerning human sports drug testing in the context of WADA's 2016 Prohibited List. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Axisymmetric inlet minimum weight design method

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth

    1995-01-01

    An analytical method for determining the minimum weight design of an axisymmetric supersonic inlet has been developed. The goal of this method development project was to improve the ability to predict the weight of high-speed inlets in conceptual and preliminary design. The initial model was developed using information that was available from inlet conceptual design tools (e.g., the inlet internal and external geometries and pressure distributions). Stiffened shell construction was assumed. Mass properties were computed by analyzing a parametric cubic curve representation of the inlet geometry. Design loads and stresses were developed at analysis stations along the length of the inlet. The equivalent minimum structural thicknesses for both shell and frame structures required to support the maximum loads produced by various load conditions were then determined. Preliminary results indicated that inlet hammershock pressures produced the critical design load condition for a significant portion of the inlet. By improving the accuracy of inlet weight predictions, the method will improve the fidelity of propulsion and vehicle design studies and increase the accuracy of weight versus cost studies.

  2. Numerical solution of the nonlinear Schrodinger equation by feedforward neural networks

    NASA Astrophysics Data System (ADS)

    Shirvany, Yazdan; Hayati, Mohsen; Moradian, Rostam

    2008-12-01

    We present a method to solve boundary value problems using artificial neural networks (ANN). A trial solution of the differential equation is written as a feed-forward neural network containing adjustable parameters (the weights and biases). From the differential equation and its boundary conditions we prepare the energy function which is used in the back-propagation method with momentum term to update the network parameters. We improved energy function of ANN which is derived from Schrodinger equation and the boundary conditions. With this improvement of energy function we can use unsupervised training method in the ANN for solving the equation. Unsupervised training aims to minimize a non-negative energy function. We used the ANN method to solve Schrodinger equation for few quantum systems. Eigenfunctions and energy eigenvalues are calculated. Our numerical results are in agreement with their corresponding analytical solution and show the efficiency of ANN method for solving eigenvalue problems.

  3. Comparison of Multi-Criteria Decision Support Methods (AHP, TOPSIS, SAW & PROMENTHEE) for Employee Placement

    NASA Astrophysics Data System (ADS)

    Widianta, M. M. D.; Rizaldi, T.; Setyohadi, D. P. S.; Riskiawan, H. Y.

    2018-01-01

    The right decision in placing employees in an appropriate position in a company will support the quality of management and will have an impact on improving the quality of human resources of the company. Such decision-making can be assisted by an approach through the Decision Support System (DSS) to improve accuracy in the employee placement process. The purpose of this paper is to compare the four methods of Multi Criteria Decision Making (MCDM), ie Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), Simple Additive Weighting (SAW), Analytic Hierarchy Process (AHP) and Preference Ranking Organization Method for Enrichment Of Evaluations (PROMETHEE) for the application of employee placement in accordance with predetermined criteria. The ranking results and the accuracy level obtained from each method are different depending on the different scaling and weighting processes in each method.

  4. FT-IR-cPAS—New Photoacoustic Measurement Technique for Analysis of Hot Gases: A Case Study on VOCs

    PubMed Central

    Hirschmann, Christian Bernd; Koivikko, Niina Susanna; Raittila, Jussi; Tenhunen, Jussi; Ojala, Satu; Rahkamaa-Tolonen, Katariina; Marbach, Ralf; Hirschmann, Sarah; Keiski, Riitta Liisa

    2011-01-01

    This article describes a new photoacoustic FT-IR system capable of operating at elevated temperatures. The key hardware component is an optical-readout cantilever microphone that can work up to 200 °C. All parts in contact with the sample gas were put into a heated oven, incl. the photoacoustic cell. The sensitivity of the built photoacoustic system was tested by measuring 18 different VOCs. At 100 ppm gas concentration, the univariate signal to noise ratios (1σ, measurement time 25.5 min, at highest peak, optical resolution 8 cm−1) of the spectra varied from minimally 19 for o-xylene up to 329 for butyl acetate. The sensitivity can be improved by multivariate analyses over broad wavelength ranges, which effectively co-adds the univariate sensitivities achievable at individual wavelengths. The multivariate limit of detection (3σ, 8.5 min, full useful wavelength range), i.e., the best possible inverse analytical sensitivity achievable at optimum calibration, was calculated using the SBC method and varied from 2.60 ppm for dichloromethane to 0.33 ppm for butyl acetate. Depending on the shape of the spectra, which often only contain a few sharp peaks, the multivariate analysis improved the analytical sensitivity by 2.2 to 9.2 times compared to the univariate case. Selectivity and multi component ability were tested by a SBC calibration including 5 VOCs and water. The average cross selectivities turned out to be less than 2% and the resulting inverse analytical sensitivities of the 5 interfering VOCs was increased by maximum factor of 2.2 compared to the single component sensitivities. Water subtraction using SBC gave the true analyte concentration with a variation coefficient of 3%, although the sample spectra (methyl ethyl ketone, 200 ppm) contained water from 1,400 to 100k ppm and for subtraction only one water spectra (10k ppm) was used. The developed device shows significant improvement to the current state-of-the-art measurement methods used in industrial VOC measurements. PMID:22163900

  5. Improving Learning Analytics--Combining Observational and Self-Report Data on Student Learning

    ERIC Educational Resources Information Center

    Ellis, Robert A.; Han, Feifei; Pardo, Abelardo

    2017-01-01

    The field of education technology is embracing a use of learning analytics to improve student experiences of learning. Along with exponential growth in this area is an increasing concern of the interpretability of the analytics from the student experience and what they can tell us about learning. This study offers a way to address some of the…

  6. 7 CFR 93.13 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture... PROCESSED FRUITS AND VEGETABLES Peanuts, Tree Nuts, Corn and Other Oilseeds § 93.13 Analytical methods... manuals: (a) Approved Methods of the American Association of Cereal Chemists (AACC), American Association...

  7. 40 CFR 141.25 - Analytical methods for radioactivity.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 24 2012-07-01 2012-07-01 false Analytical methods for radioactivity... § 141.25 Analytical methods for radioactivity. (a) Analysis for the following contaminants shall be conducted to determine compliance with § 141.66 (radioactivity) in accordance with the methods in the...

  8. 7 CFR 93.13 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture... PROCESSED FRUITS AND VEGETABLES Peanuts, Tree Nuts, Corn and Other Oilseeds § 93.13 Analytical methods... manuals: (a) Approved Methods of the American Association of Cereal Chemists (AACC), American Association...

  9. 7 CFR 93.13 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture... PROCESSED FRUITS AND VEGETABLES Peanuts, Tree Nuts, Corn and Other Oilseeds § 93.13 Analytical methods... manuals: (a) Approved Methods of the American Association of Cereal Chemists (AACC), American Association...

  10. 7 CFR 93.13 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture... PROCESSED FRUITS AND VEGETABLES Peanuts, Tree Nuts, Corn and Other Oilseeds § 93.13 Analytical methods... manuals: (a) Approved Methods of the American Association of Cereal Chemists (AACC), American Association...

  11. 7 CFR 93.13 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture... PROCESSED FRUITS AND VEGETABLES Peanuts, Tree Nuts, Corn and Other Oilseeds § 93.13 Analytical methods... manuals: (a) Approved Methods of the American Association of Cereal Chemists (AACC), American Association...

  12. Determination of vitamin C in foods: current state of method validation.

    PubMed

    Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C

    2014-11-21

    Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review. Copyright © 2014. Published by Elsevier B.V.

  13. Improved vertical streambed flux estimation using multiple diurnal temperature methods in series

    USGS Publications Warehouse

    Irvine, Dylan J.; Briggs, Martin A.; Cartwright, Ian; Scruggs, Courtney; Lautz, Laura K.

    2017-01-01

    Analytical solutions that use diurnal temperature signals to estimate vertical fluxes between groundwater and surface water based on either amplitude ratios (Ar) or phase shifts (Δϕ) produce results that rarely agree. Analytical solutions that simultaneously utilize Ar and Δϕ within a single solution have more recently been derived, decreasing uncertainty in flux estimates in some applications. Benefits of combined (ArΔϕ) methods also include that thermal diffusivity and sensor spacing can be calculated. However, poor identification of either Ar or Δϕ from raw temperature signals can lead to erratic parameter estimates from ArΔϕ methods. An add-on program for VFLUX 2 is presented to address this issue. Using thermal diffusivity selected from an ArΔϕ method during a reliable time period, fluxes are recalculated using an Ar method. This approach maximizes the benefits of the Ar and ArΔϕ methods. Additionally, sensor spacing calculations can be used to identify periods with unreliable flux estimates, or to assess streambed scour. Using synthetic and field examples, the use of these solutions in series was particularly useful for gaining conditions where fluxes exceeded 1 m/d.

  14. SCI Identification (SCIDNT) program user's guide. [maximum likelihood method for linear rotorcraft models

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The computer program Linear SCIDNT which evaluates rotorcraft stability and control coefficients from flight or wind tunnel test data is described. It implements the maximum likelihood method to maximize the likelihood function of the parameters based on measured input/output time histories. Linear SCIDNT may be applied to systems modeled by linear constant-coefficient differential equations. This restriction in scope allows the application of several analytical results which simplify the computation and improve its efficiency over the general nonlinear case.

  15. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moran, James; Alexander, Thomas; Aalseth, Craig

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of T behavior in the environment.

  16. Finite-analytic numerical solution of heat transfer in two-dimensional cavity flow

    NASA Technical Reports Server (NTRS)

    Chen, C.-J.; Naseri-Neshat, H.; Ho, K.-S.

    1981-01-01

    Heat transfer in cavity flow is numerically analyzed by a new numerical method called the finite-analytic method. The basic idea of the finite-analytic method is the incorporation of local analytic solutions in the numerical solutions of linear or nonlinear partial differential equations. In the present investigation, the local analytic solutions for temperature, stream function, and vorticity distributions are derived. When the local analytic solution is evaluated at a given nodal point, it gives an algebraic relationship between a nodal value in a subregion and its neighboring nodal points. A system of algebraic equations is solved to provide the numerical solution of the problem. The finite-analytic method is used to solve heat transfer in the cavity flow at high Reynolds number (1000) for Prandtl numbers of 0.1, 1, and 10.

  17. Quantitative on-line preconcentration-liquid chromatography coupled with tandem mass spectrometry method for the determination of pharmaceutical compounds in water.

    PubMed

    Idder, Salima; Ley, Laurent; Mazellier, Patrick; Budzinski, Hélène

    2013-12-17

    One of the current environmental issues concerns the presence and fate of pharmaceuticals in water bodies as these compounds may represent a potential environmental problem. The characterization of pharmaceutical contamination requires powerful analytical method able to quantify these pollutants at very low concentration (few ng L(-1)). In this work, a multi-residue analytical methodology (on-line solid phase extraction-liquid chromatography-triple quadrupole mass spectrometry using positive and negative electrospray ionization) has been developed and validated for 40 multi-class pharmaceuticals and metabolites for tap and surface waters. This on-line SPE method was very convenient and efficient compared to classical off-line SPE method because of its shorter total run time including sample preparation and smaller sample volume (1 mL vs up to 1 L). The optimized method included several therapeutic classes as lipid regulators, antibiotics, beta-blockers, non-steroidal anti-inflammatories, antineoplastic, etc., with various physicochemical properties. Quantification has been achieved with the internal standards. The limits of detection are between 0.7 and 15 ng L(-1) for drinking waters and 2-15 ng L(-1) for surface waters. The inter-day precision values are below 20% for each studied level. The improvement and strength of the analytical method has been verified along a monitoring of these 40 pharmaceuticals in Isle River, a French stream located in the South West of France. During this survey, 16 pharmaceutical compounds have been detected. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. TSOS and TSOS-FK hybrid methods for modelling the propagation of seismic waves

    NASA Astrophysics Data System (ADS)

    Ma, Jian; Yang, Dinghui; Tong, Ping; Ma, Xiao

    2018-05-01

    We develop a new time-space optimized symplectic (TSOS) method for numerically solving elastic wave equations in heterogeneous isotropic media. We use the phase-preserving symplectic partitioned Runge-Kutta method to evaluate the time derivatives and optimized explicit finite-difference (FD) schemes to discretize the space derivatives. We introduce the averaged medium scheme into the TSOS method to further increase its capability of dealing with heterogeneous media and match the boundary-modified scheme for implementing free-surface boundary conditions and the auxiliary differential equation complex frequency-shifted perfectly matched layer (ADE CFS-PML) non-reflecting boundaries with the TSOS method. A comparison of the TSOS method with analytical solutions and standard FD schemes indicates that the waveform generated by the TSOS method is more similar to the analytic solution and has a smaller error than other FD methods, which illustrates the efficiency and accuracy of the TSOS method. Subsequently, we focus on the calculation of synthetic seismograms for teleseismic P- or S-waves entering and propagating in the local heterogeneous region of interest. To improve the computational efficiency, we successfully combine the TSOS method with the frequency-wavenumber (FK) method and apply the ADE CFS-PML to absorb the scattered waves caused by the regional heterogeneity. The TSOS-FK hybrid method is benchmarked against semi-analytical solutions provided by the FK method for a 1-D layered model. Several numerical experiments, including a vertical cross-section of the Chinese capital area crustal model, illustrate that the TSOS-FK hybrid method works well for modelling waves propagating in complex heterogeneous media and remains stable for long-time computation. These numerical examples also show that the TSOS-FK method can tackle the converted and scattered waves of the teleseismic plane waves caused by local heterogeneity. Thus, the TSOS and TSOS-FK methods proposed in this study present an essential tool for the joint inversion of local, regional, and teleseismic waveform data.

  19. Evaluation of algorithms for point cloud surface reconstruction through the analysis of shape parameters

    NASA Astrophysics Data System (ADS)

    Cao, Lu; Verbeek, Fons J.

    2012-03-01

    In computer graphics and visualization, reconstruction of a 3D surface from a point cloud is an important research area. As the surface contains information that can be measured, i.e. expressed in features, the application of surface reconstruction can be potentially important for application in bio-imaging. Opportunities in this application area are the motivation for this study. In the past decade, a number of algorithms for surface reconstruction have been proposed. Generally speaking, these methods can be separated into two categories: i.e., explicit representation and implicit approximation. Most of the aforementioned methods are firmly based in theory; however, so far, no analytical evaluation between these methods has been presented. The straightforward way of evaluation has been by convincing through visual inspection. Through evaluation we search for a method that can precisely preserve the surface characteristics and that is robust in the presence of noise. The outcome will be used to improve reliability in surface reconstruction of biological models. We, therefore, use an analytical approach by selecting features as surface descriptors and measure these features in varying conditions. We selected surface distance, surface area and surface curvature as three major features to compare quality of the surface created by the different algorithms. Our starting point has been ground truth values obtained from analytical shapes such as the sphere and the ellipsoid. In this paper we present four classical surface reconstruction methods from the two categories mentioned above, i.e. the Power Crust, the Robust Cocone, the Fourier-based method and the Poisson reconstruction method. The results obtained from our experiments indicate that Poisson reconstruction method performs the best in the presence of noise.

  20. The Acoustic Analogy: A Powerful Tool in Aeroacoustics with Emphasis on Jet Noise Prediction

    NASA Technical Reports Server (NTRS)

    Farassat, F.; Doty, Michael J.; Hunter, Craig A.

    2004-01-01

    The acoustic analogy introduced by Lighthill to study jet noise is now over 50 years old. In the present paper, Lighthill s Acoustic Analogy is revisited together with a brief evaluation of the state-of-the-art of the subject and an exploration of the possibility of further improvements in jet noise prediction from analytical methods, computational fluid dynamics (CFD) predictions, and measurement techniques. Experimental Particle Image Velocimetry (PIV) data is used both to evaluate turbulent statistics from Reynolds-averaged Navier-Stokes (RANS) CFD and to propose correlation models for the Lighthill stress tensor. The NASA Langley Jet3D code is used to study the effect of these models on jet noise prediction. From the analytical investigation, a retarded time correction is shown that improves, by approximately 8 dB, the over-prediction of aft-arc jet noise by Jet3D. In experimental investigation, the PIV data agree well with the CFD mean flow predictions, with room for improvement in Reynolds stress predictions. Initial modifications, suggested by the PIV data, to the form of the Jet3D correlation model showed no noticeable improvements in jet noise prediction.

Top