A simulation-based evaluation of methods for inferring linear barriers to gene flow
Christopher Blair; Dana E. Weigel; Matthew Balazik; Annika T. H. Keeley; Faith M. Walker; Erin Landguth; Sam Cushman; Melanie Murphy; Lisette Waits; Niko Balkenhol
2012-01-01
Different analytical techniques used on the same data set may lead to different conclusions about the existence and strength of genetic structure. Therefore, reliable interpretation of the results from different methods depends on the efficacy and reliability of different statistical methods. In this paper, we evaluated the performance of multiple analytical methods to...
Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra
2018-02-01
The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.
Effects of Analytical and Holistic Scoring Patterns on Scorer Reliability in Biology Essay Tests
ERIC Educational Resources Information Center
Ebuoh, Casmir N.
2018-01-01
Literature revealed that the patterns/methods of scoring essay tests had been criticized for not being reliable and this unreliability is more likely to be more in internal examinations than in the external examinations. The purpose of this study is to find out the effects of analytical and holistic scoring patterns on scorer reliability in…
Distributed Generation Interconnection Collaborative | NREL
, reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...
2016-07-05
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
One-year test-retest reliability of intrinsic connectivity network fMRI in older adults
Guo, Cong C.; Kurth, Florian; Zhou, Juan; Mayer, Emeran A.; Eickhoff, Simon B; Kramer, Joel H.; Seeley, William W.
2014-01-01
“Resting-state” or task-free fMRI can assess intrinsic connectivity network (ICN) integrity in health and disease, suggesting a potential for use of these methods as disease-monitoring biomarkers. Numerous analytical options are available, including model-driven ROI-based correlation analysis and model-free, independent component analysis (ICA). High test-retest reliability will be a necessary feature of a successful ICN biomarker, yet available reliability data remains limited. Here, we examined ICN fMRI test-retest reliability in 24 healthy older subjects scanned roughly one year apart. We focused on the salience network, a disease-relevant ICN not previously subjected to reliability analysis. Most ICN analytical methods proved reliable (intraclass coefficients > 0.4) and could be further improved by wavelet analysis. Seed-based ROI correlation analysis showed high map-wise reliability, whereas graph theoretical measures and temporal concatenation group ICA produced the most reliable individual unit-wise outcomes. Including global signal regression in ROI-based correlation analyses reduced reliability. Our study provides a direct comparison between the most commonly used ICN fMRI methods and potential guidelines for measuring intrinsic connectivity in aging control and patient populations over time. PMID:22446491
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-06
... Analytic Methods and Sampling Procedures for the United States National Residue Program for Meat, Poultry... implementing several multi-residue methods for analyzing samples of meat, poultry, and egg products for animal.... These modern, high-efficiency methods will conserve resources and provide useful and reliable results...
Reliability analysis of composite structures
NASA Technical Reports Server (NTRS)
Kan, Han-Pin
1992-01-01
A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.
Reliability of proton NMR spectroscopy for the assessment of frying oil oxidation
USDA-ARS?s Scientific Manuscript database
Although there are many analytical methods developed to assess oxidation of edible oil, it is still common to see a lack of consistency in results from different methods. This inconsistency is expected since there are numerous oxidation products and any analytical method measuring only one kind of o...
The NMR analysis of frying oil: a very reliable method for assessment of lipid oxidation
USDA-ARS?s Scientific Manuscript database
There are many analytical methods developed for the assessment of lipid oxidation. However, one of the most challenging issues in analyzing oil oxidation is that there is lack of consistency in results obtained from different analytical methods. The major reason for the inconsistency is that most me...
Analytical Algorithms to Quantify the Uncertainty in Remaining Useful Life Prediction
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar; Saxena, Abhinav; Daigle, Matthew; Goebel, Kai
2013-01-01
This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decisionmaking. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the the first-order second moment method (FOSM), the first-order reliability method (FORM), and the inverse first-order reliability method (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.
2014-01-01
Background As a part of the longitudinal Chronic Obstructive Pulmonary Disease (COPD) study, Subpopulations and Intermediate Outcome Measures in COPD study (SPIROMICS), blood samples are being collected from 3200 subjects with the goal of identifying blood biomarkers for sub-phenotyping patients and predicting disease progression. To determine the most reliable sample type for measuring specific blood analytes in the cohort, a pilot study was performed from a subset of 24 subjects comparing serum, Ethylenediaminetetraacetic acid (EDTA) plasma, and EDTA plasma with proteinase inhibitors (P100™). Methods 105 analytes, chosen for potential relevance to COPD, arranged in 12 multiplex and one simplex platform (Myriad-RBM) were evaluated in duplicate from the three sample types from 24 subjects. The reliability coefficient and the coefficient of variation (CV) were calculated. The performance of each analyte and mean analyte levels were evaluated across sample types. Results 20% of analytes were not consistently detectable in any sample type. Higher reliability and/or smaller CV were determined for 12 analytes in EDTA plasma compared to serum, and for 11 analytes in serum compared to EDTA plasma. While reliability measures were similar for EDTA plasma and P100 plasma for a majority of analytes, CV was modestly increased in P100 plasma for eight analytes. Each analyte within a multiplex produced independent measurement characteristics, complicating selection of sample type for individual multiplexes. Conclusions There were notable detectability and measurability differences between serum and plasma. Multiplexing may not be ideal if large reliability differences exist across analytes measured within the multiplex, especially if values differ based on sample type. For some analytes, the large CV should be considered during experimental design, and the use of duplicate and/or triplicate samples may be necessary. These results should prove useful for studies evaluating selection of samples for evaluation of potential blood biomarkers. PMID:24397870
ERIC Educational Resources Information Center
Helms, LuAnn Sherbeck
This paper discusses the fact that reliability is about scores and not tests and how reliability limits effect sizes. The paper also explores the classical reliability coefficients of stability, equivalence, and internal consistency. Stability is concerned with how stable test scores will be over time, while equivalence addresses the relationship…
García-Blanco, Ana; Peña-Bautista, Carmen; Oger, Camille; Vigor, Claire; Galano, Jean-Marie; Durand, Thierry; Martín-Ibáñez, Nuria; Baquero, Miguel; Vento, Máximo; Cháfer-Pericás, Consuelo
2018-07-01
Lipid peroxidation plays an important role in Alzheimer Disease, so corresponding metabolites found in urine samples could be potential biomarkers. The aim of this work is to develop a reliable ultra-performance liquid chromatography-tandem mass spectrometry analytical method to determine a new set of lipid peroxidation compounds in urine samples. Excellent sensitivity was achieved with limits of detection between 0.08 and 17 nmol L -1 , which renders this method suitable to monitor analytes concentrations in real samples. The method's precision was satisfactory with coefficients of variation around 5-17% (intra-day) and 8-19% (inter-day). The accuracy of the method was assessed by analysis of spiked urine samples obtaining recoveries between 70% and 120% for most of the analytes. The utility of the described method was tested by analyzing urine samples from patients early diagnosed with mild cognitive impairment or mild dementia Alzheimer Disease following the clinical standard criteria. As preliminary results, some analytes (17(RS)-10-epi-SC-Δ 15 -11-dihomo-IsoF, PGE 2 ) and total parameters (Neuroprostanes, Isoprostanes, Isofurans) show differences between the control and the clinical groups. So, these analytes could be potential early Alzheimer Disease biomarkers assessing the patients' pro-oxidant condition. Copyright © 2018 Elsevier B.V. All rights reserved.
Goldstein, S J; Hensley, C A; Armenta, C E; Peters, R J
1997-03-01
Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for alpha-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of "real" environmental and bioassay samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of approximately 2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously.
The National Shipbuilding Research Program. Environmental Studies and Testing (Phase V)
2000-11-20
development of an analytical procedure for toxic organic compounds, including TBT ( tributyltin ), whose turnaround time would be in the order of minutes...Cost of the Subtask was $20,000. Subtask #33 - Turnaround Analytical Method for TBT This Subtask performed a preliminary investigation leading to the...34Quick TBT Analytical Method" that will yield reliable results in 15 minutes, a veritable breakthrough in sampling technology. The Subtask was managed by
Binding, N; Schilder, K; Czeschinski, P A; Witting, U
1998-08-01
The 2,4-dinitrophenylhydrazine (2,4-DNPH) derivatization method mainly used for the determination of airborne formaldehyde was extended for acetaldehyde, acetone, 2-butanone, and cyclohexanone, the next four carbonyl compounds of industrial importance. Sampling devices and sampling conditions were adjusted for the respective limit value regulations. Analytical reliability criteria were established and compared to those of other recommended methods. With a minimum analytical range from one tenth to the 3-fold limit value in all cases and with relative standard deviations below 5%, the adjusted method meets all requirements for the reliable quantification of the four compounds in workplace air as well as in ambient air.
Yun, Changhong; Dashwood, Wan-Mohaiza; Kwong, Lawrence N; Gao, Song; Yin, Taijun; Ling, Qinglan; Singh, Rashim; Dashwood, Roderick H; Hu, Ming
2018-01-30
An accurate and reliable UPLC-MS/MS method is reported for the quantification of endogenous Prostaglandin E2 (PGE 2 ) in rat colonic mucosa and polyps. This method adopted the "surrogate analyte plus authentic bio-matrix" approach, using two different stable isotopic labeled analogs - PGE 2 -d9 as the surrogate analyte and PGE 2 -d4 as the internal standard. A quantitative standard curve was constructed with the surrogate analyte in colonic mucosa homogenate, and the method was successfully validated with the authentic bio-matrix. Concentrations of endogenous PGE 2 in both normal and inflammatory tissue homogenates were back-calculated based on the regression equation. Because of no endogenous interference on the surrogate analyte determination, the specificity was particularly good. By using authentic bio-matrix for validation, the matrix effect and exaction recovery are identically same for the quantitative standard curve and actual samples - this notably increased the assay accuracy. The method is easy, fast, robust and reliable for colon PGE 2 determination. This "surrogate analyte" approach was applied to measure the Pirc (an Apc-mutant rat kindred that models human FAP) mucosa and polyps PGE 2 , one of the strong biomarkers of colorectal cancer. A similar concept could be applied to endogenous biomarkers in other tissues. Copyright © 2017 Elsevier B.V. All rights reserved.
Interim reliability evaluation program, Browns Ferry fault trees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, M.E.
1981-01-01
An abbreviated fault tree method is used to evaluate and model Browns Ferry systems in the Interim Reliability Evaluation programs, simplifying the recording and displaying of events, yet maintaining the system of identifying faults. The level of investigation is not changed. The analytical thought process inherent in the conventional method is not compromised. But the abbreviated method takes less time, and the fault modes are much more visible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldstein, S.J.; Hensley, C.A.; Armenta, C.E.
1997-03-01
Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for {alpha}-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of `real` environmental and bioassaymore » samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of {approx}2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously. 24 refs., 2 figs., 2 tabs.« less
Method Development in Forensic Toxicology.
Peters, Frank T; Wissenbach, Dirk K; Busardo, Francesco Paolo; Marchei, Emilia; Pichini, Simona
2017-01-01
In the field of forensic toxicology, the quality of analytical methods is of great importance to ensure the reliability of results and to avoid unjustified legal consequences. A key to high quality analytical methods is a thorough method development. The presented article will provide an overview on the process of developing methods for forensic applications. This includes the definition of the method's purpose (e.g. qualitative vs quantitative) and the analytes to be included, choosing an appropriate sample matrix, setting up separation and detection systems as well as establishing a versatile sample preparation. Method development is concluded by an optimization process after which the new method is subject to method validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Tsikas, Dimitrios
2017-07-15
Tyrosine and tyrosine residues in proteins are attacked by the reactive oxygen and nitrogen species peroxynitrite (O=N-OO - ) to generate 3-nitrotyrosine (3-NT) and 3-nitrotyrosine-proteins (3-NTProt), respectively. 3-NT and 3-NTProt are widely accepted as biomarkers of nitr(os)ative stress. Over the years many different analytical methods have been reported for 3-NT and 3-NTProt. Reported concentrations often differ by more than three orders of magnitude, indicative of serious analytical problems. Strategies to overcome pre-analytical and analytical shortcomings and pitfalls have been proposed. The present review investigated whether recently published work on the quantitative measurement of biological 3-nitrotyrosine did adequately consider the analytical past of this biomolecule. 3-Nitrotyrosine was taken as a representative of biomolecules that occur in biological samples in the pM-to-nM concentration range. This examination revealed that in many cases the main protagonists involved in the publication of scientific work, i.e., authors, reviewers and editors, failed to do so. Learning from the analytical history of 3-nitrotyrosine means advancing analytical and biological science and implies the following key issues. (1) Choosing the most reliable analytical approach in terms of sensitivity and accuracy; presently this is best feasible by stable-isotope dilution tandem mass spectrometry coupled with gas chromatography (GC-MS/MS) or liquid chromatography (LC-MS/MS). (2) Minimizing artificial formation of 3-nitrotyrosine during sample work up, a major pitfall in 3-nitrotyrosine analysis. (3) Validating adequately the final method in the intendent biological matrix and the established concentration range. (4) Inviting experts in the field for critical evaluation of the novelty and reliability of the proposed analytical method, placing special emphasis on the compliance of the analytical outcome with 3-nitrotyrosine concentrations obtained by validated GC-MS/MS and LC-MS/MS methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Dynamic one-dimensional modeling of secondary settling tanks and system robustness evaluation.
Li, Ben; Stenstrom, M K
2014-01-01
One-dimensional secondary settling tank models are widely used in current engineering practice for design and optimization, and usually can be expressed as a nonlinear hyperbolic or nonlinear strongly degenerate parabolic partial differential equation (PDE). Reliable numerical methods are needed to produce approximate solutions that converge to the exact analytical solutions. In this study, we introduced a reliable numerical technique, the Yee-Roe-Davis (YRD) method as the governing PDE solver, and compared its reliability with the prevalent Stenstrom-Vitasovic-Takács (SVT) method by assessing their simulation results at various operating conditions. The YRD method also produced a similar solution to the previously developed Method G and Enquist-Osher method. The YRD and SVT methods were also used for a time-to-failure evaluation, and the results show that the choice of numerical method can greatly impact the solution. Reliable numerical methods, such as the YRD method, are strongly recommended.
High Throughput Determination of Tetramine in Drinking ...
Report The sampling and analytical procedure (SAP) presented herein, describes a method for the high throughput determination of tetramethylene disulfotetramine in drinking water by solid phase extraction and isotope dilution gas chromatography/mass spectrometry. This method, which will be included in the SAM, is expected to provide the Water Laboratory Alliance, as part of EPA’s Environmental Response Laboratory Network, with a more reliable and faster means of analyte collection and measurement.
Designing Glass Panels for Economy and Reliability
NASA Technical Reports Server (NTRS)
Moore, D. M.
1983-01-01
Analytical method determines probability of failure of rectangular glass plates subjected to uniformly distributed loads such as those from wind, earthquake, snow, and deadweight. Developed as aid in design of protective glass covers for solar-cell arrays and solar collectors, method is also useful in estimating the reliability of large windows in buildings exposed to high winds and is adapted to nonlinear stress analysis of simply supported plates of any elastic material.
Determination of glycerophosphate and other anions in dentifrices by ion chromatography.
Chen, Yongxin; Ye, Mingli; Cui, Hairong; Wu, Feiyan; Zhu, Yan; Fritz, James S
2006-06-16
Simple, reliable and sensitive analytical methods to determine the anions, such as fluoride, monofluorophaosphate, glycerophosphate related to anticaries are necessary for basic investigations of anticaries and quality control of dentifrices. A method for the simultaneous determination of organic acids, organic anions and inorganic anions in the sample of commercial toothpaste is proposed. Nine anions (fluoride, chloride, nitrite, nitrate, sulfate, phosphate, monofluorophaosphate, glycerophosphate and oxalic acid) were analyzed by means of ion chromatography using a gradient elution with KOH as mobile phase, IonPac AS18 as the separation column and suppressed conductivity detection. Optimized analytical conditions were further validated in terms of accuracy, precision and total uncertainty and the results showed the reliability of the IC method. The relative standard deviations (RSD) of the retention time and peak area of all species were less than 0.170 and 1.800%, respectively. The correlation coefficients for target analytes ranged from 0.9985 to 0.9996. The detection limit (signal to noise ratio of 3:1) of this method was at low ppb level (<15 ppb). The spiked recoveries for the anions were 96-103%. The method was applied to toothpaste without interferences.
Analytical studies on the Benney-Luke equation in mathematical physics
NASA Astrophysics Data System (ADS)
Islam, S. M. Rayhanul; Khan, Kamruzzaman; Woadud, K. M. Abdul Al
2018-04-01
The enhanced (G‧/G)-expansion method presents wide applicability to handling nonlinear wave equations. In this article, we find the new exact traveling wave solutions of the Benney-Luke equation by using the enhanced (G‧/G)-expansion method. This method is a useful, reliable, and concise method to easily solve the nonlinear evaluation equations (NLEEs). The traveling wave solutions have expressed in term of the hyperbolic and trigonometric functions. We also have plotted the 2D and 3D graphics of some analytical solutions obtained in this paper.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
López-García, Ester; Mastroianni, Nicola; Postigo, Cristina; Valcárcel, Yolanda; González-Alonso, Silvia; Barceló, Damia; López de Alda, Miren
2018-04-15
This work presents a fast, sensitive and reliable multi-residue methodology based on fat and protein precipitation and liquid chromatography-tandem mass spectrometry for the determination of common legal and illegal psychoactive drugs, and major metabolites, in breast milk. One-fourth of the 40 target analytes is investigated for the first time in this biological matrix. The method was validated in breast milk and also in various types of bovine milk, as tranquilizers are occasionally administered to food-producing animals. Absolute recoveries were satisfactory for 75% of the target analytes. The use of isotopically labeled compounds assisted in correcting analyte losses due to ionization suppression matrix effects (higher in whole milk than in the other investigated milk matrices) and ensured the reliability of the results. Average method limits of quantification ranged between 0.4 and 6.8 ng/mL. Application of the developed method showed the presence of caffeine in breast milk samples (12-179 ng/mL). Copyright © 2017 Elsevier Ltd. All rights reserved.
Pang, Susan; Cowen, Simon
2017-12-13
We describe a novel generic method to derive the unknown endogenous concentrations of analyte within complex biological matrices (e.g. serum or plasma) based upon the relationship between the immunoassay signal response of a biological test sample spiked with known analyte concentrations and the log transformed estimated total concentration. If the estimated total analyte concentration is correct, a portion of the sigmoid on a log-log plot is very close to linear, allowing the unknown endogenous concentration to be estimated using a numerical method. This approach obviates conventional relative quantification using an internal standard curve and need for calibrant diluent, and takes into account the individual matrix interference on the immunoassay by spiking the test sample itself. This technique is based on standard additions for chemical analytes. Unknown endogenous analyte concentrations within even 2-fold diluted human plasma may be determined reliably using as few as four reaction wells.
An efficient and reliable analytical method was developed for the sensitive and selective quantification of pyrethroid pesticides (PYRs) in house dust samples. The method is based on selective pressurized liquid extraction (SPLE) of the dust-bound PYRs into dichloromethane (DCM) wi...
Pérez-Lozano, P; García-Montoya, E; Orriols, A; Miñarro, M; Ticó, J R; Suñé-Negre, J M
2005-10-04
A new HPLC-RP method has been developed and validated for the simultaneous determination of benzocaine, two preservatives (propylparaben (nipasol) and benzyl alcohol) and degradation products of benzocaine in a semisolid pharmaceutical dosage form (benzocaine gel). The method uses a Nucleosil 120 C18 column and gradient elution. The mobile phase consisted of a mixture of methanol and glacial acetic acid (10%, v/v) at different proportion according to a time-schedule programme, pumped at a flow rate of 2.0 ml min(-1). The DAD detector was set at 258 nm. The validation study was carried out fulfilling the ICH guidelines in order to prove that the new analytical method, meets the reliability characteristics, and these characteristics showed the capacity of analytical method to keep, throughout the time, the fundamental criteria for validation: selectivity, linearity, precision, accuracy and sensitivity. The method was applied during the quality control of benzocaine gel in order to quantify the drug (benzocaine), preservatives and degraded products and proved to be suitable for rapid and reliable quality control method.
Differential reliability : probabilistic engineering applied to wood members in bending-tension
Stanley K. Suddarth; Frank E. Woeste; William L. Galligan
1978-01-01
Reliability analysis is a mathematical technique for appraising the design and materials of engineered structures to provide a quantitative estimate of probability of failure. Two or more cases which are similar in all respects but one may be analyzed by this method; the contrast between the probabilities of failure for these cases allows strong analytical focus on the...
Stark, Peter C [Los Alamos, NM; Zurek, Eduardo [Barranquilla, CO; Wheat, Jeffrey V [Fort Walton Beach, FL; Dunbar, John M [Santa Fe, NM; Olivares, Jose A [Los Alamos, NM; Garcia-Rubio, Luis H [Temple Terrace, FL; Ward, Michael D [Los Alamos, NM
2011-07-26
There is provided a method and device for remote sampling, preparation and optical interrogation of a sample using light scattering and light absorption methods. The portable device is a filtration-based device that removes interfering background particle material from the sample matrix by segregating or filtering the chosen analyte from the sample solution or matrix while allowing the interfering background particles to be pumped out of the device. The segregated analyte is then suspended in a diluent for analysis. The device is capable of calculating an initial concentration of the analyte, as well as diluting the analyte such that reliable optical measurements can be made. Suitable analytes include cells, microorganisms, bioparticles, pathogens and diseases. Sample matrixes include biological fluids such as blood and urine, as well as environmental samples including waste water.
First Order Reliability Application and Verification Methods for Semistatic Structures
NASA Technical Reports Server (NTRS)
Verderaime, Vincent
1994-01-01
Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored by conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments, its stress audits are shown to be arbitrary and incomplete, and it compromises high strength materials performance. A reliability method is proposed which combines first order reliability principles with deterministic design variables and conventional test technique to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety index expression. The application is reduced to solving for a factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and with the pace of semistatic structural designs.
Discharge reliability in ablative pulsed plasma thrusters
NASA Astrophysics Data System (ADS)
Wu, Zhiwen; Sun, Guorui; Yuan, Shiyue; Huang, Tiankun; Liu, Xiangyang; Xie, Kan; Wang, Ningfei
2017-08-01
Discharge reliability is typically neglected in low-ignition-cycle ablative pulsed plasma thrusters (APPTs). In this study, the discharge reliability of an APPT is assessed analytically and experimentally. The goals of this study are to better understand the ignition characteristics and to assess the accuracy of the analytical method. For each of six sets of operating conditions, 500 tests of a parallel-plate APPT with a coaxial semiconductor spark plug are conducted. The discharge voltage and current are measured with a high-voltage probe and a Rogowski coil, respectively, to determine whether the discharge is successful. Generally, the discharge success rate increases as the discharge voltage increases, and it decreases as the electrode gap and the number of ignitions increases. The theoretical analysis and the experimental results are reasonably consistent. This approach provides a reference for designing APPTs and improving their stability.
Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta
2018-05-15
Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.
Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei
2017-08-15
Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.
Statistically Qualified Neuro-Analytic system and Method for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
1998-11-04
An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less
NASA Astrophysics Data System (ADS)
Hosseini-Hashemi, Shahrokh; Sepahi-Boroujeni, Amin; Sepahi-Boroujeni, Saeid
2018-04-01
Normal impact performance of a system including a fullerene molecule and a single-layered graphene sheet is studied in the present paper. Firstly, through a mathematical approach, a new contact law is derived to describe the overall non-bonding interaction forces of the "hollow indenter-target" system. Preliminary verifications show that the derived contact law gives a reliable picture of force field of the system which is in good agreements with the results of molecular dynamics (MD) simulations. Afterwards, equation of the transversal motion of graphene sheet is utilized on the basis of both the nonlocal theory of elasticity and the assumptions of classical plate theory. Then, to derive dynamic behavior of the system, a set including the proposed contact law and the equations of motion of both graphene sheet and fullerene molecule is solved numerically. In order to evaluate outcomes of this method, the problem is modeled by MD simulation. Despite intrinsic differences between analytical and MD methods as well as various errors arise due to transient nature of the problem, acceptable agreements are established between analytical and MD outcomes. As a result, the proposed analytical method can be reliably used to address similar impact problems. Furthermore, it is found that a single-layered graphene sheet is capable of trapping fullerenes approaching with low velocities. Otherwise, in case of rebound, the sheet effectively absorbs predominant portion of fullerene energy.
A review of analytical methods for the treatment of flows with detached shocks
NASA Technical Reports Server (NTRS)
Busemann, Adolf
1949-01-01
The transonic flow theory has been considerably improved in recent years. The problems at subsonic speeds of a moving body concern chiefly the drag and the problems at supersonic speeds, the detached and attached shock waves. Inasmuch as the literature contains some information that is valuable and some other information that is misleading, the purpose of this paper is to discuss those analytical methods and their applications which are regarded as reliable in the transonic range. After these methods are reviewed, a short discussion without details and proofs follows to round out the picture. (author)
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
Yu, Chanki; Lee, Sang Wook
2016-05-20
We present a reliable and accurate global optimization framework for estimating parameters of isotropic analytical bidirectional reflectance distribution function (BRDF) models. This approach is based on a branch and bound strategy with linear programming and interval analysis. Conventional local optimization is often very inefficient for BRDF estimation since its fitting quality is highly dependent on initial guesses due to the nonlinearity of analytical BRDF models. The algorithm presented in this paper employs L1-norm error minimization to estimate BRDF parameters in a globally optimal way and interval arithmetic to derive our feasibility problem and lower bounding function. Our method is developed for the Cook-Torrance model but with several normal distribution functions such as the Beckmann, Berry, and GGX functions. Experiments have been carried out to validate the presented method using 100 isotropic materials from the MERL BRDF database, and our experimental results demonstrate that the L1-norm minimization provides a more accurate and reliable solution than the L2-norm minimization.
[Enzymatic analysis of the quality of foodstuffs].
Kolesnov, A Iu
1997-01-01
Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.
Improving the Analysis of Anthocyanidins from Blueberries Using Response Surface Methodology
USDA-ARS?s Scientific Manuscript database
Background: Recent interest in the health promoting potential of anthocyanins points to the need for robust and reliable analytical methods. It is essential to know that the health promoting chemicals are present in juices and other products processed from whole fruit. Many different methods have be...
Meta-analysis in evidence-based healthcare: a paradigm shift away from random effects is overdue.
Doi, Suhail A R; Furuya-Kanamori, Luis; Thalib, Lukman; Barendregt, Jan J
2017-12-01
Each year up to 20 000 systematic reviews and meta-analyses are published whose results influence healthcare decisions, thus making the robustness and reliability of meta-analytic methods one of the world's top clinical and public health priorities. The evidence synthesis makes use of either fixed-effect or random-effects statistical methods. The fixed-effect method has largely been replaced by the random-effects method as heterogeneity of study effects led to poor error estimation. However, despite the widespread use and acceptance of the random-effects method to correct this, it too remains unsatisfactory and continues to suffer from defective error estimation, posing a serious threat to decision-making in evidence-based clinical and public health practice. We discuss here the problem with the random-effects approach and demonstrate that there exist better estimators under the fixed-effect model framework that can achieve optimal error estimation. We argue for an urgent return to the earlier framework with updates that address these problems and conclude that doing so can markedly improve the reliability of meta-analytical findings and thus decision-making in healthcare.
Reliability computation using fault tree analysis
NASA Technical Reports Server (NTRS)
Chelson, P. O.
1971-01-01
A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.
NASA Technical Reports Server (NTRS)
Ghista, D. N.; Sandler, H.
1974-01-01
An analytical method is presented for determining the oxygen consumption rate of the intact heart working (as opposed to empty but beating) human left ventricle. Use is made of experimental recordings obtained for the chamber pressure and the associated dimensions of the LV. LV dimensions are determined by cineangiocardiography, and the chamber pressure is obtained by means of fluid-filled catheters during retrograde or transeptal catheterization. An analytical method incorporating these data is then employed for the evaluation of the LV coronary oxygen consumption in five subjects. Oxygen consumption for these subjects was also obtained by the conventional clinical method in order to evaluate the reliability of the proposed method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patton, A.D.; Ayoub, A.K.; Singh, C.
1982-07-01
Existing methods for generating capacity reliability evaluation do not explicitly recognize a number of operating considerations which may have important effects in system reliability performance. Thus, current methods may yield estimates of system reliability which differ appreciably from actual observed reliability. Further, current methods offer no means of accurately studying or evaluating alternatives which may differ in one or more operating considerations. Operating considerations which are considered to be important in generating capacity reliability evaluation include: unit duty cycles as influenced by load cycle shape, reliability performance of other units, unit commitment policy, and operating reserve policy; unit start-up failuresmore » distinct from unit running failures; unit start-up times; and unit outage postponability and the management of postponable outages. A detailed Monte Carlo simulation computer model called GENESIS and two analytical models called OPCON and OPPLAN have been developed which are capable of incorporating the effects of many operating considerations including those noted above. These computer models have been used to study a variety of actual and synthetic systems and are available from EPRI. The new models are shown to produce system reliability indices which differ appreciably from index values computed using traditional models which do not recognize operating considerations.« less
NASA Astrophysics Data System (ADS)
Kamiaka, Shoya; Benomar, Othman; Suto, Yasushi
2018-05-01
Advances in asteroseismology of solar-like stars, now provide a unique method to estimate the stellar inclination i⋆. This enables to evaluate the spin-orbit angle of transiting planetary systems, in a complementary fashion to the Rossiter-McLaughlineffect, a well-established method to estimate the projected spin-orbit angle λ. Although the asteroseismic method has been broadly applied to the Kepler data, its reliability has yet to be assessed intensively. In this work, we evaluate the accuracy of i⋆ from asteroseismology of solar-like stars using 3000 simulated power spectra. We find that the low signal-to-noise ratio of the power spectra induces a systematic under-estimate (over-estimate) bias for stars with high (low) inclinations. We derive analytical criteria for the reliable asteroseismic estimate, which indicates that reliable measurements are possible in the range of 20° ≲ i⋆ ≲ 80° only for stars with high signal-to-noise ratio. We also analyse and measure the stellar inclination of 94 Kepler main-sequence solar-like stars, among which 33 are planetary hosts. According to our reliability criteria, a third of them (9 with planets, 22 without) have accurate stellar inclination. Comparison of our asteroseismic estimate of vsin i⋆ against spectroscopic measurements indicates that the latter suffers from a large uncertainty possibly due to the modeling of macro-turbulence, especially for stars with projected rotation speed vsin i⋆ ≲ 5km/s. This reinforces earlier claims, and the stellar inclination estimated from the combination of measurements from spectroscopy and photometric variation for slowly rotating stars needs to be interpreted with caution.
Nurhuda, M; Rouf, A
2017-09-01
The paper presents a method for simultaneous computation of eigenfunction and eigenvalue of the stationary Schrödinger equation on a grid, without imposing boundary-value condition. The method is based on the filter operator, which selects the eigenfunction from wave packet at the rate comparable to δ function. The efficacy and reliability of the method are demonstrated by comparing the simulation results with analytical or numerical solutions obtained by using other methods for various boundary-value conditions. It is found that the method is robust, accurate, and reliable. Further prospect of filter method for simulation of the Schrödinger equation in higher-dimensional space will also be highlighted.
EPA Method 101A applies to the determination of particulate and gaseous mercury missions from sewage sludge incinerators and other sources. oncern has been expressed hat ammonia or hydrogen chloride (HCl) when present in the emissions, interferes in the analytical processes and p...
Quantitative PCR for Genetic Markers of Human Fecal Pollution
Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantificationapproach. We report the development of quantitative PCR assays for quantification of two recently described human-...
HUMAN EXPOSURE ASSESSMENT USING IMMUNOASSAY
The National Exposure Research Laboratory-Las Vegas is developing analytical methods for human exposure assessment studies. Critical exposure studies generate a large number of samples which must be analyzed in a reliable, cost-effective and timely manner. TCP (3,5,6-trichlor...
An analytic model for footprint dispersions and its application to mission design
NASA Technical Reports Server (NTRS)
Rao, J. R. Jagannatha; Chen, Yi-Chao
1992-01-01
This is the final report on our recent research activities that are complementary to those conducted by our colleagues, Professor Farrokh Mistree and students, in the context of the Taguchi method. We have studied the mathematical model that forms the basis of the Simulation and Optimization of Rocket Trajectories (SORT) program and developed an analytic method for determining mission reliability with a reduced number of flight simulations. This method can be incorporated in a design algorithm to mathematically optimize different performance measures of a mission, thus leading to a robust and easy-to-use methodology for mission planning and design.
First-order reliability application and verification methods for semistatic structures
NASA Astrophysics Data System (ADS)
Verderaime, V.
1994-11-01
Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored in conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments; stress audits are shown to be arbitrary and incomplete, and the concept compromises the performance of high-strength materials. A reliability method is proposed that combines first-order reliability principles with deterministic design variables and conventional test techniques to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety-index expression. The application is reduced to solving for a design factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this design factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and the development of semistatic structural designs.
Zhdanov,; Michael, S [Salt Lake City, UT
2008-01-29
Mineral exploration needs a reliable method to distinguish between uneconomic mineral deposits and economic mineralization. A method and system includes a geophysical technique for subsurface material characterization, mineral exploration and mineral discrimination. The technique introduced in this invention detects induced polarization effects in electromagnetic data and uses remote geophysical observations to determine the parameters of an effective conductivity relaxation model using a composite analytical multi-phase model of the rock formations. The conductivity relaxation model and analytical model can be used to determine parameters related by analytical expressions to the physical characteristics of the microstructure of the rocks and minerals. These parameters are ultimately used for the discrimination of different components in underground formations, and in this way provide an ability to distinguish between uneconomic mineral deposits and zones of economic mineralization using geophysical remote sensing technology.
Quantitative PCR for genetic markers of human fecal pollution
Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantification approach. We report the development of quantitative PCR assays for enumeration of two recently described hum...
Psychometrics Matter in Health Behavior: A Long-term Reliability Generalization Study.
Pickett, Andrew C; Valdez, Danny; Barry, Adam E
2017-09-01
Despite numerous calls for increased understanding and reporting of reliability estimates, social science research, including the field of health behavior, has been slow to respond and adopt such practices. Therefore, we offer a brief overview of reliability and common reporting errors; we then perform analyses to examine and demonstrate the variability of reliability estimates by sample and over time. Using meta-analytic reliability generalization, we examined the variability of coefficient alpha scores for a well-designed, consistent, nationwide health study, covering a span of nearly 40 years. For each year and sample, reliability varied. Furthermore, reliability was predicted by a sample characteristic that differed among age groups within each administration. We demonstrated that reliability is influenced by the methods and individuals from which a given sample is drawn. Our work echoes previous calls that psychometric properties, particularly reliability of scores, are important and must be considered and reported before drawing statistical conclusions.
Enhancing reproducibility of SALDI MS detection by concentrating analytes within laser spot.
Teng, Fei; Zhu, Qunyan; Wang, Yalei; Du, Juan; Lu, Nan
2018-03-01
Surface-assisted laser desorption/ionization time-of-flight mass spectrometry (SALDI TOF MS) has become one of the most important analytical methods due to its less interference at low molecular weight range. However, it is still a challenge to obtain a good reproducibility of SALDI TOF MS because of the inhomogeneous distribution of analyte molecules induced by coffee ring effect. We propose a universal and reliable method to eliminate the coffee ring effect by concentrating all the analyte molecules within the laser spot. This method exhibits an excellent reproducibility of spot-to-spot and substrate-to-substrate, and the relative standard deviations (RSDs) for different concentrations are lower than 12.6%. It also performs good linear dependency (R 2 > 0.98) in the log-log plot with the concentration range of 1nM to 1μM, and the limit of detection for R6G is down to 1fmol. Copyright © 2017 Elsevier B.V. All rights reserved.
Mechanics of the tapered interference fit in dental implants.
Bozkaya, Dinçer; Müftü, Sinan
2003-11-01
In evaluation of the long-term success of a dental implant, the reliability and the stability of the implant-abutment interface plays a great role. Tapered interference fits provide a reliable connection method between the abutment and the implant. In this work, the mechanics of the tapered interference fits were analyzed using a closed-form formula and the finite element (FE) method. An analytical solution, which is used to predict the contact pressure in a straight interference, was modified to predict the contact pressure in the tapered implant-abutment interface. Elastic-plastic FE analysis was used to simulate the implant and abutment material behavior. The validity and the applicability of the analytical solution were investigated by comparisons with the FE model for a range of problem parameters. It was shown that the analytical solution could be used to determine the pull-out force and loosening-torque with 5-10% error. Detailed analysis of the stress distribution due to tapered interference fit, in a commercially available, abutment-implant system was carried out. This analysis shows that plastic deformation in the implant limits the increase in the pull-out force that would have been otherwise predicted by higher interference values.
Preusser, Matthias; Berghoff, Anna S.; Manzl, Claudia; Filipits, Martin; Weinhäusel, Andreas; Pulverer, Walter; Dieckmann, Karin; Widhalm, Georg; Wöhrer, Adelheid; Knosp, Engelbert; Marosi, Christine; Hainfellner, Johannes A.
2014-01-01
Testing of the MGMT promoter methylation status in glioblastoma is relevant for clinical decision making and research applications. Two recent and independent phase III therapy trials confirmed a prognostic and predictive value of the MGMT promoter methylation status in elderly glioblastoma patients. Several methods for MGMT promoter methylation testing have been proposed, but seem to be of limited test reliability. Therefore, and also due to feasibility reasons, translation of MGMT methylation testing into routine use has been protracted so far. Pyrosequencing after prior DNA bisulfite modification has emerged as a reliable, accurate, fast and easy-to-use method for MGMT promoter methylation testing in tumor tissues (including formalin-fixed and paraffin-embedded samples). We performed an intra- and inter-laboratory ring trial which demonstrates a high analytical performance of this technique. Thus, pyrosequencing-based assessment of MGMT promoter methylation status in glioblastoma meets the criteria of high analytical test performance and can be recommended for clinical application, provided that strict quality control is performed. Our article summarizes clinical indications, practical instructions and open issues for MGMT promoter methylation testing in glioblastoma using pyrosequencing. PMID:24359605
Accurate mass measurements and their appropriate use for reliable analyte identification.
Godfrey, A Ruth; Brenton, A Gareth
2012-09-01
Accurate mass instrumentation is becoming increasingly available to non-expert users. This data can be mis-used, particularly for analyte identification. Current best practice in assigning potential elemental formula for reliable analyte identification has been described with modern informatic approaches to analyte elucidation, including chemometric characterisation, data processing and searching using facilities such as the Chemical Abstracts Service (CAS) Registry and Chemspider.
Reliability-based structural optimization: A proposed analytical-experimental study
NASA Technical Reports Server (NTRS)
Stroud, W. Jefferson; Nikolaidis, Efstratios
1993-01-01
An analytical and experimental study for assessing the potential of reliability-based structural optimization is proposed and described. In the study, competing designs obtained by deterministic and reliability-based optimization are compared. The experimental portion of the study is practical because the structure selected is a modular, actively and passively controlled truss that consists of many identical members, and because the competing designs are compared in terms of their dynamic performance and are not destroyed if failure occurs. The analytical portion of this study is illustrated on a 10-bar truss example. In the illustrative example, it is shown that reliability-based optimization can yield a design that is superior to an alternative design obtained by deterministic optimization. These analytical results provide motivation for the proposed study, which is underway.
Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.
Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María
2017-01-01
This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.
ERIC Educational Resources Information Center
Han, Turgay; Huang, Jinyan
2017-01-01
Using generalizability (G-) theory and rater interviews as both quantitative and qualitative approaches, this study examined the impact of scoring methods (i.e., holistic versus analytic scoring) on the scoring variability and reliability of an EFL institutional writing assessment at a Turkish university. Ten raters were invited to rate 36…
O'Neal, Wanda K; Anderson, Wayne; Basta, Patricia V; Carretta, Elizabeth E; Doerschuk, Claire M; Barr, R Graham; Bleecker, Eugene R; Christenson, Stephanie A; Curtis, Jeffrey L; Han, Meilan K; Hansel, Nadia N; Kanner, Richard E; Kleerup, Eric C; Martinez, Fernando J; Miller, Bruce E; Peters, Stephen P; Rennard, Stephen I; Scholand, Mary Beth; Tal-Singer, Ruth; Woodruff, Prescott G; Couper, David J; Davis, Sonia M
2014-01-08
As a part of the longitudinal Chronic Obstructive Pulmonary Disease (COPD) study, Subpopulations and Intermediate Outcome Measures in COPD study (SPIROMICS), blood samples are being collected from 3200 subjects with the goal of identifying blood biomarkers for sub-phenotyping patients and predicting disease progression. To determine the most reliable sample type for measuring specific blood analytes in the cohort, a pilot study was performed from a subset of 24 subjects comparing serum, Ethylenediaminetetraacetic acid (EDTA) plasma, and EDTA plasma with proteinase inhibitors (P100). 105 analytes, chosen for potential relevance to COPD, arranged in 12 multiplex and one simplex platform (Myriad-RBM) were evaluated in duplicate from the three sample types from 24 subjects. The reliability coefficient and the coefficient of variation (CV) were calculated. The performance of each analyte and mean analyte levels were evaluated across sample types. 20% of analytes were not consistently detectable in any sample type. Higher reliability and/or smaller CV were determined for 12 analytes in EDTA plasma compared to serum, and for 11 analytes in serum compared to EDTA plasma. While reliability measures were similar for EDTA plasma and P100 plasma for a majority of analytes, CV was modestly increased in P100 plasma for eight analytes. Each analyte within a multiplex produced independent measurement characteristics, complicating selection of sample type for individual multiplexes. There were notable detectability and measurability differences between serum and plasma. Multiplexing may not be ideal if large reliability differences exist across analytes measured within the multiplex, especially if values differ based on sample type. For some analytes, the large CV should be considered during experimental design, and the use of duplicate and/or triplicate samples may be necessary. These results should prove useful for studies evaluating selection of samples for evaluation of potential blood biomarkers.
Hines, Erin P; Rayner, Jennifer L; Barbee, Randy; Moreland, Rae Ann; Valcour, Andre; Schmid, Judith E; Fenton, Suzanne E
2007-05-01
Breast milk is a primary source of nutrition that contains many endogenous compounds that may affect infant development. The goals of this study were to develop reliable assays for selected endogenous breast milk components and to compare levels of those in milk and serum collected from the same mother twice during lactation (2-7 weeks and 3-4 months). Reliable assays were developed for glucose, secretory IgA, interleukin-6, tumor necrosis factor-a, triglycerides, prolactin, and estradiol from participants in a US EPA study called Methods Advancement in Milk Analysis (MAMA). Fresh and frozen (-20 degrees C) milk samples were assayed to determine effects of storage on endogenous analytes. The source effect (serum vs milk) seen in all 7 analytes indicates that serum should not be used as a surrogate for milk in children's health studies. The authors propose to use these assays in studies to examine relationships between the levels of milk components and children's health.
Analytical Model of Large Data Transactions in CoAP Networks
Ludovici, Alessandro; Di Marco, Piergiuseppe; Calveras, Anna; Johansson, Karl H.
2014-01-01
We propose a novel analytical model to study fragmentation methods in wireless sensor networks adopting the Constrained Application Protocol (CoAP) and the IEEE 802.15.4 standard for medium access control (MAC). The blockwise transfer technique proposed in CoAP and the 6LoWPAN fragmentation are included in the analysis. The two techniques are compared in terms of reliability and delay, depending on the traffic, the number of nodes and the parameters of the IEEE 802.15.4 MAC. The results are validated trough Monte Carlo simulations. To the best of our knowledge this is the first study that evaluates and compares analytically the performance of CoAP blockwise transfer and 6LoWPAN fragmentation. A major contribution is the possibility to understand the behavior of both techniques with different network conditions. Our results show that 6LoWPAN fragmentation is preferable for delay-constrained applications. For highly congested networks, the blockwise transfer slightly outperforms 6LoWPAN fragmentation in terms of reliability. PMID:25153143
An interactive website for analytical method comparison and bias estimation.
Bahar, Burak; Tuncel, Ayse F; Holmes, Earle W; Holmes, Daniel T
2017-12-01
Regulatory standards mandate laboratories to perform studies to ensure accuracy and reliability of their test results. Method comparison and bias estimation are important components of these studies. We developed an interactive website for evaluating the relative performance of two analytical methods using R programming language tools. The website can be accessed at https://bahar.shinyapps.io/method_compare/. The site has an easy-to-use interface that allows both copy-pasting and manual entry of data. It also allows selection of a regression model and creation of regression and difference plots. Available regression models include Ordinary Least Squares, Weighted-Ordinary Least Squares, Deming, Weighted-Deming, Passing-Bablok and Passing-Bablok for large datasets. The server processes the data and generates downloadable reports in PDF or HTML format. Our website provides clinical laboratories a practical way to assess the relative performance of two analytical methods. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Research strategies that result in optimal data collection from the patient medical record
Gregory, Katherine E.; Radovinsky, Lucy
2010-01-01
Data obtained from the patient medical record are often a component of clinical research led by nurse investigators. The rigor of the data collection methods correlates to the reliability of the data and, ultimately, the analytical outcome of the study. Research strategies for reliable data collection from the patient medical record include the development of a precise data collection tool, the use of a coding manual, and ongoing communication with research staff. PMID:20974093
NASA Astrophysics Data System (ADS)
Rohandi, M.; Tuloli, M. Y.; Jassin, R. T.
2018-02-01
This research aims to determine the development of priority of underwater tourism in Gorontalo province using the Analytical Hierarchy Process (AHP) method which is one of DSS methods applying Multi-Attribute Decision Making (MADM). This method used 5 criteria and 28 alternatives to determine the best priority of underwater tourism site development in Gorontalo province. Based on the AHP calculation it appeared that the best priority development of underwater tourism site is Pulau Cinta whose total AHP score is 0.489 or 48.9%. This DSS produced a reliable result, faster solution, time-saving, and low cost for the decision makers to obtain the best underwater tourism site to be developed.
Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert
2015-07-01
Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.
Statistical evaluation of forecasts
NASA Astrophysics Data System (ADS)
Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn
2014-08-01
Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.
A reliability analysis tool for SpaceWire network
NASA Astrophysics Data System (ADS)
Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou
2017-04-01
A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.
Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M
2008-01-01
Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. Conclusion The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. Method We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit. PMID:18992163
Milosheska, Daniela; Roškar, Robert
2017-05-10
The aim of the present report was to develop and validate simple, sensitive and reliable LC-MS/MS method for quantification of topiramate (TPM) and its main metabolites: 2,3-desisopropylidene TPM, 4,5-desisopropylidene TPM, 10-OH TPM and 9-OH TPM in human plasma samples. The most abundant metabolite 2,3-desisopropylidene TPM was isolated from patients urine, characterized and afterwards used as an authentic standard for method development and validation. Sample preparation method employs 100μL of plasma sample and liquid-liquid extraction with a mixture of ethyl acetate and diethyl ether as extraction solvent. Chromatographic separation was achieved on a 1290 Infinity UHPLC coupled to 6460 Triple Quad Mass Spectrometer operated in negative MRM mode using Kinetex C18 column (50×2.1mm, 2.6μm) by gradient elution using water and methanol as a mobile phase and stable isotope labeled TPM as internal standard. The method showed to be selective, accurate, precise and linear over the concentration ranges of 0.10-20μg/mL for TPM, 0.01-2.0μg/mL for 2,3-desisopropylidene TPM, and 0.001-0.200μg/mL for 4,5-desisopropylidene TPM, 10-OH TPM and 9-OH TPM. The described method is the first fully validated method capable of simultaneous determination of TPM and its main metabolites in plasma over the selected analytical range. The suitability of the method was successfully demonstrated by the quantification of all analytes in plasma samples of patients with epilepsy and can be considered as reliable analytical tool for future investigations of the TPM metabolism. Copyright © 2017 Elsevier B.V. All rights reserved.
Mabood, Fazle; Khan, Waqar A; Ismail, Ahmad Izani Md
2013-01-01
In this article, an approximate analytical solution of flow and heat transfer for a viscoelastic fluid in an axisymmetric channel with porous wall is presented. The solution is obtained through the use of a powerful method known as Optimal Homotopy Asymptotic Method (OHAM). We obtained the approximate analytical solution for dimensionless velocity and temperature for various parameters. The influence and effect of different parameters on dimensionless velocity, temperature, friction factor, and rate of heat transfer are presented graphically. We also compared our solution with those obtained by other methods and it is found that OHAM solution is better than the other methods considered. This shows that OHAM is reliable for use to solve strongly nonlinear problems in heat transfer phenomena.
Mabood, Fazle; Khan, Waqar A.; Ismail, Ahmad Izani
2013-01-01
In this article, an approximate analytical solution of flow and heat transfer for a viscoelastic fluid in an axisymmetric channel with porous wall is presented. The solution is obtained through the use of a powerful method known as Optimal Homotopy Asymptotic Method (OHAM). We obtained the approximate analytical solution for dimensionless velocity and temperature for various parameters. The influence and effect of different parameters on dimensionless velocity, temperature, friction factor, and rate of heat transfer are presented graphically. We also compared our solution with those obtained by other methods and it is found that OHAM solution is better than the other methods considered. This shows that OHAM is reliable for use to solve strongly nonlinear problems in heat transfer phenomena. PMID:24376722
Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel
2017-01-01
Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.
2017-01-01
Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)’s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete (“grab”) samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples. PMID:28759607
Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M
2008-11-07
Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit.
Tran, N L; Bohrer, F I; Trogler, W C; Kummel, A C
2009-05-28
Density functional theory (DFT) simulations were used to determine the binding strength of 12 electron-donating analytes to the zinc metal center of a zinc phthalocyanine molecule (ZnPc monomer). The analyte binding strengths were compared to the analytes' enthalpies of complex formation with boron trifluoride (BF(3)), which is a direct measure of their electron donating ability or Lewis basicity. With the exception of the most basic analyte investigated, the ZnPc binding energies were found to correlate linearly with analyte basicities. Based on natural population analysis calculations, analyte complexation to the Zn metal of the ZnPc monomer resulted in limited charge transfer from the analyte to the ZnPc molecule, which increased with analyte-ZnPc binding energy. The experimental analyte sensitivities from chemiresistor ZnPc sensor data were proportional to an exponential of the binding energies from DFT calculations consistent with sensitivity being proportional to analyte coverage and binding strength. The good correlation observed suggests DFT is a reliable method for the prediction of chemiresistor metallophthalocyanine binding strengths and response sensitivities.
Study on application of aerospace technology to improve surgical implants
NASA Technical Reports Server (NTRS)
Johnson, R. E.; Youngblood, J. L.
1982-01-01
The areas where aerospace technology could be used to improve the reliability and performance of metallic, orthopedic implants was assessed. Specifically, comparisons were made of material controls, design approaches, analytical methods and inspection approaches being used in the implant industry with hardware for the aerospace industries. Several areas for possible improvement were noted such as increased use of finite element stress analysis and fracture control programs on devices where the needs exist for maximum reliability and high structural performance.
Daniels, Vijay John; Harley, Dwight
2017-07-01
Although previous research has compared checklists to rating scales for assessing communication, the purpose of this study was to compare the effect on reliability and sensitivity to level of training of an analytic, a holistic, and a combined analytic-holistic rating scale in assessing communication skills. The University of Alberta Internal Medicine Residency runs OSCEs for postgraduate year (PGY) 1 and 2 residents and another for PGY-4 residents. Communication stations were scored with an analytic scale (empathy, non-verbal skills, verbal skills, and coherence subscales) and a holistic scale. Authors analyzed reliability of individual and combined scales using generalizability theory and evaluated each scale's sensitivity to level of training. For analytic, holistic, and combined scales, 12, 12, and 11 stations respectively yielded a Phi of 0.8 for the PGY-1,2 cohort, and 16, 16, and 14 stations yielded a Phi of 0.8 for the PGY-4 cohort. PGY-4 residents scored higher on the combined scale, the analytic rating scale, and the non-verbal and coherence subscales. A combined analytic-holistic rating scale increased score reliability and was sensitive to level of training. Given increased validity evidence, OSCE developers should consider combining analytic and holistic scales when assessing communication skills. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.
Krishan, Kewal; Chatterjee, Preetika M; Kanchan, Tanuj; Kaur, Sandeep; Baryah, Neha; Singh, R K
2016-04-01
Sex estimation is considered as one of the essential parameters in forensic anthropology casework, and requires foremost consideration in the examination of skeletal remains. Forensic anthropologists frequently employ morphologic and metric methods for sex estimation of human remains. These methods are still very imperative in identification process in spite of the advent and accomplishment of molecular techniques. A constant boost in the use of imaging techniques in forensic anthropology research has facilitated to derive as well as revise the available population data. These methods however, are less reliable owing to high variance and indistinct landmark details. The present review discusses the reliability and reproducibility of various analytical approaches; morphological, metric, molecular and radiographic methods in sex estimation of skeletal remains. Numerous studies have shown a higher reliability and reproducibility of measurements taken directly on the bones and hence, such direct methods of sex estimation are considered to be more reliable than the other methods. Geometric morphometric (GM) method and Diagnose Sexuelle Probabiliste (DSP) method are emerging as valid methods and widely used techniques in forensic anthropology in terms of accuracy and reliability. Besides, the newer 3D methods are shown to exhibit specific sexual dimorphism patterns not readily revealed by traditional methods. Development of newer and better methodologies for sex estimation as well as re-evaluation of the existing ones will continue in the endeavour of forensic researchers for more accurate results. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Analytical aspects of hydrogen exchange mass spectrometry
Engen, John R.; Wales, Thomas E.
2016-01-01
The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552
ERIC Educational Resources Information Center
Saxton, Emily; Belanger, Secret; Becker, William
2012-01-01
The purpose of this study was to investigate the intra-rater and inter-rater reliability of the Critical Thinking Analytic Rubric (CTAR). The CTAR is composed of 6 rubric categories: interpretation, analysis, evaluation, inference, explanation, and disposition. To investigate inter-rater reliability, two trained raters scored four sets of…
Hebaz, Salah-Eddine; Benmeddour, Farouk; Moulin, Emmanuel; Assaad, Jamal
2018-01-01
The development of reliable guided waves inspection systems is conditioned by an accurate knowledge of their dispersive properties. The semi-analytical finite element method has been proven to be very practical for modeling wave propagation in arbitrary cross-section waveguides. However, when it comes to computations on complex geometries to a given accuracy, it still has a major drawback: the high consumption of resources. Recently, discontinuous Galerkin finite element method (DG-FEM) has been found advantageous over the standard finite element method when applied as well in the frequency domain. In this work, a high-order method for the computation of Lamb mode characteristics in plates is proposed. The problem is discretised using a class of DG-FEM, namely, the interior penalty methods family. The analytical validation is performed through the homogeneous isotropic case with traction-free boundary conditions. Afterwards, functionally graded material plates are analysed and a numerical example is presented. It was found that the obtained results are in good agreement with those found in the literature.
Evaluation of two methods to determine glyphosate and AMPA in soils of Argentina
NASA Astrophysics Data System (ADS)
De Geronimo, Eduardo; Lorenzon, Claudio; Iwasita, Barbara; Faggioli, Valeria; Aparicio, Virginia; Costa, Jose Luis
2017-04-01
Argentine agricultural production is fundamentally based on a technological package combining no-tillage and the dependence of glyphosate applications to control weeds in transgenic crops (soybean, maize and cotton). Therefore, glyphosate is the most employed herbicide in the country, where 180 to 200 million liters are applied every year. Due to its widespread use, it is important to assess its impact on the environment and, therefore, reliable analytical methods are mandatory. Glyphosate molecule exhibits unique physical and chemical characteristics which difficult its quantification, especially in soils with high organic matter content, such as the central eastern Argentine soils, where strong interferences are normally observed. The objective of this work was to compare two methods for extraction and quantification of glyphosate and AMPA in samples of 8 representative soils of Argentina. The first analytical method (method 1) was based on the use of phosphate buffer as extracting solution and dichloromethane to minimize matrix organic content. In the second method (method 2), potassium hydroxide was used to extract the analytes followed by a clean-up step using solid phase extraction (SPE) to minimize strong interferences. Sensitivity, recoveries, matrix effects and robustness were evaluated. Both methodologies involved a derivatization with 9-fluorenyl-methyl-chloroformate (FMOC) in borate buffer and detection based on ultra-high-pressure liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS). Recoveries obtained from soil samples spiked at 0.1 and 1 mg kg-1 and were satisfactory in both methods (70% - 120%). However, there was a remarkable difference regarding the matrix effect, being the SPE clean-up step (method 2) insufficient to remove the interferences. Whereas the dilution and the clean-up with dichloromethane (method 1) were more effective minimizing the ionic suppression. Moreover, method 1 had fewer steps in the protocol of sample processing than method 2. This can be highly valuable in the routine lab work due to the reduction of potential undesired errors such as the loss of analyte or sample contamination. In addition, the substitution of SPE by another alternative involved a considerable reduction of analytical costs in method 1. We conclude that method 1 seemed to be simpler and cheaper than method 2, as well as reliable to quantify glyphosate in Argentinean soils. We hope that this experience can be useful to simplify the protocols of glyphosate quantification and contribute to the understanding of the fate of this herbicide in the environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, S.S.; Attari, A.
1995-01-01
The discovery of arsenic compounds, as alkylarsines, in natural gas prompted this research program to develop reliable measurement techniques needed to assess the efficiency of removal processes for these environmentally sensitive substances. These techniques include sampling, speciation, quantitation and on-line instrumental methods for monitoring the total arsenic concentration. The current program has yielded many products, including calibration standards, arsenic-specific sorbents, sensitive analytical methods and instrumentation. Four laboratory analytical methods have been developed and successfully employed for arsenic determination in natural gas. These methods use GC-AED and GC-MS instruments to speciate alkylarsines, and peroxydisulfate extraction with FIAS, special carbon sorbent withmore » XRF and an IGT developed sorbent with GFAA for total arsenic measurement.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joiner, R.L.; Hayes, L.; Rust, W.
1989-05-01
The following report summarizes the development and validation of an analytical method for the analyses of soman (GD), mustard (HD), VX, and tabun (GA) in wastewater. The need for an analytical method that can detect GD, HD, VX, and GA with the necessary sensitivity (< 20 parts per billion (PPB))and selectivity is essential to Medical Research and Evaluation Facility (MREF) operations. The analytical data were generated using liquid-liquid extraction of the wastewater, with the extract being concentrated and analyzed by gas chromatography (GC) methods. The sample preparation and analyses methods were developed in support of ongoing activities within the MREF.more » We have documented the precision and accuracy of the analytical method through an expected working calibration range (3.0 to 60 ppb). The analytical method was statistically evaluated over a range of concentrations to establish a detection limit and quantitation limit for the method. Whenever the true concentration is 8.5 ppb or above, the probability is at least 99.9 percent that the measured concentration will be ppb or above. Thus, 6 ppb could be used as a lower reliability limit for detecting concentrations in excess of 8.5 ppb. In summary, the proposed sample extraction and analyses methods are suitable for quantitative analyses to determine the presence of GD, HD, VX, and GA in wastewater samples. Our findings indicate that we can detect any of these chemical surety materiel (CSM) in water at or below the established U.S. Army Surgeon General's safety levels in drinking water.« less
Droplet digital PCR quantifies host inflammatory transcripts in feces reliably and reproducibly
USDA-ARS?s Scientific Manuscript database
The gut is the most extensive, interactive, and complex interface between the human host and the environment and therefore a critical site of immunological activity. Non-invasive methods to assess the host response in this organ are currently lacking. Feces are the available analyte which have been ...
A Performance-Based Method of Student Evaluation
ERIC Educational Resources Information Center
Nelson, G. E.; And Others
1976-01-01
The Problem Oriented Medical Record (which allows practical definition of the behavioral terms thoroughness, reliability, sound analytical sense, and efficiency as they apply to the identification and management of patient problems) provides a vehicle to use in performance based type evaluation. A test-run use of the record is reported. (JT)
USDA-ARS?s Scientific Manuscript database
Market demands for cotton varieties with improved fiber properties also call for the development of fast, reliable analytical methods for monitoring fiber development and measuring their properties. Currently, cotton breeders rely on instrumentation that can require significant amounts of sample, w...
Wu, Xiaobin; Chao, Yan; Wan, Zemin; Wang, Yunxiu; Ma, Yan; Ke, Peifeng; Wu, Xinzhong; Xu, Jianhua; Zhuang, Junhua; Huang, Xianzhang
2016-10-15
Haemoglobin A 1c (HbA 1c ) is widely used in the management of diabetes. Therefore, the reliability and comparability among different analytical methods for its detection have become very important. A comparative evaluation of the analytical performances (precision, linearity, accuracy, method comparison, and interferences including bilirubin, triglyceride, cholesterol, labile HbA 1c (LA 1c ), vitamin C, aspirin, fetal haemoglobin (HbF), and haemoglobin E (Hb E)) were performed on Capillarys 2 Flex Piercing (Capillarys 2FP) (Sebia, France), Tosoh HLC-723 G8 (Tosoh G8) (Tosoh, Japan), Premier Hb9210 (Trinity Biotech, Ireland) and Roche Cobas c501 (Roche c501) (Roche Diagnostics, Germany). A good precision was shown at both low and high HbA 1c levels on all four systems, with all individual CVs below 2% (IFCC units) or 1.5% (NGSP units). Linearity analysis for each analyzer had achieved a good correlation coefficient (R 2 > 0.99) over the entire range tested. The analytical bias of the four systems against the IFCC targets was less than ± 6% (NGSP units), indicating a good accuracy. Method comparison showed a great correlation and agreement between methods. Very high levels of triglycerides and cholesterol (≥ 15.28 and ≥ 8.72 mmol/L, respectively) led to falsely low HbA 1c concentrations on Roche c501. Elevated HbF induced false HbA 1c detection on Capillarys 2FP (> 10%), Tosoh G8 (> 30%), Premier Hb9210 (> 15%), and Roche c501 (> 5%). On Tosoh G8, HbE induced an extra peak on chromatogram, and significantly lower results were reported. The four HbA 1c methods commonly used with commercial analyzers showed a good reliability and comparability, although some interference may falsely alter the result.
Pilot testing of SHRP 2 reliability data and analytical products: Minnesota.
DOT National Transportation Integrated Search
2015-01-01
The Minnesota pilot site has undertaken an effort to test data and analytical tools developed through the Strategic Highway Research Program (SHRP) 2 Reliability focus area. The purpose of these tools is to facilitate the improvement of travel time r...
Steuer, Andrea E; Forss, Anna-Maria; Dally, Annika M; Kraemer, Thomas
2014-11-01
In the context of driving under the influence of drugs (DUID), not only common drugs of abuse may have an influence, but also medications with similar mechanisms of action. Simultaneous quantification of a variety of drugs and medications relevant in this context allows faster and more effective analyses. Therefore, multi-analyte approaches have gained more and more popularity in recent years. Usually, calibration curves for such procedures contain a mixture of all analytes, which might lead to mutual interferences. In this study we investigated whether the use of such mixtures leads to reliable results for authentic samples containing only one or two analytes. Five hundred microliters of whole blood were extracted by routine solid-phase extraction (SPE, HCX). Analysis was performed on an ABSciex 3200 QTrap instrument with ESI+ in scheduled MRM mode. The method was fully validated according to international guidelines including selectivity, recovery, matrix effects, accuracy and precision, stabilities, and limit of quantification. The selected SPE provided recoveries >60% for all analytes except 6-monoacetylmorphine (MAM) with coefficients of variation (CV) below 15% or 20% for quality controls (QC) LOW and HIGH, respectively. Ion suppression >30% was found for benzoylecgonine, hydrocodone, hydromorphone, MDA, oxycodone, and oxymorphone at QC LOW, however CVs were always below 10% (n=6 different whole blood samples). Accuracy and precision criteria were fulfilled for all analytes except for MAM. Systematic investigation of accuracy determined for QC MED in a multi-analyte mixture compared to samples containing only single analytes revealed no relevant differences for any analyte, indicating that a multi-analyte calibration is suitable for the presented method. Comparison of approximately 60 samples to a former GC-MS method showed good correlation. The newly validated method was successfully applied to more than 1600 routine samples and 3 proficiency tests. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Migneault, Gerard E.
1987-01-01
Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.
NASA Astrophysics Data System (ADS)
Yu, Bo; Ning, Chao-lie; Li, Bing
2017-03-01
A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Wardak, Cecylia; Grabarczyk, Malgorzata
2016-08-02
A simple, fast and cheap method for monitoring copper and nitrate in drinking water and food products using newly developed solid contact ion-selective electrodes is proposed. Determination of copper and nitrate was performed by application of multiple standard additions technique. The reliability of the obtained results was assessed by comparing them using the anodic stripping voltammetry or spectrophotometry for the same samples. In each case, satisfactory agreement of the results was obtained, which confirms the analytical usefulness of the constructed electrodes.
Low thermal flux glass-fiber tubing for cryogenic service
NASA Technical Reports Server (NTRS)
Hall, C. A.; Spond, D. E.
1977-01-01
This paper describes analytical techniques, fabrication development, and test results for composite tubing that has many applications in aerospace and commercial cryogenic installations. Metal liner fabrication is discussed in detail with attention given to resistance-welded liners, fusion-welded liners, chem-milled tubing liners, joining tube liners and end fittings, heat treatment and leak checks. Composite overwrapping, a second method of tubing fabrication, is also discussed. Test programs and analytical correlation are considered along with composite tubing advantages such as minimum weight, thermal efficiency and safety and reliability.
Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia
2013-10-02
An unprecedented analytical method that allows simultaneous structural and quantitative characterization of all functional groups present in tannins is reported. In situ labeling of all labile H groups (aliphatic and phenolic hydroxyls and carboxylic acids) with a phosphorus-containing reagent (Cl-TMDP) followed by quantitative ³¹P NMR acquisition constitutes a novel fast and reliable analytical tool for the analysis of tannins and proanthocyanidins with significant implications for the fields of food and feed analyses, tannery, and the development of natural polyphenolics containing products.
Low level vapor verification of monomethyl hydrazine
NASA Technical Reports Server (NTRS)
Mehta, Narinder
1990-01-01
The vapor scrubbing system and the coulometric test procedure for the low level vapor verification of monomethyl hydrazine (MMH) are evaluated. Experimental data on precision, efficiency of the scrubbing liquid, instrument response, detection and reliable quantitation limits, stability of the vapor scrubbed solution, and interference were obtained to assess the applicability of the method for the low ppb level detection of the analyte vapor in air. The results indicated that the analyte vapor scrubbing system and the coulometric test procedure can be utilized for the quantitative detection of low ppb level vapor of MMH in air.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.
Ullah, Hakeem; Islam, Saeed; Khan, Ilyas; Shafie, Sharidan; Fiza, Mehreen
2015-01-01
In this paper we applied a new analytic approximate technique Optimal Homotopy Asymptotic Method (OHAM) for treatment of coupled differential-difference equations (DDEs). To see the efficiency and reliability of the method, we consider Relativistic Toda coupled nonlinear differential-difference equation. It provides us a convenient way to control the convergence of approximate solutions when it is compared with other methods of solution found in the literature. The obtained solutions show that OHAM is effective, simpler, easier and explicit.
Ullah, Hakeem; Islam, Saeed; Khan, Ilyas; Shafie, Sharidan; Fiza, Mehreen
2015-01-01
In this paper we applied a new analytic approximate technique Optimal Homotopy Asymptotic Method (OHAM) for treatment of coupled differential- difference equations (DDEs). To see the efficiency and reliability of the method, we consider Relativistic Toda coupled nonlinear differential-difference equation. It provides us a convenient way to control the convergence of approximate solutions when it is compared with other methods of solution found in the literature. The obtained solutions show that OHAM is effective, simpler, easier and explicit. PMID:25874457
Ghanbari, Behzad
2014-01-01
We aim to study the convergence of the homotopy analysis method (HAM in short) for solving special nonlinear Volterra-Fredholm integrodifferential equations. The sufficient condition for the convergence of the method is briefly addressed. Some illustrative examples are also presented to demonstrate the validity and applicability of the technique. Comparison of the obtained results HAM with exact solution shows that the method is reliable and capable of providing analytic treatment for solving such equations.
Sarma-based key-group method for rock slope reliability analyses
NASA Astrophysics Data System (ADS)
Yarahmadi Bafghi, A. R.; Verdel, T.
2005-08-01
The methods used in conducting static stability analyses have remained pertinent to this day for reasons of both simplicity and speed of execution. The most well-known of these methods for purposes of stability analysis of fractured rock masses is the key-block method (KBM).This paper proposes an extension to the KBM, called the key-group method (KGM), which combines not only individual key-blocks but also groups of collapsable blocks into an iterative and progressive analysis of the stability of discontinuous rock slopes. To take intra-group forces into account, the Sarma method has been implemented within the KGM in order to generate a Sarma-based KGM, abbreviated SKGM. We will discuss herein the hypothesis behind this new method, details regarding its implementation, and validation through comparison with results obtained from the distinct element method.Furthermore, as an alternative to deterministic methods, reliability analyses or probabilistic analyses have been proposed to take account of the uncertainty in analytical parameters and models. The FOSM and ASM probabilistic methods could be implemented within the KGM and SKGM framework in order to take account of the uncertainty due to physical and mechanical data (density, cohesion and angle of friction). We will then show how such reliability analyses can be introduced into SKGM to give rise to the probabilistic SKGM (PSKGM) and how it can be used for rock slope reliability analyses. Copyright
Fallback options for airgap sensor fault of an electromagnetic suspension system
NASA Astrophysics Data System (ADS)
Michail, Konstantinos; Zolotas, Argyrios C.; Goodall, Roger M.
2013-06-01
The paper presents a method to recover the performance of an electromagnetic suspension under faulty airgap sensor. The proposed control scheme is a combination of classical control loops, a Kalman Estimator and analytical redundancy (for the airgap signal). In this way redundant airgap sensors are not essential for reliable operation of this system. When the airgap sensor fails the required signal is recovered using a combination of a Kalman estimator and analytical redundancy. The performance of the suspension is optimised using genetic algorithms and some preliminary robustness issues to load and operating airgap variations are discussed. Simulations on a realistic model of such type of suspension illustrate the efficacy of the proposed sensor tolerant control method.
Mass Spectrometry for Paper-Based Immunoassays: Toward On-Demand Diagnosis.
Chen, Suming; Wan, Qiongqiong; Badu-Tawiah, Abraham K
2016-05-25
Current analytical methods, either point-of-care or centralized detection, are not able to meet recent demands of patient-friendly testing and increased reliability of results. Here, we describe a two-point separation on-demand diagnostic strategy based on a paper-based mass spectrometry immunoassay platform that adopts stable and cleavable ionic probes as mass reporter; these probes make possible sensitive, interruptible, storable, and restorable on-demand detection. In addition, a new touch paper spray method was developed for on-chip, sensitive, and cost-effective analyte detection. This concept is successfully demonstrated via (i) the detection of Plasmodium falciparum histidine-rich protein 2 antigen and (ii) multiplexed and simultaneous detection of cancer antigen 125 and carcinoembryonic antigen.
Pilot testing of SHRP 2 reliability data and analytical products: Florida. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
SHRP 2 initiated the L38 project to pilot test products from five of the programs completed projects. The products support reliability estimation and use based on data analyses, analytical techniques, and decision-making framework. The L38 project...
Westgard, Sten A
2016-06-01
To assess the analytical performance of instruments and methods through external quality assessment and proficiency testing data on the Sigma scale. A representative report from five different EQA/PT programs around the world (2 US, 1 Canadian, 1 UK, and 1 Australasian) was accessed. The instrument group standard deviations were used as surrogate estimates of instrument imprecision. Performance specifications from the US CLIA proficiency testing criteria were used to establish a common quality goal. Then Sigma-metrics were calculated to grade the analytical performance. Different methods have different Sigma-metrics for each analyte reviewed. Summary Sigma-metrics estimate the percentage of the chemistry analytes that are expected to perform above Five Sigma, which is where optimized QC design can be implemented. The range of performance varies from 37% to 88%, exhibiting significant differentiation between instruments and manufacturers. Median Sigmas for the different manufacturers in three analytes (albumin, glucose, sodium) showed significant differentiation. Chemistry tests are not commodities. Quality varies significantly from manufacturer to manufacturer, instrument to instrument, and method to method. The Sigma-assessments from multiple EQA/PT programs provide more insight into the performance of methods and instruments than any single program by itself. It is possible to produce a ranking of performance by manufacturer, instrument and individual method. Laboratories seeking optimal instrumentation would do well to consult this data as part of their decision-making process. To confirm that these assessments are stable and reliable, a longer term study should be conducted that examines more results over a longer time period. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Ku-band signal design study. [space shuttle orbiter data processing network
NASA Technical Reports Server (NTRS)
Rubin, I.
1978-01-01
Analytical tools, methods and techniques for assessing the design and performance of the space shuttle orbiter data processing system (DPS) are provided. The computer data processing network is evaluated in the key areas of queueing behavior synchronization and network reliability. The structure of the data processing network is described as well as the system operation principles and the network configuration. The characteristics of the computer systems are indicated. System reliability measures are defined and studied. System and network invulnerability measures are computed. Communication path and network failure analysis techniques are included.
Analytical performances of the Diazyme ADA assay on the Cobas® 6000 system.
Delacour, Hervé; Sauvanet, Christophe; Ceppa, Franck; Burnat, Pascal
2010-12-01
To evaluate the analytical performance of the Diazyme ADA assay on the Cobas® 6000 system for pleural fluid samples analysis. Imprecision, linearity, calibration curve stability, interference, and correlation studies were completed. The Diazyme ADA assay demonstrated excellent precision (CV<4%) over the analytical measurement range (0.5-117 U/L). Bilirubin above 50 μmol/L and haemoglobin above 177 μmol/L interfered with the test, inducing a negative and a positive interference respectively. The Diazyme ADA assay correlated well with the Giusti method (r(2)=0.93) but exhibited a negative bias (~ -30%). The Diazyme ADA assay on the Cobas® 6000 system represents a rapid, accurate, precise and reliable method for determination of ADA activity in pleural fluid samples. Copyright © 2010 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Thin-Layer Chromatography: The "Eyes" of the Organic Chemist
ERIC Educational Resources Information Center
Dickson, Hamilton; Kittredge, Kevin W.; Sarquis, Arlyne
2004-01-01
Thin-layer chromatography (TLC) methods are successfully used in many areas of research and development such as clinical medicine, forensic chemistry, biochemistry, and pharmaceutical analysis as TLC is relatively inexpensive and has found widespread application as an easy to use, reliable, and quick analytic tool. The usefulness of TLC in organic…
Tornadoes and transmission reliability planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teles, J.E.; Anderson, S.W.; Landgren, G.L.
1980-01-01
The objective of this paper is to introduce an analytical approach for predicting overhead transmission line outages that are caused by tornadoes. The method is presently being used to determine the effects of tornadoes on various right-of-way configurations associated with a generating station project or the supply to a major substation. 2 refs.
Developments in Cylindrical Shell Stability Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Starnes, James H., Jr.
1998-01-01
Today high-performance computing systems and new analytical and numerical techniques enable engineers to explore the use of advanced materials for shell design. This paper reviews some of the historical developments of shell buckling analysis and design. The paper concludes by identifying key research directions for reliable and robust methods development in shell stability analysis and design.
USDA-ARS?s Scientific Manuscript database
Immunoassay for low molecular weight food contaminants, such as pesticides, veterinary drugs, and mycotoxins is now a well-established technique which meets the demands for a rapid, reliable, and cost-effective analytical method. However, due to limited understanding of the fundamental aspects of i...
Matuszak, Małgorzata; Minorczyk, Maria; Góralczyk, Katarzyna; Hernik, Agnieszka; Struciński, Paweł; Liszewska, Monika; Czaja, Katarzyna; Korcz, Wojciech; Łyczewska, Monika; Ludwicki, Jan K
2016-01-01
Polybrominated diphenyl ethers (PBDEs) as other persistent organic pollutants like polychlorinated biphenyls (PCBs) and organochlorine pesticides (OCPs) pose a significant hazard to human health, mainly due to interference with the endocrine system and carcinogenetic effects. Humans are exposed to these substances mainly through a food of animal origin. These pollutants are globally detected in human matrices which requires to dispose reliable and simple analytical method that would enable further studies to assess the exposure of specific human populations to these compounds. The purpose of this study was to modify and validate of the analytical procedure for the simultaneous determination of selected PBDEs, PCBs and OCPs in human blood serum samples. The analytical measurement was performed by GC-µECD following preparation of serum samples (denaturation, multiple extraction, lipid removal). Identity of the compounds was confirmed by GC-MS. The method was characterised by the appropriate linearity, good repeatability (CV below 20%). The recoveries ranged from 52.9 to 125.0% depending on compound and level of fortification. The limit of quantification was set at 0.03 ng mL(-1) of serum. The modified analytical method proved to be suitable for the simultaneous determination of selected PBDEs, PCBs and OCPs in human blood serum by GC-µECD with good precision.
NASA Astrophysics Data System (ADS)
Lyubimov, V. V.; Kurkina, E. V.
2018-05-01
The authors consider the problem of a dynamic system passing through a low-order resonance, describing an uncontrolled atmospheric descent of an asymmetric nanosatellite in the Earth's atmosphere. The authors perform mathematical and numerical modeling of the motion of the nanosatellite with a small mass-aerodynamic asymmetry relative to the center of mass. The aim of the study is to obtain new reliable approximate analytical estimates of perturbations of the angle of attack of a nanosatellite passing through resonance at angles of attack of not more than 0.5π. By using the stationary phase method, the authors were able to investigate a discontinuous perturbation in the angle of attack of a nanosatellite passing through a resonance with two different nanosatellite designs. Comparison of the results of the numerical modeling and new approximate analytical estimates of the perturbation of the angle of attack confirms the reliability of the said estimates.
NASA Astrophysics Data System (ADS)
Doletskaya, L. I.; Solopov, R. V.; Kavchenkov, V. P.; Andreenkov, E. S.
2017-12-01
The physical features of the damage of aerial lines with a voltage of 10 kV under ice and wind loads are examined, mathematical models for estimating the reliability the mechanical part in aerial lines with the application of analytical theoretical methods and corresponding mathematical models taking into account the probabilistic nature of ice and wind loads are described, calculation results on reliability, specific damage and average time for restoration in case of emergency outages of 10 kV high-voltage transmission aerial lines with the use of uninsulated and protected wires are presented.
Hollands, Wendy J; Voorspoels, Stefan; Jacobs, Griet; Aaby, Kjersti; Meisland, Ane; Garcia-Villalba, Rocio; Tomas-Barberan, Francisco; Piskula, Mariusz K; Mawson, Deborah; Vovk, Irena; Needs, Paul W; Kroon, Paul A
2017-04-28
There is a lack of data for individual oligomeric procyanidins in apples and apple extracts. Our aim was to develop, validate and evaluate an analytical method for the separation, identification and quantification of monomeric and oligomeric flavanols in apple extracts. To achieve this, we prepared two types of flavanol extracts from freeze-dried apples; one was an epicatechin-rich extract containing ∼30% (w/w) monomeric (-)-epicatechin which also contained oligomeric procyanidins (Extract A), the second was an oligomeric procyanidin-rich extract depleted of epicatechin (Extract B). The parameters considered for method optimisation were HPLC columns and conditions, sample heating, mass of extract and dilution volumes. The performance characteristics considered for method validation included standard linearity, method sensitivity, precision and trueness. Eight laboratories participated in the method evaluation. Chromatographic separation of the analytes was best achieved utilizing a Hilic column with a binary mobile phase consisting of acidic acetonitrile and acidic aqueous methanol. The final method showed linearity for epicatechin in the range 5-100μg/mL with a correlation co-efficient >0.999. Intra-day and inter-day precision of the analytes ranged from 2 to 6% and 2 to 13% respectively. Up to dp3, trueness of the method was >95% but decreased with increasing dp. Within laboratory precision showed RSD values <5 and 10% for monomers and oligomers, respectively. Between laboratory precision was 4 and 15% (Extract A) and 7 and 30% (Extract B) for monomers and oligomers, respectively. An analytical method for the separation, identification and quantification of procyanidins in an apple extract was developed, validated and assessed. The results of the inter-laboratory evaluation indicate that the method is reliable and reproducible. Copyright © 2017. Published by Elsevier B.V.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Kenneth Paul
Capillary electrophoresis (CE) and high-performance liquid chromatography (HPLC) are widely used analytical separation techniques with many applications in chemical, biochemical, and biomedical sciences. Conventional analyte identification in these techniques is based on retention/migration times of standards; requiring a high degree of reproducibility, availability of reliable standards, and absence of coelution. From this, several new information-rich detection methods (also known as hyphenated techniques) are being explored that would be capable of providing unambiguous on-line identification of separating analytes in CE and HPLC. As further discussed, a number of such on-line detection methods have shown considerable success, including Raman, nuclear magnetic resonancemore » (NMR), mass spectrometry (MS), and fluorescence line-narrowing spectroscopy (FLNS). In this thesis, the feasibility and potential of combining the highly sensitive and selective laser-based detection method of FLNS with analytical separation techniques are discussed and presented. A summary of previously demonstrated FLNS detection interfaced with chromatography and electrophoresis is given, and recent results from on-line FLNS detection in CE (CE-FLNS), and the new combination of HPLC-FLNS, are shown.« less
Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo
2013-09-17
We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Zhang, Baohong; Pan, Xiaoping; Venne, Louise; Dunnum, Suzy; McMurry, Scott T; Cobb, George P; Anderson, Todd A
2008-05-30
A reliable, sensitive, and reproducible method was developed for quantitative determination of nine new generation pesticides currently used in cotton agriculture. Injector temperature significantly affected analyte response as indicated by electron capture detector (ECD) chromatograms. A majority of the analytes had an enhanced response at injector temperatures between 240 and 260 degrees C, especially analytes such as acephate that overall had a poor response on the ECD. The method detection limits (MDLs) were 0.13, 0.05, 0.29, 0.35, 0.08, 0.10, 0.32, 0.05, and 0.59 ng/mL for acephate, trifuralin, malathion, thiamethozam, pendimethalin, DEF6, acetamiprid, brifenthrin, and lambda-cyhalothrin. This study provides a precision (0.17-13.1%), accuracy (recoveries=88-107%) and good reproducible method for the analytes of interest. At relatively high concentrations, only lambda-cyhalothrin was unstable at room temperature (20-25 degrees C) and 4 degrees C over 10 days. At relatively low concentrations, acephate and acetamiprid were also unstable regardless of temperature. After 10 days storage at room temperature, 30-40% degradation of lambda-cyhalothrin was observed. It is recommended that acephate, acetamiprid, and lambda-cyhalothrin be stored at -20 degrees C or analyzed immediately after extraction.
NASA Astrophysics Data System (ADS)
Lin, Xiangyue; Peng, Minli; Lei, Fengming; Tan, Jiangxian; Shi, Huacheng
2017-12-01
Based on the assumptions of uniform corrosion and linear elastic expansion, an analytical model of cracking due to rebar corrosion expansion in concrete was established, which is able to consider the structure internal force. And then, by means of the complex variable function theory and series expansion technology established by Muskhelishvili, the corresponding stress component functions of concrete around the reinforcement were obtained. Also, a comparative analysis was conducted between the numerical simulation model and present model in this paper. The results show that the calculation results of both methods were consistent with each other, and the numerical deviation was less than 10%, proving that the analytical model established in this paper is reliable.
NASA Astrophysics Data System (ADS)
Ravi, J. T.; Nidhan, S.; Muthu, N.; Maiti, S. K.
2018-02-01
An analytical method for determination of dimensions of longitudinal crack in monolithic beams, based on frequency measurements, has been extended to model L and inverted T cracks. Such cracks including longitudinal crack arise in beams made of layered isotropic or composite materials. A new formulation for modelling cracks in bi-material beams is presented. Longitudinal crack segment sizes, for L and inverted T cracks, varying from 2.7% to 13.6% of length of Euler-Bernoulli beams are considered. Both forward and inverse problems have been examined. In the forward problems, the analytical results are compared with finite element (FE) solutions. In the inverse problems, the accuracy of prediction of crack dimensions is verified using FE results as input for virtual testing. The analytical results show good agreement with the actual crack dimensions. Further, experimental studies have been done to verify the accuracy of the analytical method for prediction of dimensions of three types of crack in isotropic and bi-material beams. The results show that the proposed formulation is reliable and can be employed for crack detection in slender beam like structures in practice.
Bassuoni, M M
2014-03-01
The dehumidifier is a key component in liquid desiccant air-conditioning systems. Analytical solutions have more advantages than numerical solutions in studying the dehumidifier performance parameters. This paper presents the performance results of exit parameters from an analytical model of an adiabatic cross-flow liquid desiccant air dehumidifier. Calcium chloride is used as desiccant material in this investigation. A program performing the analytical solution is developed using the engineering equation solver software. Good accuracy has been found between analytical solution and reliable experimental results with a maximum deviation of +6.63% and -5.65% in the moisture removal rate. The method developed here can be used in the quick prediction of the dehumidifier performance. The exit parameters from the dehumidifier are evaluated under the effects of variables such as air temperature and humidity, desiccant temperature and concentration, and air to desiccant flow rates. The results show that hot humid air and desiccant concentration have the greatest impact on the performance of the dehumidifier. The moisture removal rate is decreased with increasing both air inlet temperature and desiccant temperature while increases with increasing air to solution mass ratio, inlet desiccant concentration, and inlet air humidity ratio.
DOT National Transportation Integrated Search
2012-11-30
The objective of this project was to develop technical relationships between reliability improvement strategies and reliability performance metrics. This project defined reliability, explained the importance of travel time distributions for measuring...
El-Yazbi, Amira F
2017-07-01
Sofosbuvir (SOFO) was approved by the U.S. Food and Drug Administration in 2013 for the treatment of hepatitis C virus infection with enhanced antiviral potency compared with earlier analogs. Notwithstanding, all current editions of the pharmacopeias still do not present any analytical methods for the quantification of SOFO. Thus, rapid, simple, and ecofriendly methods for the routine analysis of commercial formulations of SOFO are desirable. In this study, five accurate methods for the determination of SOFO in pharmaceutical tablets were developed and validated. These methods include HPLC, capillary zone electrophoresis, HPTLC, and UV spectrophotometric and derivative spectrometry methods. The proposed methods proved to be rapid, simple, sensitive, selective, and accurate analytical procedures that were suitable for the reliable determination of SOFO in pharmaceutical tablets. An analysis of variance test with P-value > 0.05 confirmed that there were no significant differences between the proposed assays. Thus, any of these methods can be used for the routine analysis of SOFO in commercial tablets.
RSE-40: An Alternate Scoring System for the Rosenberg Self-Esteem Scale (RSE).
ERIC Educational Resources Information Center
Wallace, Gaylen R.
The Rosenberg Self-Esteem Inventory (RSE) is a 10-item scale purporting to measure self-esteem using self-acceptance and self-worth statements. This analysis covers concerns about the degree to which the RSE items represent a particular content universe, the RSE's applicability, factor analytic methods used, and the RSE's reliability and validity.…
Reviews of Single Subject Research Designs: Applications to Special Education and School Psychology
ERIC Educational Resources Information Center
Nevin, Ann I., Ed.
2004-01-01
The authors of this collection of research reviews studied how single subject research designs might be a useful method to apply as part of being accountable to clients. The single subject research studies were evaluated in accordance with the following criteria: Was the study applied, behavioral, reliable, analytic, effective, and generalizable?…
Reliability Generalization (RG) Analysis: The Test Is Not Reliable
ERIC Educational Resources Information Center
Warne, Russell
2008-01-01
Literature shows that most researchers are unaware of some of the characteristics of reliability. This paper clarifies some misconceptions by describing the procedures, benefits, and limitations of reliability generalization while using it to illustrate the nature of score reliability. Reliability generalization (RG) is a meta-analytic method…
Kim, Hyeryun; Kosinski, Penelope; Kung, Charles; Dang, Lenny; Chen, Yue; Yang, Hua; Chen, Yuan-Shek; Kramer, Jordyn; Liu, Guowen
2017-09-01
Many hemolytic anemias results in major metabolic abnormalities: two common metabolite abnormalities include increased levels of 2,3-diphosphoglycerate (2,3-DPG) and decreased levels of adenosine triphosphate (ATP). To better monitor the concentration changes of these metabolites, the development of a reliable LC-MS/MS method to quantitatively profile the concentrations of 2, 3-DPG and ATP in whole blood is essential to understand the effects of investigational therapeutics. Accurate quantification of both compounds imposes great challenges to bioanalytical scientists due to their polar, ionic and endogenous nature. Here we present an LC-MS/MS method for the reliable quantification of 2,3-DPG and ATP from K 2 EDTA human whole blood (WB) simultaneously. Whole blood samples were spiked with stable isotope labeled internal standards, processed by protein precipitation extraction, and analyzed using zwitterionic ion chromatography-hydrophilic interaction chromatography (ZIC-HILIC) coupled with tandem mass spectrometry. The linear analytical range of the assay was 50-3000μg/mL. The fit-for-purpose method demonstrated excellent accuracy and precision. The overall accuracy was within ±10.5% (%RE) for both analytes and the intra- and inter-assay precision (%CV) were less than 6.7% and 6.2% for both analytes, respectively. ATP and 2,3-DPG were found to be stable in human K 2 EDTA blood for at least 8h at 4°C, 96days when stored at -70°C and after three freeze/thaw cycles. The assay has been successfully applied to K 2 EDTA human whole blood samples to support clinical studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Llorente-Mirandes, Toni; Rubio, Roser; López-Sánchez, José Fermín
2017-01-01
Here we review recent developments in analytical proposals for the assessment of inorganic arsenic (iAs) content in food products. Interest in the determination of iAs in products for human consumption such as food commodities, wine, and seaweed among others is fueled by the wide recognition of its toxic effects on humans, even at low concentrations. Currently, the need for robust and reliable analytical methods is recognized by various international safety and health agencies, and by organizations in charge of establishing acceptable tolerance levels of iAs in food. This review summarizes the state of the art of analytical methods while highlighting tools for the assessment of quality assessment of the results, such as the production and evaluation of certified reference materials (CRMs) and the availability of specific proficiency testing (PT) programmes. Because the number of studies dedicated to the subject of this review has increased considerably over recent years, the sources consulted and cited here are limited to those from 2010 to the end of 2015.
Chebrolu, Kranthi K; Yousef, Gad G; Park, Ryan; Tanimura, Yoshinori; Brown, Allan F
2015-09-15
A high-throughput, robust and reliable method for simultaneous analysis of five carotenoids, four chlorophylls and one tocopherol was developed for rapid screening large sample populations to facilitate molecular biology and plant breeding. Separation was achieved for 10 known analytes and four unknown carotenoids in a significantly reduced run time of 10min. Identity of the 10 analytes was confirmed by their UV-Vis absorption spectras. Quantification of tocopherol, carotenoids and chlorophylls was performed at 290nm, 460nm and 650nm respectively. In this report, two sub two micron particle core-shell columns, Kinetex from Phenomenex (1.7μm particle size, 12% carbon load) and Cortecs from Waters (1.6μm particle size, 6.6% carbon load) were investigated and their separation efficiencies were evaluated. The peak resolutions were >1.5 for all analytes except for chlorophyll-a' with Cortecs column. The ruggedness of this method was evaluated in two identical but separate instruments that produced CV<2 in peak retentions for nine out of 10 analytes separated. Copyright © 2015 Elsevier B.V. All rights reserved.
Suitability of analytical methods to measure solubility for the purpose of nanoregulation.
Tantra, Ratna; Bouwmeester, Hans; Bolea, Eduardo; Rey-Castro, Carlos; David, Calin A; Dogné, Jean-Michel; Jarman, John; Laborda, Francisco; Laloy, Julie; Robinson, Kenneth N; Undas, Anna K; van der Zande, Meike
2016-01-01
Solubility is an important physicochemical parameter in nanoregulation. If nanomaterial is completely soluble, then from a risk assessment point of view, its disposal can be treated much in the same way as "ordinary" chemicals, which will simplify testing and characterisation regimes. This review assesses potential techniques for the measurement of nanomaterial solubility and evaluates the performance against a set of analytical criteria (based on satisfying the requirements as governed by the cosmetic regulation as well as the need to quantify the concentration of free (hydrated) ions). Our findings show that no universal method exists. A complementary approach is thus recommended, to comprise an atomic spectrometry-based method in conjunction with an electrochemical (or colorimetric) method. This article shows that although some techniques are more commonly used than others, a huge research gap remains, related with the need to ensure data reliability.
Targeted methods for quantitative analysis of protein glycosylation
Goldman, Radoslav; Sanda, Miloslav
2018-01-01
Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218
A Lean Six Sigma approach to the improvement of the selenium analysis method.
Cloete, Bronwyn C; Bester, André
2012-11-02
Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.
New analytical exact solutions of time fractional KdV-KZK equation by Kudryashov methods
NASA Astrophysics Data System (ADS)
S Saha, Ray
2016-04-01
In this paper, new exact solutions of the time fractional KdV-Khokhlov-Zabolotskaya-Kuznetsov (KdV-KZK) equation are obtained by the classical Kudryashov method and modified Kudryashov method respectively. For this purpose, the modified Riemann-Liouville derivative is used to convert the nonlinear time fractional KdV-KZK equation into the nonlinear ordinary differential equation. In the present analysis, the classical Kudryashov method and modified Kudryashov method are both used successively to compute the analytical solutions of the time fractional KdV-KZK equation. As a result, new exact solutions involving the symmetrical Fibonacci function, hyperbolic function and exponential function are obtained for the first time. The methods under consideration are reliable and efficient, and can be used as an alternative to establish new exact solutions of different types of fractional differential equations arising from mathematical physics. The obtained results are exhibited graphically in order to demonstrate the efficiencies and applicabilities of these proposed methods of solving the nonlinear time fractional KdV-KZK equation.
Cao, Xiaoqin; Li, Xiaofei; Li, Jian; Niu, Yunhui; Shi, Lu; Fang, Zhenfeng; Zhang, Tao; Ding, Hong
2018-01-15
A sensitive and reliable multi-mycotoxin-based method was developed to identify and quantify several carcinogenic mycotoxins in human blood and urine, as well as edible animal tissues, including muscle and liver tissue from swine and chickens, using liquid chromatography-tandem mass spectrometry (LC-MS/MS). For the toxicokinetic studies with individual mycotoxins, highly sensitive analyte-specific LC-MS/MS methods were developed for rat plasma and urine. Sample purification consisted of a rapid 'dilute and shoot' approach in urine samples, a simple 'dilute, evaporate and shoot' approach in plasma samples and a 'QuEChERS' procedure in edible animal tissues. The multi-mycotoxin and analyte-specific methods were validated in-house: The limits of detection (LOD) for the multi-mycotoxin and analyte-specific methods ranged from 0.02 to 0.41 μg/kg (μg/L) and 0.01 to 0.19 μg/L, respectively, and limits of quantification (LOQ) between 0.10 to 1.02 μg/kg (μg/L) and 0.09 to 0.47 μg/L, respectively. Apparent recoveries of the samples spiked with 0.25 to 4 μg/kg (μg/L) ranged from 60.1% to 109.8% with relative standard deviations below 15%. The methods were successfully applied to real samples. To the best of our knowledge, this is the first study carried out using a small group of patients from the Chinese population with hepatocellular carcinoma to assess their exposure to carcinogenic mycotoxins using biomarkers. Finally, the multi-mycotoxin method is a useful analytical method for assessing exposure to mycotoxins edible in animal tissues. The analyte-specific methods could be useful during toxicokinetic and toxicological studies. Copyright © 2017. Published by Elsevier B.V.
Effects of viscosity on shock-induced damping of an initial sinusoidal disturbance
NASA Astrophysics Data System (ADS)
Ma, Xiaojuan; Liu, Fusheng; Jing, Fuqian
2010-05-01
A lack of reliable data treatment method has been for several decades the bottleneck of viscosity measurement by disturbance amplitude damping method of shock waves. In this work the finite difference method is firstly applied to obtain the numerical solutions for disturbance amplitude damping behavior of sinusoidal shock front in inviscid and viscous flow. When water shocked to 15 GPa is taken as an example, the main results are as follows: (1) For inviscid and lower viscous flows the numerical method gives results in good agreement with the analytic solutions under the condition of small disturbance ( a 0/ λ=0.02); (2) For the flow of viscosity beyond 200 Pa s ( η = κ) the analytic solution is found to overestimate obviously the effects of viscosity. It is attributed to the unreal pre-conditions of analytic solution by Miller and Ahrens; (3) The present numerical method provides an effective tool with more confidence to overcome the bottleneck of data treatment when the effects of higher viscosity in experiments of Sakharov and flyer impact are expected to be analyzed, because it can in principle simulate the development of shock waves in flows with larger disturbance amplitude, higher viscosity, and complicated initial flow.
Rudzki, Piotr J; Gniazdowska, Elżbieta; Buś-Kwaśnik, Katarzyna
2018-06-05
Liquid chromatography coupled to mass spectrometry (LC-MS) is a powerful tool for studying pharmacokinetics and toxicokinetics. Reliable bioanalysis requires the characterization of the matrix effect, i.e. influence of the endogenous or exogenous compounds on the analyte signal intensity. We have compared two methods for the quantitation of matrix effect. The CVs(%) of internal standard normalized matrix factors recommended by the European Medicines Agency were evaluated against internal standard normalized relative matrix effects derived from Matuszewski et al. (2003). Both methods use post-extraction spiked samples, but matrix factors require also neat solutions. We have tested both approaches using analytes of diverse chemical structures. The study did not reveal relevant differences in the results obtained with both calculation methods. After normalization with the internal standard, the CV(%) of the matrix factor was on average 0.5% higher than the corresponding relative matrix effect. The method adopted by the European Medicines Agency seems to be slightly more conservative in the analyzed datasets. Nine analytes of different structures enabled a general overview of the problem, still, further studies are encouraged to confirm our observations. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gao, Chen; Ding, Zhongan; Deng, Bofa; Yan, Shengteng
2017-10-01
According to the characteristics of electric energy data acquire system (EEDAS), considering the availability of each index data and the connection between the index integrity, establishing the performance evaluation index system of electric energy data acquire system from three aspects as master station system, communication channel, terminal equipment. To determine the comprehensive weight of each index based on triangular fuzzy number analytic hierarchy process with entropy weight method, and both subjective preference and objective attribute are taken into consideration, thus realize the performance comprehensive evaluation more reasonable and reliable. Example analysis shows that, by combination with analytic hierarchy process (AHP) and triangle fuzzy numbers (TFN) to establish comprehensive index evaluation system based on entropy method, the evaluation results not only convenient and practical, but also more objective and accurate.
Determination of tocopherols and sitosterols in seeds and nuts by QuEChERS-liquid chromatography.
Delgado-Zamarreño, M Milagros; Fernández-Prieto, Cristina; Bustamante-Rangel, Myriam; Pérez-Martín, Lara
2016-02-01
In the present work a simple, reliable and affordable sample treatment method for the simultaneous analysis of tocopherols and free phytosterols in nuts was developed. Analyte extraction was carried out using the QuEChERS methodology and analyte separation and detection were accomplished using HPLC-DAD. The use of this methodology for the extraction of natural occurring substances provides advantages such as speed, simplicity and ease of use. The parameters evaluated for the validation of the method developed included the linearity of the calibration plots, the detection and quantification limits, repeatability, reproducibility and recovery. The proposed method was successfully applied to the analysis of tocopherols and free phytosterols in samples of almonds, cashew nuts, hazelnuts, peanuts, tiger nuts, sun flower seeds and pistachios. Copyright © 2015 Elsevier Ltd. All rights reserved.
Graphite nanocomposites sensor for multiplex detection of antioxidants in food.
Ng, Khan Loon; Tan, Guan Huat; Khor, Sook Mei
2017-12-15
Butylated hydroxyanisole (BHA), butylated hydroxytoluene (BHT), and tert-butylhydroquinone (TBHQ) are synthetic antioxidants used in the food industry. Herein, we describe the development of a novel graphite nanocomposite-based electrochemical sensor for the multiplex detection and measurement of BHA, BHT, and TBHQ levels in complex food samples using a linear sweep voltammetry technique. Moreover, our newly established analytical method exhibited good sensitivity, limit of detection, limit of quantitation, and selectivity. The accuracy and reliability of analytical results were challenged by method validation and comparison with the results of the liquid chromatography method, where a linear correlation of more than 0.99 was achieved. The addition of sodium dodecyl sulfate as supporting additive further enhanced the LSV response (anodic peak current, I pa ) of BHA and BHT by 2- and 20-times, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
López-Serna, Rebeca; Marín-de-Jesús, David; Irusta-Mata, Rubén; García-Encina, Pedro Antonio; Lebrero, Raquel; Fdez-Polanco, María; Muñoz, Raúl
2018-08-15
The work here presented aimed at developing an analytical method for the simultaneous determination of 22 pharmaceuticals and personal care products, including 3 transformation products, in sewage and sludge. A meticulous method optimization, involving an experimental design, was carried out. The developed method was fully automated and consisted of the online extraction of 17 mL of water sample by Direct Immersion Solid Phase MicroExtraction followed by On-fiber Derivatization coupled to Gas Chromatography - Mass Spectrometry (DI-SPME - On-fiber Derivatization - GC - MS). This methodology was validated for 12 of the initial compounds as a reliable (relative recoveries above 90% for sewage and 70% for sludge; repeatability as %RSD below 10% in all cases), sensitive (LODs below 20 ng L -1 in sewage and 10 ng g -1 in sludge), versatile (sewage and sewage-sludge samples up to 15,000 ng L -1 and 900 ng g -1 , respectively) and green analytical alternative for many medium-tech routine laboratories around the world to keep up with both current and forecast environmental regulations requirements. The remaining 10 analytes initially considered showed insufficient suitability to be included in the final method. The methodology was successfully applied to real samples generated in a pilot scale sewage treatment reactor. Copyright © 2018 Elsevier B.V. All rights reserved.
openECA Platform and Analytics Alpha Test Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
openECA Platform and Analytics Beta Demonstration Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
van de Water, A T M; Benjamin, D R
2016-02-01
Systematic literature review. Diastasis of the rectus abdominis muscle (DRAM) has been linked with low back pain, abdominal and pelvic dysfunction. Measurement is used to either screen or to monitor DRAM width. Determining which methods are suitable for screening and monitoring DRAM is of clinical value. To identify the best methods to screen for DRAM presence and monitor DRAM width. AMED, Embase, Medline, PubMed and CINAHL databases were searched for measurement property studies of DRAM measurement methods. Population characteristics, measurement methods/procedures and measurement information were extracted from included studies. Quality of all studies was evaluated using 'quality rating criteria'. When possible, reliability generalisation was conducted to provide combined reliability estimations. Thirteen studies evaluated measurement properties of the 'finger width'-method, tape measure, calipers, ultrasound, CT and MRI. Ultrasound was most evaluated. Methodological quality of these studies varied widely. Pearson's correlations of r = 0.66-0.79 were found between calipers and ultrasound measurements. Calipers and ultrasound had Intraclass Correlation Coefficients (ICC) of 0.78-0.97 for test-retest, inter- and intra-rater reliability. The 'finger width'-method had weighted Kappa's of 0.73-0.77 for test-retest reliability, but moderate agreement (63%; weighted Kappa = 0.53) between raters. Comparing calipers and ultrasound, low measurement error was found (above the umbilicus), and the methods had good agreement (83%; weighted Kappa = 0.66) for discriminative purposes. The available information support ultrasound and calipers as adequate methods to assess DRAM. For other methods limited measurement information of low to moderate quality is available and further evaluation of their measurement properties is required. Copyright © 2015 Elsevier Ltd. All rights reserved.
Evaluation of Analytical Modeling Functions for the Phonation Onset Process.
Petermann, Simon; Kniesburges, Stefan; Ziethe, Anke; Schützenberger, Anne; Döllinger, Michael
2016-01-01
The human voice originates from oscillations of the vocal folds in the larynx. The duration of the voice onset (VO), called the voice onset time (VOT), is currently under investigation as a clinical indicator for correct laryngeal functionality. Different analytical approaches for computing the VOT based on endoscopic imaging were compared to determine the most reliable method to quantify automatically the transient vocal fold oscillations during VO. Transnasal endoscopic imaging in combination with a high-speed camera (8000 fps) was applied to visualize the phonation onset process. Two different definitions of VO interval were investigated. Six analytical functions were tested that approximate the envelope of the filtered or unfiltered glottal area waveform (GAW) during phonation onset. A total of 126 recordings from nine healthy males and 210 recordings from 15 healthy females were evaluated. Three criteria were analyzed to determine the most appropriate computation approach: (1) reliability of the fit function for a correct approximation of VO; (2) consistency represented by the standard deviation of VOT; and (3) accuracy of the approximation of VO. The results suggest the computation of VOT by a fourth-order polynomial approximation in the interval between 32.2 and 67.8% of the saturation amplitude of the filtered GAW.
NASA Astrophysics Data System (ADS)
Pietropolli Charmet, Andrea; Cornaton, Yann
2018-05-01
This work presents an investigation of the theoretical predictions yielded by anharmonic force fields having the cubic and quartic force constants are computed analytically by means of density functional theory (DFT) using the recursive scheme developed by M. Ringholm et al. (J. Comput. Chem. 35 (2014) 622). Different functionals (namely B3LYP, PBE, PBE0 and PW86x) and basis sets were used for calculating the anharmonic vibrational spectra of two halomethanes. The benchmark analysis carried out demonstrates the reliability and overall good performances offered by hybrid approaches, where the harmonic data obtained at the coupled cluster with single and double excitations level of theory augmented by a perturbational estimate of the effects of connected triple excitations, CCSD(T), are combined with the fully analytic higher order force constants yielded by DFT functionals. These methods lead to reliable and computationally affordable calculations of anharmonic vibrational spectra with an accuracy comparable to that yielded by hybrid force fields having the anharmonic force fields computed at second order Møller-Plesset perturbation theory (MP2) level of theory using numerical differentiation but without the corresponding potential issues related to computational costs and numerical errors.
Krasnoshchekov, Sergey V; Isayeva, Elena V; Stepanov, Nikolay F
2012-04-12
Anharmonic vibrational states of semirigid polyatomic molecules are often studied using the second-order vibrational perturbation theory (VPT2). For efficient higher-order analysis, an approach based on the canonical Van Vleck perturbation theory (CVPT), the Watson Hamiltonian and operators of creation and annihilation of vibrational quanta is employed. This method allows analysis of the convergence of perturbation theory and solves a number of theoretical problems of VPT2, e.g., yields anharmonic constants y(ijk), z(ijkl), and allows the reliable evaluation of vibrational IR and Raman anharmonic intensities in the presence of resonances. Darling-Dennison and higher-order resonance coupling coefficients can be reliably evaluated as well. The method is illustrated on classic molecules: water and formaldehyde. A number of theoretical conclusions results, including the necessity of using sextic force field in the fourth order (CVPT4) and the nearly vanishing CVPT4 contributions for bending and wagging modes. The coefficients of perturbative Dunham-type Hamiltonians in high-orders of CVPT are found to conform to the rules of equality at different orders as earlier proven analytically for diatomic molecules. The method can serve as a good substitution of the more traditional VPT2.
NASA Astrophysics Data System (ADS)
Liu, Boshi; Huang, Renliang; Yu, Yanjun; Su, Rongxin; Qi, Wei; He, Zhimin
2018-04-01
Ochratoxin A (OTA) is a type of mycotoxin generated from the metabolism of Aspergillus and Penicillium, and is extremely toxic to humans, livestock, and poultry. However, traditional assays for the detection of OTA are expensive and complicated. Other than OTA aptamer, OTA itself at high concentration can also adsorb on the surface of gold nanoparticles (AuNPs), and further inhibit AuNPs salt aggregation. We herein report a new OTA assay by applying the localized surface plasmon resonance effect of AuNPs and their aggregates. The result obtained from only one single linear calibration curve is not reliable, and so we developed a “double calibration curve” method to address this issue and widen the OTA detection range. A number of other analytes were also examined, and the structural properties of analytes that bind with the AuNPs were further discussed. We found that various considerations must be taken into account in the detection of these analytes when applying AuNP aggregation-based methods due to their different binding strengths.
Zhang, Yan-zhen; Zhou, Yan-chun; Liu, Li; Zhu, Yan
2007-01-01
Simple, reliable and sensitive analytical methods to determine anticariogenic agents, preservatives, and artificial sweeteners contained in commercial gargles are necessary for evaluating their effectiveness, safety, and quality. An ion chromatography (IC) method has been described to analyze simultaneously eight anions including fluoride, chloride, sulfate, phosphate, monofluorophosphate, glycerophosphate (anticariogenic agents), sorbate (a preservative), and saccharin (an artificial sweetener) in gargles. In this IC system, we applied a mobile phased gradient elution with KOH, separation by IonPac AS18 columns, and suppressed conductivity detection. Optimized analytical conditions were further evaluated for accuracy. The relative standard deviations (RSDs) of the inter-day’s retention time and peak area of all species were less than 0.938% and 8.731%, respectively, while RSDs of 5-day retention time and peak area were less than 1.265% and 8.934%, respectively. The correlation coefficients for targeted analytes ranged from 0.999 7 to 1.000 0. The spiked recoveries for the anions were 90%~102.5%. We concluded that the method can be applied for comprehensive evaluation of commercial gargles. PMID:17610331
Zhang, Yan-zhen; Zhou, Yan-chun; Liu, Li; Zhu, Yan
2007-07-01
Simple, reliable and sensitive analytical methods to determine anticariogenic agents, preservatives, and artificial sweeteners contained in commercial gargles are necessary for evaluating their effectiveness, safety, and quality. An ion chromatography (IC) method has been described to analyze simultaneously eight anions including fluoride, chloride, sulfate, phosphate, monofluorophosphate, glycerophosphate (anticariogenic agents), sorbate (a preservative), and saccharin (an artificial sweetener) in gargles. In this IC system, we applied a mobile phased gradient elution with KOH, separation by IonPac AS18 columns, and suppressed conductivity detection. Optimized analytical conditions were further evaluated for accuracy. The relative standard deviations (RSDs) of the inter-day's retention time and peak area of all species were less than 0.938% and 8.731%, respectively, while RSDs of 5-day retention time and peak area were less than 1.265% and 8.934%, respectively. The correlation coefficients for targeted analytes ranged from 0.999 7 to 1.000 0. The spiked recoveries for the anions were 90% approximately 102.5%. We concluded that the method can be applied for comprehensive evaluation of commercial gargles.
Adhikari, Puspa L; Wong, Roberto L; Overton, Edward B
2017-10-01
Accurate characterization of petroleum hydrocarbons in complex and weathered oil residues is analytically challenging. This is primarily due to chemical compositional complexity of both the oil residues and environmental matrices, and the lack of instrumental selectivity due to co-elution of interferences with the target analytes. To overcome these analytical selectivity issues, we used an enhanced resolution gas chromatography coupled with triple quadrupole mass spectrometry in Multiple Reaction Monitoring (MRM) mode (GC/MS/MS-MRM) to eliminate interferences within the ion chromatograms of target analytes found in environmental samples. This new GC/MS/MS-MRM method was developed and used for forensic fingerprinting of deep-water and marsh sediment samples containing oily residues from the Deepwater Horizon oil spill. The results showed that the GC/MS/MS-MRM method increases selectivity, eliminates interferences, and provides more accurate quantitation and characterization of trace levels of alkyl-PAHs and biomarker compounds, from weathered oil residues in complex sample matrices. The higher selectivity of the new method, even at low detection limits, provides greater insights on isomer and homolog compositional patterns and the extent of oil weathering under various environmental conditions. The method also provides flat chromatographic baselines for accurate and unambiguous calculation of petroleum forensic biomarker compound ratios. Thus, this GC/MS/MS-MRM method can be a reliable analytical strategy for more accurate and selective trace level analyses in petroleum forensic studies, and for tacking continuous weathering of oil residues. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reliability and maintainability assessment factors for reliable fault-tolerant systems
NASA Technical Reports Server (NTRS)
Bavuso, S. J.
1984-01-01
A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.
Toward automatic finite element analysis
NASA Technical Reports Server (NTRS)
Kela, Ajay; Perucchio, Renato; Voelcker, Herbert
1987-01-01
Two problems must be solved if the finite element method is to become a reliable and affordable blackbox engineering tool. Finite element meshes must be generated automatically from computer aided design databases and mesh analysis must be made self-adaptive. The experimental system described solves both problems in 2-D through spatial and analytical substructuring techniques that are now being extended into 3-D.
ERIC Educational Resources Information Center
Alqahtani, Abdulmuhsen Ayedh; Almutairi, Yousef B.
2013-01-01
The purpose of the current study is to examine, in retrospect, trainees' perceptions of the reasons some of their peers dropped out of the vocational education at the Industrial Institute-Shuwaikh (IIS), Kuwait. Using the descriptive-analytical method, a reliable questionnaire was developed to achieve this purpose. Results show that: (a) the…
Budnik, Lygia T; Fahrenholtz, Svea; Kloth, Stefan; Baur, Xaver
2010-04-01
Protection against infestation of a container cargo by alien species is achieved by mandatory fumigation with pesticides. Most of the effective fumigants are methyl and ethyl halide gases that are highly toxic and are a risk to both human health and the environment. There is a worldwide need for a reliable and robust analytical screening procedure for these volatile chemicals in a multitude of health and environmental scenarios. We have established a highly sensitive broad spectrum mass spectrometry method combined with thermal desorption gas chromatography to detect, identify and quantify volatile pesticide residues. Using this method, 1201 random ambient air samples taken from freight containers arriving at the biggest European ports of Hamburg and Rotterdam were analyzed over a period of two and a half years. This analytical procedure is a valuable strategy to measure air pollution from these hazardous chemicals, to help in the identification of pesticides in the new mixtures/formulations that are being adopted globally and to analyze expired breath samples after suspected intoxication in biomonitoring.
Control Chart on Semi Analytical Weighting
NASA Astrophysics Data System (ADS)
Miranda, G. S.; Oliveira, C. C.; Silva, T. B. S. C.; Stellato, T. B.; Monteiro, L. R.; Marques, J. R.; Faustino, M. G.; Soares, S. M. V.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.
2018-03-01
Semi-analytical balance verification intends to assess the balance performance using graphs that illustrate measurement dispersion, trough time, and to demonstrate measurements were performed in a reliable manner. This study presents internal quality control of a semi-analytical balance (GEHAKA BG400) using control charts. From 2013 to 2016, 2 weight standards were monitored before any balance operation. This work intended to evaluate if any significant difference or bias were presented on weighting procedure over time, to check the generated data reliability. This work also exemplifies how control intervals are established.
Zhang, Jiawen; He, Shaohui; Wang, Dahai; Liu, Yangpeng; Yao, Wenbo; Liu, Xiabing
2018-01-01
Based on the operating Chegongzhuang heat-supplying tunnel in Beijing, the reliability of its lining structure under the action of large thrust and thermal effect is studied. According to the characteristics of a heat-supplying tunnel service, a three-dimensional numerical analysis model was established based on the mechanical tests on the in-situ specimens. The stress and strain of the tunnel structure were obtained before and after the operation. Compared with the field monitoring data, the rationality of the model was verified. After extracting the internal force of the lining structure, the improved method of subset simulation was proposed as the performance function to calculate the reliability of the main control section of the tunnel. In contrast to the traditional calculation method, the analytic relationship between the sample numbers in the subset simulation method and Monte Carlo method was given. The results indicate that the lining structure is greatly influenced by coupling in the range of six meters from the fixed brackets, especially the tunnel floor. The improved subset simulation method can greatly save computation time and improve computational efficiency under the premise of ensuring the accuracy of calculation. It is suitable for the reliability calculation of tunnel engineering, because “the lower the probability, the more efficient the calculation.” PMID:29401691
Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders
Song, Yu; Kwak, Shin; Yoshida, Sohei; Yamamoto, Yoshiharu
2014-01-01
Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient's movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson's disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD) for vascular dementia (VD), seasonal affective disorder (SAD), and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records. PMID:25214709
Plate and butt-weld stresses beyond elastic limit, material and structural modeling
NASA Technical Reports Server (NTRS)
Verderaime, V.
1991-01-01
Ultimate safety factors of high performance structures depend on stress behavior beyond the elastic limit, a region not too well understood. An analytical modeling approach was developed to gain fundamental insights into inelastic responses of simple structural elements. Nonlinear material properties were expressed in engineering stresses and strains variables and combined with strength of material stress and strain equations similar to numerical piece-wise linear method. Integrations are continuous which allows for more detailed solutions. Included with interesting results are the classical combined axial tension and bending load model and the strain gauge conversion to stress beyond the elastic limit. Material discontinuity stress factors in butt-welds were derived. This is a working-type document with analytical methods and results applicable to all industries of high reliability structures.
Hu, J-Y; Deng, Z-B; Qin, D-M
2009-12-01
JS-118 is a diacylhydrazines-type insect growth regulator used extensively in China now. An analytical method for residues determination of JS-118 in cabbage and soil samples by high performance liquid chromatography with DAD detection was established and optimized. Primary secondary amine solid phase extraction cartridge was used for sample preparation. Mean recoveries for the analyte ranged from 96.6% to 107.0% with CV value less than 4.7%. The limit of quantification is 0.01 mg/kg. Direct confirmation of JS-118 residues in samples was realized by high performance liquid chromatography-mass spectrometry. The proposed method is simple, rapid and reliable to perform and could be utilized for monitoring of pesticides residues.
Perez-Rea, Daysi; Zielke, Claudia; Nilsson, Lars
2017-07-14
Starch and hence, amylopectin is an important biomacromolecule in both the human diet as well as in technical applications. Therefore, accurate and reliable analytical methods for its characterization are needed. A suitable method for analyzing macromolecules with ultra-high molar mass, branched structure and high polydispersity is asymmetric flow field-flow fractionation (AF4) in combination with multiangle light scattering (MALS) detection. In this paper we illustrate how co-elution of low quantities of very large analytes in AF4 may cause disturbances in the MALS data which, in turn, causes an overestimation of the size. Furthermore, it is shown how pre-injection filtering of the sample can improve the results. Copyright © 2017 Elsevier B.V. All rights reserved.
Analysis of multiple mycotoxins in food.
Hajslova, Jana; Zachariasova, Milena; Cajka, Tomas
2011-01-01
Mycotoxins are secondary metabolites of microscopic filamentous fungi. With regard to the widespread distribution of fungi in the environment, mycotoxins are considered to be one of the most important natural contaminants in foods and feeds. To protect consumers' health and reduce economic losses, surveillance and control of mycotoxins in food and feed has become a major objective for producers, regulatory authorities, and researchers worldwide. In this context, availability of reliable analytical methods applicable for this purpose is essential. Since the variety of chemical structures of mycotoxins makes impossible to use one single technique for their analysis, a vast number of analytical methods has been developed and validated. Both a large variability of food matrices and growing demands for a fast, cost-saving and accurate determination of multiple mycotoxins by a single method outline new challenges for analytical research. This strong effort is facilitated by technical developments in mass spectrometry allowing decreasing the influence of matrix effects in spite of omitting sample clean-up step. The current state-of-the-art together with future trends is presented in this chapter. Attention is focused mainly on instrumental method; advances in biosensors and other screening bioanalytical approaches enabling analysis of multiple mycotoxins are not discussed in detail.
Farajzadeh, Mir Ali; Bamorowat, Mahdi; Mogaddam, Mohammad Reza Afshar
2016-11-01
An efficient, reliable, sensitive, rapid, and green analytical method for the extraction and determination of neonicotinoid insecticides in aqueous samples has been developed using ionic liquid phase microextraction coupled with high performance liquid chromatography-diode array detector. In this method, a few microliters of 1-hexyl-3-methylimidazolium hexafluorophosphate (as an extractant) is added onto a ringer tablet and it is transferred into a conical test tube containing aqueous phase of the analytes. By manually shaking, the ringer tablet is dissolved and the extractant is released into the aqueous phase as very tiny droplets to provide a cloudy solution. After centrifuging the extracted analytes into ionic liquid are collected at the bottom of a conical test tube. Under the optimum extraction conditions, the method showed low limits of detection and quantification between 0.12 and 0.33 and 0.41 and 1.11ngmL(-1), respectively. Extraction recoveries and enrichment factors were from 66% to 84% and 655% to 843%, respectively. Finally different aqueous samples were successfully analyzed using the proposed method. Copyright © 2016 Elsevier B.V. All rights reserved.
Gao, Kai; Chung, Eric T.; Gibson, Richard L.; ...
2015-06-05
The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elasticmore » wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.« less
Experimental and Analytical Determinations of Spiral Bevel Gear-Tooth Bending Stress Compared
NASA Technical Reports Server (NTRS)
Handschuh, Robert F.
2000-01-01
Spiral bevel gears are currently used in all main-rotor drive systems for rotorcraft produced in the United States. Applications such as these need spiral bevel gears to turn the corner from the horizontal gas turbine engine to the vertical rotor shaft. These gears must typically operate at extremely high rotational speeds and carry high power levels. With these difficult operating conditions, an improved analytical capability is paramount to increasing aircraft safety and reliability. Also, literature on the analysis and testing of spiral bevel gears has been very sparse in comparison to that for parallel axis gears. This is due to the complex geometry of this type of gear and to the specialized test equipment necessary to test these components. To develop an analytical model of spiral bevel gears, researchers use differential geometry methods to model the manufacturing kinematics. A three-dimensional spiral bevel gear modeling method was developed that uses finite elements for the structural analysis. This method was used to analyze the three-dimensional contact pattern between the test pinion and gear used in the Spiral Bevel Gear Test Facility at the NASA Glenn Research Center at Lewis Field. Results of this analysis are illustrated in the preceding figure. The development of the analytical method was a joint endeavor between NASA Glenn, the U.S. Army Research Laboratory, and the University of North Dakota.
Chemmalil, Letha; Suravajjala, Sreekanth; See, Kate; Jordan, Eric; Furtado, Marsha; Sun, Chong; Hosselet, Stephen
2015-01-01
This paper describes a novel approach for the quantitation of nonderivatized sialic acid in glycoproteins, separated by hydrophilic interaction chromatography, and detection by Nano Quantity Analyte Detector (NQAD). The detection technique of NQAD is based on measuring change in the size of dry aerosol and converting the particle count rate into chromatographic output signal. NQAD detector is suitable for the detection of sialic acid, which lacks sufficiently active chromophore or fluorophore. The water condensation particle counting technology allows the analyte to be enlarged using water vapor to provide highest sensitivity. Derivatization-free analysis of glycoproteins using HPLC/NQAD method with PolyGLYCOPLEX™ amide column is well correlated with HPLC method with precolumn derivatization using 1, 2-diamino-4, 5-methylenedioxybenzene (DMB) as well as the Dionex-based high-pH anion-exchange chromatography (or ion chromatography) with pulsed amperometric detection (HPAEC-PAD). With the elimination of derivatization step, HPLC/NQAD method is more efficient than HPLC/DMB method. HPLC/NQAD method is more reproducible than HPAEC-PAD method as HPAEC-PAD method suffers high variability because of electrode fouling during analysis. Overall, HPLC/NQAD method offers broad linear dynamic range as well as excellent precision, accuracy, repeatability, reliability, and ease of use, with acceptable comparability to the commonly used HPAEC-PAD and HPLC/DMB methods. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
NASA Technical Reports Server (NTRS)
Kleinhammer, Roger K.; Graber, Robert R.; DeMott, D. L.
2016-01-01
Reliability practitioners advocate getting reliability involved early in a product development process. However, when assigned to estimate or assess the (potential) reliability of a product or system early in the design and development phase, they are faced with lack of reasonable models or methods for useful reliability estimation. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, analysts attempt to develop the "best" or composite analog data to support the assessments. Industries, consortia and vendors across many areas have spent decades collecting, analyzing and tabulating fielded item and component reliability performance in terms of observed failures and operational use. This data resource provides a huge compendium of information for potential use, but can also be compartmented by industry, difficult to find out about, access, or manipulate. One method used incorporates processes for reviewing these existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component. It can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. It also establishes a baseline prior that may updated based on test data or observed operational constraints and failures, i.e., using Bayesian techniques. This tutorial presents a descriptive compilation of historical data sources across numerous industries and disciplines, along with examples of contents and data characteristics. It then presents methods for combining failure information from different sources and mathematical use of this data in early reliability estimation and analyses.
Analytic thinking reduces belief in conspiracy theories.
Swami, Viren; Voracek, Martin; Stieger, Stefan; Tran, Ulrich S; Furnham, Adrian
2014-12-01
Belief in conspiracy theories has been associated with a range of negative health, civic, and social outcomes, requiring reliable methods of reducing such belief. Thinking dispositions have been highlighted as one possible factor associated with belief in conspiracy theories, but actual relationships have only been infrequently studied. In Study 1, we examined associations between belief in conspiracy theories and a range of measures of thinking dispositions in a British sample (N=990). Results indicated that a stronger belief in conspiracy theories was significantly associated with lower analytic thinking and open-mindedness and greater intuitive thinking. In Studies 2-4, we examined the causational role played by analytic thinking in relation to conspiracist ideation. In Study 2 (N=112), we showed that a verbal fluency task that elicited analytic thinking reduced belief in conspiracy theories. In Study 3 (N=189), we found that an alternative method of eliciting analytic thinking, which related to cognitive disfluency, was effective at reducing conspiracist ideation in a student sample. In Study 4, we replicated the results of Study 3 among a general population sample (N=140) in relation to generic conspiracist ideation and belief in conspiracy theories about the July 7, 2005, bombings in London. Our results highlight the potential utility of supporting attempts to promote analytic thinking as a means of countering the widespread acceptance of conspiracy theories. Copyright © 2014 Elsevier B.V. All rights reserved.
Maier, Barbara; Vogeser, Michael
2013-04-01
Isotope dilution LC-MS/MS methods used in the clinical laboratory typically involve multi-point external calibration in each analytical series. Our aim was to test the hypothesis that determination of target analyte concentrations directly derived from the relation of the target analyte peak area to the peak area of a corresponding stable isotope labelled internal standard compound [direct isotope dilution analysis (DIDA)] may be not inferior to conventional external calibration with respect to accuracy and reproducibility. Quality control samples and human serum pools were analysed in a comparative validation protocol for cortisol as an exemplary analyte by LC-MS/MS. Accuracy and reproducibility were compared between quantification either involving a six-point external calibration function, or a result calculation merely based on peak area ratios of unlabelled and labelled analyte. Both quantification approaches resulted in similar accuracy and reproducibility. For specified analytes, reliable analyte quantification directly derived from the ratio of peak areas of labelled and unlabelled analyte without the need for a time consuming multi-point calibration series is possible. This DIDA approach is of considerable practical importance for the application of LC-MS/MS in the clinical laboratory where short turnaround times often have high priority.
Bassuoni, M.M.
2013-01-01
The dehumidifier is a key component in liquid desiccant air-conditioning systems. Analytical solutions have more advantages than numerical solutions in studying the dehumidifier performance parameters. This paper presents the performance results of exit parameters from an analytical model of an adiabatic cross-flow liquid desiccant air dehumidifier. Calcium chloride is used as desiccant material in this investigation. A program performing the analytical solution is developed using the engineering equation solver software. Good accuracy has been found between analytical solution and reliable experimental results with a maximum deviation of +6.63% and −5.65% in the moisture removal rate. The method developed here can be used in the quick prediction of the dehumidifier performance. The exit parameters from the dehumidifier are evaluated under the effects of variables such as air temperature and humidity, desiccant temperature and concentration, and air to desiccant flow rates. The results show that hot humid air and desiccant concentration have the greatest impact on the performance of the dehumidifier. The moisture removal rate is decreased with increasing both air inlet temperature and desiccant temperature while increases with increasing air to solution mass ratio, inlet desiccant concentration, and inlet air humidity ratio. PMID:25685485
On fatigue crack growth under random loading
NASA Astrophysics Data System (ADS)
Zhu, W. Q.; Lin, Y. K.; Lei, Y.
1992-09-01
A probabilistic analysis of the fatigue crack growth, fatigue life and reliability of a structural or mechanical component is presented on the basis of fracture mechanics and theory of random processes. The material resistance to fatigue crack growth and the time-history of the stress are assumed to be random. Analytical expressions are obtained for the special case in which the random stress is a stationary narrow-band Gaussian random process, and a randomized Paris-Erdogan law is applicable. As an example, the analytical method is applied to a plate with a central crack, and the results are compared with those obtained from digital Monte Carlo simulations.
Soliton and periodic solutions for time-dependent coefficient non-linear equation
NASA Astrophysics Data System (ADS)
Guner, Ozkan
2016-01-01
In this article, we establish exact solutions for the generalized (3+1)-dimensional variable coefficient Kadomtsev-Petviashvili (GVCKP) equation. Using solitary wave ansatz in terms of ? functions and the modified sine-cosine method, we find exact analytical bright soliton solutions and exact periodic solutions for the considered model. The physical parameters in the soliton solutions are obtained as function of the dependent model coefficients. The effectiveness and reliability of the method are shown by its application to the GVCKP equation.
The application of the statistical theory of extreme values to gust-load problems
NASA Technical Reports Server (NTRS)
Press, Harry
1950-01-01
An analysis is presented which indicates that the statistical theory of extreme values is applicable to the problems of predicting the frequency of encountering the larger gust loads and gust velocities for both specific test conditions as well as commercial transport operations. The extreme-value theory provides an analytic form for the distributions of maximum values of gust load and velocity. Methods of fitting the distribution are given along with a method of estimating the reliability of the predictions. The theory of extreme values is applied to available load data from commercial transport operations. The results indicate that the estimates of the frequency of encountering the larger loads are more consistent with the data and more reliable than those obtained in previous analyses. (author)
NASA Technical Reports Server (NTRS)
White, Allan L.; Palumbo, Daniel L.
1991-01-01
Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.
El-Yazbi, Amira F
2017-01-20
Sofosbuvir (SOFO) was approved by the U.S. Food and Drug Administration in 2013 for the treatment of hepatitis C virusinfection with enhanced antiviral potency compared with earlier analogs. Notwithstanding, all current editions of the pharmacopeias still do not present any analytical methods for the quantification of SOFO. Thus, rapid, simple, and ecofriendly methods for the routine analysis of commercial formulations of SOFO are desirable. In this study, five accurate methods for the determination of SOFO in pharmaceutical tablets were developed and validated. These methods include HPLC, capillary zone electrophoresis, HPTLC, and UV spectrophotometric and derivative spectrometry methods. The proposed methods proved to be rapid, simple, sensitive, selective, and accurate analytical procedures that were suitable for the reliable determination of SOFO in pharmaceutical tablets. An analysis of variance test with <em>P</em>-value > 0.05 confirmed that there were no significant differences between the proposed assays. Thus, any of these methods can be used for the routine analysis of SOFO in commercial tablets.
Pilot testing of SHRP 2 reliability data and analytical products: Washington. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
The Washington site used the reliability guide from Project L02, analysis tools for forecasting reliability and estimating impacts from Project L07, Project L08, and Project C11 as well as the guide on reliability performance measures from the Projec...
Poitevin, Eric
2016-01-01
The minerals and trace elements that account for about 4% of total human body mass serve as materials and regulators in numerous biological activities in body structure building. Infant formula and milk products are important sources of endogenic and added minerals and trace elements and hence, must comply with regulatory as well as nutritional and safety requirements. In addition, reliable analytical data are necessary to support product content and innovation, health claims, or declaration and specific safety issues. Adequate analytical platforms and methods must be implemented to demonstrate both the compliance and safety assessment of all declared and regulated minerals and trace elements, especially trace-element contaminant surveillance. The first part of this paper presents general information on the mineral composition of infant formula and milk products and their regulatory status. In the second part, a survey describes the main techniques and related current official methods determining minerals and trace elements in infant formula and milk products applied for by various international organizations (AOAC INTERNATIONAL, the International Organization for Standardization, the International Dairy Federation, and the European Committe for Standardization). The third part summarizes method officialization activities by Stakeholder Panels on Infant Formula and Adult Nutritionals and Stakeholder Panel on Strategic Food Analytical Methods. The final part covers a general discussion focusing on analytical gaps and future trends in inorganic analysis that have been applied for in infant formula and milk-based products.
Busatto, Zenaís; da Silva, Agnaldo Fernando Baldo; de Freitas, Osvaldo; Paschoal, Jonas Augusto Rizzato
2017-04-01
This paper describes the development of analytical methods for the quantification of albendazole (ABZ) in fish feed and ABZ and its main known metabolites (albendazole sulfoxide, albendazole sulfone and albendazole aminosulfone) in fish fillet employing LC-MS/MS. In order to assess the reliability of the analytical methods, evaluation was undertaken as recommended by related guides proposed by the Brazilian Ministry of Agriculture for analytical method validation. The calibration curve for ABZ quantification in feed showed adequate linearity (r > 0.99), precision (CV < 1.03%) and trueness ranging from 99% to 101%. The method for ABZ residues in fish fillet involving the QuEChERS technique for sample extraction had adequate linearity (r > 0.99) for all analytes, precision (CV < 13%) and trueness around 100%, with CCα < 122 ng g - 1 and CCβ < 145 ng g - 1 . Besides, by aiming to avoid the risk of ABZ leaching from feed into the aquatic environment during fish medication via the oral route, a promising procedure for drug incorporation in the feed involving coating feed pellets with ethyl cellulose polymer containing ABZ was also evaluated. The medicated feed had good homogeneity (CV < 3%) and a lower release of ABZ (< 0.2%) from feed to water when the medicated feed stayed in the water for up to 15 min.
NASA Astrophysics Data System (ADS)
Ding, Anxin; Li, Shuxin; Wang, Jihui; Ni, Aiqing; Sun, Liangliang; Chang, Lei
2016-10-01
In this paper, the corner spring-in angles of AS4/8552 L-shaped composite profiles with different thicknesses are predicted using path-dependent constitutive law with the consideration of material properties variation due to phase change during curing. The prediction accuracy mainly depends on the properties in the rubbery and glassy states obtained by homogenization method rather than experimental measurements. Both analytical and finite element (FE) homogenization methods are applied to predict the overall properties of AS4/8552 composite. The effect of fiber volume fraction on the properties is investigated for both rubbery and glassy states using both methods. And the predicted results are compared with experimental measurements for the glassy state. Good agreement is achieved between the predicted results and available experimental data, showing the reliability of the homogenization method. Furthermore, the corner spring-in angles of L-shaped composite profiles are measured experimentally and the reliability of path-dependent constitutive law is validated as well as the properties prediction by FE homogenization method.
Huertas Pérez, J F; Sejerøe-Olsen, B; Fernández Alba, A R; Schimmel, H; Dabrio, M
2015-05-01
A sensitive, accurate and simple liquid chromatography coupled with mass spectrometry method for the determination of 10 selected pesticides in soya beans has been developed and validated. The method is intended for use during the characterization of selected pesticides in a reference material. In this process, high accuracy and appropriate uncertainty levels associated to the analytical measurements are of utmost importance. The analytical procedure is based on sample extraction by the use of a modified QuEChERS (quick, easy, cheap, effective, rugged, safe) extraction and subsequent clean-up of the extract with C18, PSA and Florisil. Analytes were separated on a C18 column using gradient elution with water-methanol/2.5 mM ammonium acetate mobile phase, and finally identified and quantified by triple quadrupole mass spectrometry in the multiple reaction monitoring mode (MRM). Reliable and accurate quantification of the analytes was achieved by means of stable isotope-labelled analogues employed as internal standards (IS) and calibration with pure substance solutions containing both, the isotopically labelled and native compounds. Exceptions were made for thiodicarb and malaoxon where the isotopically labelled congeners were not commercially available at the time of analysis. For the quantification of those compounds methomyl-(13)C2(15)N and malathion-D10 were used respectively. The method was validated according to the general principles covered by DG SANCO guidelines. However, validation criteria were set more stringently. Mean recoveries were in the range of 86-103% with RSDs lower than 8.1%. Repeatability and intermediate precision were in the range of 3.9-7.6% and 1.9-8.7% respectively. LODs were theoretically estimated and experimentally confirmed to be in the range 0.001-0.005 mg kg(-1) in the matrix, while LOQs established as the lowest spiking mass fractionation level were in the range 0.01-0.05 mg kg(-1). The method reliably identifies and quantifies the selected pesticides in soya beans at appropriate uncertainty levels, making it suitable for the characterization of candidate reference materials. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Probabilistic structural mechanics research for parallel processing computers
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Martin, William R.
1991-01-01
Aerospace structures and spacecraft are a complex assemblage of structural components that are subjected to a variety of complex, cyclic, and transient loading conditions. Significant modeling uncertainties are present in these structures, in addition to the inherent randomness of material properties and loads. To properly account for these uncertainties in evaluating and assessing the reliability of these components and structures, probabilistic structural mechanics (PSM) procedures must be used. Much research has focused on basic theory development and the development of approximate analytic solution methods in random vibrations and structural reliability. Practical application of PSM methods was hampered by their computationally intense nature. Solution of PSM problems requires repeated analyses of structures that are often large, and exhibit nonlinear and/or dynamic response behavior. These methods are all inherently parallel and ideally suited to implementation on parallel processing computers. New hardware architectures and innovative control software and solution methodologies are needed to make solution of large scale PSM problems practical.
A third-order approximation method for three-dimensional wheel-rail contact
NASA Astrophysics Data System (ADS)
Negretti, Daniele
2012-03-01
Multibody train analysis is used increasingly by railway operators whenever a reliable and time-efficient method to evaluate the contact between wheel and rail is needed; particularly, the wheel-rail contact is one of the most important aspects that affects a reliable and time-efficient vehicle dynamics computation. The focus of the approach proposed here is to carry out such tasks by means of online wheel-rail elastic contact detection. In order to improve efficiency and save time, a main analytical approach is used for the definition of wheel and rail surfaces as well as for contact detection, then a final numerical evaluation is used to locate contact. The final numerical procedure consists in finding the zeros of a nonlinear function in a single variable. The overall method is based on the approximation of the wheel surface, which does not influence the contact location significantly, as shown in the paper.
Advanced flight control system study
NASA Technical Reports Server (NTRS)
Hartmann, G. L.; Wall, J. E., Jr.; Rang, E. R.; Lee, H. P.; Schulte, R. W.; Ng, W. K.
1982-01-01
A fly by wire flight control system architecture designed for high reliability includes spare sensor and computer elements to permit safe dispatch with failed elements, thereby reducing unscheduled maintenance. A methodology capable of demonstrating that the architecture does achieve the predicted performance characteristics consists of a hierarchy of activities ranging from analytical calculations of system reliability and formal methods of software verification to iron bird testing followed by flight evaluation. Interfacing this architecture to the Lockheed S-3A aircraft for flight test is discussed. This testbed vehicle can be expanded to support flight experiments in advanced aerodynamics, electromechanical actuators, secondary power systems, flight management, new displays, and air traffic control concepts.
Toxicologic evaluation of analytes from Tank 241-C-103
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahlum, D.D.; Young, J.Y.; Weller, R.E.
1994-11-01
Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team`s objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise,more » including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found.« less
Median of patient results as a tool for assessment of analytical stability.
Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György
2015-06-15
In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.
Zong, Shi-Yu; Han, Han; Wang, Bing; Li, Ning; Dong, Tina Ting-Xia; Zhang, Tong; Tsim, Karl W K
2015-12-04
A reliable ultra-high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry (UHPLC-ESI-MS/MS) method for the fast simultaneous determination of 13 nucleosides and nucleobases in Cordyceps sinensis (C. sinensis) with 2-chloroadenosine as internal standard was developed and validated. Samples were ultrasonically extracted in an ice bath thrice, and the optimum analyte separation was performed on an ACQUITY UPLC(TM) HSS C18 column (100 mm × 2.1 mm, 1.8 μm) with gradient elution. All targeted analytes were separated in 5.5 min. Furthermore, all calibration curves showed good linear regression (r > 0.9970) within the test ranges, and the limits of quantitation and detection of the 13 analytes were less than 150 and 75 ng/mL, respectively. The relative standard deviations (RSDs) of intra- and inter-day precisions were <6.23%. Recoveries of the quantified analytes ranged within 85.3%-117.3%, with RSD < 6.18%. The developed UHPLC-ESI-MS/MS method was successfully applied to determine nucleosides and nucleobases in 11 batches of C. sinensis samples from different regions in China. The range for the total content in the analyzed samples was 1329-2057 µg/g.
Modern data science for analytical chemical data - A comprehensive review.
Szymańska, Ewa
2018-10-22
Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
USSR and Eastern Europe Scientific Abstracts Geophysics, Astronomy and Space No. 404
1977-09-01
atmospheric circulation. A reliable linear correlation was established between the monthly fallout activity of 10^Ru + -^Rh and monthly precipitation and...therefore the washing out of this radionuclide from tropospheric air by precipitation is more important for its fallout. [153] ANALYTICAL...development of some methods for predicting definite weather phenomena (such as precipitation ), taking into account the evolution of the
Mission Reliability Estimation for Repairable Robot Teams
NASA Technical Reports Server (NTRS)
Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen
2010-01-01
A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the current design paradigm of building a minimal number of highly robust robots may not be the best way to design robots for extended missions.
Zhao, Jianxing
2015-03-01
A high-performance liquid chromatography with ultraviolet detection method has been developed for the simultaneous determination of a set of reliable markers of renal function, including creatinine, uric acid, kynurenine and tryptophan in plasma. Separation was achieved by an Agilent HC-C18 (2) analytical column. Gradient elution and programmed wavelength detection allowed the method to be used to analyze these compounds by just one injection. The total run time was 25 min with all peaks of interest being eluted within 13 min. Good linear responses were found with correlation coefficient >0.999 for all analytes within the concentration range of the relevant levels. The recovery was: creatinine, 101 ± 1%; uric acid, 94.9 ± 3.7%; kynurenine, 100 ± 2%; and tryptophan, 92.6 ± 2.9%. Coefficients of variation within-run and between-run of all analytes were ≤2.4%. The limit of detection of the method was: creatinine, 0.1 µmol/L; uric acid, 0.05 µmol/L; kynurenine, 0.02 µmol/L; and tryptophan, 1 µmol/L. The developed method could be employed as a useful tool for the detection of chronic kidney disease, even at an early stage. Copyright © 2014 John Wiley & Sons, Ltd.
Dickson, Leslie C; O'Byrne, Collin; Chan, Wayne
2012-01-01
An LC/MS/MS-based multiresidue quantitative method was developed for the macrolides erythromycin A, neospiramycin I, oleandomycin, spiramycin I, tilmicosin, and tylosin A in porcine kidney tissues. The Canadian Food Inspection Agency (CFIA) had as part of its analytical scope an LC/UV method for quantification of residues of two macrolide antibiotics, tilmicosin and tylosin A, in the kidney, liver, and muscle of cattle, swine, and poultry. The method could not reliably detect concentrations below 10 microg/kg. To increase the scope of the CFIA's analytical capabilities, a sensitive multiresidue quantitative method for macrolide residues in food animal tissues was required. Porcine kidney samples were extracted with acetonitrile and alkaline buffer and cleaned-up using silica-based C18 SPE cartridges. Sample extracts were analyzed using LC/MS/MS with positive electrospray ionization. Fitness for purpose was verified in a single-laboratory validation study using a second analyst. The working analytical range was 5 to 50 microg/kg. LOD and LOQ were 0.5 to 0.6 microg/kg and 1.5 to 3.0 microg/kg, respectively. Limits of identification were 0.5 to 2.0 microg/kg. Relative intermediate precisions were 8 to 17%. Average absolute recoveries were 68 to 76%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, Tsong-Lun; Varuttamaseni, Athi; Baek, Joo-Seok
The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities.more » Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).« less
Pilot testing of SHRP 2 reliability data and analytical products: Washington.
DOT National Transportation Integrated Search
2014-07-30
The second Strategic Highway Research Program (SHRP 2) addresses the challenges of moving people and goods efficiently and safely on the nations highways. In its Reliability focus area, the research emphasizes improving the reliability of highway ...
Pascale, Raffaella; Caivano, Marianna; Buchicchio, Alessandro; Mancini, Ignazio M; Bianco, Giuliana; Caniani, Donatella
2017-01-13
Wastewater treatment plants (WWTPs) emit CO 2 and N 2 O, which may lead to climate change and global warming. Over the last few years, awareness of greenhouse gas (GHG) emissions from WWTPs has increased. Moreover, the development of valid, reliable, and high-throughput analytical methods for simultaneous gas analysis is an essential requirement for environmental applications. In the present study, an analytical method based on a gas chromatograph (GC) equipped with a barrier ionization discharge (BID) detector was developed for the first time. This new method simultaneously analyses CO 2 and N 2 O and has a precision, measured in terms of relative standard of variation RSD%, equal to or less than 6.6% and 5.1%, respectively. The method's detection limits are 5.3ppm v for CO 2 and 62.0ppb v for N 2 O. The method's selectivity, linearity, accuracy, repeatability, intermediate precision, limit of detection and limit of quantification were good at trace concentration levels. After validation, the method was applied to a real case of N 2 O and CO 2 emissions from a WWTP, confirming its suitability as a standard procedure for simultaneous GHG analysis in environmental samples containing CO 2 levels less than 12,000mg/L. Copyright © 2016 Elsevier B.V. All rights reserved.
Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo
2013-10-24
In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Camerini, Serena; Montepeloso, Emanuela; Casella, Marialuisa; Crescenzi, Marco; Marianella, Rosa Maria; Fuselli, Fabio
2016-04-15
Ricotta cheese is a typical Italian product, made with whey from various species, including cow, buffalo, sheep, and goat. Ricotta cheese nominally manufactured from the last three species may be fraudulently produced using the comparatively cheaper cow whey. Exposing such food frauds requires a reliable analytical method. Despite the extensive similarities shared by whey proteins of the four species, a mass spectrometry-based analytical method was developed that exploits three species-specific peptides derived from β-lactoglobulin and α-lactalbumin. This method can detect as little as 0.5% bovine whey in ricotta cheese from the other three species. Furthermore, a tight correlation was found (R(2)>0.99) between cow whey percentages and mass spectrometry measurements throughout the 1-50% range. Thus, this method can be used for forensic detection of ricotta cheese adulteration and, if properly validated, to provide quantitative evaluations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Galfi, Istvan; Virtanen, Jorma; Gasik, Michael M.
2017-01-01
A new, faster and more reliable analytical methodology for S(IV) species analysis at low pH solutions by bichromatometry is proposed. For decades the state of the art methodology has been iodometry that is still well justified method for neutral solutions, thus at low pH media possess various side reactions increasing inaccuracy. In contrast, the new methodology has no side reactions at low pH media, requires only one titration step and provides a clear color change if S(IV) species are present in the solution. The method is validated using model solutions with known concentrations and applied to analyses of gaseous SO2 from purged solution in low pH media samples. The results indicate that bichromatometry can accurately analyze SO2 from liquid samples having pH even below 0 relevant to metallurgical industrial processes. PMID:29145479
Monakhova, Yulia B; Mushtakova, Svetlana P
2017-05-01
A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.
Species-specific detection of processed animal proteins in feed by Raman spectroscopy.
Mandrile, Luisa; Amato, Giuseppina; Marchis, Daniela; Martra, Gianmario; Rossi, Andrea Mario
2017-08-15
The existing European Regulation (EC n° 51/2013) prohibits the use of animals meals in feedstuffs in order to prevent Bovine Spongiform Encephalopathy infection and diffusion, however the legislation is rapidly moving towards a partial lifting of the "feed ban" and the competent control organisms are urged to develop suitable analytical methods able to avoid food safety incidents related to animal origin products. The limitations of the official methods (i.e. light microscopy and Polymerase Chain Reaction) suggest exploring new analytic ways to get reliable results in a short time. The combination of spectroscopic techniques with optical microscopy allows the development of an individual particle method able to meet both selectivity and sensitivity requirements (0.1%w/w). A spectroscopic method based on Fourier Transform micro-Raman spectroscopy coupled with Discriminant Analysis is here presented. This approach could be very useful for in-situ applications, such as customs inspections, since it drastically reduces time and costs of analysis. Copyright © 2017. Published by Elsevier Ltd.
Maximum likelihood solution for inclination-only data in paleomagnetism
NASA Astrophysics Data System (ADS)
Arason, P.; Levi, S.
2010-08-01
We have developed a new robust maximum likelihood method for estimating the unbiased mean inclination from inclination-only data. In paleomagnetic analysis, the arithmetic mean of inclination-only data is known to introduce a shallowing bias. Several methods have been introduced to estimate the unbiased mean inclination of inclination-only data together with measures of the dispersion. Some inclination-only methods were designed to maximize the likelihood function of the marginal Fisher distribution. However, the exact analytical form of the maximum likelihood function is fairly complicated, and all the methods require various assumptions and approximations that are often inappropriate. For some steep and dispersed data sets, these methods provide estimates that are significantly displaced from the peak of the likelihood function to systematically shallower inclination. The problem locating the maximum of the likelihood function is partly due to difficulties in accurately evaluating the function for all values of interest, because some elements of the likelihood function increase exponentially as precision parameters increase, leading to numerical instabilities. In this study, we succeeded in analytically cancelling exponential elements from the log-likelihood function, and we are now able to calculate its value anywhere in the parameter space and for any inclination-only data set. Furthermore, we can now calculate the partial derivatives of the log-likelihood function with desired accuracy, and locate the maximum likelihood without the assumptions required by previous methods. To assess the reliability and accuracy of our method, we generated large numbers of random Fisher-distributed data sets, for which we calculated mean inclinations and precision parameters. The comparisons show that our new robust Arason-Levi maximum likelihood method is the most reliable, and the mean inclination estimates are the least biased towards shallow values.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
Quifer-Rada, Paola; Martínez-Huélamo, Miriam; Lamuela-Raventos, Rosa M
2017-07-19
Phenolic compounds are present in human fluids (plasma and urine) mainly as glucuronidated and sulfated metabolites. Up to now, due to the unavailability of standards, enzymatic hydrolysis has been the method of choice in analytical chemistry to quantify these phase II phenolic metabolites. Enzymatic hydrolysis procedures vary in enzyme concentration, pH and temperature; however, there is a lack of knowledge about the stability of polyphenols in their free form during the process. In this study, we evaluated the stability of 7 phenolic acids, 2 flavonoids and 3 prenylflavanoids in urine during enzymatic hydrolysis to assess the suitability of this analytical procedure, using three different concentrations of β-glucuronidase/sulfatase enzymes from Helix pomatia. The results indicate that enzymatic hydrolysis negatively affected the recovery of the precursor and free-form polyphenols present in the sample. Thus, enzymatic hydrolysis does not seem an ideal analytical strategy to quantify glucuronidated and sulfated polyphenol metabolites.
Ermacora, Alessia; Hrnčiřík, Karel
2014-01-01
Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.
Viidanoja, Jyrki
2015-02-27
A new method for quantification of short chain C1-C6 carboxylic acids in vegetable oils and fats by employing Liquid Chromatography Mass Spectrometry (LC-MS) has been developed. The method requires minor sample preparation and applies non-conventional Electrospray Ionization (ESI) liquid phase chemistry. Samples are first dissolved in chloroform and then extracted using water that has been spiked with stable isotope labeled internal standards that are used for signal normalization and absolute quantification of selected acids. The analytes are separated using Ion Exclusion Chromatography (IEC) and detected with Electrospray Ionization Mass Spectrometry (ESI-MS) as deprotonated molecules. Prior to ionization the eluent that contains hydrochloric acid is modified post-column to ensure good ionization efficiency of the analytes. The averaged within run precision and between run precision were generally lower than 8%. The accuracy was between 85 and 115% for most of the analytes. The Lower Limit of Quantification (LLOQ) ranged from 0.006 to 7mg/kg. It is shown that this method offers good selectivity in cases where UV detection fails to produce reliable results. Copyright © 2015 Elsevier B.V. All rights reserved.
Green analytical chemistry--theory and practice.
Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek
2010-08-01
This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.
Analytical models for coupling reliability in identical two-magnet systems during slow reversals
NASA Astrophysics Data System (ADS)
Kani, Nickvash; Naeemi, Azad
2017-12-01
This paper follows previous works which investigated the strength of dipolar coupling in two-magnet systems. While those works focused on qualitative analyses, this manuscript elucidates reversal through dipolar coupling culminating in analytical expressions for reversal reliability in identical two-magnet systems. The dipolar field generated by a mono-domain magnetic body can be represented by a tensor containing both longitudinal and perpendicular field components; this field changes orientation and magnitude based on the magnetization of neighboring nanomagnets. While the dipolar field does reduce to its longitudinal component at short time-scales, for slow magnetization reversals, the simple longitudinal field representation greatly underestimates the scope of parameters that ensure reliable coupling. For the first time, analytical models that map the geometric and material parameters required for reliable coupling in two-magnet systems are developed. It is shown that in biaxial nanomagnets, the x ̂ and y ̂ components of the dipolar field contribute to the coupling, while all three dimensions contribute to the coupling between a pair of uniaxial magnets. Additionally, the ratio of the longitudinal and perpendicular components of the dipolar field is also very important. If the perpendicular components in the dipolar tensor are too large, the nanomagnet pair may come to rest in an undesirable meta-stable state away from the free axis. The analytical models formulated in this manuscript map the minimum and maximum parameters for reliable coupling. Using these models, it is shown that there is a very small range of material parameters which can facilitate reliable coupling between perpendicular-magnetic-anisotropy nanomagnets; hence, in-plane nanomagnets are more suitable for coupled systems.
ERIC Educational Resources Information Center
Choi, Namok; Fuqua, Dale R.; Newman, Jody L.
2009-01-01
The short form of the Bem Sex Role Inventory (BSRI) contains half as many items as the long form and yet has often demonstrated better reliability and validity. This study uses exploratory and confirmatory factor analytic methods to examine the structure of the short form of the BSRI. A structure noted elsewhere also emerged here, consisting of…
Valente, Inês Maria; Rodrigues, José António
2014-01-01
Abstract Cardamonin, as shown by the increasing number of publications, has received growing attention from the scientific community due to the expectations toward its benefits to human health. In this study, research on cardamonin is reviewed, including its natural sources, health promoting aspects, and analytical methods for its determination. Therefore, this article hopes to aid current and future researchers on the search for reliable answers concerning cardamonin's value in medicine. PMID:24433078
Modelling the aggregation process of cellular slime mold by the chemical attraction.
Atangana, Abdon; Vermeulen, P D
2014-01-01
We put into exercise a comparatively innovative analytical modus operandi, the homotopy decomposition method (HDM), for solving a system of nonlinear partial differential equations arising in an attractor one-dimensional Keller-Segel dynamics system. Numerical solutions are given and some properties show evidence of biologically practical reliance on the parameter values. The reliability of HDM and the reduction in computations give HDM a wider applicability.
Ballistic Puncture Self-Healing Polymeric Materials
NASA Technical Reports Server (NTRS)
Gordon, Keith L.; Siochi, Emilie J.; Yost, William T.; Bogert, Phil B.; Howell, Patricia A.; Cramer, K. Elliott; Burke, Eric R.
2017-01-01
Space exploration launch costs on the order of $10,000 per pound provide an incentive to seek ways to reduce structural mass while maintaining structural function to assure safety and reliability. Damage-tolerant structural systems provide a route to avoiding weight penalty while enhancing vehicle safety and reliability. Self-healing polymers capable of spontaneous puncture repair show promise to mitigate potentially catastrophic damage from events such as micrometeoroid penetration. Effective self-repair requires these materials to quickly heal following projectile penetration while retaining some structural function during the healing processes. Although there are materials known to possess this capability, they are typically not considered for structural applications. Current efforts use inexpensive experimental methods to inflict damage, after which analytical procedures are identified to verify that function is restored. Two candidate self-healing polymer materials for structural engineering systems are used to test these experimental methods.
Wang, Xiaoyang; Wang, Mi; Zhang, Keyu; Hou, Ting; Zhang, Lifang; Fei, Chenzong; Xue, Feiqun; Hang, Taijun
2018-06-01
A reliable UPLC-MS/MS method with high sensitivity was developed and validated for the determination of virginiamycin M1 in muscle, fat, liver, and kidney samples of chicken and swine. Analytes were extracted using acetonitrile and extracts were defatted by N-hexane. Chromatographic separation was performed on a BEH C18 liquid chromatography column. The analytes were then detected using triplequadrupole mass spectrometry in positive electrospray ionization and multiple reaction monitoring mode. Calibration plots were constructed using standard working solutions and showed good linearity. Limits of quantification ranged from 2 to 60 ng mL -1 . Copyright © 2018 Elsevier Ltd. All rights reserved.
Guo, Lin; Wang, Meng-meng; He, Min; Qiu, Fu-rong; Jiang, Jian
2015-04-01
A liquid chromatography-tandem mass spectrometry method (LC-MS/MS) was developed to quantify ezetimibe (EZM) and its major glucuronide (ezetimibe glucuronide, EZM-G) in human plasma simultaneously. The analytes were purified by solid phase extraction (SPE) without hydrolysis. Separation of the analytes was achieved using acetonitrile-water (0.08% formic acid) (70:30, v/v) as the mobile phase at a flow rate of 0.8 mL/min on an Agilent Extend C18 column. The analytes were detected by LC-MS/MS using negative ionization in multiple reaction monitoring (MRM) mode. The mass transition pairs of m/z 408.4→271.0 and m/z 584.5→271.0 were used to detect EZM and EZM-G, respectively. The analytical method was linear over the concentration range of 0.1-20 ng/mL for EZM and 0.5-200 ng/mL for EZM-G. Within- and between-run precision for EZM was no more than 8.6% and 12.8%; and for EZM-G was no more than 9.0% and 8.7%, respectively. This method was reproducible and reliable, and was successfully used to analyze human plasma samples for application in a bioequivalence study. Copyright © 2015 Elsevier B.V. All rights reserved.
Analytical procedures for determining the impacts of reliability mitigation strategies.
DOT National Transportation Integrated Search
2013-01-01
Reliability of transport, especially the ability to reach a destination within a certain amount of time, is a regular concern of travelers and shippers. The definition of reliability used in this research is how travel time varies over time. The vari...
Foundations of measurement and instrumentation
NASA Technical Reports Server (NTRS)
Warshawsky, Isidore
1990-01-01
The user of instrumentation has provided an understanding of the factors that influence instrument performance, selection, and application, and of the methods of interpreting and presenting the results of measurements. Such understanding is prerequisite to the successful attainment of the best compromise among reliability, accuracy, speed, cost, and importance of the measurement operation in achieving the ultimate goal of a project. Some subjects covered are dimensions; units; sources of measurement error; methods of describing and estimating accuracy; deduction and presentation of results through empirical equations, including the method of least squares; experimental and analytical methods of determining the static and dynamic behavior of instrumentation systems, including the use of analogs.
Panetta, Robert J; Jahren, A Hope
2011-05-30
Gas chromatography-combustion-isotope ratio mass spectrometry (GC-C-IRMS) is increasingly applied to food and metabolic studies for stable isotope analysis (δ(13) C), with the quantification of analyte concentration often obtained via a second alternative method. We describe a rapid direct transesterification of triacylglycerides (TAGs) for fatty acid methyl ester (FAME) analysis by GC-C-IRMS demonstrating robust simultaneous quantification of amount of analyte (mean r(2) =0.99, accuracy ±2% for 37 FAMEs) and δ(13) C (±0.13‰) in a single analytical run. The maximum FAME yield and optimal δ(13) C values are obtained by derivatizing with 10% (v/v) acetyl chloride in methanol for 1 h, while lower levels of acetyl chloride and shorter reaction times skewed the δ(13) C values by as much as 0.80‰. A Bland-Altman evaluation of the GC-C-IRMS measurements resulted in excellent agreement for pure oils (±0.08‰) and oils extracted from French fries (±0.49‰), demonstrating reliable simultaneous quantification of FAME concentration and δ(13) C values. Thus, we conclude that for studies requiring both the quantification of analyte and δ(13) C data, such as authentication or metabolic flux studies, GC-C-IRMS can be used as the sole analytical method. Copyright © 2011 John Wiley & Sons, Ltd.
Magnusson, R; Nordlander, T; Östin, A
2016-01-15
Sampling teams performing work at sea in areas where chemical munitions may have been dumped require rapid and reliable analytical methods for verifying sulfur mustard leakage from suspected objects. Here we present such an on-site analysis method based on dynamic headspace GC-MS for analysis of five cyclic sulfur mustard degradation products that have previously been detected in sediments from chemical weapon dumping sites: 1,4-oxathiane, 1,3-dithiolane, 1,4-dithiane, 1,4,5-oxadithiephane, and 1,2,5-trithiephane. An experimental design involving authentic Baltic Sea sediments spiked with the target analytes was used to develop an optimized protocol for sample preparation, headspace extraction and analysis that afforded recoveries of up to 60-90%. The optimized method needs no organic solvents, uses only two grams of sediment on a dry weight basis and involves a unique sample presentation whereby sediment is spread uniformly as a thin layer inside the walls of a glass headspace vial. The method showed good linearity for analyte concentrations of 5-200 ng/g dw, good repeatability, and acceptable carry-over. The method's limits of detection for spiked sediment samples ranged from 2.5 to 11 μg/kg dw, with matrix interference being the main limiting factor. The instrumental detection limits were one to two orders of magnitude lower. Full-scan GC-MS analysis enabled the use of automated mass spectral deconvolution for rapid identification of target analytes. Using this approach, analytes could be identified in spiked sediment samples at concentrations down to 13-65 μg/kg dw. On-site validation experiments conducted aboard the research vessel R/V Oceania demonstrated the method's practical applicability, enabling the successful identification of four cyclic sulfur mustard degradation products at concentrations of 15-308μg/kg in sediments immediately after being collected near a wreck at the Bornholm Deep dumpsite in the Baltic Sea. Copyright © 2015 Elsevier B.V. All rights reserved.
Elsohaby, Ibrahim; McClure, J Trenton; Riley, Christopher B; Bryanton, Janet; Bigsby, Kathryn; Shaw, R Anthony
2018-02-20
Attenuated total reflectance infrared (ATR-IR) spectroscopy is a simple, rapid and cost-effective method for the analysis of serum. However, the complex nature of serum remains a limiting factor to the reliability of this method. We investigated the benefits of coupling the centrifugal ultrafiltration with ATR-IR spectroscopy for quantification of human serum IgA concentration. Human serum samples (n = 196) were analyzed for IgA using an immunoturbidimetric assay. ATR-IR spectra were acquired for whole serum samples and for the retentate (residue) reconstituted with saline following 300 kDa centrifugal ultrafiltration. IR-based analytical methods were developed for each of the two spectroscopic datasets, and the accuracy of each of the two methods compared. Analytical methods were based upon partial least squares regression (PLSR) calibration models - one with 5-PLS factors (for whole serum) and the second with 9-PLS factors (for the reconstituted retentate). Comparison of the two sets of IR-based analytical results to reference IgA values revealed improvements in the Pearson correlation coefficient (from 0.66 to 0.76), and the root mean squared error of prediction in IR-based IgA concentrations (from 102 to 79 mg/dL) for the ultrafiltration retentate-based method as compared to the method built upon whole serum spectra. Depleting human serum low molecular weight proteins using a 300 kDa centrifugal filter thus enhances the accuracy IgA quantification by ATR-IR spectroscopy. Further evaluation and optimization of this general approach may ultimately lead to routine analysis of a range of high molecular-weight analytical targets that are otherwise unsuitable for IR-based analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kamiński, M.; Supeł, Ł.
2016-02-01
It is widely known that lateral-torsional buckling of a member under bending and warping restraints of its cross-sections in the steel structures are crucial for estimation of their safety and durability. Although engineering codes for steel and aluminum structures support the designer with the additional analytical expressions depending even on the boundary conditions and internal forces diagrams, one may apply alternatively the traditional Finite Element or Finite Difference Methods (FEM, FDM) to determine the so-called critical moment representing this phenomenon. The principal purpose of this work is to compare three different ways of determination of critical moment, also in the context of structural sensitivity analysis with respect to the structural element length. Sensitivity gradients are determined by the use of both analytical and the central finite difference scheme here and contrasted also for analytical, FEM as well as FDM approaches. Computational study is provided for the entire family of the steel I- and H - beams available for the practitioners in this area, and is a basis for further stochastic reliability analysis as well as durability prediction including possible corrosion progress.
On the Local Convergence of Pattern Search
NASA Technical Reports Server (NTRS)
Dolan, Elizabeth D.; Lewis, Robert Michael; Torczon, Virginia; Bushnell, Dennis M. (Technical Monitor)
2000-01-01
We examine the local convergence properties of pattern search methods, complementing the previously established global convergence properties for this class of algorithms. We show that the step-length control parameter which appears in the definition of pattern search algorithms provides a reliable asymptotic measure of first-order stationarity. This gives an analytical justification for a traditional stopping criterion for pattern search methods. Using this measure of first-order stationarity, we analyze the behavior of pattern search in the neighborhood of an isolated local minimizer. We show that a recognizable subsequence converges r-linearly to the minimizer.
Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam
2017-01-18
While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in cellular media due to strong cross-talk between energetically separated detection channels.
NASA Astrophysics Data System (ADS)
Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam
2017-03-01
While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in cellular media due to strong cross-talk between energetically separated detection channels. Dedicated to Professor Kankan Bhattacharyya.
Bioassays as one of the Green Chemistry tools for assessing environmental quality: A review.
Wieczerzak, M; Namieśnik, J; Kudłak, B
2016-09-01
For centuries, mankind has contributed to irreversible environmental changes, but due to the modern science of recent decades, scientists are able to assess the scale of this impact. The introduction of laws and standards to ensure environmental cleanliness requires comprehensive environmental monitoring, which should also meet the requirements of Green Chemistry. The broad spectrum of Green Chemistry principle applications should also include all of the techniques and methods of pollutant analysis and environmental monitoring. The classical methods of chemical analyses do not always match the twelve principles of Green Chemistry, and they are often expensive and employ toxic and environmentally unfriendly solvents in large quantities. These solvents can generate hazardous and toxic waste while consuming large volumes of resources. Therefore, there is a need to develop reliable techniques that would not only meet the requirements of Green Analytical Chemistry, but they could also complement and sometimes provide an alternative to conventional classical analytical methods. These alternatives may be found in bioassays. Commercially available certified bioassays often come in the form of ready-to-use toxkits, and they are easy to use and relatively inexpensive in comparison with certain conventional analytical methods. The aim of this study is to provide evidence that bioassays can be a complementary alternative to classical methods of analysis and can fulfil Green Analytical Chemistry criteria. The test organisms discussed in this work include single-celled organisms, such as cell lines, fungi (yeast), and bacteria, and multicellular organisms, such as invertebrate and vertebrate animals and plants. Copyright © 2016 Elsevier Ltd. All rights reserved.
Do placebo based validation standards mimic real batch products behaviour? Case studies.
Bouabidi, A; Talbi, M; Bouklouze, A; El Karbane, M; Bourichi, H; El Guezzar, M; Ziemons, E; Hubert, Ph; Rozet, E
2011-06-01
Analytical methods validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application. Validation usually involves validation standards or quality control samples that are prepared in placebo or reconstituted matrix made of a mixture of all the ingredients composing the drug product except the active substance or the analyte under investigation. However, one of the main concerns that can be made with this approach is that it may lack an important source of variability that come from the manufacturing process. The question that remains at the end of the validation step is about the transferability of the quantitative performance from validation standards to real authentic drug product samples. In this work, this topic is investigated through three case studies. Three analytical methods were validated using the commonly spiked placebo validation standards at several concentration levels as well as using samples coming from authentic batch samples (tablets and syrups). The results showed that, depending on the type of response function used as calibration curve, there were various degrees of differences in the results accuracy obtained with the two types of samples. Nonetheless the use of spiked placebo validation standards was showed to mimic relatively well the quantitative behaviour of the analytical methods with authentic batch samples. Adding these authentic batch samples into the validation design may help the analyst to select and confirm the most fit for purpose calibration curve and thus increase the accuracy and reliability of the results generated by the method in routine application. Copyright © 2011 Elsevier B.V. All rights reserved.
The kappa statistic in rehabilitation research: an examination.
Tooth, Leigh R; Ottenbacher, Kenneth J
2004-08-01
The number and sophistication of statistical procedures reported in medical rehabilitation research is increasing. Application of the principles and methods associated with evidence-based practice has contributed to the need for rehabilitation practitioners to understand quantitative methods in published articles. Outcomes measurement and determination of reliability are areas that have experienced rapid change during the past decade. In this study, distinctions between reliability and agreement are examined. Information is presented on analytical approaches for addressing reliability and agreement with the focus on the application of the kappa statistic. The following assumptions are discussed: (1) kappa should be used with data measured on a categorical scale, (2) the patients or objects categorized should be independent, and (3) the observers or raters must make their measurement decisions and judgments independently. Several issues related to using kappa in measurement studies are described, including use of weighted kappa, methods of reporting kappa, the effect of bias and prevalence on kappa, and sample size and power requirements for kappa. The kappa statistic is useful for assessing agreement among raters, and it is being used more frequently in rehabilitation research. Correct interpretation of the kappa statistic depends on meeting the required assumptions and accurate reporting.
NASA Astrophysics Data System (ADS)
Iskandar, I.
2018-03-01
The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.
Techniques for sensing methanol concentration in aqueous environments
NASA Technical Reports Server (NTRS)
Narayanan, Sekharipuram R. (Inventor); Chun, William (Inventor); Valdez, Thomas I. (Inventor)
2001-01-01
An analyte concentration sensor that is capable of fast and reliable sensing of analyte concentration in aqueous environments with high concentrations of the analyte. Preferably, the present invention is a methanol concentration sensor device coupled to a fuel metering control system for use in a liquid direct-feed fuel cell.
Electrochemical genosensors in food safety assessment.
Martín-Fernández, Begoña; Manzanares-Palenzuela, C Lorena; Sánchez-Paniagua López, Marta; de-Los-Santos-Álvarez, Noemí; López-Ruiz, Beatriz
2017-09-02
The main goal of food safety assessment is to provide reliable information on the identity and composition of food and reduce the presence of harmful components. Nowadays, there are many countries where rather than the presence of pathogens, common public concerns are focused on the presence of hidden allergens, fraudulent practices, and genetic modifications in food. Accordingly, food regulations attempt to offer a high level of protection and to guarantee transparent information to the consumers. The availability of analytical methods is essential to comply these requirements. Protein-based strategies are usually employed for this purpose, but present some limitations. Because DNA is a more stable molecule, present in most tissues, and can be amplified, there has been an increasing interest in developing DNA-based approaches (polymerase chain reaction, microarrays, and genosensors). In this regard, electrochemical genosensors may play a major role in fulfilling the needs of food industry, such as reliable, portable, and affordable devices. This work reviews the achievements of this technology applied to allergen detection, species identification, and genetically modified organisms testing. We summarized the legislative framework, current design strategies in sensor development, their analytical characteristics, and future prospects.
Big data analytics for the Future Circular Collider reliability and availability studies
NASA Astrophysics Data System (ADS)
Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter
2017-10-01
Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.
Score Reliability of Adolescent Alcohol Screening Measures: A Meta-Analytic Inquiry
ERIC Educational Resources Information Center
Shields, Alan L.; Campfield, Delia C.; Miller, Christopher S.; Howell, Ryan T.; Wallace, Kimberly; Weiss, Roger D.
2008-01-01
This study describes the reliability reporting practices in empirical studies using eight adolescent alcohol screening tools and characterizes and explores variability in internal consistency estimates across samples. Of 119 observed administrations of these instruments, 40 (34%) reported usable reliability information. The Personal Experience…
Aptamer-Based Analysis: A Promising Alternative for Food Safety Control
Amaya-González, Sonia; de-los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J.; Lobo-Castañón, Maria Jesús
2013-01-01
Ensuring food safety is nowadays a top priority of authorities and professional players in the food supply chain. One of the key challenges to determine the safety of food and guarantee a high level of consumer protection is the availability of fast, sensitive and reliable analytical methods to identify specific hazards associated to food before they become a health problem. The limitations of existing methods have encouraged the development of new technologies, among them biosensors. Success in biosensor design depends largely on the development of novel receptors with enhanced affinity to the target, while being stable and economical. Aptamers fulfill these characteristics, and thus have surfaced as promising alternatives to natural receptors. This Review describes analytical strategies developed so far using aptamers for the control of pathogens, allergens, adulterants, toxins and other forbidden contaminants to ensure food safety. The main progresses to date are presented, highlighting potential prospects for the future. PMID:24287543
Review of levoglucosan in glacier snow and ice studies: Recent progress and future perspectives.
You, Chao; Xu, Chao
2018-03-01
Levoglucosan (LEV) in glacier snow and ice layers provides a fingerprint of fire activity, ranging from modern air pollution to ancient fire emissions. In this study, we review recent progress in our understanding and application of LEV in glaciers, including analytical methods, transport and post-depositional processes, and historical records. We firstly summarize progress in analytical methods for determination of LEV in glacier snow and ice. Then, we discuss the processes influencing the records of LEV in snow and ice layers. Finally, we make some recommendations for future work, such as assessing the stability of LEV and obtaining continuous records, to increase reliability of the reconstructed ancient fire activity. This review provides an update for researchers working with LEV and will facilitate the further use of LEV as a biomarker in paleo-fire studies based on ice core records. Copyright © 2017 Elsevier B.V. All rights reserved.
Nonylphenol: Properties, legislation, toxicity and determination.
Araujo, Frederico G DE; Bauerfeldt, Glauco F; Cid, Yara Peluso
2017-08-07
This paper aims to gather and discuss important information about nonylphenol, such as physical chemistry properties, toxicity and analytical methods in various matrices. As a degradation product of ethoxylated alkylphenols, nonylphenol presents a higher degree of reactivity than its precursor. Due to its harmful effects on the environment, use and production of nonylphenol has been banned in European Union countries, alongside their precursors. The guide on quality of drinking water (USEPA) recommends a maximum concentration of 28 µg L-1 for fresh water. In Brazil, there is no clear legislation containing values of maximum concentration of nonylphenol. Due to this lack of regulation, a continuous monitoring is necessary of this pollutant in environmental samples. This paper aims to encourage further studies on nonylphenol, seen as a critical environmental pollutant. For proper monitoring is necessary to have reliable analytical methods and easy to perform in routine analysis.
Green's functions in equilibrium and nonequilibrium from real-time bold-line Monte Carlo
NASA Astrophysics Data System (ADS)
Cohen, Guy; Gull, Emanuel; Reichman, David R.; Millis, Andrew J.
2014-03-01
Green's functions for the Anderson impurity model are obtained within a numerically exact formalism. We investigate the limits of analytical continuation for equilibrium systems, and show that with real time methods even sharp high-energy features can be reliably resolved. Continuing to an Anderson impurity in a junction, we evaluate two-time correlation functions, spectral properties, and transport properties, showing how the correspondence between the spectral function and the differential conductance breaks down when nonequilibrium effects are taken into account. Finally, a long-standing dispute regarding this model has involved the voltage splitting of the Kondo peak, an effect which was predicted over a decade ago by approximate analytical methods but never successfully confirmed by numerics. We settle the issue by demonstrating in an unbiased manner that this splitting indeed occurs. Yad Hanadiv-Rothschild Foundation, TG-DMR120085, TG-DMR130036, NSF CHE-1213247, NSF DMR 1006282, DOE ER 46932.
Probabilistic Analysis of a Composite Crew Module
NASA Technical Reports Server (NTRS)
Mason, Brian H.; Krishnamurthy, Thiagarajan
2011-01-01
An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.
Analysis of short-chain fatty acids in human feces: A scoping review.
Primec, Maša; Mičetić-Turk, Dušanka; Langerholc, Tomaž
2017-06-01
Short-chain fatty acids (SCFAs) play a crucial role in maintaining homeostasis in humans, therefore the importance of a good and reliable SCFAs analytical detection has raised a lot in the past few years. The aim of this scoping review is to show the trends in the development of different methods of SCFAs analysis in feces, based on the literature published in the last eleven years in all major indexing databases. The search criteria included analytical quantification techniques of SCFAs in different human clinical and in vivo studies. SCFAs analysis is still predominantly performed using gas chromatography (GC), followed by high performance liquid chromatography (HPLC), nuclear magnetic resonance (NMR) and capillary electrophoresis (CE). Performances, drawbacks and advantages of these methods are discussed, especially in the light of choosing a proper pretreatment, as feces is a complex biological material. Further optimization to develop a simple, cost effective and robust method for routine use is needed. Copyright © 2017 Elsevier Inc. All rights reserved.
Application of Dynamic Analysis in Semi-Analytical Finite Element Method.
Liu, Pengfei; Xing, Qinyan; Wang, Dawei; Oeser, Markus
2017-08-30
Analyses of dynamic responses are significantly important for the design, maintenance and rehabilitation of asphalt pavement. In order to evaluate the dynamic responses of asphalt pavement under moving loads, a specific computational program, SAFEM, was developed based on a semi-analytical finite element method. This method is three-dimensional and only requires a two-dimensional FE discretization by incorporating Fourier series in the third dimension. In this paper, the algorithm to apply the dynamic analysis to SAFEM was introduced in detail. Asphalt pavement models under moving loads were built in the SAFEM and commercial finite element software ABAQUS to verify the accuracy and efficiency of the SAFEM. The verification shows that the computational accuracy of SAFEM is high enough and its computational time is much shorter than ABAQUS. Moreover, experimental verification was carried out and the prediction derived from SAFEM is consistent with the measurement. Therefore, the SAFEM is feasible to reliably predict the dynamic response of asphalt pavement under moving loads, thus proving beneficial to road administration in assessing the pavement's state.
Simple and clean determination of tetracyclines by flow injection analysis
NASA Astrophysics Data System (ADS)
Rodríguez, Michael Pérez; Pezza, Helena Redigolo; Pezza, Leonardo
2016-01-01
An environmentally reliable analytical methodology was developed for direct quantification of tetracycline (TC) and oxytetracycline (OTC) using continuous flow injection analysis with spectrophotometric detection. The method is based on the diazo coupling reaction between the tetracyclines and diazotized sulfanilic acid in a basic medium, resulting in the formation of an intense orange azo compound that presents maximum absorption at 434 nm. Experimental design was used to optimize the analytical conditions. The proposed technique was validated over the concentration range of 1 to 40 μg mL- 1, and was successfully applied to samples of commercial veterinary pharmaceuticals. The detection (LOD) and quantification (LOQ) limits were 0.40 and 1.35 μg mL- 1, respectively. The samples were also analyzed by an HPLC method, and the results showed agreement with the proposed technique. The new flow injection method can be immediately used for quality control purposes in the pharmaceutical industry, facilitating monitoring in real time during the production processes of tetracycline formulations for veterinary use.
Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.
Grdinić, Vladimir; Vuković, Jadranka
2004-05-28
A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.
Fair, Justin D.; Bailey, William F.; Felty, Robert A.; Gifford, Amy E.; Shultes, Benjamin; Volles, Leslie H.
2010-01-01
Development of a robust reliable technique that permits for the rapid quantitation of volatile organic chemicals is an important first step to remediation associated with vapor intrusion. This paper describes the development of an analytical method that allows for the rapid and precise identification and quantitation of halogenated and nonhalogenated contaminants commonly found within the ppbv level at sites where vapor intrusion is a concern. PMID:20885969
Stochastic modeling of experimental chaotic time series.
Stemler, Thomas; Werner, Johannes P; Benner, Hartmut; Just, Wolfram
2007-01-26
Methods developed recently to obtain stochastic models of low-dimensional chaotic systems are tested in electronic circuit experiments. We demonstrate that reliable drift and diffusion coefficients can be obtained even when no excessive time scale separation occurs. Crisis induced intermittent motion can be described in terms of a stochastic model showing tunneling which is dominated by state space dependent diffusion. Analytical solutions of the corresponding Fokker-Planck equation are in excellent agreement with experimental data.
Anderson, David M. G.; Floyd, Kyle A.; Barnes, Stephen; Clark, Judy M.; Clark, John I.; Mchaourab, Hassane; Schey, Kevin L.
2015-01-01
MALDI imaging requires careful sample preparation to obtain reliable, high quality images of small molecules, peptides, lipids, and proteins across tissue sections. Poor crystal formation, delocalization of analytes, and inadequate tissue adherence can affect the quality, reliability, and spatial resolution of MALDI images. We report a comparison of tissue mounting and washing methods that resulted in an optimized method using conductive carbon substrates that avoids thaw mounting or washing steps, minimizes protein delocalization, and prevents tissue detachment from the target surface. Application of this method to image ocular lens proteins of small vertebrate eyes demonstrates the improved methodology for imaging abundant crystallin protein products. This method was demonstrated for tissue sections from rat, mouse, and zebrafish lenses resulting in good quality MALDI images with little to no delocalization. The images indicate, for the first time in mouse and zebrafish, discrete localization of crystallin protein degradation products resulting in concentric rings of distinct protein contents that may be responsible for the refractive index gradient of vertebrate lenses. PMID:25665708
Mechanical and Electronic Approaches to Improve the Sensitivity of Microcantilever Sensors
Mutyala, Madhu Santosh Ku; Bandhanadham, Deepika; Pan, Liu; Pendyala, Vijaya Rohini; Ji, Hai-Feng
2010-01-01
Advances in the field of Micro Electro Mechanical Systems (MEMS) and their uses now offer unique opportunities in the design of ultrasensitive analytical tools. The analytical community continues to search for cost-effective, reliable, and even portable analytical techniques that can give reliable and fast response results for a variety of chemicals and biomolecules. Microcantilevers (MCLs) have emerged as a unique platform for label-free biosensor or bioassay. Several electronic designs, including piezoresistive, piezoelectric, and capacitive approaches, have been applied to measure the bending or frequency change of the MCLs upon exposure to chemicals. This review summarizes mechanical, fabrication, and electronics approaches to increase the sensitivity of microcantilever (MCL) sensors. PMID:20975987
Dai, Sheng-Yun; Xu, Bing; Zhang, Yi; Li, Jian-Yu; Sun, Fei; Shi, Xin-Yuan; Qiao, Yan-Jiang
2016-09-01
Coptis chinensis (Huanglian) is a commonly used traditional Chinese medicine (TCM) herb and alkaloids are the most important chemical constituents in it. In the present study, an isocratic reverse phase high performance liquid chromatography (RP-HPLC) method allowing the separation of six alkaloids in Huanglian was for the first time developed under the quality by design (QbD) principles. First, five chromatographic parameters were identified to construct a Plackett-Burman experimental design. The critical resolution, analysis time, and peak width were responses modeled by multivariate linear regression. The results showed that the percentage of acetonitrile, concentration of sodium dodecyl sulfate, and concentration of potassium phosphate monobasic were statistically significant parameters (P < 0.05). Then, the Box-Behnken experimental design was applied to further evaluate the interactions between the three parameters on selected responses. Full quadratic models were built and used to establish the analytical design space. Moreover, the reliability of design space was estimated by the Bayesian posterior predictive distribution. The optimal separation was predicted at 40% acetonitrile, 1.7 g·mL(-1) of sodium dodecyl sulfate and 0.03 mol·mL(-1) of potassium phosphate monobasic. Finally, the accuracy profile methodology was used to validate the established HPLC method. The results demonstrated that the QbD concept could be efficiently used to develop a robust RP-HPLC analytical method for Huanglian. Copyright © 2016 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.
Symmetry-plane model of 3D Euler flows: Mapping to regular systems and numerical solutions of blowup
NASA Astrophysics Data System (ADS)
Mulungye, Rachel M.; Lucas, Dan; Bustamante, Miguel D.
2014-11-01
We introduce a family of 2D models describing the dynamics on the so-called symmetry plane of the full 3D Euler fluid equations. These models depend on a free real parameter and can be solved analytically. For selected representative values of the free parameter, we apply the method introduced in [M.D. Bustamante, Physica D: Nonlinear Phenom. 240, 1092 (2011)] to map the fluid equations bijectively to globally regular systems. By comparing the analytical solutions with the results of numerical simulations, we establish that the numerical simulations of the mapped regular systems are far more accurate than the numerical simulations of the original systems, at the same spatial resolution and CPU time. In particular, the numerical integrations of the mapped regular systems produce robust estimates for the growth exponent and singularity time of the main blowup quantity (vorticity stretching rate), converging well to the analytically-predicted values even beyond the time at which the flow becomes under-resolved (i.e. the reliability time). In contrast, direct numerical integrations of the original systems develop unstable oscillations near the reliability time. We discuss the reasons for this improvement in accuracy, and explain how to extend the analysis to the full 3D case. Supported under the programme for Research in Third Level Institutions (PRTLI) Cycle 5 and co-funded by the European Regional Development Fund.
Emerson, Rachel M.
2015-01-01
Abstract Inorganic compounds in biomass, often referred to as ash, are known to be problematic in the thermochemical conversion of biomass to bio-oil or syngas and, ultimately, hydrocarbon fuels because they negatively influence reaction pathways, contribute to fouling and corrosion, poison catalysts, and impact waste streams. The most common ash-analysis methods, such as inductively coupled plasma-optical emission spectrometry/mass spectrometry (ICP-OES/MS), require considerable time and expensive reagents. Laser-induced breakdown spectroscopy (LIBS) is emerging as a technique for rapid analysis of the inorganic constituents in a wide range of biomass materials. This study compares analytical results using LIBS data to results obtained from three separate ICP-OES/MS methods for 12 samples, including six standard reference materials. Analyzed elements include aluminum, calcium, iron, magnesium, manganese, phosphorus, potassium, sodium, and silicon, and results show that concentrations can be measured with an uncertainty of approximately 100 parts per million using univariate calibration models and relatively few calibration samples. These results indicate that the accuracy of LIBS is comparable to that of ICP-OES methods and indicate that some acid-digestion methods for ICP-OES may not be reliable for Na and Al. These results also demonstrate that germanium can be used as an internal standard to improve the reliability and accuracy of measuring many elements of interest, and that LIBS can be used for rapid determination of total ash in biomass samples. Key benefits of LIBS include little sample preparation, no reagent consumption, and the generation of meaningful analytical data instantaneously. PMID:26733765
An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study
NASA Technical Reports Server (NTRS)
Ray, Paul S.
1996-01-01
The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.
Myneni, Sahiti; Cobb, Nathan K; Cohen, Trevor
2016-01-01
Analysis of user interactions in online communities could improve our understanding of health-related behaviors and inform the design of technological solutions that support behavior change. However, to achieve this we would need methods that provide granular perspective, yet are scalable. In this paper, we present a methodology for high-throughput semantic and network analysis of large social media datasets, combining semi-automated text categorization with social network analytics. We apply this method to derive content-specific network visualizations of 16,492 user interactions in an online community for smoking cessation. Performance of the categorization system was reasonable (average F-measure of 0.74, with system-rater reliability approaching rater-rater reliability). The resulting semantically specific network analysis of user interactions reveals content- and behavior-specific network topologies. Implications for socio-behavioral health and wellness platforms are also discussed.
Creep-rupture reliability analysis
NASA Technical Reports Server (NTRS)
Peralta-Duran, A.; Wirsching, P. H.
1984-01-01
A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep.
Visual analytics of brain networks.
Li, Kaiming; Guo, Lei; Faraco, Carlos; Zhu, Dajiang; Chen, Hanbo; Yuan, Yixuan; Lv, Jinglei; Deng, Fan; Jiang, Xi; Zhang, Tuo; Hu, Xintao; Zhang, Degang; Miller, L Stephen; Liu, Tianming
2012-05-15
Identification of regions of interest (ROIs) is a fundamental issue in brain network construction and analysis. Recent studies demonstrate that multimodal neuroimaging approaches and joint analysis strategies are crucial for accurate, reliable and individualized identification of brain ROIs. In this paper, we present a novel approach of visual analytics and its open-source software for ROI definition and brain network construction. By combining neuroscience knowledge and computational intelligence capabilities, visual analytics can generate accurate, reliable and individualized ROIs for brain networks via joint modeling of multimodal neuroimaging data and an intuitive and real-time visual analytics interface. Furthermore, it can be used as a functional ROI optimization and prediction solution when fMRI data is unavailable or inadequate. We have applied this approach to an operation span working memory fMRI/DTI dataset, a schizophrenia DTI/resting state fMRI (R-fMRI) dataset, and a mild cognitive impairment DTI/R-fMRI dataset, in order to demonstrate the effectiveness of visual analytics. Our experimental results are encouraging. Copyright © 2012 Elsevier Inc. All rights reserved.
Reliable fusion of control and sensing in intelligent machines. Thesis
NASA Technical Reports Server (NTRS)
Mcinroy, John E.
1991-01-01
Although robotics research has produced a wealth of sophisticated control and sensing algorithms, very little research has been aimed at reliably combining these control and sensing strategies so that a specific task can be executed. To improve the reliability of robotic systems, analytic techniques are developed for calculating the probability that a particular combination of control and sensing algorithms will satisfy the required specifications. The probability can then be used to assess the reliability of the design. An entropy formulation is first used to quickly eliminate designs not capable of meeting the specifications. Next, a framework for analyzing reliability based on the first order second moment methods of structural engineering is proposed. To ensure performance over an interval of time, lower bounds on the reliability of meeting a set of quadratic specifications with a Gaussian discrete time invariant control system are derived. A case study analyzing visual positioning in robotic system is considered. The reliability of meeting timing and positioning specifications in the presence of camera pixel truncation, forward and inverse kinematic errors, and Gaussian joint measurement noise is determined. This information is used to select a visual sensing strategy, a kinematic algorithm, and a discrete compensator capable of accomplishing the desired task. Simulation results using PUMA 560 kinematic and dynamic characteristics are presented.
Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M
2004-09-01
The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used to determine if the individual compounds quantified provide a suitable mass balance of total airborne organofluorochemicals based on known fluorine content. Improvements in precision and/or recovery as well as some additional testing would be needed to meet all NIOSH validation criteria. This study provided valuable information about the accuracy of this method for organofluorochemical exposure assessment.
Aerothermodynamic shape optimization of hypersonic blunt bodies
NASA Astrophysics Data System (ADS)
Eyi, Sinan; Yumuşak, Mine
2015-07-01
The aim of this study is to develop a reliable and efficient design tool that can be used in hypersonic flows. The flow analysis is based on the axisymmetric Euler/Navier-Stokes and finite-rate chemical reaction equations. The equations are coupled simultaneously and solved implicitly using Newton's method. The Jacobian matrix is evaluated analytically. A gradient-based numerical optimization is used. The adjoint method is utilized for sensitivity calculations. The objective of the design is to generate a hypersonic blunt geometry that produces the minimum drag with low aerodynamic heating. Bezier curves are used for geometry parameterization. The performances of the design optimization method are demonstrated for different hypersonic flow conditions.
High-performance liquid chromatographic method for guanylhydrazone compounds.
Cerami, C; Zhang, X; Ulrich, P; Bianchi, M; Tracey, K J; Berger, B J
1996-01-12
A high-performance liquid chromatographic method has been developed for a series of aromatic guanylhydrazones that have demonstrated therapeutic potential as anti-inflammatory agents. The compounds were separated using octadecyl or diisopropyloctyl reversed-phase columns, with an acetonitrile gradient in water containing heptane sulfonate, tetramethylammonium chloride, and phosphoric acid. The method was used to reliably quantify levels of analyte as low as 785 ng/ml, and the detector response was linear to at least 50 micrograms/ml using a 100 microliters injection volume. The assay system was used to determine the basic pharmacokinetics of a lead compound, CNI-1493, from serum concentrations following a single intravenous injection in rats.
Trace metal speciation in natural waters: Computational vs. analytical
Nordstrom, D. Kirk
1996-01-01
Improvements in the field sampling, preservation, and determination of trace metals in natural waters have made many analyses more reliable and less affected by contamination. The speciation of trace metals, however, remains controversial. Chemical model speciation calculations do not necessarily agree with voltammetric, ion exchange, potentiometric, or other analytical speciation techniques. When metal-organic complexes are important, model calculations are not usually helpful and on-site analytical separations are essential. Many analytical speciation techniques have serious interferences and only work well for a limited subset of water types and compositions. A combined approach to the evaluation of speciation could greatly reduce these uncertainties. The approach proposed would be to (1) compare and contrast different analytical techniques with each other and with computed speciation, (2) compare computed trace metal speciation with reliable measurements of solubility, potentiometry, and mean activity coefficients, and (3) compare different model calculations with each other for the same set of water analyses, especially where supplementary data on speciation already exist. A comparison and critique of analytical with chemical model speciation for a range of water samples would delineate the useful range and limitations of these different approaches to speciation. Both model calculations and analytical determinations have useful and different constraints on the range of possible speciation such that they can provide much better insight into speciation when used together. Major discrepancies in the thermodynamic databases of speciation models can be evaluated with the aid of analytical speciation, and when the thermodynamic models are highly consistent and reliable, the sources of error in the analytical speciation can be evaluated. Major thermodynamic discrepancies also can be evaluated by simulating solubility and activity coefficient data and testing various chemical models for their range of applicability. Until a comparative approach such as this is taken, trace metal speciation will remain highly uncertain and controversial.
Akutsu, Kazuhiko; Kitagawa, Yoko; Yoshimitsu, Masato; Takatori, Satoshi; Fukui, Naoki; Osakada, Masakazu; Uchida, Kotaro; Azuma, Emiko; Kajimura, Keiji
2018-05-01
Polyethylene glycol 300 is commonly used as a base material for "analyte protection" in multiresidue pesticide analysis via gas chromatography-mass spectrometry. However, the disadvantage of the co-injection method using polyethylene glycol 300 is that it causes peak instability in α-cyano pyrethroids (type II pyrethroids) such as fluvalinate. In this study, we confirmed the instability phenomenon in type II pyrethroids and developed novel analyte protectants for acetone/n-hexane mixture solution to suppress the phenomenon. Our findings revealed that among the examined additive compounds, three lipophilic ascorbic acid derivatives, 3-O-ethyl-L-ascorbic acid, 6-O-palmitoyl-L-ascorbic acid, and 6-O-stearoyl-L-ascorbic acid, could effectively stabilize the type II pyrethroids in the presence of polyethylene glycol 300. A mixture of the three ascorbic acid derivatives and polyethylene glycol 300 proved to be an effective analyte protectant for multiresidue pesticide analysis. Further, we designed and evaluated a new combination of analyte protectant compounds without using polyethylene glycol or the troublesome hydrophilic compounds. Consequently, we obtained a set of 10 medium- and long-chain saturated fatty acids as an effective analyte protectant suitable for acetone/n-hexane solution that did not cause peak instability in type II pyrethroids. These analyte protectants will be useful in multiresidue pesticide analysis by gas chromatography-mass spectrometry in terms of ruggedness and reliable quantitativeness. Graphical abstract Comparison of effectiveness of the addition of lipophilic derivatives of ascorbic acid in controlling the instability phenomenon of fluvalinate with polyethylene glycol 300.
Elkhoudary, Mahmoud M; Abdel Salam, Randa A; Hadad, Ghada M
2014-09-15
Metronidazole (MNZ) is a widely used antibacterial and amoebicide drug. Therefore, it is important to develop a rapid and specific analytical method for the determination of MNZ in mixture with Spiramycin (SPY), Diloxanide (DIX) and Cliquinol (CLQ) in pharmaceutical preparations. This work describes simple, sensitive and reliable six multivariate calibration methods, namely linear and nonlinear artificial neural networks preceded by genetic algorithm (GA-ANN) and principle component analysis (PCA-ANN) as well as partial least squares (PLS) either alone or preceded by genetic algorithm (GA-PLS) for UV spectrophotometric determination of MNZ, SPY, DIX and CLQ in pharmaceutical preparations with no interference of pharmaceutical additives. The results manifest the problem of nonlinearity and how models like ANN can handle it. Analytical performance of these methods was statistically validated with respect to linearity, accuracy, precision and specificity. The developed methods indicate the ability of the previously mentioned multivariate calibration models to handle and solve UV spectra of the four components' mixtures using easy and widely used UV spectrophotometer. Copyright © 2014 Elsevier B.V. All rights reserved.
Walorczyk, Stanisław; Drożdżyński, Dariusz; Kowalska, Jolanta; Remlein-Starosta, Dorota; Ziółkowski, Andrzej; Przewoźniak, Monika; Gnusowski, Bogusław
2013-08-15
A sensitive, accurate and reliable multiresidue method based on the application of gas chromatography-tandem quadrupole mass spectrometry (GC-QqQ-MS/MS) has been established for screening, identification and quantification of a large number of pesticide residues in produce. The method was accredited in compliance with PN-EN ISO/IEC 17025:2005 standard and it was operated under flexible scope as PB-11 method. The flexible scope of accreditation allowed for minor modifications and extension of the analytical scope while using the same analytical technique. During the years 2007-2010, the method was used for the purpose of verification of organic crop production by multiresidue analysis for the presence of pesticides. A total of 528 samples of differing matrices such as fruits, vegetables, cereals, plant leaves and other green parts were analysed, of which 4.4% samples contained pesticide residues above the threshold value of 0.01 mg/kg. A total of 20 different pesticide residues were determined in the samples. Copyright © 2013 Elsevier Ltd. All rights reserved.
Moliner-Martínez, Y; Herráez-Hernández, R; Campíns-Falcó, P
2007-09-14
A new microscale method is presented for the determination of ammonium and primary short-chain aliphatic amines (methylamine, ethylamine, propylamine, n-butylamine and n-pentylamine) in water. The assay uses precolumn derivatization with the reagent o-phthaldialdehyde (OPA) in combination with the thiol N-acetyl-L-cysteine (NAC), and capillary liquid chromatography with UV detection at 330 nm. The described method is very simple and rapid as no preconcentration of the analytes is necessary, and the volume of sample required is only 0.1 mL. Under the proposed conditions good linearity has been obtained up to a concentration of the analytes of 10.0 mgL(-1), the limits of detection being of 8-50 microgL(-1). No matrix effect was found, and recoveries between 97 and 110% were obtained. The precision of the method was good, and the achieved variation coefficients were below 12%. The reliability of the proposed approach has been tested by analyzing a microsample of fogwater collected from leaf surfaces.
NASA Astrophysics Data System (ADS)
Elkhoudary, Mahmoud M.; Abdel Salam, Randa A.; Hadad, Ghada M.
2014-09-01
Metronidazole (MNZ) is a widely used antibacterial and amoebicide drug. Therefore, it is important to develop a rapid and specific analytical method for the determination of MNZ in mixture with Spiramycin (SPY), Diloxanide (DIX) and Cliquinol (CLQ) in pharmaceutical preparations. This work describes simple, sensitive and reliable six multivariate calibration methods, namely linear and nonlinear artificial neural networks preceded by genetic algorithm (GA-ANN) and principle component analysis (PCA-ANN) as well as partial least squares (PLS) either alone or preceded by genetic algorithm (GA-PLS) for UV spectrophotometric determination of MNZ, SPY, DIX and CLQ in pharmaceutical preparations with no interference of pharmaceutical additives. The results manifest the problem of nonlinearity and how models like ANN can handle it. Analytical performance of these methods was statistically validated with respect to linearity, accuracy, precision and specificity. The developed methods indicate the ability of the previously mentioned multivariate calibration models to handle and solve UV spectra of the four components’ mixtures using easy and widely used UV spectrophotometer.
Foster, Scott D; Feutry, Pierre; Grewe, Peter M; Berry, Oliver; Hui, Francis K C; Davies, Campbell R
2018-06-26
Delineating naturally occurring and self-sustaining sub-populations (stocks) of a species is an important task, especially for species harvested from the wild. Despite its central importance to natural resource management, analytical methods used to delineate stocks are often, and increasingly, borrowed from superficially similar analytical tasks in human genetics even though models specifically for stock identification have been previously developed. Unfortunately, the analytical tasks in resource management and human genetics are not identical { questions about humans are typically aimed at inferring ancestry (often referred to as 'admixture') rather than breeding stocks. In this article, we argue, and show through simulation experiments and an analysis of yellowfin tuna data, that ancestral analysis methods are not always appropriate for stock delineation. In this work, we advocate a variant of a previouslyintroduced and simpler model that identifies stocks directly. We also highlight that the computational aspects of the analysis, irrespective of the model, are difficult. We introduce some alternative computational methods and quantitatively compare these methods to each other and to established methods. We also present a method for quantifying uncertainty in model parameters and in assignment probabilities. In doing so, we demonstrate that point estimates can be misleading. One of the computational strategies presented here, based on an expectation-maximisation algorithm with judiciously chosen starting values, is robust and has a modest computational cost. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Pandey, Renu; Chandra, Preeti; Srivastava, Mukesh; Mishra, D K; Kumar, Brijesh
2015-01-01
Ocimum sanctum L., with phenolic acids, flavonoids, propenyl phenols and terpenoids as active pharmacological constituents, is a popular medicinal herb and is present as an ingredient in many herbal formulations. Therefore, development of a reliable analytical method for simultaneous determination of the pharmacologically active constituents of O. sanctum is of high importance. To develop and validate a new, rapid, sensitive and selective UPLC-ESI/MS/MS method for simultaneous determination of 23 bioactive markers including phenolic acids, flavonoids, propenyl phenol and terpenoid in the leaf extract and marketed herbal formulations of O. sanctum. An UPLC-ESI/MS/MS method using negative electrospray ionisation (ESI) in multiple-reaction-monitoring (MRM) mode was used for simultaneous determination. Chromatographic separation was achieved on an Acquity UPLC BEH C18 -column using a gradient elution with 0.1% formic acid in water and 0.1% formic acid in acetonitrile. Principal component analysis (PCA) was applied to correlate and discriminate eight geographical collections of O. sanctum based on quantitative data of the analytes. The developed method was validated as per International Conference on Harmonization guidelines and found to be accurate, with overall recovery in the range 95.09-104.84% (RSD ≤ 1.85%), precise (RSD ≤ 1.98%) and linear (r(2) ≥ 0.9971) over the concentration range of 0.5-1000 ng/mL. Ursolic acid was found to be the most abundant marker in all the samples investigated, except for the marketed tablet. The method established is simple, rapid and sensitive, hence it can be reliably utilised for the quality control of O. sanctum and derived herbal formulations. Copyright © 2015 John Wiley & Sons, Ltd.
Kim, Min-A; Sim, Hye-Min; Lee, Hye-Seong
2016-11-01
As reformulations and processing changes are increasingly needed in the food industry to produce healthier, more sustainable, and cost effective products while maintaining superior quality, reliable measurements of consumers' sensory perception and discrimination are becoming more critical. Consumer discrimination methods using a preferred-reference duo-trio test design have been shown to be effective in improving the discrimination performance by customizing sample presentation sequences. However, this design can add complexity to the discrimination task for some consumers, resulting in more errors in sensory discrimination. The objective of the present study was to investigate the effects of different types of test instructions using the preference-reference duo-trio test design where a paired-preference test is followed by 6 repeated preferred-reference duo-trio tests, in comparison to the analytical method using the balanced-reference duo-trio. Analyses of d' estimates (product-related measure) and probabilistic sensory discriminators in momentary numbers of subjects showing statistical significance (subject-related measure) revealed that only preferred-reference duo-trio test using affective reference-framing, either by providing no information about the reference or information on a previously preferred sample, improved the sensory discrimination more than the analytical method. No decrease in discrimination performance was observed with any type of instruction, confirming that consumers could handle the test methods. These results suggest that when repeated tests are feasible, using the affective discrimination method would be operationally more efficient as well as ecologically more reliable for measuring consumers' sensory discrimination ability. Copyright © 2016 Elsevier Ltd. All rights reserved.
Methods of Measurement in epidemiology: Sedentary Behaviour
Atkin, Andrew J; Gorely, Trish; Clemes, Stacy A; Yates, Thomas; Edwardson, Charlotte; Brage, Soren; Salmon, Jo; Marshall, Simon J; Biddle, Stuart JH
2012-01-01
Background Research examining sedentary behaviour as a potentially independent risk factor for chronic disease morbidity and mortality has expanded rapidly in recent years. Methods We present a narrative overview of the sedentary behaviour measurement literature. Subjective and objective methods of measuring sedentary behaviour suitable for use in population-based research with children and adults are examined. The validity and reliability of each method is considered, gaps in the literature specific to each method identified and potential future directions discussed. Results To date, subjective approaches to sedentary behaviour measurement, e.g. questionnaires, have focused predominantly on TV viewing or other screen-based behaviours. Typically, such measures demonstrate moderate reliability but slight to moderate validity. Accelerometry is increasingly being used for sedentary behaviour assessments; this approach overcomes some of the limitations of subjective methods, but detection of specific postures and postural changes by this method is somewhat limited. Instruments developed specifically for the assessment of body posture have demonstrated good reliability and validity in the limited research conducted to date. Miniaturization of monitoring devices, interoperability between measurement and communication technologies and advanced analytical approaches are potential avenues for future developments in this field. Conclusions High-quality measurement is essential in all elements of sedentary behaviour epidemiology, from determining associations with health outcomes to the development and evaluation of behaviour change interventions. Sedentary behaviour measurement remains relatively under-developed, although new instruments, both objective and subjective, show considerable promise and warrant further testing. PMID:23045206
Arantes de Carvalho, Gabriel G; Kondaveeti, Stalin; Petri, Denise F S; Fioroto, Alexandre M; Albuquerque, Luiza G R; Oliveira, Pedro V
2016-12-01
Analytical methods for the determination of rare earth elements (REE) in natural waters by plasma spectrochemical techniques often require sample preparation procedures for analytes preconcentration as well as for removing matrix constituents, that may interfere on the analytical measurements. In the present work, calcium alginate (CA) beads were used for the first time aiming at Ce, La and Nd preconcentration from groundwater samples for further determination by inductively coupled plasma optical emission spectrometry (ICP OES). Test samples were analyzed in batch mode by transferring a 40mL test portion (pH=5±0.2) into a 50mL polyethylene flask containing 125mg CA beads. After 15min contact, the analytes were quantitatively extracted from the loaded CA beads with 2.0mL of 1.0molL -1 HCl solution for further determination by ICP OES, using Ce (II) 456.236, La (II) 379.478 and Nd (II) 430.358nm emission lines. The proposed approach is a reliable alternative for REE single-stage preconcentration from aqueous samples, as it provided accurate results based on the addition and recovery analysis of groundwater. The results obtained by the proposed method were also compared with those from reference method based on inductively coupled plasma mass spectrometry (ICP-MS) and no significant differences were observed after applying the Student's t-test at 95% confidence level. Copyright © 2016 Elsevier B.V. All rights reserved.
Liang, Xiao-Ping; Liang, Qiong-Lin; Xia, Jian-Fei; Wang, Yong; Hu, Ping; Wang, Yi-Ming; Zheng, Xiao-Ying; Zhang, Ting; Luo, Guo-An
2009-06-15
Disturbances in maternal folate, homocysteine, and glutathione metabolism have been reported to be associated with neural tube defects (NTDs). However, the role played by specific components in the metabolic pathways leading to NTDs remains unclear. Thus an analytical method for simultaneous measurement of sixteen compounds involved in such three metabolic pathways by high performance liquid chromatography-tandem mass spectrometry was developed. The use of hydrophilic chromatography column improved the separation of polar analytes and the detection mode of multiple-reaction monitoring (MRM) enhanced the specificity and sensitivity so as to achieve simultaneous determination of three class of metabolites which have much variance in polarity and contents. The influence of parameters such as temperature, pH, flow rate on the performance of the analytes were studied to get an optimal condition. The method was validated for its linearity, accuracy, and precision, and also used for the analysis of serum samples of NTDs-affected pregnancies and normal women. The result showed that the present method is sensitive and reliable for simultaneous determination of as many as sixteen interesting metabolites which may provide a new means to study the underlying mechanism of NTDs as well as to discover new potential biomarkers.
He, Xiaoqin; Xi, Cunxian; Tang, Bobin; Wang, Guomin; Chen, Dongdong; Peng, Tao; Mu, Zhaode
2014-01-01
A novel analytical method employing solid-phase extraction (SPE) coupled with ultra-high-performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) was developed for the simultaneous determination of 30 hormones in anti-ageing functional foods (capsules, powders and tablets). The analytes were extracted with acetic acid-acetonitrile (1-99 v/v), methanol and acetone, respectively. The extract was purified using a combined column, followed by analyte detection with electrospray ionisation in positive- or negative-ion modes. The results indicated that the 30 compounds had good linear correlations in the range of 1-1000 μg kg⁻¹, and the correlation coefficients were above 0.99. The limits of detection (LOD) and limits of quantification (LOQ) were 0.03-2 and 0.1-5 μg kg⁻¹, respectively. The average recovery of 30 compounds at the three spiked levels varied from 74.7% to 124.1%, and the relative standard deviation (RSD) was 2.4-15.0%. This method was applied to the analysis of hormones in 14 real samples of which seven hormones (such as estrone, dienestrol) were detected in four samples, but the remainder of the hormones were not detected. The developed method is sensitive, efficient, reliable and applicable to real samples.
Nakazawa, Hiroyuki; Iwasaki, Yusuke; Ito, Rie
2014-01-01
Our modern society has created a large number of chemicals that are used for the production of everyday commodities including toys, food packaging, cosmetic products, and building materials. We enjoy a comfortable and convenient lifestyle with access to these items. In addition, in specialized areas, such as experimental science and various medical fields, laboratory equipment and devices that are manufactured using a wide range of chemical substances are also extensively employed. The association between human exposure to trace hazardous chemicals and an increased incidence of endocrine disease has been recognized. However, the evaluation of human exposure to such endocrine disrupting chemicals is therefore imperative, and the determination of exposure levels requires the analysis of human biological materials, such as blood and urine. To obtain as much information as possible from limited sample sizes, highly sensitive and reliable analytical methods are also required for exposure assessments. The present review focuses on effective analytical methods for the quantification of bisphenol A (BPA), alkylphenols (APs), phthalate esters (PEs), and perfluoronated chemicals (PFCs), which are chemicals used in the production of everyday commodities. Using data obtained from liquid chromatography/mass spectrometry (LC/MS) and LC/MS/MS analyses, assessments of the risks to humans were also presented based on the estimated levels of exposure to PFCs.
Covaci, Adrian; Voorspoels, Stefan; Abdallah, Mohamed Abou-Elwafa; Geens, Tinne; Harrad, Stuart; Law, Robin J
2009-01-16
The present article reviews the available literature on the analytical and environmental aspects of tetrabromobisphenol-A (TBBP-A), a currently intensively used brominated flame retardant (BFR). Analytical methods, including sample preparation, chromatographic separation, detection techniques, and quality control are discussed. An important recent development in the analysis of TBBP-A is the growing tendency for liquid chromatographic techniques. At the detection stage, mass-spectrometry is a well-established and reliable technology in the identification and quantification of TBBP-A. Although interlaboratory exercises for BFRs have grown in popularity in the last 10 years, only a few participating laboratories report concentrations for TBBP-A. Environmental levels of TBBP-A in abiotic and biotic matrices are low, probably due to the major use of TBBP-A as reactive FR. As a consequence, the expected human exposure is low. This is in agreement with the EU risk assessment that concluded that there is no risk for humans concerning TBBP-A exposure. Much less analytical and environmental information exists for the various groups of TBBP-A derivatives which are largely used as additive flame retardants.
Dunand, Marielle; Donzelli, Massimiliano; Rickli, Anna; Hysek, Cédric M; Liechti, Matthias E; Grouzmann, Eric
2014-08-01
The diagnosis of pheochromocytoma relies on the measurement of plasma free metanephrines assay whose reliability has been considerably improved by ultra-high pressure liquid chromatography tandem mass spectrometry (UHPLC-MS/MS). Here we report an analytical interference occurring between 4-hydroxy-3-methoxymethamphetamine (HMMA), a metabolite of 3,4-methylenedioxymethamphetamine (MDMA, "Ecstasy"), and normetanephrine (NMN) since they share a common pharmacophore resulting in the same product ion after fragmentation. Synthetic HMMA was spiked into plasma samples containing various concentrations of NMN and the intensity of the interference was determined by UPLC-MS/MS before and after improvement of the analytical method. Using a careful adjustment of chromatographic conditions including the change of the UPLC analytical column, we were able to distinguish both compounds. HMMA interference for NMN determination should be seriously considered since MDMA activates the sympathetic nervous system and if confounded with NMN may lead to false-positive tests when performing a differential diagnostic of pheochromocytoma. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Determination of vitamin C in foods: current state of method validation.
Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C
2014-11-21
Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review. Copyright © 2014. Published by Elsevier B.V.
Improved vertical streambed flux estimation using multiple diurnal temperature methods in series
Irvine, Dylan J.; Briggs, Martin A.; Cartwright, Ian; Scruggs, Courtney; Lautz, Laura K.
2017-01-01
Analytical solutions that use diurnal temperature signals to estimate vertical fluxes between groundwater and surface water based on either amplitude ratios (Ar) or phase shifts (Δϕ) produce results that rarely agree. Analytical solutions that simultaneously utilize Ar and Δϕ within a single solution have more recently been derived, decreasing uncertainty in flux estimates in some applications. Benefits of combined (ArΔϕ) methods also include that thermal diffusivity and sensor spacing can be calculated. However, poor identification of either Ar or Δϕ from raw temperature signals can lead to erratic parameter estimates from ArΔϕ methods. An add-on program for VFLUX 2 is presented to address this issue. Using thermal diffusivity selected from an ArΔϕ method during a reliable time period, fluxes are recalculated using an Ar method. This approach maximizes the benefits of the Ar and ArΔϕ methods. Additionally, sensor spacing calculations can be used to identify periods with unreliable flux estimates, or to assess streambed scour. Using synthetic and field examples, the use of these solutions in series was particularly useful for gaining conditions where fluxes exceeded 1 m/d.
Large Sample Confidence Intervals for Item Response Theory Reliability Coefficients
ERIC Educational Resources Information Center
Andersson, Björn; Xin, Tao
2018-01-01
In applications of item response theory (IRT), an estimate of the reliability of the ability estimates or sum scores is often reported. However, analytical expressions for the standard errors of the estimators of the reliability coefficients are not available in the literature and therefore the variability associated with the estimated reliability…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeng, Qiao; Liang, WanZhen, E-mail: liangwz@xmu.edu.cn; Liu, Jie
2014-05-14
This work extends our previous works [J. Liu and W. Z. Liang, J. Chem. Phys. 135, 014113 (2011); J. Liu and W. Z. Liang, J. Chem. Phys. 135, 184111 (2011)] on analytical excited-state energy Hessian within the framework of time-dependent density functional theory (TDDFT) to couple with molecular mechanics (MM). The formalism, implementation, and applications of analytical first and second energy derivatives of TDDFT/MM excited state with respect to the nuclear and electric perturbations are presented. Their performances are demonstrated by the calculations of adiabatic excitation energies, and excited-state geometries, harmonic vibrational frequencies, and infrared intensities for a number ofmore » benchmark systems. The consistent results with the full quantum mechanical method and other hybrid theoretical methods indicate the reliability of the current numerical implementation of developed algorithms. The computational accuracy and efficiency of the current analytical approach are also checked and the computational efficient strategies are suggested to speed up the calculations of complex systems with many MM degrees of freedom. Finally, we apply the current analytical approach in TDDFT/MM to a realistic system, a red fluorescent protein chromophore together with part of its nearby protein matrix. The calculated results indicate that the rearrangement of the hydrogen bond interactions between the chromophore and the protein matrix is responsible for the large Stokes shift.« less
Utility of the Rosenberg self-esteem scale.
Davis, Clare; Kellett, Stephen; Beail, Nigel
2009-05-01
The Rosenberg Self-Esteem Scale (RSES) continues to be used to purportedly measure self-esteem of people with intellectual disabilities, despite the lack of sound evidence concerning its validity and reliability when employed with this population. The psychometric foundations of the RSES were analyzed here with a sample of 219 participants with intellectual disabilities. The factor analytic methods employed revealed two factors (Self-Worth and Self-Criticism) and more specific problems with RSES Items 5 and 8. Overall, this scale showed only moderate temporal and moderate internal reliability and poor aspects of criterion validity. Results are discussed with reference to either developing a new measure of self-esteem or redesigning and simplifying the RSES in order to increase its initial face validity in intellectual disability samples.
The Wartegg Zeichen Test: a literature overview and a meta-analysis of reliability and validity.
Soilevuo Grønnerød, Jarna; Grønnerød, Cato
2012-06-01
All available studies on the Wartegg Zeichen Test (WZT; Wartegg, 1939) were collected and evaluated through a literature overview and a meta-analysis. The literature overview shows that the history of the WZT reflects the geographical and language-based processes of marginalization where relatively isolated traditions have lived and vanished in different parts of the world. The meta-analytic review indicates a high average interscorer reliability of rw = .74 and high validity effect size for studies with clear hypotheses of rw = .33. Although the results were strong, we conclude that the WZT research has not been able to establish cumulative knowledge of the method because of the isolation of research traditions. (c) 2012 APA, all rights reserved
Electrochemical hydrogen sulfide biosensors.
Xu, Tailin; Scafa, Nikki; Xu, Li-Ping; Zhou, Shufeng; Abdullah Al-Ghanem, Khalid; Mahboob, Shahid; Fugetsu, Bunshi; Zhang, Xueji
2016-02-21
The measurement of sulfide, especially hydrogen sulfide, has held the attention of the analytical community due to its unique physiological and pathophysiological roles in biological systems. Electrochemical detection offers a rapid, highly sensitive, affordable, simple, and real-time technique to measure hydrogen sulfide concentration, which has been a well-documented and reliable method. This review details up-to-date research on the electrochemical detection of hydrogen sulfide (ion selective electrodes, polarographic hydrogen sulfide sensors, etc.) in biological samples for potential therapeutic use.
Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz
2018-01-18
In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.
NASA Technical Reports Server (NTRS)
Johnson, Theodore F.; Waters, W. Allen; Singer, Thomas N.; Haftka, Raphael T.
2004-01-01
A next generation reusable launch vehicle (RLV) will require thermally efficient and light-weight cryogenic propellant tank structures. Since these tanks will be weight-critical, analytical tools must be developed to aid in sizing the thickness of insulation layers and structural geometry for optimal performance. Finite element method (FEM) models of the tank and insulation layers were created to analyze the thermal performance of the cryogenic insulation layer and thermal protection system (TPS) of the tanks. The thermal conditions of ground-hold and re-entry/soak-through for a typical RLV mission were used in the thermal sizing study. A general-purpose nonlinear FEM analysis code, capable of using temperature and pressure dependent material properties, was used as the thermal analysis code. Mechanical loads from ground handling and proof-pressure testing were used to size the structural geometry of an aluminum cryogenic tank wall. Nonlinear deterministic optimization and reliability optimization techniques were the analytical tools used to size the geometry of the isogrid stiffeners and thickness of the skin. The results from the sizing study indicate that a commercial FEM code can be used for thermal analyses to size the insulation thicknesses where the temperature and pressure were varied. The results from the structural sizing study show that using combined deterministic and reliability optimization techniques can obtain alternate and lighter designs than the designs obtained from deterministic optimization methods alone.
Application of advanced control techniques to aircraft propulsion systems
NASA Technical Reports Server (NTRS)
Lehtinen, B.
1984-01-01
Two programs are described which involve the application of advanced control techniques to the design of engine control algorithms. Multivariable control theory is used in the F100 MVCS (multivariable control synthesis) program to design controls which coordinate the control inputs for improved engine performance. A systematic method for handling a complex control design task is given. Methods of analytical redundancy are aimed at increasing the control system reliability. The F100 DIA (detection, isolation, and accommodation) program, which investigates the uses of software to replace or augment hardware redundancy for certain critical engine sensor, is described.
NASA Astrophysics Data System (ADS)
Sheikholeslami, Mohsen; Azimi, Mohammadreza; Domiri Ganji, Davood
2015-07-01
In this study, we propose a reliable algorithm to develop an analytical solution for the problem of laminar steady magnetohydrodymanics (MHD) nanofluid flow in a semi-permeable channel using the differential transformation method (DTM). The working fluid is water with copper nanoparticles. The effects of Hartmann number and Reynolds number on velocity profiles have been also considered for various numerical cases. The effective thermal conductivity and viscosity of nanofluid are calculated by the Maxwell and Brinkman models, respectively. A close agreement between the obtained solution and some well-known results has been established.
Chericoni, Silvio; Stefanelli, Fabio; Da Valle, Ylenia; Giusiani, Mario
2015-09-01
A sensitive and reliable method for extraction and quantification of benzoylecgonine (BZE) and cocaine (COC) in urine is presented. Propyl-chloroformate was used as derivatizing agent, and it was directly added to the urine sample: the propyl derivative and COC were then recovered by liquid-liquid extraction procedure. Gas chromatography-mass spectrometry was used to detect the analytes in selected ion monitoring mode. The method proved to be precise for BZE and COC both in term of intraday and interday analysis, with a coefficient of variation (CV)<6%. Limits of detection (LOD) were 2.7 ng/mL for BZE and 1.4 ng/mL for COC. The calibration curve showed a linear relationship for BZE and COC (r2>0.999 and >0.997, respectively) within the range investigated. The method, applied to thirty authentic samples, showed to be very simple, fast, and reliable, so it can be easily applied in routine analysis for the quantification of BZE and COC in urine samples. © 2015 American Academy of Forensic Sciences.
Wu, Yun; Wang, Fenrong; Ai, Yu; Ma, Wen; Bian, Qiaoxia; Lee, David Y-W; Dai, Ronghua
2015-06-01
A simple, sensitive and reliable ultra performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) method has been developed and validated for simultaneous quantitation of seven coumarins, the bio-active ingredients of Huo Luo Xiao Ling Dan (HLXLD), in rat plasma. The liquid-liquid extraction method with ether-dichloromethane (2:1, v/v) was used to prepare the plasma samples. Analytes and internal standard (IS) of bifendate were separated on a Shim-pack XR-ODS column (75mm×3.0mm, 2.2μm particles) using gradient elution with the mobile phase consisting of methanol and 0.05% formic acid in water at a flow rate of 0.4mL/min. Detection was performed on a triple quadrupole (TQ) tandem mass spectrometry equipped with an electrospray ionization source in the positive ionization and multiple reaction monitoring (MRM) mode. The lower limits of quantitation (LLOQ) were 0.03-0.25ng/mL for all the analytes. Intra- and inter-day precision and accuracy of the seven analytes were well within acceptance criteria (15%). The matrix effect and the mean extraction recoveries of the analytes and IS from rat plasma were all within satisfaction. The validated method has been successfully applied to compare pharmacokinetic profiles of the seven active ingredients in rat plasma between normal and arthritic rats after oral administration of HLXLD, Angelica pubescens extract and Notopterygium incisum extract, respectively. Results showed that there were remarkable differences in pharmacokinetic properties of the analytes among the different groups. Copyright © 2015. Published by Elsevier B.V.
NASA Technical Reports Server (NTRS)
Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola
2005-01-01
Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.
Azari, Nadia; Soleimani, Farin; Vameghi, Roshanak; Sajedi, Firoozeh; Shahshahani, Soheila; Karimi, Hossein; Kraskian, Adis; Shahrokhi, Amin; Teymouri, Robab; Gharib, Masoud
2017-01-01
Bayley Scales of infant & toddler development is a well-known diagnostic developmental assessment tool for children aged 1-42 months. Our aim was investigating the validity & reliability of this scale in Persian speaking children. The method was descriptive-analytic. Translation- back translation and cultural adaptation was done. Content & face validity of translated scale was determined by experts' opinions. Overall, 403 children aged 1 to 42 months were recruited from health centers of Tehran, during years of 2013-2014 for developmental assessment in cognitive, communicative (receptive & expressive) and motor (fine & gross) domains. Reliability of scale was calculated through three methods; internal consistency using Cronbach's alpha coefficient, test-retest and interrater methods. Construct validity was calculated using factor analysis and comparison of the mean scores methods. Cultural and linguistic changes were made in items of all domains especially on communication subscale. Content and face validity of the test were approved by experts' opinions. Cronbach's alpha coefficient was above 0.74 in all domains. Pearson correlation coefficient in various domains, were ≥ 0.982 in test retest method, and ≥0.993 in inter-rater method. Construct validity of the test was approved by factor analysis. Moreover, the mean scores for the different age groups were compared and statistically significant differences were observed between mean scores of different age groups, that confirms validity of the test. The Bayley Scales of Infant and Toddler Development is a valid and reliable tool for child developmental assessment in Persian language children.
NASA Astrophysics Data System (ADS)
Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.
2017-06-01
Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.
Post-standardization of routine creatinine assays: are they suitable for clinical applications.
Jassam, Nuthar; Weykamp, Cas; Thomas, Annette; Secchiero, Sandra; Sciacovelli, Laura; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Perich, Carmen; Ricós, Carmen; Paula, Faria A; Barth, Julian H
2017-05-01
Introduction Reliable serum creatinine measurements are of vital importance for the correct classification of chronic kidney disease and early identification of kidney injury. The National Kidney Disease Education Programme working group and other groups have defined clinically acceptable analytical limits for creatinine methods. The aim of this study was to re-evaluate the performance of routine creatinine methods in the light of these defined limits so as to assess their suitability for clinical practice. Method In collaboration with the Dutch External Quality Assurance scheme, six frozen commutable samples, with a creatinine concentration ranging from 80 to 239 μmol/L and traceable to isotope dilution mass spectrometry, were circulated to 91 laboratories in four European countries for creatinine measurement and estimated glomerular filtration rate calculation. Two out of the six samples were spiked with glucose to give high and low final concentrations of glucose. Results Results from 89 laboratories were analysed for bias, imprecision (%CV) for each creatinine assay and total error for estimated glomerular filtration rate. The participating laboratories used analytical instruments from four manufacturers; Abbott, Beckman, Roche and Siemens. All enzymatic methods in this study complied with the National Kidney Disease Education Programme working group recommended limits of bias of 5% above a creatinine concentration of 100 μmol/L. They also did not show any evidence of interference from glucose. In addition, they also showed compliance with the clinically recommended %CV of ≤4% across the analytical range. In contrast, the Jaffe methods showed variable performance with regard to the interference of glucose and unsatisfactory bias and precision. Conclusion Jaffe-based creatinine methods still exhibit considerable analytical variability in terms of bias, imprecision and lack of specificity, and this variability brings into question their clinical utility. We believe that clinical laboratories and manufacturers should work together to phase out the use of relatively non-specific Jaffe methods and replace them with more specific methods that are enzyme based.
Sorio, Daniela; De Palo, Elio Franco; Bertaso, Anna; Bortolotti, Federica; Tagliaro, Franco
2017-02-01
This paper puts forward a new method for the transferrin (Tf) glycoform analysis in body fluids that involves the formation of a transferrin-terbium fluorescent adduct (TfFluo). The key idea is to validate the analytical procedure for carbohydrate-deficient transferrin (CDT), a traditional biochemical serum marker to identify chronic alcohol abuse. Terbium added to a human body-fluid sample produced TfFluo. Anion exchange HPLC technique, with fluorescence detection (λ exc 298 nm and λ em 550 nm), permitted clear separation and identification of Tf glycoform peaks without any interfering signals, allowing selective Tf sialoforms analysis in human serum and body fluids (cadaveric blood, cerebrospinal fluid, and dried blood spots) hampered for routine test. Serum samples (n = 78) were analyzed by both traditional absorbance (Abs) and fluorescence (Fl) HPLC methods and CDT% levels demonstrated a significant correlation (p < 0.001 Pearson). Intra- and inter-runs CV% was 3.1 and 4.6%, respectively. The cut-off of 1.9 CDT%, related to the HPLC Abs proposed as the reference method, by interpolation in the correlation curve with the present method demonstrated a 1.3 CDT% cut-off. Method comparison by Passing-Bablok and Bland-Altman tests demonstrated Fl versus Abs agreement. In conclusion, the novel method is a reliable test for CDT% analysis and provides a substantial analytical improvement offering important advantages in terms of types of body fluid analysis. Its sensitivity and absence of interferences extend clinical applications being reliable for CDT assay on body fluids usually not suitable for routine test. Graphical Abstract The formation of a transferrin-terbium fluorescent adduct can be used to analyze the transferrin glycoforms. The HPLC method for carbohydrate-deficient transferrin (CDT%) measurement was validated and employed to determine the levels in different body fluids.
Barbosa, Marta O; Ribeiro, Ana R; Pereira, Manuel F R; Silva, Adrián M T
2016-11-01
Organic micropollutants present in drinking water (DW) may cause adverse effects for public health, and so reliable analytical methods are required to detect these pollutants at trace levels in DW. This work describes the first green analytical methodology for multi-class determination of 21 pollutants in DW: seven pesticides, an industrial compound, 12 pharmaceuticals, and a metabolite (some included in Directive 2013/39/EU or Decision 2015/495/EU). A solid-phase extraction procedure followed by ultra-high-performance liquid chromatography coupled to tandem mass spectrometry (offline SPE-UHPLC-MS/MS) method was optimized using eco-friendly solvents, achieving detection limits below 0.20 ng L -1 . The validated analytical method was successfully applied to DW samples from different sources (tap, fountain, and well waters) from different locations in the north of Portugal, as well as before and after bench-scale UV and ozonation experiments in spiked tap water samples. Thirteen compounds were detected, many of them not regulated yet, in the following order of frequency: diclofenac > norfluoxetine > atrazine > simazine > warfarin > metoprolol > alachlor > chlorfenvinphos > trimethoprim > clarithromycin ≈ carbamazepine ≈ PFOS > citalopram. Hazard quotients were also estimated for the quantified substances and suggested no adverse effects to humans. Graphical Abstract Occurrence and removal of multi-class micropollutants in drinking water, analyzed by an eco-friendly LC-MS/MS method.
NASA Astrophysics Data System (ADS)
Tan, Samantha H.; Chen, Ning; Liu, Shi; Wang, Kefei
2003-09-01
As part of the semiconductor industry "contamination-free manufacturing" effort, significant emphasis has been placed on reducing potential sources of contamination from process equipment and process equipment components. Process tools contain process chambers and components that are exposed to the process environment or process chemistry and in some cases are in direct contact with production wafers. Any contamination from these sources must be controlled or eliminated in order to maintain high process yields, device performance, and device reliability. This paper discusses new nondestructive analytical methods for quantitative measurement of the cleanliness of metal, quartz, polysilicon and ceramic components that are used in process equipment tools. The goal of these new procedures is to measure the effectiveness of cleaning procedures and to verify whether a tool component part is sufficiently clean for installation and subsequent routine use in the manufacturing line. These procedures provide a reliable "qualification method" for tool component certification and also provide a routine quality control method for reliable operation of cleaning facilities. Cost advantages to wafer manufacturing include higher yields due to improved process cleanliness and elimination of yield loss and downtime resulting from the installation of "bad" components in process tools. We also discuss a representative example of wafer contamination having been linked to a specific process tool component.
UV Spectrophotometric Method for Estimation of Polypeptide-K in Bulk and Tablet Dosage Forms
NASA Astrophysics Data System (ADS)
Kaur, P.; Singh, S. Kumar; Gulati, M.; Vaidya, Y.
2016-01-01
An analytical method for estimation of polypeptide-k using UV spectrophotometry has been developed and validated for bulk as well as tablet dosage form. The developed method was validated for linearity, precision, accuracy, specificity, robustness, detection, and quantitation limits. The method has shown good linearity over the range from 100.0 to 300.0 μg/ml with a correlation coefficient of 0.9943. The percentage recovery of 99.88% showed that the method was highly accurate. The precision demonstrated relative standard deviation of less than 2.0%. The LOD and LOQ of the method were found to be 4.4 and 13.33, respectively. The study established that the proposed method is reliable, specific, reproducible, and cost-effective for the determination of polypeptide-k.
A Note on the Score Reliability for the Satisfaction with Life Scale: An RG Study
ERIC Educational Resources Information Center
Vassar, Matt
2008-01-01
The purpose of the present study was to meta-analytically investigate the score reliability for the Satisfaction With Life Scale. Four-hundred and sixteen articles using the measure were located through electronic database searches and then separated to identify studies which had calculated reliability estimates from their own data. Sixty-two…
Reliability Generalization: An Examination of the Positive Affect and Negative Affect Schedule
ERIC Educational Resources Information Center
Leue, Anja; Lange, Sebastian
2011-01-01
The assessment of positive affect (PA) and negative affect (NA) by means of the Positive Affect and Negative Affect Schedule has received a remarkable popularity in the social sciences. Using a meta-analytic tool--namely, reliability generalization (RG)--population reliability scores of both scales have been investigated on the basis of a random…
González, Oskar; van Vliet, Michael; Damen, Carola W N; van der Kloet, Frans M; Vreeken, Rob J; Hankemeier, Thomas
2015-06-16
The possible presence of matrix effect is one of the main concerns in liquid chromatography-mass spectrometry (LC-MS)-driven bioanalysis due to its impact on the reliability of the obtained quantitative results. Here we propose an approach to correct for the matrix effect in LC-MS with electrospray ionization using postcolumn infusion of eight internal standards (PCI-IS). We applied this approach to a generic ultraperformance liquid chromatography-time-of-flight (UHPLC-TOF) platform developed for small-molecule profiling with a main focus on drugs. Different urine samples were spiked with 19 drugs with different physicochemical properties and analyzed in order to study matrix effect (in absolute and relative terms). Furthermore, calibration curves for each analyte were constructed and quality control samples at different concentration levels were analyzed to check the applicability of this approach in quantitative analysis. The matrix effect profiles of the PCI-ISs were different: this confirms that the matrix effect is compound-dependent, and therefore the most suitable PCI-IS has to be chosen for each analyte. Chromatograms were reconstructed using analyte and PCI-IS responses, which were used to develop an optimized method which compensates for variation in ionization efficiency. The approach presented here improved the results in terms of matrix effect dramatically. Furthermore, calibration curves of higher quality are obtained, dynamic range is enhanced, and accuracy and precision of QC samples is increased. The use of PCI-ISs is a very promising step toward an analytical platform free of matrix effect, which can make LC-MS analysis even more successful, adding a higher reliability in quantification to its intrinsic high sensitivity and selectivity.
Reliability of IGBT in a STATCOM for Harmonic Compensation and Power Factor Correction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopi Reddy, Lakshmi Reddy; Tolbert, Leon M; Ozpineci, Burak
With smart grid integration, there is a need to characterize reliability of a power system by including reliability of power semiconductors in grid related applications. In this paper, the reliability of IGBTs in a STATCOM application is presented for two different applications, power factor correction and harmonic elimination. The STATCOM model is developed in EMTP, and analytical equations for average conduction losses in an IGBT and a diode are derived and compared with experimental data. A commonly used reliability model is used to predict reliability of IGBT.
NASA Astrophysics Data System (ADS)
Stefanović, S.; Đorđevic, V.; Jelušić, V.
2017-09-01
The aim of this paper is to verify the performance characteristics and fitness for purpose of rapid and simple QuEChERS-based LC-MS/MS method for determination of acrylamide in potato chips and coffee. LC-MS/MS is by far the most suitable analytical technique for acrylamide measurements given its inherent sensitivity and selectivity, as well as capability of analyzing underivatized molecule. Acrylamide in roasted coffee and potato chips wasextracted with water:acetonitrile mixture using NaCl and MgSO4. Cleanup was carried out with MgSO4 and PSA. Obtained results were satisfactory. Recoveries were in the range of 85-112%, interlaboratory reproducibility (Cv) was 5.8-7.6% and linearity (R2) was in the range of 0.995-0.999. LoQ was 35 μg kg-1 for coffee and 20 μg kg-1 for potato chips. Performance characteristic of the method are compliant with criteria for analytical methods validation. Presented method for quantitative determination of acrylamide in roasted coffee and potato chips is fit for purposes of self-control in food industry as well as regulatory controls carried out by the governmental agencies.
Quantitation of permethylated N-glycans through multiple-reaction monitoring (MRM) LC-MS/MS.
Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L; Mechref, Yehia
2015-04-01
The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple-reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan was determined to be 30%, whereas it was found to be 35% for either fucosylated or sialylated N-glycans. The optimum CE for mannose and complex type N-glycan was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan compositions in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these glycans was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitude. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples.
Quantitation of Permethylated N-Glycans through Multiple-Reaction Monitoring (MRM) LC-MS/MS
NASA Astrophysics Data System (ADS)
Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L.; Mechref, Yehia
2015-04-01
The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple-reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan was determined to be 30%, whereas it was found to be 35% for either fucosylated or sialylated N-glycans. The optimum CE for mannose and complex type N-glycan was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan compositions in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these glycans was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitude. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples.
ERIC Educational Resources Information Center
Shindler, John; Taylor, Clint; Cadenas, Herminia; Jones, Albert
This study was a pilot effort to examine the efficacy of an analytic trait scale school climate assessment instrument and democratic change system in two urban high schools. Pilot study results indicate that the instrument shows promising soundness in that it exhibited high levels of validity and reliability. In addition, the analytic trait format…
Sensor failure detection for jet engines
NASA Technical Reports Server (NTRS)
Merrill, Walter C.
1988-01-01
The use of analytical redundancy to improve gas turbine engine control system reliability through sensor failure detection, isolation, and accommodation is surveyed. Both the theoretical and application papers that form the technology base of turbine engine analytical redundancy research are discussed. Also, several important application efforts are reviewed. An assessment of the state-of-the-art in analytical redundancy technology is given.
Field reliability of competency and sanity opinions: A systematic review and meta-analysis.
Guarnera, Lucy A; Murrie, Daniel C
2017-06-01
We know surprisingly little about the interrater reliability of forensic psychological opinions, even though courts and other authorities have long called for known error rates for scientific procedures admitted as courtroom testimony. This is particularly true for opinions produced during routine practice in the field, even for some of the most common types of forensic evaluations-evaluations of adjudicative competency and legal sanity. To address this gap, we used meta-analytic procedures and study space methodology to systematically review studies that examined the interrater reliability-particularly the field reliability-of competency and sanity opinions. Of 59 identified studies, 9 addressed the field reliability of competency opinions and 8 addressed the field reliability of sanity opinions. These studies presented a wide range of reliability estimates; pairwise percentage agreements ranged from 57% to 100% and kappas ranged from .28 to 1.0. Meta-analytic combinations of reliability estimates obtained by independent evaluators returned estimates of κ = .49 (95% CI: .40-.58) for competency opinions and κ = .41 (95% CI: .29-.53) for sanity opinions. This wide range of reliability estimates underscores the extent to which different evaluation contexts tend to produce different reliability rates. Unfortunately, our study space analysis illustrates that available field reliability studies typically provide little information about contextual variables crucial to understanding their findings. Given these concerns, we offer suggestions for improving research on the field reliability of competency and sanity opinions, as well as suggestions for improving reliability rates themselves. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Dervisevic, Muamer; Senel, Mehmet; Sagir, Tugba; Isik, Sevim
2017-04-15
The detection of cancer cells through important molecular recognition target such as sialic acid is significant for the clinical diagnosis and treatment. There are many electrochemical cytosensors developed for cancer cells detection but most of them have complicated fabrication processes which results in poor reproducibility and reliability. In this study, a simple, low-cost, and highly sensitive electrochemical cytosensor was designed based on boronic acid-functionalized polythiophene. In cytosensors fabrication simple single-step procedure was used which includes coating pencil graphite electrode (PGE) by means of electro-polymerization of 3-Thienyl boronic acid and Thiophen. Electrochemical impedance spectroscopy and cyclic voltammetry were used as an analytical methods to optimize and measure analytical performances of PGE/P(TBA 0.5 Th 0.5 ) based electrode. Cytosensor showed extremely good analytical performances in detection of cancer cells with linear rage of 1×10 1 to 1×10 6 cellsmL -1 exhibiting low detection limit of 10 cellsmL -1 and incubation time of 10min. Next to excellent analytical performances, it showed high selectivity towards AGS cancer cells when compared to HEK 293 normal cells and bone marrow mesenchymal stem cells (BM-hMSCs). This method is promising for future applications in early stage cancer diagnosis. Copyright © 2016 Elsevier B.V. All rights reserved.
The reliability of Raman micro-spectroscopy in measuring the density of CO2 mantle fluids
NASA Astrophysics Data System (ADS)
Remigi, S.; Frezzotti, M. L.; Ferrando, S.; Villa, I. M.; Maffeis, A.
2017-12-01
Recent evaluations of carbon fluxes into and out the Earth's interior recognize that a significant part of the total outgassing of deep Earth carbon occurs in tectonically active areas (Kelemen and Manning, 2015). Potential tracers of carbon fluxes at mantle depths include CO2 fluid inclusions in peridotites. Raman micro-spectroscopy allows calculating the density of CO2 fluids based on the distance of the CO2 Fermi doublet, Δ, in cm-1 (Rosso and Bodnar, 1995). The aim of this work is to check the reliability of Raman densimeter equations (cf. Lamadrid et al., 2016) for high-density CO2 fluids originating at mantle depths. Forty pure CO2 inclusions in peridotites (El Hierro, Canary Islands) of known density (microthermometry) have been analyzed by Raman micro-spectroscopy. In order to evaluate the influence of contaminants on the reliability of equations, 22 CO2-rich inclusions containing subordinate amounts of N2, CO, SO2 have also been studied. Raman spectrometer analytical conditions are: 532 nm laser, 80 mW emission power, T 18°C, 1800 and 600 grating, 1 accumulation x 80 sec. Daily calibration included diamond and atmosphere N2. Results suggest that the "Raman densimeter" represents an accurate method to calculate the density of CO2 mantle fluids. Equations, however, must be applied only to pure CO2 fluids, since contaminants, even in trace amounts (0.39 mol%), affect the Δ resulting in density overestimation. Present study further highlights how analytical conditions and data processing, such as spectral resolution (i.e., grating), calibration linearity, and statistical treatment of spectra, influence the accuracy and the precision of Δ measurements. As a consequence, specific analytical protocols for single Raman spectrometers should be set up in order to get reliable CO2 density data. Kelemen, Peter B., & Craig E. Manning. PNAS, 112.30 (2015): E3997-E4006.Lamadrid, H. M., Moore, L. R., Moncada, D., Rimstidt, J. D., Burruss, R. C., & Bodnar, R. J. Chem. Geol. (2016).Rosso, K. M., & Bodnar, R. J. Geochim. et Cosmochim. Acta, 59(19), 3961-3975 (1995).
El-Awady, Mohamed; Belal, Fathalla; Pyell, Ute
2013-09-27
The analysis of hydrophobic basic analytes by micellar electrokinetic chromatography (MEKC) is usually challenging because of the tendency of these analytes to be adsorbed onto the inner capillary wall in addition to the difficulty to separate these compounds as they exhibit extremely high retention factors. A robust and reliable method for the simultaneous determination of loratadine (LOR) and its major metabolite desloratadine (DSL) is developed based on cyclodextrin-modified micellar electrokinetic chromatography (CD-MEKC) with acidic sample matrix and basic background electrolyte (BGE). The influence of the sample matrix on the reachable focusing efficiency is studied. It is shown that the application of a low pH sample solution mitigates problems associated with the low solubility of the hydrophobic basic analytes in aqueous solution while having advantages with regard to on-line focusing. Moreover, the use of a basic BGE reduces the adsorption of these analytes in the separation compartment. The separation of the studied analytes is achieved in less than 7min using a BGE consisting of 10mmolL(-1) disodium tetraborate buffer, pH 9.30 containing 40mmolL(-1) SDS and 20mmolL(-1) hydroxypropyl-β-CD while the sample solution is composed of 10mmolL(-1) phosphoric acid, pH 2.15. A full validation study of the developed method based on the pharmacopeial guidelines is performed. The method is successfully applied to the analysis of the studied drugs in tablets without interference of tablet additives as well as the analysis of spiked human urine without any sample pretreatment. Furthermore, DSL can be detected as an impurity in LOR bulk powder at the stated pharmacopeial limit (0.1%, w/w). The selectivity of the developed method allows the analysis of LOR and DSL in combination with the co-formulated drug pseudoephedrine. It is shown that in CD-MEKC with basic BGE, solute-wall interactions are effectively suppressed allowing the development of efficient and precise methods for the determination of hydrophobic basic analytes, whereas the use of a low pH sample solution has a positive impact on the attainable sweeping efficiency without compromising peak shape and resolution. Copyright © 2013 Elsevier B.V. All rights reserved.
Maas, Alexandra; Maier, Christoph; Iwersen-Bergmann, Stefanie; Madea, Burkhard; Hess, Cornelius
2017-11-30
Besides its clinical application, the anaesthetic agent propofol is being increasingly misused, mostly by healthcare professionals, and its abuse potential gained worldwide attention after the tragic death of Michael Jackson in 2009. Due to the short duration of its narcotic effects, propofol abuse is especially easy to hide compared with the use of other recreational drugs. However, propofol possesses a very narrow therapeutic window between the desired effect and potentially fatal toxicity, making abuse of the drug extremely dangerous even in experienced physicians. Consequently, it is important that forensic laboratories possess a sensitive and specific method for the detection of chronic propofol abuse. We present a simple, fast and reliable method to simultaneously extract propofol and its main metabolite propofol glucuronide from hair, followed by sensitive LC-MS/MS analyses, allowing to determine a chronic propofol abuse. Difficulties regarding the detection of propofol using LC-MS/MS were solved by using a derivatization reaction with 2-fluoro-1-methylpyridinium-p-toluene-sulfonate and triethylamine. Reliability of extraction method and subsequent LC-MS/MS analyses was confirmed under consideration of the validation parameters selectivity, linearity, accuracy and precision, analytical limits, processed sample stability, matrix effects and recovery. Appropriate quantification (LLOQ=10pg/mg hair) and detection limits (3.6pg/mg hair for propofol and 7.8 pg/mg hair for propofol glucuronide) could be achieved, enabling to detect even small amounts of both analytes. Applicability of the method was confirmed by analysis of three human hair samples from deceased with suspicion of chronic propofol abuse. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Hess, R. A.
1976-01-01
Paramount to proper utilization of electronic displays is a method for determining pilot-centered display requirements. Display design should be viewed fundamentally as a guidance and control problem which has interactions with the designer's knowledge of human psychomotor activity. From this standpoint, reliable analytical models of human pilots as information processors and controllers can provide valuable insight into the display design process. A relatively straightforward, nearly algorithmic procedure for deriving model-based, pilot-centered display requirements was developed and is presented. The optimal or control theoretic pilot model serves as the backbone of the design methodology, which is specifically directed toward the synthesis of head-down, electronic, cockpit display formats. Some novel applications of the optimal pilot model are discussed. An analytical design example is offered which defines a format for the electronic display to be used in a UH-1H helicopter in a landing approach task involving longitudinal and lateral degrees of freedom.
Template based rotation: A method for functional connectivity analysis with a priori templates☆
Schultz, Aaron P.; Chhatwal, Jasmeer P.; Huijbers, Willem; Hedden, Trey; van Dijk, Koene R.A.; McLaren, Donald G.; Ward, Andrew M.; Wigman, Sarah; Sperling, Reisa A.
2014-01-01
Functional connectivity magnetic resonance imaging (fcMRI) is a powerful tool for understanding the network level organization of the brain in research settings and is increasingly being used to study large-scale neuronal network degeneration in clinical trial settings. Presently, a variety of techniques, including seed-based correlation analysis and group independent components analysis (with either dual regression or back projection) are commonly employed to compute functional connectivity metrics. In the present report, we introduce template based rotation,1 a novel analytic approach optimized for use with a priori network parcellations, which may be particularly useful in clinical trial settings. Template based rotation was designed to leverage the stable spatial patterns of intrinsic connectivity derived from out-of-sample datasets by mapping data from novel sessions onto the previously defined a priori templates. We first demonstrate the feasibility of using previously defined a priori templates in connectivity analyses, and then compare the performance of template based rotation to seed based and dual regression methods by applying these analytic approaches to an fMRI dataset of normal young and elderly subjects. We observed that template based rotation and dual regression are approximately equivalent in detecting fcMRI differences between young and old subjects, demonstrating similar effect sizes for group differences and similar reliability metrics across 12 cortical networks. Both template based rotation and dual-regression demonstrated larger effect sizes and comparable reliabilities as compared to seed based correlation analysis, though all three methods yielded similar patterns of network differences. When performing inter-network and sub-network connectivity analyses, we observed that template based rotation offered greater flexibility, larger group differences, and more stable connectivity estimates as compared to dual regression and seed based analyses. This flexibility owes to the reduced spatial and temporal orthogonality constraints of template based rotation as compared to dual regression. These results suggest that template based rotation can provide a useful alternative to existing fcMRI analytic methods, particularly in clinical trial settings where predefined outcome measures and conserved network descriptions across groups are at a premium. PMID:25150630
Stochastic modelling of the hydrologic operation of rainwater harvesting systems
NASA Astrophysics Data System (ADS)
Guo, Rui; Guo, Yiping
2018-07-01
Rainwater harvesting (RWH) systems are an effective low impact development practice that provides both water supply and runoff reduction benefits. A stochastic modelling approach is proposed in this paper to quantify the water supply reliability and stormwater capture efficiency of RWH systems. The input rainfall series is represented as a marked Poisson process and two typical water use patterns are analytically described. The stochastic mass balance equation is solved analytically, and based on this, explicit expressions relating system performance to system characteristics are derived. The performances of a wide variety of RWH systems located in five representative climatic regions of the United States are examined using the newly derived analytical equations. Close agreements between analytical and continuous simulation results are shown for all the compared cases. In addition, an analytical equation is obtained expressing the required storage size as a function of the desired water supply reliability, average water use rate, as well as rainfall and catchment characteristics. The equations developed herein constitute a convenient and effective tool for sizing RWH systems and evaluating their performances.
Analytical techniques and instrumentation, a compilation
NASA Technical Reports Server (NTRS)
1974-01-01
Procedures for conducting materials tests and structural analyses of aerospace components are presented as a part of the NASA technology utilization program. Some of the subjects discussed are as follows: (1) failures in cryogenic tank insulation, (2) friction characteristics of graphite and graphite-metal combinations, (3) evaluation of polymeric products in thermal-vacuum environment, (4) erosion of metals by multiple impacts with water, (5) mass loading effects on vibrated ring and shell structures, (6) nonlinear damping in structures, and (7) method for estimating reliability of randomly excited structures.
Prediction of true test scores from observed item scores and ancillary data.
Haberman, Shelby J; Yao, Lili; Sinharay, Sandip
2015-05-01
In many educational tests which involve constructed responses, a traditional test score is obtained by adding together item scores obtained through holistic scoring by trained human raters. For example, this practice was used until 2008 in the case of GRE(®) General Analytical Writing and until 2009 in the case of TOEFL(®) iBT Writing. With use of natural language processing, it is possible to obtain additional information concerning item responses from computer programs such as e-rater(®). In addition, available information relevant to examinee performance may include scores on related tests. We suggest application of standard results from classical test theory to the available data to obtain best linear predictors of true traditional test scores. In performing such analysis, we require estimation of variances and covariances of measurement errors, a task which can be quite difficult in the case of tests with limited numbers of items and with multiple measurements per item. As a consequence, a new estimation method is suggested based on samples of examinees who have taken an assessment more than once. Such samples are typically not random samples of the general population of examinees, so that we apply statistical adjustment methods to obtain the needed estimated variances and covariances of measurement errors. To examine practical implications of the suggested methods of analysis, applications are made to GRE General Analytical Writing and TOEFL iBT Writing. Results obtained indicate that substantial improvements are possible both in terms of reliability of scoring and in terms of assessment reliability. © 2015 The British Psychological Society.
Becerra-Herrera, Mercedes; Honda, Luis; Richter, Pablo
2015-12-04
A novel analytical approach involving an improved rotating-disk sorptive extraction (RDSE) procedure and ultra-high-performance liquid chromatography (UHPLC) coupled to an ultraspray electrospray ionization source (UESI) and time-of-flight mass spectrometry (TOF/MS), in trap mode, was developed to identify and quantify four non-steroidal anti-inflammatory drugs (NSAIDs) (naproxen, ibuprofen, ketoprofen and diclofenac) and two anti-cholesterol drugs (ACDs) (clofibric acid and gemfibrozil) that are widely used and typically found in water samples. The method reduced the amount of both sample and reagents used and also the time required for the whole analysis, resulting in a reliable and green analytical strategy. The analytical eco-scale was calculated, showing that this methodology is an excellent green analysis, increasing its ecological worth. The detection limits (LOD) and precision (%RSD) were lower than 90ng/L and 10%, respectively. Matrix effects and recoveries were studied using samples from the influent of a wastewater treatment plant (WWTP). All the compounds exhibited suppression of their signals due to matrix effects, and the recoveries were approximately 100%. The applicability and reliability of this methodology were confirmed through the analysis of influent and effluent samples from a WWTP in Santiago, Chile, obtaining concentrations ranging from 1.1 to 20.5μg/L and from 0.5 to 8.6μg/L, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.
Historical milestones in measurement of HDL-cholesterol: impact on clinical and laboratory practice.
Langlois, Michel R; Blaton, Victor H
2006-07-23
High-density lipoprotein cholesterol (HDL-C) comprises a family of particles with differing physicochemical characteristics. Continuing progress in improving HDL-C analysis has originated from two separate fields-one clinical, reflecting increased attention to HDL-C in estimating risk for coronary heart disease (CHD), and the other analytical, reflecting increased emphasis on finding more reliable and cost-effective HDL-C assays. Epidemiologic and prospective studies established the inverse association of HDL-C with CHD risk, a relationship that is consistent with protective mechanisms demonstrated in basic research and animal studies. Atheroprotective and less atheroprotective HDL subpopulations have been described. Guidelines on primary and secondary CHD prevention, which increased the workload in clinical laboratories, have led to a revolution in HDL-C assay technology. Many analytical techniques including ultracentrifugation, electrophoresis, chromatography, and polyanion precipitation methods have been developed to separate and quantify HDL-C and HDL subclasses. More recently developed homogeneous assays enable direct measurement of HDL-C on an automated analyzer, without the need for manual pretreatment to separate non-HDL. Although homogeneous assays show improved accuracy and precision in normal serum, discrepant results exist in samples with atypical lipoprotein characteristics. Hypertriglyceridemia and monoclonal paraproteins are important interfering factors. A novel approach is nuclear magnetic resonance spectroscopy that allows rapid and reliable analysis of lipoprotein subclasses, which may improve the identification of individuals at increased CHD risk. Apolipoprotein A-I, the major protein of HDL, has been proposed as an alternative cardioprotective marker avoiding the analytical limitations of HDL-C.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroniger, K; Herzog, M; Landry, G
2015-06-15
Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less
Ai, Yu; Wu, Yun; Wang, Fenrong; Ma, Wen; Bian, Qiaoxia; Lee, David Y-W; Dai, Ronghua
2015-03-01
The objective of this study was to develop a sensitive and reliable ultra performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) method for simultaneous quantitation of three monoterpene glycosides (paeoniflorin, alibiflorin and oxypaeoniflorin) and four alkaloids (tetrahydropalmatine, corydaline, dehydrocorydaline and berberine), the main active ingredients of Radix Paeoniae Rubra extract (RPE) and Corydalis yanhusuo extract (CYE) in Huo Luo Xiao Ling Dan (HLXLD), and to compare the pharmacokinetics of these active ingredients in normal and arthritic rats orally administrated with HLXLD or RPE/CYE alone. The analytes and internal standard (IS) (geniposide) were separated on a XBridge C18 column (150 × 4.6 mm, 3.5 µm) using gradient elution with the mobile phase consisting of methanol and 0.01% formic acid in water at a flow rate of 0.6 ml/min. The detection of the analytes was performed on Acquity UPLC-MS/MS system with an electrospray ionization and multiple reaction monitoring mode via polarity switching between negative (for monoterpene glycosides) and positive (for alkaloids) ionization mode. The lower limits of quantification were 2.5, 1, 0.5, 0.2, 0.2, 0.02 and 0.01 ng/ml for paeoniflorin, alibiflorin, oxypaeoniflorin, tetrahydropalmatine, corydaline, dehydrocorydaline and berberine, respectively. Intra-day and inter-day precision and accuracy of analytes were well within acceptance criteria (15%). The mean extraction recoveries of analytes and IS from rat plasma were all more than 83.1%. The validated method has been successfully applied to determination of the analytes. Results showed that there were remarkable differences in pharmacokinetic properties of the analytes between herbal formula and single herb group, normal and arthritic group. Copyright © 2015 John Wiley & Sons, Ltd.
Microemulsification: an approach for analytical determinations.
Lima, Renato S; Shiroma, Leandro Y; Teixeira, Alvaro V N C; de Toledo, José R; do Couto, Bruno C; de Carvalho, Rogério M; Carrilho, Emanuel; Kubota, Lauro T; Gobbi, Angelo L
2014-09-16
We address a novel method for analytical determinations that combines simplicity, rapidity, low consumption of chemicals, and portability with high analytical performance taking into account parameters such as precision, linearity, robustness, and accuracy. This approach relies on the effect of the analyte content over the Gibbs free energy of dispersions, affecting the thermodynamic stabilization of emulsions or Winsor systems to form microemulsions (MEs). Such phenomenon was expressed by the minimum volume fraction of amphiphile required to form microemulsion (Φ(ME)), which was the analytical signal of the method. Thus, the measurements can be taken by visually monitoring the transition of the dispersions from cloudy to transparent during the microemulsification, like a titration. It bypasses the employment of electric energy. The performed studies were: phase behavior, droplet dimension by dynamic light scattering, analytical curve, and robustness tests. The reliability of the method was evaluated by determining water in ethanol fuels and monoethylene glycol in complex samples of liquefied natural gas. The dispersions were composed of water-chlorobenzene (water analysis) and water-oleic acid (monoethylene glycol analysis) with ethanol as the hydrotrope phase. The mean hydrodynamic diameter values for the nanostructures in the droplet-based water-chlorobenzene MEs were in the range of 1 to 11 nm. The procedures of microemulsification were conducted by adding ethanol to water-oleic acid (W-O) mixtures with the aid of micropipette and shaking. The Φ(ME) measurements were performed in a thermostatic water bath at 23 °C by direct observation that is based on the visual analyses of the media. The experiments to determine water demonstrated that the analytical performance depends on the composition of ME. It shows flexibility in the developed method. The linear range was fairly broad with limits of linearity up to 70.00% water in ethanol. For monoethylene glycol in water, in turn, the linear range was observed throughout the volume fraction of analyte. The best limits of detection were 0.32% v/v water to ethanol and 0.30% v/v monoethylene glycol to water. Furthermore, the accuracy was highly satisfactory. The natural gas samples provided by the Petrobras exhibited color, particulate material, high ionic strength, and diverse compounds as metals, carboxylic acids, and anions. These samples had a conductivity of up to 2630 μS cm(-1); the conductivity of pure monoethylene glycol was only 0.30 μS cm(-1). Despite such downsides, the method allowed accurate measures bypassing steps such as extraction, preconcentration, and dilution of the sample. In addition, the levels of robustness were promising. This parameter was evaluated by investigating the effect of (i) deviations in volumetric preparation of the dispersions and (ii) changes in temperature over the analyte contents recorded by the method.
Pilot testing of SHRP 2 reliability data and analytical products: Florida.
DOT National Transportation Integrated Search
2015-01-01
Transportation agencies have realized the importance of performance estimation, measurement, and management. The Moving Ahead for Progress in the 21st Century Act legislation identifies travel time reliability as one of the goals of the federal highw...
Validation of urban freeway models. [supporting datasets
DOT National Transportation Integrated Search
2015-01-01
The goal of the SHRP 2 Project L33 Validation of Urban Freeway Models was to assess and enhance the predictive travel time reliability models developed in the SHRP 2 Project L03, Analytic Procedures for Determining the Impacts of Reliability Mitigati...
Néri-Quiroz, José; Canto, Fabrice; Guillerme, Laurent; Couston, Laurent; Magnaldo, Alastair; Dugas, Vincent
2016-10-01
A miniaturized and automated approach for the determination of free acidity in solutions containing uranium (VI) is presented. The measurement technique is based on the concept of sequential injection analysis with on-line spectroscopic detection. The proposed methodology relies on the complexation and alkalimetric titration of nitric acid using a pH 5.6 sodium oxalate solution. The titration process is followed by UV/VIS detection at 650nm thanks to addition of Congo red as universal pH indicator. Mixing sequence as well as method validity was investigated by numerical simulation. This new analytical design allows fast (2.3min), reliable and accurate free acidity determination of low volume samples (10µL) containing uranium/[H(+)] moles ratio of 1:3 with relative standard deviation of <7.0% (n=11). The linearity range of the free nitric acid measurement is excellent up to 2.77molL(-1) with a correlation coefficient (R(2)) of 0.995. The method is specific, presence of actinide ions up to 0.54molL(-1) does not interfere on the determination of free nitric acid. In addition to automation, the developed sequential injection analysis method greatly improves the standard off-line oxalate complexation and alkalimetric titration method by reducing thousand fold the required sample volume, forty times the nuclear waste per analysis as well as the analysis time by eight fold. These analytical parameters are important especially in nuclear-related applications to improve laboratory safety, personnel exposure to radioactive samples and to drastically reduce environmental impacts or analytical radioactive waste. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kozubal, Janusz; Tomanovic, Zvonko; Zivaljevic, Slobodan
2016-09-01
In the present study the numerical model of the pile embedded in marl described by a time dependent model, based on laboratory tests, is proposed. The solutions complement the state of knowledge of the monopile loaded by horizontal force in its head with respect to its random variability values in time function. The investigated reliability problem is defined by the union of failure events defined by the excessive horizontal maximal displacement of the pile head in each periods of loads. Abaqus has been used for modeling of the presented task with a two layered viscoplastic model for marl. The mechanical parameters for both parts of model: plastic and rheological were calibrated based on the creep laboratory test results. The important aspect of the problem is reliability analysis of a monopile in complex environment under random sequences of loads which help understanding the role of viscosity in nature of rock basis constructions. Due to the lack of analytical solutions the computations were done by the method of response surface in conjunction with wavelet neural network as a method recommended for time sequences process and description of nonlinear phenomenon.
Wang, Yuanyuan; Li, Xiaowei; Zhang, Zhiwen; Ding, Shuangyang; Jiang, Haiyang; Li, Jiancheng; Shen, Jianzhong; Xia, Xi
2016-02-01
A sensitive, confirmatory ultra-high performance liquid chromatography-tandem mass spectrometric method was developed and validated to detect 23 veterinary drugs and metabolites (nitroimidazoles, benzimidazoles, and chloramphenicol components) in bovine milk. Compounds of interest were sequentially extracted from milk with acetonitrile and basified acetonitrile using sodium chloride to induce liquid-liquid partition. The extract was purified on a mixed mode solid-phase extraction cartridge. Using rapid polarity switching in electrospray ionization, a single injection was capable of detecting both positively and negatively charged analytes in a 9 min chromatography run time. Recoveries based on matrix-matched calibrations and isotope labeled internal standards for milk ranged from 51.7% to 101.8%. The detection limits and quantitation limits of the analytical method were found to be within the range of 2-20 ng/kg and 5-50 ng/kg, respectively. The recommended method is simple, specific, and reliable for the routine monitoring of nitroimidazoles, benzimidazoles, and chloramphenicol components in bovine milk samples. Copyright © 2015 Elsevier Ltd. All rights reserved.
Molognoni, Luciano; Daguer, Heitor; de Sá Ploêncio, Leandro Antunes; Yotsuyanagi, Suzana Eri; da Silva Correa Lemos, Ana Lucia; Joussef, Antonio Carlos; De Dea Lindner, Juliano
2018-08-01
The use of sorbate and nitrite in meat processing may lead to the formation of 2-methyl-1,4-dinitro-pyrrole (DNMP), a mutagenic compound. This work was aimed at developing and validating an analytical method for the quantitation of DNMP by liquid chromatography-tandem mass spectrometry. Full validation was performed in accordance to Commission Decision 2002/657/EC and method applicability was checked in several samples of meat products. A simple procedure, with low temperature partitioning solid-liquid extraction, was developed. The nitrosation during the extraction was monitored by the N-nitroso-DL-pipecolic acid content. Chromatographic separation was achieved in 8 min with di-isopropyl-3-aminopropyl silane bound to hydroxylated silica as stationary phase. Samples of bacon and cooked sausage yielded the highest concentrations of DNMP (68 ± 3 and 50 ± 3 μg kg -1 , respectively). The developed method proved to be a reliable, selective, and sensitive tool for DNMP measurements in meat products. Copyright © 2018 Elsevier B.V. All rights reserved.
Liu, Xiaolu; Yang, Tao; Hu, Jiye
2013-01-01
A method has been developed and established for residue determination of benazolin-ethyl in soil and rape seed samples by gas chromatography with electron capture detection (GC-ECD). Limits of quantification of the method are 0.005 mg/kg for both soil and rape seed, which are sufficiently below the maximum residue limit, and the limit of detection is 0.0023 ng. The average recoveries of the analyte range from 85.89 to 105.84% with relative standard deviations (coefficient of variation) less than 5.53% at the three spike levels (0.005, 0.1 and 0.5 mg/kg). The half-life of benazolin-ethyl in soil from the experimental field is 4.62 days. The final residues of benazolin-ethyl in soil and rape seed samples are lower than 0.005 mg/kg at harvest time. Direct confirmation of the analyte in real samples is achieved by GC-mass spectrometry. It is demonstrated that the proposed method is simple, rapid and efficient, and reliable to detect benazolin-ethyl residues in soil and rape seed samples.
Application of Dynamic Analysis in Semi-Analytical Finite Element Method
Oeser, Markus
2017-01-01
Analyses of dynamic responses are significantly important for the design, maintenance and rehabilitation of asphalt pavement. In order to evaluate the dynamic responses of asphalt pavement under moving loads, a specific computational program, SAFEM, was developed based on a semi-analytical finite element method. This method is three-dimensional and only requires a two-dimensional FE discretization by incorporating Fourier series in the third dimension. In this paper, the algorithm to apply the dynamic analysis to SAFEM was introduced in detail. Asphalt pavement models under moving loads were built in the SAFEM and commercial finite element software ABAQUS to verify the accuracy and efficiency of the SAFEM. The verification shows that the computational accuracy of SAFEM is high enough and its computational time is much shorter than ABAQUS. Moreover, experimental verification was carried out and the prediction derived from SAFEM is consistent with the measurement. Therefore, the SAFEM is feasible to reliably predict the dynamic response of asphalt pavement under moving loads, thus proving beneficial to road administration in assessing the pavement’s state. PMID:28867813
Criado-García, Laura; Garrido-Delgado, Rocío; Arce, Lourdes; Valcárcel, Miguel
2013-07-15
An UV-Ion Mobility Spectrometer is a simple, rapid, inexpensive instrument widely used in environmental analysis among other fields. The advantageous features of its underlying technology can be of great help towards developing reliable, economical methods for determining gaseous compounds from gaseous, liquid and solid samples. Developing an effective method using UV-Ion Mobility Spectrometry (UV-IMS) to determine volatile analytes entails using appropriate gaseous standards for calibrating the spectrometer. In this work, two home-made sample introduction systems (SISs) and a commercial gas generator were used to obtain such gaseous standards. The first home-made SIS used was a static head-space to measure compounds present in liquid samples and the other home-made system was an exponential dilution set-up to measure compounds present in gaseous samples. Gaseous compounds generated by each method were determined on-line by UV-IMS. Target analytes chosen for this comparative study were ethanol, acetone, benzene, toluene, ethylbenzene and xylene isomers. The different alternatives were acceptable in terms of sensitivity, precision and selectivity. Copyright © 2013 Elsevier B.V. All rights reserved.
Falasca, Sara; Petruzziello, Filomena; Kretz, Robert; Rainer, Gregor; Zhang, Xiaozhe
2012-06-08
Endogenous quaternary ammonium compounds are involved in various physiological processes in the central nervous system. In the present study, eleven quaternary ammonium compounds, including acetylcholine, choline, carnitine, acetylcarnitine and seven other acylcarnitines of low polarity, were analyzed from brain extracts using a two dimension capillary liquid chromatography-Fourier transform mass spectrometry method. To deal with their large difference in hydrophobicities, tandem coupling between reversed phase and hydrophilic interaction chromatography columns was used to separate all the targeted quaternary ammonium compounds. Using high accuracy mass spectrometry in selected ion monitoring mode, all the compounds could be detected from each brain sample with high selectivity. The developed method was applied for the relative quantification of these quaternary ammonium compounds in three different brain regions of tree shrews: prefrontal cortex, striatum, and hippocampus. The comparative analysis showed that quaternary ammonium compounds were differentially distributed across the three brain areas. The analytical method proved to be highly sensitive and reliable for simultaneous determination of all the targeted analytes from brain samples. Copyright © 2012 Elsevier B.V. All rights reserved.
RP-HPLC determination of water-soluble vitamins in honey.
Ciulu, Marco; Solinas, Silvia; Floris, Ignazio; Panzanelli, Angelo; Pilo, Maria I; Piu, Paola C; Spano, Nadia; Sanna, Gavino
2011-01-15
The assessment and validation of reliable analytical methods for the determination of vitamins in sugar-based matrices (e.g. honey) are still scarcely explored fields of research. This study proposes and fully validates a simple and fast RP-HPLC method for the simultaneous determination of five water-soluble vitamins (vitamin B(2), riboflavin; vitamin B(3), nicotinic acid; vitamin B(5), pantothenic acid; vitamin B(9), folic acid; and vitamin C, ascorbic acid) in honey. The method provides low detection and quantification limits, very good linearity in a large concentration interval, very good precision, and the absence of any bias. It has been successfully applied to 28 honey samples (mainly from Sardinia, Italy) of 12 different botanical origins. While the overall amount of the analytes in the samples is quite low (always below 40 mg kg(-1)), we have observed a marked dependence of some of their concentrations (i.e. vitamin B(3) and vitamin B(5)) and the botanical origin of the honey. This insight might lead to important characterization features for this food item. Copyright © 2010 Elsevier B.V. All rights reserved.
Yang, Xing-Xin; Zhang, Xiao-Xia; Chang, Rui-Miao; Wang, Yan-Wei; Li, Xiao-Ni
2011-01-01
A simple and reliable high performance liquid chromatography (HPLC) method has been developed for the simultaneous quantification of five major bioactive components in ‘Shu-Jin-Zhi-Tong’ capsules (SJZTC), for the purposes of quality control of this commonly prescribed traditional Chinese medicine. Under the optimum conditions, excellent separation was achieved, and the assay was fully validated in terms of linearity, precision, repeatability, stability and accuracy. The validated method was applied successfully to the determination of the five compounds in SJZTC samples from different production batches. The HPLC method can be used as a valid analytical method to evaluate the intrinsic quality of SJZTC. PMID:29403711
Lattice Boltzmann method for rain-induced overland flow
NASA Astrophysics Data System (ADS)
Ding, Yu; Liu, Haifei; Peng, Yong; Xing, Liming
2018-07-01
Complex rainfall situations can generate overland flow with complex hydrodynamic characteristics, affecting the surface configuration (i.e. sheet erosion) and environment to varying degrees. Reliable numerical simulations can provide a scientific method for the optimization of environmental management. A mesoscopic numerical method, the lattice Boltzmann method, was employed to simulate overland flows. To deal with complex rainfall, two schemes were introduced to improve the lattice Boltzmann equation and the local equilibrium function, respectively. Four typical cases with differences in rainfall, bed roughness, and slopes were selected to test the accuracy and applicability of the proposed schemes. It was found that the simulated results were in good agreement with the experimental data, analytical values, and the results produced by other models.
Reference materials for cellular therapeutics.
Bravery, Christopher A; French, Anna
2014-09-01
The development of cellular therapeutics (CTP) takes place over many years, and, where successful, the developer will anticipate the product to be in clinical use for decades. Successful demonstration of manufacturing and quality consistency is dependent on the use of complex analytical methods; thus, the risk of process and method drift over time is high. The use of reference materials (RM) is an established scientific principle and as such also a regulatory requirement. The various uses of RM in the context of CTP manufacturing and quality are discussed, along with why they are needed for living cell products and the analytical methods applied to them. Relatively few consensus RM exist that are suitable for even common methods used by CTP developers, such as flow cytometry. Others have also identified this need and made proposals; however, great care will be needed to ensure any consensus RM that result are fit for purpose. Such consensus RM probably will need to be applied to specific standardized methods, and the idea that a single RM can have wide applicability is challenged. Written standards, including standardized methods, together with appropriate measurement RM are probably the most appropriate way to define specific starting cell types. The characteristics of a specific CTP will to some degree deviate from those of the starting cells; consequently, a product RM remains the best solution where feasible. Each CTP developer must consider how and what types of RM should be used to ensure the reliability of their own analytical measurements. Copyright © 2014 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
AZARI, Nadia; SOLEIMANI, Farin; VAMEGHI, Roshanak; SAJEDI, Firoozeh; SHAHSHAHANI, Soheila; KARIMI, Hossein; KRASKIAN, Adis; SHAHROKHI, Amin; TEYMOURI, Robab; GHARIB, Masoud
2017-01-01
Objective Bayley Scales of infant & toddler development is a well-known diagnostic developmental assessment tool for children aged 1–42 months. Our aim was investigating the validity & reliability of this scale in Persian speaking children. Materials & Methods The method was descriptive-analytic. Translation- back translation and cultural adaptation was done. Content & face validity of translated scale was determined by experts’ opinions. Overall, 403 children aged 1 to 42 months were recruited from health centers of Tehran, during years of 2013-2014 for developmental assessment in cognitive, communicative (receptive & expressive) and motor (fine & gross) domains. Reliability of scale was calculated through three methods; internal consistency using Cronbach’s alpha coefficient, test-retest and interrater methods. Construct validity was calculated using factor analysis and comparison of the mean scores methods. Results Cultural and linguistic changes were made in items of all domains especially on communication subscale. Content and face validity of the test were approved by experts’ opinions. Cronbach’s alpha coefficient was above 0.74 in all domains. Pearson correlation coefficient in various domains, were ≥ 0.982 in test retest method, and ≥0.993 in inter-rater method. Construct validity of the test was approved by factor analysis. Moreover, the mean scores for the different age groups were compared and statistically significant differences were observed between mean scores of different age groups, that confirms validity of the test. Conclusion The Bayley Scales of Infant and Toddler Development is a valid and reliable tool for child developmental assessment in Persian language children. PMID:28277556
Identification of complex stiffness tensor from waveform reconstruction
NASA Astrophysics Data System (ADS)
Leymarie, N.; Aristégui, C.; Audoin, B.; Baste, S.
2002-03-01
An inverse method is proposed in order to determine the viscoelastic properties of composite-material plates from the plane-wave transmitted acoustic field. Analytical formulations of both the plate transmission coefficient and its first and second derivatives are established, and included in a two-step inversion scheme. Two objective functions to be minimized are then designed by considering the well-known maximum-likelihood principle and by using an analytic signal formulation. Through these innovative objective functions, the robustness of the inversion process against high level of noise in waveforms is improved and the method can be applied to a very thin specimen. The suitability of the inversion process for viscoelastic property identification is demonstrated using simulated data for composite materials with different anisotropy and damping degrees. A study of the effect of the rheologic model choice on the elastic property identification emphasizes the relevance of using a phenomenological description considering viscosity. Experimental characterizations show then the good reliability of the proposed approach. Difficulties arise experimentally for particular anisotropic media.
Calculation of ground vibration spectra from heavy military vehicles
NASA Astrophysics Data System (ADS)
Krylov, V. V.; Pickup, S.; McNuff, J.
2010-07-01
The demand for reliable autonomous systems capable to detect and identify heavy military vehicles becomes an important issue for UN peacekeeping forces in the current delicate political climate. A promising method of detection and identification is the one using the information extracted from ground vibration spectra generated by heavy military vehicles, often termed as their seismic signatures. This paper presents the results of the theoretical investigation of ground vibration spectra generated by heavy military vehicles, such as tanks and armed personnel carriers. A simple quarter car model is considered to identify the resulting dynamic forces applied from a vehicle to the ground. Then the obtained analytical expressions for vehicle dynamic forces are used for calculations of generated ground vibrations, predominantly Rayleigh surface waves, using Green's function method. A comparison of the obtained theoretical results with the published experimental data shows that analytical techniques based on the simplified quarter car vehicle model are capable of producing ground vibration spectra of heavy military vehicles that reproduce basic properties of experimental spectra.
Akan, Ozgur B.
2018-01-01
We consider a microfluidic molecular communication (MC) system, where the concentration-encoded molecular messages are transported via fluid flow-induced convection and diffusion, and detected by a surface-based MC receiver with ligand receptors placed at the bottom of the microfluidic channel. The overall system is a convection-diffusion-reaction system that can only be solved by numerical methods, e.g., finite element analysis (FEA). However, analytical models are key for the information and communication technology (ICT), as they enable an optimisation framework to develop advanced communication techniques, such as optimum detection methods and reliable transmission schemes. In this direction, we develop an analytical model to approximate the expected time course of bound receptor concentration, i.e., the received signal used to decode the transmitted messages. The model obviates the need for computationally expensive numerical methods by capturing the nonlinearities caused by laminar flow resulting in parabolic velocity profile, and finite number of ligand receptors leading to receiver saturation. The model also captures the effects of reactive surface depletion layer resulting from the mass transport limitations and moving reaction boundary originated from the passage of finite-duration molecular concentration pulse over the receiver surface. Based on the proposed model, we derive closed form analytical expressions that approximate the received pulse width, pulse delay and pulse amplitude, which can be used to optimize the system from an ICT perspective. We evaluate the accuracy of the proposed model by comparing model-based analytical results to the numerical results obtained by solving the exact system model with COMSOL Multiphysics. PMID:29415019
Wille, Klaas; Claessens, Michiel; Rappé, Karen; Monteyne, Els; Janssen, Colin R; De Brabander, Hubert F; Vanhaecke, Lynn
2011-12-23
The presence of both pharmaceuticals and pesticides in the aquatic environment has become a well-known environmental issue during the last decade. An increasing demand however still exists for sensitive and reliable monitoring tools for these rather polar contaminants in the marine environment. In recent years, the great potential of passive samplers or equilibrium based sampling techniques for evaluation of the fate of these contaminants has been shown in literature. Therefore, we developed a new analytical method for the quantification of a high number of pharmaceuticals and pesticides in passive sampling devices. The analytical procedure consisted of extraction using 1:1 methanol/acetonitrile followed by detection with ultra-high performance liquid chromatography coupled to high resolution and high mass accuracy Orbitrap mass spectrometry. Validation of the analytical method resulted in limits of quantification and recoveries ranging between 0.2 and 20 ng per sampler sheet and between 87.9 and 105.2%, respectively. Determination of the sampler-water partition coefficients of all compounds demonstrated that several pharmaceuticals and most pesticides exert a high affinity for the polydimethylsiloxane passive samplers. Finally, the developed analytical methods were used to measure the time-weighted average (TWA) concentrations of the targeted pollutants in passive samplers, deployed at eight stations in the Belgian coastal zone. Propranolol, carbamazepine and seven pesticides were found to be very abundant in the passive samplers. These obtained long-term and large-scale TWA concentrations will contribute in assessing the environmental and human health risk of these emerging pollutants. Copyright © 2011 Elsevier B.V. All rights reserved.
Lin, Monica; Lin, Kham; Lin, Amanda; Gras, Ronda; Luong, Jim
2016-07-01
A novel approach for the determination of parts-per-billion level of 5-hydroxymethyl-2-furaldehyde, furfuryl alcohol, furfural, 2-furyl methyl ketone, and 5-methylfurfural in transformer or rectifier oils has been successfully innovated and implemented. Various extraction methods including solid-phase extraction, liquid-liquid extraction using methanol, acetonitrile, and water were studied. Water was by far the most efficient solvent for use as an extraction medium. Separation of the analytes was conducted using a 4.6 mm × 250 mm × 3.5 μm Agilent Zorbax column while detection and quantitation were conducted with a variable wavelength UV detector. Detection limits of all furans were at 1 ppb v/v with linear ranges range from 5 to 1000 ppb v/v with correlation coefficients of 0.997 or better. A relative standard deviation of at most 2.4% at 1000 ppb v/v and 7.3% at 5 ppb v/v and a recovery from 43% to 90% depending on the analyte monitored were obtained. The method was purposely designed to be environmental friendly with water as an extraction medium. Also, the method uses 80% water and 20% acetonitrile with a mere 0.2 mL/min of acetonitrile in an acetonitrile/water mixture as mobile phase. The analytical technique has been demonstrated to be highly reliable with low cost of ownership, suitable for deployment in quality control labs or in regions where available analytical resources and solvents are difficult to procure. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Kuscu, Murat; Akan, Ozgur B
2018-01-01
We consider a microfluidic molecular communication (MC) system, where the concentration-encoded molecular messages are transported via fluid flow-induced convection and diffusion, and detected by a surface-based MC receiver with ligand receptors placed at the bottom of the microfluidic channel. The overall system is a convection-diffusion-reaction system that can only be solved by numerical methods, e.g., finite element analysis (FEA). However, analytical models are key for the information and communication technology (ICT), as they enable an optimisation framework to develop advanced communication techniques, such as optimum detection methods and reliable transmission schemes. In this direction, we develop an analytical model to approximate the expected time course of bound receptor concentration, i.e., the received signal used to decode the transmitted messages. The model obviates the need for computationally expensive numerical methods by capturing the nonlinearities caused by laminar flow resulting in parabolic velocity profile, and finite number of ligand receptors leading to receiver saturation. The model also captures the effects of reactive surface depletion layer resulting from the mass transport limitations and moving reaction boundary originated from the passage of finite-duration molecular concentration pulse over the receiver surface. Based on the proposed model, we derive closed form analytical expressions that approximate the received pulse width, pulse delay and pulse amplitude, which can be used to optimize the system from an ICT perspective. We evaluate the accuracy of the proposed model by comparing model-based analytical results to the numerical results obtained by solving the exact system model with COMSOL Multiphysics.
Regulatory observations in bioanalytical determinations.
Viswanathan, C T
2010-07-01
The concept of measuring analytes in biological media is a long-established area of the quantitative sciences that is employed in many sectors. While academic research and R&D units of private firms have been in the forefront of developing complex methodologies, it is the regulatory environment that has brought the focus and rigor to the quality control of the quantitative determination of drug concentration in biological samples. In this article, the author examines the regulatory findings discovered during the course of several years of auditing bioanalytical work. The outcomes of these findings underscore the importance of quality method validation to ensure the reliability of the data generated. The failure to ensure the reliability of these data can lead to potential risks in the health management of millions of people in the USA.
Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?
Ershadi, Saba; Shayanfar, Ali
2018-03-22
The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.
Michael, Costas; Bayona, Josep Maria; Lambropoulou, Dimitra; Agüera, Ana; Fatta-Kassinos, Despo
2017-06-01
Occurrence and effects of contaminants of emerging concern pose a special challenge to environmental scientists. The investigation of these effects requires reliable, valid, and comparable analytical data. To this effect, two critical aspects are raised herein, concerning the limitations of the produced analytical data. The first relates to the inherent difficulty that exists in the analysis of environmental samples, which is related to the lack of knowledge (information), in many cases, of the form(s) of the contaminant in which is present in the sample. Thus, the produced analytical data can only refer to the amount of the free contaminant ignoring the amount in which it may be present in other forms; e.g., as in chelated and conjugated form. The other important aspect refers to the way with which the spiking procedure is generally performed to determine the recovery of the analytical method. Spiking environmental samples, in particular solid samples, with standard solution followed by immediate extraction, as is the common practice, can lead to an overestimation of the recovery. This is so, because no time is given to the system to establish possible equilibria between the solid matter-inorganic and/or organic-and the contaminant. Therefore, the spiking procedure need to be reconsidered by including a study of the extractable amount of the contaminant versus the time elapsed between spiking and the extraction of the sample. This study can become an element of the validation package of the method.
NASA Astrophysics Data System (ADS)
Cao, Lu; Verbeek, Fons J.
2012-03-01
In computer graphics and visualization, reconstruction of a 3D surface from a point cloud is an important research area. As the surface contains information that can be measured, i.e. expressed in features, the application of surface reconstruction can be potentially important for application in bio-imaging. Opportunities in this application area are the motivation for this study. In the past decade, a number of algorithms for surface reconstruction have been proposed. Generally speaking, these methods can be separated into two categories: i.e., explicit representation and implicit approximation. Most of the aforementioned methods are firmly based in theory; however, so far, no analytical evaluation between these methods has been presented. The straightforward way of evaluation has been by convincing through visual inspection. Through evaluation we search for a method that can precisely preserve the surface characteristics and that is robust in the presence of noise. The outcome will be used to improve reliability in surface reconstruction of biological models. We, therefore, use an analytical approach by selecting features as surface descriptors and measure these features in varying conditions. We selected surface distance, surface area and surface curvature as three major features to compare quality of the surface created by the different algorithms. Our starting point has been ground truth values obtained from analytical shapes such as the sphere and the ellipsoid. In this paper we present four classical surface reconstruction methods from the two categories mentioned above, i.e. the Power Crust, the Robust Cocone, the Fourier-based method and the Poisson reconstruction method. The results obtained from our experiments indicate that Poisson reconstruction method performs the best in the presence of noise.
Pilot testing of SHRP 2 reliability data and analytical products: Southern California.
DOT National Transportation Integrated Search
2015-01-01
The second Strategic Highway Research Program (SHRP 2) has been investigating the critical subject of travel time reliability for several years. As part of this research, SHRP 2 supported multiple efforts to develop products to evaluate travel time r...
Assessment of Semi-Structured Clinical Interview for Mobile Phone Addiction Disorder
Alavi, Seyyed Salman; Jannatifard, Fereshteh; Mohammadi Kalhori, Soroush; Sepahbodi, Ghazal; BabaReisi, Mohammad; Sajedi, Sahar; Farshchi, Mojtaba; KhodaKarami, Rasul; Hatami Kasvaee, Vahid
2016-01-01
Objective: The Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision (DSM-IV-TR) classified mobile phone addiction disorder under “impulse control disorder not elsewhere classified”. This study surveyed the diagnostic criteria of DSM-IV-TR for the diagnosis of mobile phone addiction in correspondence with Iranian society and culture. Method: Two hundred fifty students of Tehran universities were entered into this descriptive-analytical and cross-sectional study. Quota sampling method was used. At first, semi- structured clinical interview (based on DSM-IV-TR) was performed for all the cases, and another specialist reevaluated the interviews. Data were analyzed using content validity, inter-scorer reliability (Kappa coefficient) and test-retest via SPSS18 software. Results: The content validity of the semi- structured clinical interview matched the DSM–IV-TR criteria for behavioral addiction. Moreover, their content was appropriate, and two items, including “SMS pathological use” and “High monthly cost of using the mobile phone” were added to promote its validity. Internal reliability (Kappa) and test–retest reliability were 0.55 and r = 0.4 (p<0. 01) respectively. Conclusion: The results of this study revealed that semi- structured diagnostic criteria of DSM-IV-TR are valid and reliable for diagnosing mobile phone addiction, and this instrument is an effective tool to diagnose this disorder. PMID:27437008
NASA Astrophysics Data System (ADS)
Daly, Aoife; Streeton, Noëlle L. W.
2017-06-01
A technique for non-invasive dendrochronological analysis of oak was developed for archaeological material, using an industrial CT scanner. Since 2013, this experience has been extended within the scope of the research project `After the Black Death: Painting and Polychrome Sculpture in Norway'. The source material for the project is a collection of late-medieval winged altarpieces, shrines, polychrome sculpture, and fragments from Norwegian churches, which are owned by the Museum of Cultural History, University of Oslo. The majority cannot be sampled, and many are too large to fit into the CT scanner. For these reasons, a combined approach was adopted, utilizing CT scanning where possible, but preceded by an `exposed-wood' imaging technique. Both non-invasive techniques have yielded reliable results, and CT scanning has confirmed the reliability of the imaging technique alone. This paper presents the analytical methods, along with results from two of the 13 objects under investigation. Results for reliable dates and provenances provide new foundations for historical interpretations.
Tang, Wenfu; Wan, Meihua; Zhu, Zhengyan; Chen, Guanyuan; Huang, Xi
2008-04-29
Dachengqi Tang (DT) is a common traditional Chinese medicine formula for expelling neire ('internal heat') in the stomach and intestines. There was no reliable analytical method available for the quality control of DT. A high-performance liquid chromatography (HPLC) method with a reverse phase C18 column (150 x 4.6 mm) was developed. The mobile phase was methanol with 0.2% acetic acid. Eight markers including naringin, hesperidin, aloe emodin, rhein, honokiol, magnolol, emodin and chrysophanol were determined. Regression analysis revealed a linear relationship between the concentrations of the markers and the peak area ratio of the standards and internal standard. The limit of detection (S/N = 3) and the limit of qualification (RSD < 20%) ranged from 0.21 to 0.43 ng/microl and 0.76 to 1.74 ng/microl respectively. The recovery was between 95.6% and 103.4%. The tests on the samples from three batches of DT showed that the profiles of the markers did not vary significantly among batches. A reliable HPLC method for simultaneous determination of the eight markers in DT was developed.
Analytical method for promoting process capability of shock absorption steel.
Sung, Wen-Pei; Shih, Ming-Hsiang; Chen, Kuen-Suan
2003-01-01
Mechanical properties and low cycle fatigue are two factors that must be considered in developing new type steel for shock absorption. Process capability and process control are significant factors in achieving the purpose of research and development programs. Often-used evaluation methods failed to measure process yield and process centering; so this paper uses Taguchi loss function as basis to establish an evaluation method and the steps for assessing the quality of mechanical properties and process control of an iron and steel manufacturer. The establishment of this method can serve the research and development and manufacturing industry and lay a foundation in enhancing its process control ability to select better manufacturing processes that are more reliable than decision making by using the other commonly used methods.
Santelmann, Hanno; Franklin, Jeremy; Bußhoff, Jana; Baethge, Christopher
2016-10-01
Schizoaffective disorder is a common diagnosis in clinical practice but its nosological status has been subject to debate ever since it was conceptualized. Although it is key that diagnostic reliability is sufficient, schizoaffective disorder has been reported to have low interrater reliability. Evidence based on systematic review and meta-analysis methods, however, is lacking. Using a highly sensitive literature search in Medline, Embase, and PsycInfo we identified studies measuring the interrater reliability of schizoaffective disorder in comparison to schizophrenia, bipolar disorder, and unipolar disorder. Out of 4126 records screened we included 25 studies reporting on 7912 patients diagnosed by different raters. The interrater reliability of schizoaffective disorder was moderate (meta-analytic estimate of Cohen's kappa 0.57 [95% CI: 0.41-0.73]), and substantially lower than that of its main differential diagnoses (difference in kappa between 0.22 and 0.19). Although there was considerable heterogeneity, analyses revealed that the interrater reliability of schizoaffective disorder was consistently lower in the overwhelming majority of studies. The results remained robust in subgroup and sensitivity analyses (e.g., diagnostic manual used) as well as in meta-regressions (e.g., publication year) and analyses of publication bias. Clinically, the results highlight the particular importance of diagnostic re-evaluation in patients diagnosed with schizoaffective disorder. They also quantify a widely held clinical impression of lower interrater reliability and agree with earlier meta-analysis reporting low test-retest reliability. Copyright © 2016. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Sidhom, H.; Amadou, T.; Sahlaoui, H.; Braham, C.
2007-06-01
The evaluation of the degree of sensitization (DOS) to intergranular corrosion (IGC) of a commercial AISI 316L austenitic stainless steel aged at temperatures ranging from 550 °C to 800 °C during 100 to 80,000 hours was carried out using three different assessment methods. (1) The microstructural method coupled with the Strauss standard test (ASTM A262). This method establishes the kinetics of the precipitation phenomenon under different aging conditions, by transmission electronic microscope (TEM) examination of thin foils and electron diffraction. The subsequent chromium-depleted zones are characterized by X-ray microanalysis using scanning transmission electronic microscope (STEM). The superimposition of microstructural time-temperature-precipitation (TTP) and ASTM A262 time-temperature-sensitization (TTS) diagrams provides the relationship between aged microstructure and IGC. Moreover, by considering the chromium-depleted zone characteristics, sensitization and desensitization criteria could be established. (2) The electrochemical method involving the double loop-electrochemical potentiokinetic reactivation (DL-EPR) test. The operating conditions of this test were initially optimized using the experimental design method on the bases of the reliability, the selectivity, and the reproducibility of test responses for both annealed and sensitized steels. The TTS diagram of the AISI 316L stainless steel was established using this method. This diagram offers a quantitative assessment of the DOS and a possibility to appreciate the time-temperature equivalence of the IGC sensitization and desensitization. (3) The analytical method based on the chromium diffusion models. Using the IGC sensitization and desensitization criteria established by the microstructural method, numerical solving of the chromium diffusion equations leads to a calculated AISI 316L TTS diagram. Comparison of these three methods gives a clear advantage to the nondestructive DL-EPR test when it is used with its optimized operating conditions. This quantitative method is simple to perform; it is fast, reliable, economical, and presents the best ability to detect the lowest DOS to IGC. For these reasons, this method can be considered as a serious candidate for IGC checking of stainless steel components of industrial plants.
NASA Technical Reports Server (NTRS)
Littell, Justin D.; Binienda, Wieslaw K.; Arnold, William A.; Roberts, Gary D.; Goldberg, Robert K.
2010-01-01
The reliability of impact simulations for aircraft components made with triaxial-braided carbon-fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Methods to characterize the material properties used in the analytical models from a systematically obtained set of test data are also lacking. A macroscopic finite element based analytical model to analyze the impact response of these materials has been developed. The stiffness and strength properties utilized in the material model are obtained from a set of quasi-static in-plane tension, compression and shear coupon level tests. Full-field optical strain measurement techniques are applied in the testing, and the results are used to help in characterizing the model. The unit cell of the braided composite is modeled as a series of shell elements, where each element is modeled as a laminated composite. The braided architecture can thus be approximated within the analytical model. The transient dynamic finite element code LS-DYNA is utilized to conduct the finite element simulations, and an internal LS-DYNA constitutive model is utilized in the analysis. Methods to obtain the stiffness and strength properties required by the constitutive model from the available test data are developed. Simulations of quasi-static coupon tests and impact tests of a represented braided composite are conducted. Overall, the developed method shows promise, but improvements that are needed in test and analysis methods for better predictive capability are examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meinke, Rainer B.; Goodzeit, Carl L.; Ball, Millicent J.
This research project advanced the development of reliable, cost-effective arrays of superconducting quadrupole magnets for use in multi-beam inertial fusion accelerators. The field in each array cell must be identical and meet stringent requirements for field quality and strength. An optimized compact array design using flat double-layer pancake coils was developed. Analytical studies of edge termination methods showed that it is feasible to meet the requirements for field uniformity in all cells and elimination of stray external field in several ways: active methods that involve placement of field compensating coils on the periphery of the array or a passive methodmore » that involves use of iron shielding.« less
Miura, Tsutomu; Chiba, Koichi; Kuroiwa, Takayoshi; Narukawa, Tomohiro; Hioki, Akiharu; Matsue, Hideaki
2010-09-15
Neutron activation analysis (NAA) coupled with an internal standard method was applied for the determination of As in the certified reference material (CRM) of arsenobetaine (AB) standard solutions to verify their certified values. Gold was used as an internal standard to compensate for the difference of the neutron exposure in an irradiation capsule and to improve the sample-to-sample repeatability. Application of the internal standard method significantly improved linearity of the calibration curve up to 1 microg of As, too. The analytical reliability of the proposed method was evaluated by k(0)-standardization NAA. The analytical results of As in AB standard solutions of BCR-626 and NMIJ CRM 7901-a were (499+/-55)mgkg(-1) (k=2) and (10.16+/-0.15)mgkg(-1) (k=2), respectively. These values were found to be 15-20% higher than the certified values. The between-bottle variation of BCR-626 was much larger than the expanded uncertainty of the certified value, although that of NMIJ CRM 7901-a was almost negligible. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Ying, Xixiang; Meng, Xiansheng; Wang, Siyuan; Wang, Dong; Li, Haibo; Wang, Bing; Du, Yang; Liu, Xun; Zhang, Wenjie; Kang, Tingguo
2012-01-01
A simple and sensitive HPLC method was developed to simultaneously determine three active compounds, vitexin-4″-O-glucoside (VG), vitexin-2″-O-rhamnoside (VR) and hyperoside (HP), in rat plasma after administering the hawthorn leaves extract (HLE). An HPLC assay with baicalin as the internal standard was carried out using a Phenomsil C₁₈ analytical column with UV detection at 332 nm. The mobile phase consisted of methanol-acetonitrile-tetrahydrofuran-1% glacial acetic acid (6 : 1.5 : 18.5 : 74, v/v/v/v). The calibration curves were linear over the range of 2.5-500, 0.2-25 and 0.25-12.5 µg mL⁻¹ for VG, VR and HP, respectively. The method was reproducible and reliable, with relative standard deviations of the intra- and inter-day precision between 1.2% and 13.2% for the analysis of the three analytes. The validated HPLC method herein described was successfully applied to the pharmacokinetic study of VG, VR and HP after oral administration of HLE to rats over the dose range of 2.5-10 mL kg⁻¹.
Advances in high-resolution mass spectrometry based on metabolomics studies for food--a review.
Rubert, Josep; Zachariasova, Milena; Hajslova, Jana
2015-01-01
Food authenticity becomes a necessity for global food policies, since food placed in the market without fail has to be authentic. It has always been a challenge, since in the past minor components, called also markers, have been mainly monitored by chromatographic methods in order to authenticate the food. Nevertheless, nowadays, advanced analytical methods have allowed food fingerprints to be achieved. At the same time they have been also combined with chemometrics, which uses statistical methods in order to verify food and to provide maximum information by analysing chemical data. These sophisticated methods based on different separation techniques or stand alone have been recently coupled to high-resolution mass spectrometry (HRMS) in order to verify the authenticity of food. The new generation of HRMS detectors have experienced significant advances in resolving power, sensitivity, robustness, extended dynamic range, easier mass calibration and tandem mass capabilities, making HRMS more attractive and useful to the food metabolomics community, therefore becoming a reliable tool for food authenticity. The purpose of this review is to summarise and describe the most recent metabolomics approaches in the area of food metabolomics, and to discuss the strengths and drawbacks of the HRMS analytical platforms combined with chemometrics.
Sang, Jun; Sang, Jie; Ma, Qun; Hou, Xiao-Fang; Li, Cui-Qin
2017-03-01
This study aimed to extract and identify anthocyanins from Nitraria tangutorun Bobr. seed meal and establish a green analytical method of anthocyanins. Ultrasound-assisted extraction of anthocyanins from N. tangutorun seed meal was optimized using response surface methodology. Extraction at 70°C for 32.73 min using 51.15% ethanol rendered an extract with 65.04mg/100g of anthocyanins and 947.39mg/100g of polyphenols. An in vitro antioxidant assay showed that the extract exhibited a potent DPPH radical-scavenging capacity. Eight anthocyanins in N. tangutorun seed meal were identified by HPLC-MS, and the main anthocyanin was cyanidin-3-O-(trans-p-coumaroyl)-diglucoside (18.17mg/100g). A green HPLC-DAD method was developed to analyse anthocyanins. A mixtures of ethanol and a 5% (v/v) formic acid aqueous solution at a 20:80 (v/v) ratio was used as the optimized mobile phase. The method was accurate, stable and reliable and could be used to investigate anthocyanins from N. tangutorun seed meal. Copyright © 2016 Elsevier Ltd. All rights reserved.
Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques
NASA Astrophysics Data System (ADS)
Mai, J.; Tolson, B.
2017-12-01
The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed sensitivity results. This is one step towards reliable and transferable, published sensitivity results.
Selection of reference standard during method development using the analytical hierarchy process.
Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun
2015-03-25
Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.
Humbert, H; Machinal, C; Labaye, Ivan; Schrotter, J C
2011-01-01
The determination of the virus retention capabilities of UF units during operation is essential for the operators of drinking water treatment facilities in order to guarantee an efficient and stable removal of viruses through time. In previous studies, an effective method (MS2-phage challenge tests) was developed by the Water Research Center of Veolia Environnement for the measurement of the virus retention rates (Log Removal Rate, LRV) of commercially available hollow fiber membranes at lab scale. In the present work, the protocol for monitoring membrane performance was transferred from lab scale to pilot scale. Membrane performances were evaluated during pilot trial and compared to the results obtained at lab scale with fibers taken from the pilot plant modules. PFU culture method was compared to RT-PCR method for the calculation of LRV in both cases. Preliminary tests at lab scale showed that both methods can be used interchangeably. For tests conducted on virgin membrane, a good consistency was observed between lab and pilot scale results with the two analytical methods used. This work intends to show that a reliable determination of the membranes performances based on RT-PCR analytical method can be achieved during the operation of the UF units.
Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.
Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F
2016-01-01
Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence factors. This review is a summary of the most important recommendations regarding the importance of pre-analytical factors for coagulation testing and should be a tool to increase awareness about the importance of pre-analytical factors for coagulation testing.
Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W
2016-01-01
A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson's disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson's disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer's, Huntington's, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications.
Mousavi, Seyed Mohammad Hadi; Dargahi, Hossein; Mohammadi, Sara
2016-10-01
Creating a safe of health care system requires the establishment of High Reliability Organizations (HROs), which reduces errors, and increases the level of safety in hospitals. This model focuses on improving reliability through higher process design, building a culture of accreditation, and leveraging human factors. The present study intends to determine the readiness of hospitals for the establishment of HROs model in Tehran University of Medical Sciences from the viewpoint of managers of these hospitals. This is a descriptive-analytical study carried out in 2013-2014. The research population consists of 105 senior and middle managers of 15 hospitals of Tehran University of Medical Sciences. The data collection tool was a 55-question researcher-made questionnaire, included six elements of HROs to assess the level of readiness for establishing HROS model from managers' point of view. The validity of the questionnaire was calculated through the content validity method using 10 experts in the area of hospitals' accreditation, and its reliability was calculated through test-retest method with a correlation coefficient of 0.90. The response rate was 90 percent. The Likert scale was used for the questions, and data analysis was conducted through SPSS version 21 Descriptive statistics was presented via tables and normal distributions of data and means. Analytical methods, including t-test, Mann-Whitney, Spearman, and Kruskal-Wallis, were used for presenting inferential statistics. The study showed that from the viewpoint of senior and middle managers of the hospitals considered in this study, these hospitals are indeed ready for acceptance and establishment of HROs model. A significant relationship was showed between HROs model and its elements with demographic details of managers like their age, work experience, management experience, and level of management. Although the studied hospitals, as viewed by their managers, are capable of attaining the goals of HROs, it seems there are a lot of challenges in this way. Therefore, it is suggested that a detailed audit is conducted among hospitals' current status regarding different characteristics of HROs, and workshops are held for medical and non-medical employees and managers of hospitals as an influencing factor; and a re-assessment process afterward, can help moving the hospitals from their current position towards an HROs culture.
Kylander, M E; Weiss, D J; Jeffries, T E; Kober, B; Dolgopolova, A; Garcia-Sanchez, R; Coles, B J
2007-01-16
An analytical protocol for rapid and reliable laser ablation-quadrupole (LA-Q)- and multi-collector (MC-) inductively coupled plasma-mass spectrometry (ICP-MS) analysis of Pb isotope ratios ((207)Pb/(206)Pb and (208)Pb/(206)Pb) in peats and lichens is developed. This technique is applicable to source tracing atmospheric Pb deposition in biomonitoring studies and sample screening. Reference materials and environmental samples were dry ashed and pressed into pellets for introduction by laser ablation. No binder was used to reduce contamination. LA-MC-ICP-MS internal and external precisions were <1.1% and <0.3%, respectively, on both (207)Pb/(206)Pb and (208)Pb/(206)Pb ratios. LA-Q-ICP-MS internal precisions on (207)Pb/(206)Pb and (208)Pb/(206)Pb ratios were lower with values for the different sample sets <14.3% while external precisions were <2.9%. The level of external precision acquired in this study is high enough to distinguish between most modern Pb sources. LA-MC-ICP-MS measurements differed from thermal ionisation mass spectrometry (TIMS) values by 1% or less while the accuracy obtained using LA-Q-ICP-MS compared to solution MC-ICP-MS was 3.1% or better using a run bracketing (RB) mass bias correction method. Sample heterogeneity and detector switching when measuring (208)Pb by Q-ICP-MS are identified as sources of reduced analytical performance.
DOT National Transportation Integrated Search
2014-01-01
The objective of this project was to develop system designs for programs to monitor travel time reliability and to prepare a guidebook that practitioners and others can use to design, build, operate, and maintain such systems. Generally, such travel ...
Pilot testing of SHRP 2 reliability data and analytical products: Minnesota. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
The objective of this project was to develop system designs for programs to monitor travel time reliability and to prepare a guidebook that practitioners and others can use to design, build, operate, and maintain such systems. Generally, such travel ...
Meta-Analysis of Coefficient Alpha
ERIC Educational Resources Information Center
Rodriguez, Michael C.; Maeda, Yukiko
2006-01-01
The meta-analysis of coefficient alpha across many studies is becoming more common in psychology by a methodology labeled reliability generalization. Existing reliability generalization studies have not used the sampling distribution of coefficient alpha for precision weighting and other common meta-analytic procedures. A framework is provided for…
Yasukawa, Keiko; Shimosawa, Tatsuo; Okubo, Shigeo; Yatomi, Yutaka
2018-01-01
Background Human mercaptalbumin and human non-mercaptalbumin have been reported as markers for various pathological conditions, such as kidney and liver diseases. These markers play important roles in redox regulations throughout the body. Despite the recognition of these markers in various pathophysiologic conditions, the measurements of human mercaptalbumin and non-mercaptalbumin have not been popular because of the technical complexity and long measurement time of conventional methods. Methods Based on previous reports, we explored the optimal analytical conditions for a high-performance liquid chromatography method using an anion-exchange column packed with a hydrophilic polyvinyl alcohol gel. The method was then validated using performance tests as well as measurements of various patients' serum samples. Results We successfully established a reliable high-performance liquid chromatography method with an analytical time of only 12 min per test. The repeatability (within-day variability) and reproducibility (day-to-day variability) were 0.30% and 0.27% (CV), respectively. A very good correlation was obtained with the results of the conventional method. Conclusions A practical method for the clinical measurement of human mercaptalbumin and non-mercaptalbumin was established. This high-performance liquid chromatography method is expected to be a powerful tool enabling the expansion of clinical usefulness and ensuring the elucidation of the roles of albumin in redox reactions throughout the human body.
UPLC-MS/MS determination of ptaquiloside and pterosin B in preserved natural water.
Clauson-Kaas, Frederik; Hansen, Hans Christian Bruun; Strobel, Bjarne W
2016-11-01
The naturally occurring carcinogen ptaquiloside and its degradation product pterosin B are found in water leaching from bracken stands. The objective of this work is to present a new sample preservation method and a fast UPLC-MS/MS method for quantification of ptaquiloside and pterosin B in environmental water samples, employing a novel internal standard. A faster, reliable, and efficient method was developed for isolation of high purity ptaquiloside and pterosin B from plant material for use as analytical standards, with purity verified by 1 H-NMR. The chemical analysis was performed by cleanup and preconcentration of samples with solid phase extraction, before analyte quantification with UPLC-MS/MS. By including gradient elution and optimizing the liquid chromatography mobile phase buffer system, a total run cycle of 5 min was achieved, with method detection limits, including preconcentration, of 8 and 4 ng/L for ptaquiloside and pterosin B, respectively. The use of loganin as internal standard improved repeatability of the determination of both analytes, though it could not be employed for sample preparation. Buffering raw water samples in situ with ammonium acetate to pH ∼5.5 decisively increased sample integrity at realistic transportation and storing conditions prior to extraction. Groundwater samples collected in November 2015 at the shallow water table below a Danish bracken stand were preserved and analyzed using the above methods, and PTA concentrations of 3.8 ± 0.24 μg/L (±sd, n = 3) were found, much higher than previously reported. Graphical abstract Workflow overview of ptaquiloside determination.
Tang, Wenfu; Wan, Meihua; Zhu, Zhengyan; Chen, Guanyuan; Huang, Xi
2008-01-01
Background Dachengqi Tang (DT) is a common traditional Chinese medicine formula for expelling neire ('internal heat') in the stomach and intestines. There was no reliable analytical method available for the quality control of DT. Methods A high-performance liquid chromatography (HPLC) method with a reverse phase C18 column (150 × 4.6 mm) was developed. The mobile phase was methanol with 0.2% acetic acid. Eight markers including naringin, hesperidin, aloe emodin, rhein, honokiol, magnolol, emodin and chrysophanol were determined. Results Regression analysis revealed a linear relationship between the concentrations of the markers and the peak area ratio of the standards and internal standard. The limit of detection (S/N = 3) and the limit of qualification (RSD < 20%) ranged from 0.21 to 0.43 ng/μl and 0.76 to 1.74 ng/μl respectively. The recovery was between 95.6% and 103.4%. The tests on the samples from three batches of DT showed that the profiles of the markers did not vary significantly among batches. Conclusion A reliable HPLC method for simultaneous determination of the eight markers in DT was developed. PMID:18445276
Advanced DNA- and Protein-based Methods for the Detection and Investigation of Food Allergens.
Prado, M; Ortea, I; Vial, S; Rivas, J; Calo-Mata, P; Barros-Velázquez, J
2016-11-17
Currently, food allergies are an important health concern worldwide. The presence of undeclared allergenic ingredients or the presence of traces of allergens due to contamination during food processing poses a great health risk to sensitized individuals. Therefore, reliable analytical methods are required to detect and identify allergenic ingredients in food products. The present review addresses the recent developments regarding the application of DNA- and protein-based methods for the detection of allergenic ingredients in foods. The fitness-for-purpose of reviewed methodology will be discussed, and future trends will be highlighted. Special attention will be given to the evaluation of the potential of newly developed and promising technologies that can improve the detection and identification of allergenic ingredients in foods, such as the use of biosensors and/or nanomaterials to improve detection limits, specificity, ease of use, or to reduce the time of analysis. Such rapid food allergen test methods are required to facilitate the reliable detection of allergenic ingredients by control laboratories, to give the food industry the means to easily determine whether its product has been subjected to cross-contamination and, simultaneously, to identify how and when this cross-contamination occurred.
Quantitation of Permethylated N-Glycans through Multiple-Reaction Monitoring (MRM) LC-MS/MS
Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L.; Mechref, Yehia
2015-01-01
The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan structures was determined to be 30% while it was found to be 35% for either fucosylated or sialylated structures The optimum CE for mannose and complex type N-glycan structures was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan structures in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these structures was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitudes. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan structures enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples. PMID:25698222
Lein, Sabine; Van Boven, Maurits; Holser, Ron; Decuypere, Eddy; Flo, Gerda; Lievens, Sylvia; Cokelaere, Marnix
2002-11-22
Separate methods for the analyses of soluble carbohydrates in different plants and simmondsins in jojoba seed meal are described. A reliable gas chromatographic procedure for the simultaneous quantification of D-pinitol, myo-inositoL sucrose, 5-alpha-D-galactopyranosyl-D-pinitol. 2-alpha-D-galactopyranosyl-D-pinitol, simmondsin, 4-demethylsimmondsin, 5-demethylsimmondsin and 4,5-didemethylsimmondsin as trimethylsilyl derivatives in jojoba seed meal has been developed. The study of different extraction mixtures allowed for the quantitative recovery of the 9 analytes by a mixture of methanol-water (80:20, v/v) in the concentration range between 0.1 and 4%. Comparison of the separation parameters on three different capillary stationary phases with MS detection allowed for the choice of the optimal gas chromatographic conditions for baseline separation of the analytes.
A hybrid approach to near-optimal launch vehicle guidance
NASA Technical Reports Server (NTRS)
Leung, Martin S. K.; Calise, Anthony J.
1992-01-01
This paper evaluates a proposed hybrid analytical/numerical approach to launch-vehicle guidance for ascent to orbit injection. The feedback-guidance approach is based on a piecewise nearly analytic zero-order solution evaluated using a collocation method. The zero-order solution is then improved through a regular perturbation analysis, wherein the neglected dynamics are corrected in the first-order term. For real-time implementation, the guidance approach requires solving a set of small dimension nonlinear algebraic equations and performing quadrature. Assessment of performance and reliability are carried out through closed-loop simulation for a vertically launched 2-stage heavy-lift capacity vehicle to a low earth orbit. The solutions are compared with optimal solutions generated from a multiple shooting code. In the example the guidance approach delivers over 99.9 percent of optimal performance and terminal constraint accuracy.
Relational, Structural, and Semantic Analysis of Graphical Representations and Concept Maps
ERIC Educational Resources Information Center
Ifenthaler, Dirk
2010-01-01
The demand for good instructional environments presupposes valid and reliable analytical instruments for educational research. This paper introduces the "SMD Technology" (Surface, Matching, Deep Structure), which measures relational, structural, and semantic levels of graphical representations and concept maps. The reliability and validity of the…
[Construction of a psychological aging scale for healthy people].
Lin, Fei; Long, Yao; Zeng, Ni; Wu, Lei; Huang, Helang
2017-04-28
To construct a psychological aging scale, and to provide a tool and indexes for scientific evaluation on aging. Methods: The age-related psychological items were collected through literature screening and expert interview. The importance, feasibilityand the degree of authority for the psychological index system were graded by two rounds of Delphi method. Using analytic hierarchy process, the weight of dimensions and items were determined. The analysis for internal consistency reliability, correlation and exploratory factor was performed to evaluate the reliability and validity of the scales. Results: By two rounds of Delphi method, 17 experts offered the results as follows: the coefficient of expert authorities was 0.88±0.06, the coordination coefficients for the importance and feasibility in second round were 0.456 (P<0.01) and 0.666 (P<0.01), respectively. The consistency was good. The psychological aging scale for healthy people included 4 dimensions as follows: cognitive function, emotion, personality and motivation. The weight coefficients for the 4 dimensions were 0.338, 0.250, 0.166 and 0.258, respectively. The Cronbach's α coefficient for the scale was 0.822, the reliability was 0.817, the content validity index (CVI) was 0.847, and the cumulative contribution rate for the 5 factors was51.42%. Conclusion: The psychological aging scale is satisfied, which can provide reference for the evaluation for aging. The indicators were representative and well-recognized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munson, Matthew S.; Karp, Eric M.; Nimlos, Claire T.
Biomass conversion processes such as pretreatment, liquefaction, and pyrolysis often produce complex mixtures of intermediates that are a substantial challenge to analyze rapidly and reliably. To characterize these streams more comprehensively and efficiently, new techniques are needed to track species through biomass deconstruction and conversion processes. Here, we present the application of an emerging analytical method, gradient elution moving boundary electrophoresis (GEMBE), to quantify a suite of acids in a complex, biomass-derived streams from alkaline pretreatment of corn stover. GEMBE offers distinct advantages over common chromatography-spectrometry analytical approaches in terms of analysis time, sample preparation requirements, and cost of equipment.more » As demonstrated here, GEMBE is able to track 17 distinct compounds (oxalate, formate, succinate, malate, acetate, glycolate, protocatechuate, 3-hydroxypropanoate, lactate, glycerate, 2-hydroxybutanoate, 4-hydroxybenzoate, vanillate, p-coumarate, ferulate, sinapate, and acetovanillone). The lower limit of detection was compound dependent and ranged between 0.9 and 3.5 umol/L. Results from GEMBE were similar to recent results from an orthogonal method based on GCxGC-TOF/MS. Altogether, GEMBE offers a rapid, robust approach to analyze complex biomass-derived samples, and given the ease and convenience of deployment, may offer an analytical solution for online tracking of multiple types of biomass streams.« less
Learning style preferences of nursing students at two universities in Iran and Malaysia
Abdollahimohammad, Abdolghani; Ja’afar, Rogayah
2014-01-01
Purpose: Learning style preferences vary within the nursing field and there is no consensus on a predominant learning style preference in nursing students. The current study compared the learning style preferences of nursing students at two universities in Iran and Malaysia. Methods: A purposive sampling method was used to collect data from the two study populations. Data were collected using the Learning Style Scale (LSS), which is a valid and reliable inventory. The LSS consists of 22 items with five subscales including perceptive, solitary, analytic, imaginative, and competitive. The questionnaires were distributed at the end of the academic year during regular class time for optimum response. The Mann-Whitney U-test was used to compare the learning style preferences between the two study populations. Results: A significant difference was found in perceptive, solitary, and analytic learning styles between two groups of nursing students. However, there was no significant difference in imaginative and competitive learning styles between the two groups. Most of the students were in the middle range of the learning styles. Conclusion: There were similarities and differences in learning style preferences between Zabol Medical Sciences University (ZBMU) and University Sains Malaysia (USM) nursing students. The USM nursing students were more sociable and analytic learners, whereas the ZBMU nursing students were more solitary and perceptive learners. PMID:25417864
Munson, Matthew S.; Karp, Eric M.; Nimlos, Claire T.; ...
2016-09-27
Biomass conversion processes such as pretreatment, liquefaction, and pyrolysis often produce complex mixtures of intermediates that are a substantial challenge to analyze rapidly and reliably. To characterize these streams more comprehensively and efficiently, new techniques are needed to track species through biomass deconstruction and conversion processes. Here, we present the application of an emerging analytical method, gradient elution moving boundary electrophoresis (GEMBE), to quantify a suite of acids in a complex, biomass-derived streams from alkaline pretreatment of corn stover. GEMBE offers distinct advantages over common chromatography-spectrometry analytical approaches in terms of analysis time, sample preparation requirements, and cost of equipment.more » As demonstrated here, GEMBE is able to track 17 distinct compounds (oxalate, formate, succinate, malate, acetate, glycolate, protocatechuate, 3-hydroxypropanoate, lactate, glycerate, 2-hydroxybutanoate, 4-hydroxybenzoate, vanillate, p-coumarate, ferulate, sinapate, and acetovanillone). The lower limit of detection was compound dependent and ranged between 0.9 and 3.5 umol/L. Results from GEMBE were similar to recent results from an orthogonal method based on GCxGC-TOF/MS. Altogether, GEMBE offers a rapid, robust approach to analyze complex biomass-derived samples, and given the ease and convenience of deployment, may offer an analytical solution for online tracking of multiple types of biomass streams.« less
[Reliability theory based on quality risk network analysis for Chinese medicine injection].
Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui
2014-08-01
A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.
On the analytic and numeric optimisation of airplane trajectories under real atmospheric conditions
NASA Astrophysics Data System (ADS)
Gonzalo, J.; Domínguez, D.; López, D.
2014-12-01
From the beginning of aviation era, economic constraints have forced operators to continuously improve the planning of the flights. The revenue is proportional to the cost per flight and the airspace occupancy. Many methods, the first started in the middle of last century, have explore analytical, numerical and artificial intelligence resources to reach the optimal flight planning. In parallel, advances in meteorology and communications allow an almost real-time knowledge of the atmospheric conditions and a reliable, error-bounded forecast for the near future. Thus, apart from weather risks to be avoided, airplanes can dynamically adapt their trajectories to minimise their costs. International regulators are aware about these capabilities, so it is reasonable to envisage some changes to allow this dynamic planning negotiation to soon become operational. Moreover, current unmanned airplanes, very popular and often small, suffer the impact of winds and other weather conditions in form of dramatic changes in their performance. The present paper reviews analytic and numeric solutions for typical trajectory planning problems. Analytic methods are those trying to solve the problem using the Pontryagin principle, where influence parameters are added to state variables to form a split condition differential equation problem. The system can be solved numerically -indirect optimisation- or using parameterised functions -direct optimisation-. On the other hand, numerical methods are based on Bellman's dynamic programming (or Dijkstra algorithms), where the fact that two optimal trajectories can be concatenated to form a new optimal one if the joint point is demonstrated to belong to the final optimal solution. There is no a-priori conditions for the best method. Traditionally, analytic has been more employed for continuous problems whereas numeric for discrete ones. In the current problem, airplane behaviour is defined by continuous equations, while wind fields are given in a discrete grid at certain time intervals. The research demonstrates advantages and disadvantages of each method as well as performance figures of the solutions found for typical flight conditions under static and dynamic atmospheres. This provides significant parameters to be used in the selection of solvers for optimal trajectories.
Optimization of the Determination Method for Dissolved Cyanobacterial Toxin BMAA in Natural Water.
Yan, Boyin; Liu, Zhiquan; Huang, Rui; Xu, Yongpeng; Liu, Dongmei; Lin, Tsair-Fuh; Cui, Fuyi
2017-10-17
There is a serious dispute on the existence of β-N-methylamino-l-alanine (BMAA) in water, which is a neurotoxin that may cause amyotrophic lateral sclerosis/Parkinson's disease (ALS/PDC) and Alzheimer' disease. It is believed that a reliable and sensitive analytical method for the determination of BMAA is urgently required to resolve this dispute. In the present study, the solid phase extraction (SPE) procedure and the analytical method for dissolved BMAA in water were investigated and optimized. The results showed both derivatized and underivatized methods were qualified for the measurement of BMAA and its isomer in natural water, and the limit of detection and the precision of the two methods were comparable. Cartridge characteristics and SPE conditions could greatly affect the SPE performance, and the competition of natural organic matter is the primary factor causing the low recovery of BMAA, which was reduced from approximately 90% in pure water to 38.11% in natural water. The optimized SPE method for BMAA was a combination of rinsed SPE cartridges, controlled loading/elution rates and elution solution, evaporation at 55 °C, reconstitution of a solution mixture, and filtration by polyvinylidene fluoride membrane. This optimized method achieved > 88% recovery of BMAA in both algal solution and river water. The developed method can provide an efficient way to evaluate the actual concentration levels of BMAA in actual water environments and drinking water systems.
Gao, Hui; Yang, Minli; Wang, Minglin; Zhao, Yansheng; Cao, Ya; Chu, Xiaogang
2013-01-01
A method combining SPE with HPLC/electrospray ionization-MS/MS was developed for simultaneous determination of 30 synthetic food additives, including synthetic colorants, preservatives, and sweeteners in soft drinks. All targets were efficiently separated using the optimized chromatographic and MS conditions and parameters in a single run within 18 min. The LOD of the analytes ranged from 0.01 to 20 microg/kg, and the method was validated with recoveries in the 80.8 to 106.4% range. This multisynthetic additive method was found to be accurate and reliable and will be useful to ensure the safety of food products, such as the labeling and proper use of synthetic food additives in soft drinks.
Parkinson, I S; Ward, M K; Kerr, D N
1982-10-27
A simple but reliable method for the routine determination of aluminium in serum and water by flameless atomic absorption spectrometry is described. No preparatory procedures are required for water samples, although serum is mixed with a wetting agent (Triton X-100) to allow complete combustion of the samples and to improve analytical precision. Precautions to prevent contamination during sample handling are discussed and instrumental parameters are defined. The method has a sensitivity of 35.5 pg and detection limits of 2.3 micrograms Al/l for serum and 1.3 micrograms Al/l for water. The method was used to determine the aluminium concentration in serum of 46 normal subjects. The mean aluminium content was 7.3 micrograms/l (range 2--15 micrograms/l.
NASA Astrophysics Data System (ADS)
Şenol, Mehmet; Alquran, Marwan; Kasmaei, Hamed Daei
2018-06-01
In this paper, we present analytic-approximate solution of time-fractional Zakharov-Kuznetsov equation. This model demonstrates the behavior of weakly nonlinear ion acoustic waves in a plasma bearing cold ions and hot isothermal electrons in the presence of a uniform magnetic field. Basic definitions of fractional derivatives are described in the Caputo sense. Perturbation-iteration algorithm (PIA) and residual power series method (RPSM) are applied to solve this equation with success. The convergence analysis is also presented for both methods. Numerical results are given and then they are compared with the exact solutions. Comparison of the results reveal that both methods are competitive, powerful, reliable, simple to use and ready to apply to wide range of fractional partial differential equations.
Gu, Jifeng; Wu, Weijun; Huang, Mengwei; Long, Fen; Liu, Xinhua; Zhu, Yizhun
2018-04-11
A method for high-performance liquid chromatography coupled with linear ion trap quadrupole Orbitrap high-resolution mass spectrometry (HPLC-LTQ-Orbitrap MS) was developed and validated for the qualitative and quantitative assessment of Shejin-liyan Granule. According to the fragmentation mechanism and high-resolution MS data, 54 compounds, including fourteen isoflavones, eleven ligands, eight flavonoids, six physalins, six organic acids, four triterpenoid saponins, two xanthones, two alkaloids, and one licorice coumarin, were identified or tentatively characterized. In addition, ten of the representative compounds (matrine, galuteolin, tectoridin, iridin, arctiin, tectorigenin, glycyrrhizic acid, irigenin, arctigenin, and irisflorentin) were quantified using the validated HPLC-LTQ-Orbitrap MS method. The method validation showed a good linearity with coefficients of determination (r²) above 0.9914 for all analytes. The accuracy of the intra- and inter-day variation of the investigated compounds was 95.0-105.0%, and the precision values were less than 4.89%. The mean recoveries and reproducibilities of each analyte were 95.1-104.8%, with relative standard deviations below 4.91%. The method successfully quantified the ten compounds in Shejin-liyan Granule, and the results show that the method is accurate, sensitive, and reliable.
Grandin, Flore; Picard-Hagen, Nicole; Gayrard, Véronique; Puel, Sylvie; Viguié, Catherine; Toutain, Pierre-Louis; Debrauwer, Laurent; Lacroix, Marlène Z
2017-12-01
Regulatory measures and public concerns regarding bisphenol A (BPA) have led to its replacement by structural analogues, such as Bisphenol S (BPS), in consumer products. At present, no toxicokinetic investigations have been conducted to assess the factors determining human internal exposure to BPS for subsequent risk assessment. Toxicokinetic studies require reliable analytical methods to measure the plasma concentrations of BPS and its main conjugated metabolite, BPS-glucuronide (BPS-G). An efficient on-line SPE-UPLC-MS/MS method for the simultaneous quantification of BPS and BPS-G in ovine plasma was therefore developed and validated in accordance with the European Medicines Agency guidelines for bioanalytical method validation. This method has a limit of quantification of 3ngmL -1 for BPS and 10ngmL -1 for BPS-G, an analytical capacity of 200 samples per day, and is particularly well suited to toxicokinetic studies. Use of this method in toxicokinetic studies in sheep showed that BPS, like BPA, is efficiently metabolized into its glucuronide form. However, the clearances and distributions of BPS and BPS-G were lower than those of the corresponding unconjugated and glucuroconjugated forms of BPA. Copyright © 2017 Elsevier B.V. All rights reserved.
Application of multiplex arrays for cytokine and chemokine profiling of bile.
Kemp, Troy J; Castro, Felipe A; Gao, Yu-Tang; Hildesheim, Allan; Nogueira, Leticia; Wang, Bing-Sheng; Sun, Lu; Shelton, Gloriana; Pfeiffer, Ruth M; Hsing, Ann W; Pinto, Ligia A; Koshiol, Jill
2015-05-01
Gallbladder disease is highly related to inflammation, but the inflammatory processes are not well understood. Bile provides a direct substrate in assessing the local inflammatory response that develops in the gallbladder. To assess the reproducibility of measuring inflammatory markers in bile, we designed a methods study of 69 multiplexed immune-related markers measured in bile obtained from gallstone patients. To evaluate assay performance, a total of 18 bile samples were tested twice within the same plate for each analyte, and the 18 bile samples were tested on two different days for each analyte. We used the following performance parameters: detectability, coefficient of variation (CV), intraclass correlation coefficient (ICC), and percent agreement (concordance among replicate measures above and below detection limit). Furthermore, we examined the association of analyte levels with gallstone characteristics such as type, numbers, and size. All but 3 analytes (Stem Cell Factor, SCF; Thrombopoietin, TPO; sIL-1RI) were detectable in bile. 52 of 69 (75.4%) analytes had detectable levels for at least 50% of the subjects tested. The within-plate CVs were ⩽25% for 53 of 66 (80.3%) detectable analytes, and across-plate CVs were ⩽25% for 32 of 66 (48.5%) detectable analytes. Moreover, 64 of 66 (97.0%) analytes had ICC values of at least 0.8. Lastly, the percent agreement was high between replicates for all of the analytes (median; within plate, 97.2%; across plate, 97.2%). In exploratory analyses, we assessed analyte levels by gallstone characteristics and found that levels for several analytes decreased with increasing size of the largest gallstone per patient. Our data suggest that multiplex assays can be used to reliably measure cytokines and chemokines in bile. In addition, gallstone size was inversely related to the levels of select analytes, which may aid in identifying critical pathways and mechanisms associated with the pathogenesis of gallbladder diseases. Copyright © 2015 Elsevier Ltd. All rights reserved.
Visvanathan, Rizliya; Jayathilake, Chathuni; Liyanage, Ruvini
2016-11-15
For the first time, a reliable, simple, rapid and high-throughput analytical method for the detection and quantification of α-amylase inhibitory activity using the glucose assay kit was developed. The new method facilitates rapid screening of a large number of samples, reduces labor, time and reagents and is also suitable for kinetic studies. This method is based on the reaction of maltose with glucose oxidase (GOD) and the development of a red quinone. The test is done in microtitre plates with a total volume of 260μL and an assay time of 40min including the pre-incubation steps. The new method is tested for linearity, sensitivity, precision, reproducibility and applicability. The new method is also compared with the most commonly used 3,5-dinitrosalicylic acid (DNSA) method for determining α-amylase activity. Copyright © 2016 Elsevier Ltd. All rights reserved.
Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.
Bergeron, Dominic; Tremblay, A-M S
2016-08-01
Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.
Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation
NASA Astrophysics Data System (ADS)
Bergeron, Dominic; Tremblay, A.-M. S.
2016-08-01
Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.
Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.
2016-01-01
Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson’s disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Conclusions Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson’s disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer’s, Huntington’s, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications. PMID:27494614
He, Xiao-Mei; Ding, Jun; Yu, Lei; Hussain, Dilshad; Feng, Yu-Qi
2016-09-01
Quantitative analysis of small molecules by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) has been a challenging task due to matrix-derived interferences in low m/z region and poor reproducibility of MS signal response. In this study, we developed an approach by applying black phosphorus (BP) as a matrix-assisted laser desorption ionization (MALDI) matrix for the quantitative analysis of small molecules for the first time. Black phosphorus-assisted laser desorption/ionization mass spectrometry (BP/ALDI-MS) showed clear background and exhibited superior detection sensitivity toward quaternary ammonium compounds compared to carbon-based materials. By combining stable isotope labeling (SIL) strategy with BP/ALDI-MS (SIL-BP/ALDI-MS), a variety of analytes labeled with quaternary ammonium group were sensitively detected. Moreover, the isotope-labeled forms of analytes also served as internal standards, which broadened the analyte coverage of BP/ALDI-MS and improved the reproducibility of MS signals. Based on these advantages, a reliable method for quantitative analysis of aldehydes from complex biological samples (saliva, urine, and serum) was successfully established. Good linearities were obtained for five aldehydes in the range of 0.1-20.0 μM with correlation coefficients (R (2)) larger than 0.9928. The LODs were found to be 20 to 100 nM. Reproducibility of the method was obtained with intra-day and inter-day relative standard deviations (RSDs) less than 10.4 %, and the recoveries in saliva samples ranged from 91.4 to 117.1 %. Taken together, the proposed SIL-BP/ALDI-MS strategy has proved to be a reliable tool for quantitative analysis of aldehydes from complex samples. Graphical Abstract An approach for the determination of small molecules was developed by using black phosphorus (BP) as a matrix-assisted laser desorption ionization (MALDI) matrix.
Zeleny, Reinhard; Harbeck, Stefan; Schimmel, Heinz
2009-01-09
A liquid chromatography-electrospray ionisation tandem mass spectrometry method for the simultaneous detection and quantitation of 5-nitroimidazole veterinary drugs in lyophilised pork meat, the chosen format of a candidate certified reference material, has been developed and validated. Six analytes have been included in the scope of validation, i.e. dimetridazole (DMZ), metronidazole (MNZ), ronidazole (RNZ), hydroxymetronidazole (MNZOH), hydroxyipronidazole (IPZOH), and 2-hydroxymethyl-1-methyl-5-nitroimidazole (HMMNI). The analytes were extracted from the sample with ethyl acetate, chromatographically separated on a C(18) column, and finally identified and quantified by tandem mass spectrometry in the multiple reaction monitoring mode (MRM) using matrix-matched calibration and (2)H(3)-labelled analogues of the analytes (except for MNZOH, where [(2)H(3)]MNZ was used). The method was validated in accordance with Commission Decision 2002/657/EC, by determining selectivity, linearity, matrix effect, apparent recovery, repeatability and intermediate precision, decision limits and detection capabilities, robustness of sample preparation method, and stability of extracts. Recovery at 1 microg/kg level was at 100% (estimates in the range of 101-107%) for all analytes, repeatabilities and intermediate precisions at this level were in the range of 4-12% and 2-9%, respectively. Linearity of calibration curves in the working range 0.5-10 microg/kg was confirmed, with r values typically >0.99. Decision limits (CCalpha) and detection capabilities (CCbeta) according to ISO 11843-2 (calibration curve approach) were 0.29-0.44 and 0.36-0.54 microg/kg, respectively. The method reliably identifies and quantifies the selected nitroimidazoles in the reconstituted pork meat in the low and sub-microg/kg range and will be applied in an interlaboratory comparison for determining the mass fraction of the selected nitroimidazoles in the candidate reference material currently developed at IRMM.
Kim, Chansik; Ryu, Hong-Duck; Chung, Eu Gene; Kim, Yongseok
2018-05-01
The use of antibiotics and their occurrence in the environment have received significant attention in recent years owing to the generation of antibiotic-resistant bacteria. Antibiotic residues in water near livestock farming areas should be monitored to establish effective strategies for reducing the use of veterinary antibiotics. However, environmental water contamination resulting from veterinary antibiotics has not been studied extensively. In this work, we developed an analytical method for the simultaneous determination of multiple classes of veterinary antibiotic residues in environmental water using on-line solid-phase extraction (SPE)-high performance liquid chromatography (HPLC)-high resolution mass spectrometry (HRMS). Eighteen popular antibiotics (eight classes) were selected as target analytes based on veterinary antibiotics sales in South Korea in 2015. The developed method was validated by calibration-curve linearities, precisions, relative recoveries, and method detection limits (MDLs)/limits of quantification (LOQs) of the selected antibiotics, and applied to the analysis of environmental water samples (groundwater, river water, and wastewater-treatment-plant effluent). All calibration curves exhibited r 2 > 0.995 with MDLs ranging from 0.2 to 11.9 ng/L. Relative recoveries were between 50 and 150% with coefficients of variation below 20% for all analytes (spiked at 500 ng/L) in groundwater and river water samples. Relative standard deviations (RSDs) of standard-spiked samples were lower than 7% for all antibiotics. The on-line SPE system eliminates human-based SPE errors and affords excellent method reproducibility. Amoxicillin, ampicillin, clopidol, fenbendazole, flumequine, lincomycin, sulfadiazine, and trimethoprim were detected in environmental water samples in concentrations ranging from 1.26 to 127.49 ng/L. The developed method is a reliable analytical technique for the potential routine monitoring of veterinary antibiotics. Copyright © 2018 Elsevier B.V. All rights reserved.
Santelmann, Hanno; Franklin, Jeremy; Bußhoff, Jana; Baethge, Christopher
2015-11-01
Schizoaffective disorder is a frequent diagnosis, and its reliability is subject to ongoing discussion. We compared the diagnostic reliability of schizoaffective disorder with its main differential diagnoses. We systematically searched Medline, Embase, and PsycInfo for all studies on the test-retest reliability of the diagnosis of schizoaffective disorder as compared with schizophrenia, bipolar disorder, and unipolar depression. We used meta-analytic methods to describe and compare Cohen's kappa as well as positive and negative agreement. In addition, multiple pre-specified and post hoc subgroup and sensitivity analyses were carried out. Out of 4,415 studies screened, 49 studies were included. Test-retest reliability of schizoaffective disorder was consistently lower than that of schizophrenia (in 39 out of 42 studies), bipolar disorder (27/33), and unipolar depression (29/35). The mean difference in kappa between schizoaffective disorder and the other diagnoses was approximately 0.2, and mean Cohen's kappa for schizoaffective disorder was 0.50 (95% confidence interval: 0.40-0.59). While findings were unequivocal and homogeneous for schizoaffective disorder's diagnostic reliability relative to its three main differential diagnoses (dichotomous: smaller versus larger), heterogeneity was substantial for continuous measures, even after subgroup and sensitivity analyses. In clinical practice and research, schizoaffective disorder's comparatively low diagnostic reliability should lead to increased efforts to correctly diagnose the disorder. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Ma, Emily; Vetter, Joel; Bliss, Laura; Lai, H. Henry; Mysorekar, Indira U.
2016-01-01
Overactive bladder (OAB) is a common debilitating bladder condition with unknown etiology and limited diagnostic modalities. Here, we explored a novel high-throughput and unbiased multiplex approach with cellular and molecular components in a well-characterized patient cohort to identify biomarkers that could be reliably used to distinguish OAB from controls or provide insights into underlying etiology. As a secondary analysis, we determined whether this method could discriminate between OAB and other chronic bladder conditions. We analyzed plasma samples from healthy volunteers (n = 19) and patients diagnosed with OAB, interstitial cystitis/bladder pain syndrome (IC/BPS), or urinary tract infections (UTI; n = 51) for proinflammatory, chemokine, cytokine, angiogenesis, and vascular injury factors using Meso Scale Discovery (MSD) analysis and urinary cytological analysis. Wilcoxon rank-sum tests were used to perform univariate and multivariate comparisons between patient groups (controls, OAB, IC/BPS, and UTI). Multivariate logistic regression models were fit for each MSD analyte on 1) OAB patients and controls, 2) OAB and IC/BPS patients, and 3) OAB and UTI patients. Age, race, and sex were included as independent variables in all multivariate analysis. Receiver operating characteristic (ROC) curves were generated to determine the diagnostic potential of a given analyte. Our findings demonstrate that five analytes, i.e., interleukin 4, TNF-α, macrophage inflammatory protein-1β, serum amyloid A, and Tie2 can reliably differentiate OAB relative to controls and can be used to distinguish OAB from the other conditions. Together, our pilot study suggests a molecular imbalance in inflammatory proteins may contribute to OAB pathogenesis. PMID:27029431
An Empirical Evaluation of Factor Reliability.
ERIC Educational Resources Information Center
Jackson, Douglas N.; Morf, Martin E.
The psychometric reliability of a factor, defined as its generalizability across samples drawn from the same population of tests, is considered as a necessary precondition for the scientific meaningfulness of factor analytic results. A solution to the problem of generalizability is illustrated empirically on data from a set of tests designed to…
Investigating Reliabilities of Intraindividual Variability Indicators
ERIC Educational Resources Information Center
Wang, Lijuan; Grimm, Kevin J.
2012-01-01
Reliabilities of the two most widely used intraindividual variability indicators, "ISD[superscript 2]" and "ISD", are derived analytically. Both are functions of the sizes of the first and second moments of true intraindividual variability, the size of the measurement error variance, and the number of assessments within a burst. For comparison,…
Effects of Simple Leaching of Crushed and Powdered Materials on High-precision Pb Isotope Analyses
NASA Astrophysics Data System (ADS)
Todd, E.; Stracke, A.
2013-12-01
We present new results of simple leaching experiments on the Pb isotope composition of USGS standard reference material powders and on ocean island basalt whole rock splits and powders. Rock samples were leached with 6N HCl in two steps, first hot and then in an ultrasonic bath, and washed with ultrapure H2O before conventional sample digestion and chromatographic purification of Pb. Pb isotope analyses were determined with Tl-doped MC-ICP-MS. Intra- and inter-session analytical reproducibility of repeated analyses of both synthetic Pb solutions and Pb from single digests of chemically processed natural samples were generally < 100 ppm (2 S.D.). The comparison of leached and unleached samples shows that leaching reliably removes variable amounts of different contaminants for different starting materials. For repeated digests of a single sample, the leached samples reproduce better than the unleached ones, showing that leaching effectively removes heterogeneously distributed extraneous Pb. However, the reproducibility of repeated digests of variably contaminated natural samples is up to an order of magnitude worse than the analytical reproducibility of ca. 100 ppm. More complex leaching methods (e.g., Nobre Silva et al., 2009) yield Pb isotope ratios within error of and with similar reproducibility to our method, showing that the simple leaching method is reliable. The remaining Pb isotope heterogeneity of natural samples, which typically exceeds 100 ppm, is thus attributed to inherent isotopic sample heterogeneity. Tl-doped MC-ICP-MS Pb ratio determination is therefore a sufficiently precise method for Pb isotope analyses in natural rocks. More precise Pb double- or triple-spike methods (e.g., Galer, 1999; Thirlwall, 2000), may exploit their full potential only in cases where natural isotopic sample heterogeneity is demonstrably negligible. References: Galer, S., 1999, Chem. Geol. 157, 255-274. Nobre Silva, et al. 2009, Geochemistry Geophysics Geosystems 10, Q08012. Thirlwall, M.F., 2000, Chem. Geol. 163, 299-322.
Reliability Demonstration Approach for Advanced Stirling Radioisotope Generator
NASA Technical Reports Server (NTRS)
Ha, CHuong; Zampino, Edward; Penswick, Barry; Spronz, Michael
2010-01-01
Developed for future space missions as a high-efficiency power system, the Advanced Stirling Radioisotope Generator (ASRG) has a design life requirement of 14 yr in space following a potential storage of 3 yr after fueling. In general, the demonstration of long-life dynamic systems remains difficult in part due to the perception that the wearout of moving parts cannot be minimized, and associated failures are unpredictable. This paper shows a combination of systematic analytical methods, extensive experience gained from technology development, and well-planned tests can be used to ensure a high level reliability of ASRG. With this approach, all potential risks from each life phase of the system are evaluated and the mitigation adequately addressed. This paper also provides a summary of important test results obtained to date for ASRG and the planned effort for system-level extended operation.
Bertelli, Davide; Brighenti, Virginia; Marchetti, Lucia; Reik, Anna; Pellati, Federica
2018-06-01
Humulus lupulus L. (hop) represents one of the most cultivated crops, it being a key ingredient in the brewing process. Many health-related properties have been described for hop extracts, making this plant gain more interest in the field of pharmaceutical and nutraceutical research. Among the analytical tools available for the phytochemical characterization of plant extracts, quantitative nuclear magnetic resonance (qNMR) represents a new and powerful technique. In this ambit, the present study was aimed at the development of a new, simple, and efficient qNMR method for the metabolite fingerprinting of bioactive compounds in hop cones, taking advantage of the novel ERETIC 2 tool. To the best of our knowledge, this is the first attempt to apply this method to complex matrices of natural origin, such as hop extracts. The qNMR method set up in this study was applied to the quantification of both prenylflavonoids and bitter acids in eight hop cultivars. The performance of this analytical method was compared with that of HPLC-UV/DAD, which represents the most frequently used technique in the field of natural product analysis. The quantitative data obtained for hop samples by means of the two aforementioned techniques highlighted that the amount of bioactive compounds was slightly higher when qNMR was applied, although the order of magnitude of the values was the same. The accuracy of qNMR was comparable to that of the chromatographic method, thus proving to be a reliable tool for the analysis of these secondary metabolites in hop extracts. Graphical abstract Graphical abstract related to the extraction and analytical methods applied in this work for the analysis of bioactive compounds in Humulus lupulus L. (hop) cones.
Transfer coefficients in ultracold strongly coupled plasma
NASA Astrophysics Data System (ADS)
Bobrov, A. A.; Vorob'ev, V. S.; Zelener, B. V.
2018-03-01
We use both analytical and molecular dynamic methods for electron transfer coefficients in an ultracold plasma when its temperature is small and the coupling parameter characterizing the interaction of electrons and ions exceeds unity. For these conditions, we use the approach of nearest neighbor to determine the average electron (ion) diffusion coefficient and to calculate other electron transfer coefficients (viscosity and electrical and thermal conductivities). Molecular dynamics simulations produce electronic and ionic diffusion coefficients, confirming the reliability of these results. The results compare favorably with experimental and numerical data from earlier studies.
TWT transmitter fault prediction based on ANFIS
NASA Astrophysics Data System (ADS)
Li, Mengyan; Li, Junshan; Li, Shuangshuang; Wang, Wenqing; Li, Fen
2017-11-01
Fault prediction is an important component of health management, and plays an important role in the reliability guarantee of complex electronic equipments. Transmitter is a unit with high failure rate. The cathode performance of TWT is a common fault of transmitter. In this dissertation, a model based on a set of key parameters of TWT is proposed. By choosing proper parameters and applying adaptive neural network training model, this method, combined with analytic hierarchy process (AHP), has a certain reference value for the overall health judgment of TWT transmitters.
Detection of faults and software reliability analysis
NASA Technical Reports Server (NTRS)
Knight, J. C.
1986-01-01
Multiversion or N-version programming was proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. Specific topics addressed are: failure probabilities in N-version systems, consistent comparison in N-version systems, descriptions of the faults found in the Knight and Leveson experiment, analytic models of comparison testing, characteristics of the input regions that trigger faults, fault tolerance through data diversity, and the relationship between failures caused by automatically seeded faults.
Requirements for Successful Adoption of a Glucose Measurement System Into a Hospital POC Program.
Füzéry, Anna K; Cembrowski, George S
2016-07-01
Widespread and successful implementation of any glucose measurement system in a hospital point-of-care (POC) program requires a number of features in addition to accurate and reliable analytical performance. Such features include, but are not limited to, a system's glucose-hematocrit dependence, durability, information technology capabilities, and battery capacity and battery life. While the study of Ottiger et al in this issue supports the analytical accuracy and reliability of Bayer's CONTOUR XT® blood glucose monitoring system, the suitability of other features of this system for a hospital POC program remains to be established. © 2016 Diabetes Technology Society.
Development of Equivalent Material Properties of Microbump for Simulating Chip Stacking Packaging
Lee, Chang-Chun; Tzeng, Tzai-Liang; Huang, Pei-Chen
2015-01-01
A three-dimensional integrated circuit (3D-IC) structure with a significant scale mismatch causes difficulty in analytic model construction. This paper proposes a simulation technique to introduce an equivalent material composed of microbumps and their surrounding wafer level underfill (WLUF). The mechanical properties of this equivalent material, including Young’s modulus (E), Poisson’s ratio, shear modulus, and coefficient of thermal expansion (CTE), are directly obtained by applying either a tensile load or a constant displacement, and by increasing the temperature during simulations, respectively. Analytic results indicate that at least eight microbumps at the outermost region of the chip stacking structure need to be considered as an accurate stress/strain contour in the concerned region. In addition, a factorial experimental design with analysis of variance is proposed to optimize chip stacking structure reliability with four factors: chip thickness, substrate thickness, CTE, and E-value. Analytic results show that the most significant factor is CTE of WLUF. This factor affects microbump reliability and structural warpage under a temperature cycling load and high-temperature bonding process. WLUF with low CTE and high E-value are recommended to enhance the assembly reliability of the 3D-IC architecture. PMID:28793495
Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.
Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C
2016-09-01
Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.
Some Comments on Mapping from Disease-Specific to Generic Health-Related Quality-of-Life Scales
Palta, Mari
2013-01-01
An article by Lu et al. in this issue of Value in Health addresses the mapping of treatment or group differences in disease-specific measures (DSMs) of health-related quality of life onto differences in generic health-related quality-of-life scores, with special emphasis on how the mapping is affected by the reliability of the DSM. In the proposed mapping, a factor analytic model defines a conversion factor between the scores as the ratio of factor loadings. Hence, the mapping applies to convert true underlying scales and has desirable properties facilitating the alignment of instruments and understanding their relationship in a coherent manner. It is important to note, however, that when DSM means or differences in mean DSMs are estimated, their mapping is still of a measurement error–prone predictor, and the correct conversion coefficient is the true mapping multiplied by the reliability of the DSM in the relevant sample. In addition, the proposed strategy for estimating the factor analytic mapping in practice requires assumptions that may not hold. We discuss these assumptions and how they may be the reason we obtain disparate estimates of the mapping factor in an application of the proposed methods to groups of patients. PMID:23337233
Doses for post-Chernobyl epidemiological studies: are they reliable?
Drozdovitch, Vladimir; Chumak, Vadim; Kesminiene, Ausrele; Ostroumova, Evgenia; Bouville, André
2016-09-01
On 26 April 2016, thirty years will have elapsed since the occurrence of the Chernobyl accident, which has so far been the most severe in the history of the nuclear reactor industry. Numerous epidemiological studies were conducted to evaluate the possible health consequences of the accident. Since the credibility of the association between the radiation exposure and health outcome is highly dependent on the adequacy of the dosimetric quantities used in these studies, this paper makes an effort to overview the methods used to estimate individual doses and the associated uncertainties in the main analytical epidemiological studies (i.e. cohort or case-control) related to the Chernobyl accident. Based on the thorough analysis and comparison with other radiation studies, the authors conclude that individual doses for the Chernobyl analytical epidemiological studies have been calculated with a relatively high degree of reliability and well-characterized uncertainties, and that they compare favorably with many other non-Chernobyl studies. The major strengths of the Chernobyl studies are: (1) they are grounded on a large number of measurements, either performed on humans or made in the environment; and (2) extensive effort has been invested to evaluate the uncertainties associated with the dose estimates. Nevertheless, gaps in the methodology are identified and suggestions for the possible improvement of the current dose estimates are made.
Kwon, Yong-Kook; Bong, Yeon-Sik; Lee, Kwang-Sik; Hwang, Geum-Sook
2014-10-15
ICP-MS and (1)H NMR are commonly used to determine the geographical origin of food and crops. In this study, data from multielemental analysis performed by ICP-AES/ICP-MS and metabolomic data obtained from (1)H NMR were integrated to improve the reliability of determining the geographical origin of medicinal herbs. Astragalus membranaceus and Paeonia albiflora with different origins in Korea and China were analysed by (1)H NMR and ICP-AES/ICP-MS, and an integrated multivariate analysis was performed to characterise the differences between their origins. Four classification methods were applied: linear discriminant analysis (LDA), k-nearest neighbour classification (KNN), support vector machines (SVM), and partial least squares-discriminant analysis (PLS-DA). Results were compared using leave-one-out cross-validation and external validation. The integration of multielemental and metabolomic data was more suitable for determining geographical origin than the use of each individual data set alone. The integration of the two analytical techniques allowed diverse environmental factors such as climate and geology, to be considered. Our study suggests that an appropriate integration of different types of analytical data is useful for determining the geographical origin of food and crops with a high degree of reliability. Copyright © 2014 Elsevier Ltd. All rights reserved.
Enzinger, Ewald; Morrison, Geoffrey Stewart
2017-08-01
In a 2012 case in New South Wales, Australia, the identity of a speaker on several audio recordings was in question. Forensic voice comparison testimony was presented based on an auditory-acoustic-phonetic-spectrographic analysis. No empirical demonstration of the validity and reliability of the analytical methodology was presented. Unlike the admissibility standards in some other jurisdictions (e.g., US Federal Rule of Evidence 702 and the Daubert criteria, or England & Wales Criminal Practice Directions 19A), Australia's Unified Evidence Acts do not require demonstration of the validity and reliability of analytical methods and their implementation before testimony based upon them is presented in court. The present paper reports on empirical tests of the performance of an acoustic-phonetic-statistical forensic voice comparison system which exploited the same features as were the focus of the auditory-acoustic-phonetic-spectrographic analysis in the case, i.e., second-formant (F2) trajectories in /o/ tokens and mean fundamental frequency (f0). The tests were conducted under conditions similar to those in the case. The performance of the acoustic-phonetic-statistical system was very poor compared to that of an automatic system. Copyright © 2017 Elsevier B.V. All rights reserved.
Yang, Heejung; Kim, Hyun Woo; Kwon, Yong Soo; Kim, Ho Kyong; Sung, Sang Hyun
2017-09-01
Anthocyanins are potent antioxidant agents that protect against many degenerative diseases; however, they are unstable because they are vulnerable to external stimuli including temperature, pH and light. This vulnerability hinders the quality control of anthocyanin-containing berries using classical high-performance liquid chromatography (HPLC) analytical methodologies based on UV or MS chromatograms. To develop an alternative approach for the quality assessment and discrimination of anthocyanin-containing berries, we used MS spectral data acquired in a short analytical time rather than UV or MS chromatograms. Mixtures of anthocyanins were separated from other components in a short gradient time (5 min) due to their higher polarity, and the representative MS spectrum was acquired from the MS chromatogram corresponding to the mixture of anthocyanins. The chemometric data from the representative MS spectra contained reliable information for the identification and relative quantification of anthocyanins in berries with good precision and accuracy. This fast and simple methodology, which consists of a simple sample preparation method and short gradient analysis, could be applied to reliably discriminate the species and geographical origins of different anthocyanin-containing berries. These features make the technique useful for the food industry. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
A Novel IEEE 802.15.4e DSME MAC for Wireless Sensor Networks
Sahoo, Prasan Kumar; Pattanaik, Sudhir Ranjan; Wu, Shih-Lin
2017-01-01
IEEE 802.15.4e standard proposes Deterministic and Synchronous Multichannel Extension (DSME) mode for wireless sensor networks (WSNs) to support industrial, commercial and health care applications. In this paper, a new channel access scheme and beacon scheduling schemes are designed for the IEEE 802.15.4e enabled WSNs in star topology to reduce the network discovery time and energy consumption. In addition, a new dynamic guaranteed retransmission slot allocation scheme is designed for devices with the failure Guaranteed Time Slot (GTS) transmission to reduce the retransmission delay. To evaluate our schemes, analytical models are designed to analyze the performance of WSNs in terms of reliability, delay, throughput and energy consumption. Our schemes are validated with simulation and analytical results and are observed that simulation results well match with the analytical one. The evaluated results of our designed schemes can improve the reliability, throughput, delay, and energy consumptions significantly. PMID:28275216
A Novel IEEE 802.15.4e DSME MAC for Wireless Sensor Networks.
Sahoo, Prasan Kumar; Pattanaik, Sudhir Ranjan; Wu, Shih-Lin
2017-01-16
IEEE 802.15.4e standard proposes Deterministic and Synchronous Multichannel Extension (DSME) mode for wireless sensor networks (WSNs) to support industrial, commercial and health care applications. In this paper, a new channel access scheme and beacon scheduling schemes are designed for the IEEE 802.15.4e enabled WSNs in star topology to reduce the network discovery time and energy consumption. In addition, a new dynamic guaranteed retransmission slot allocation scheme is designed for devices with the failure Guaranteed Time Slot (GTS) transmission to reduce the retransmission delay. To evaluate our schemes, analytical models are designed to analyze the performance of WSNs in terms of reliability, delay, throughput and energy consumption. Our schemes are validated with simulation and analytical results and are observed that simulation results well match with the analytical one. The evaluated results of our designed schemes can improve the reliability, throughput, delay, and energy consumptions significantly.
Delatour, Vincent; Lalere, Beatrice; Saint-Albin, Karène; Peignaux, Maryline; Hattchouel, Jean-Marc; Dumont, Gilles; De Graeve, Jacques; Vaslin-Reimann, Sophie; Gillery, Philippe
2012-11-20
The reliability of biological tests is a major issue for patient care in terms of public health that involves high economic stakes. Reference methods, as well as regular external quality assessment schemes (EQAS), are needed to monitor the analytical performance of field methods. However, control material commutability is a major concern to assess method accuracy. To overcome material non-commutability, we investigated the possibility of using lyophilized serum samples together with a limited number of frozen serum samples to assign matrix-corrected target values, taking the example of glucose assays. Trueness of the current glucose assays was first measured against a primary reference method by using human frozen sera. Methods using hexokinase and glucose oxidase with spectroreflectometric detection proved very accurate, with bias ranging between -2.2% and +2.3%. Bias of methods using glucose oxidase with spectrophotometric detection was +4.5%. Matrix-related bias of the lyophilized materials was then determined and ranged from +2.5% to -14.4%. Matrix-corrected target values were assigned and used to assess trueness of 22 sub-peer groups. We demonstrated that matrix-corrected target values can be a valuable tool to assess field method accuracy in large scale surveys where commutable materials are not available in sufficient amount with acceptable costs. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.
1973-01-01
Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.
Analytical model of diffuse reflectance spectrum of skin tissue
NASA Astrophysics Data System (ADS)
Lisenko, S. A.; Kugeiko, M. M.; Firago, V. A.; Sobchuk, A. N.
2014-01-01
We have derived simple analytical expressions that enable highly accurate calculation of diffusely reflected light signals of skin in the spectral range from 450 to 800 nm at a distance from the region of delivery of exciting radiation. The expressions, taking into account the dependence of the detected signals on the refractive index, transport scattering coefficient, absorption coefficient and anisotropy factor of the medium, have been obtained in the approximation of a two-layer medium model (epidermis and dermis) for the same parameters of light scattering but different absorption coefficients of layers. Numerical experiments on the retrieval of the skin biophysical parameters from the diffuse reflectance spectra simulated by the Monte Carlo method show that commercially available fibre-optic spectrophotometers with a fixed distance between the radiation source and detector can reliably determine the concentration of bilirubin, oxy- and deoxyhaemoglobin in the dermis tissues and the tissue structure parameter characterising the size of its effective scatterers. We present the examples of quantitative analysis of the experimental data, confirming the correctness of estimates of biophysical parameters of skin using the obtained analytical expressions.
Ramirez, Daniela Andrea; Locatelli, Daniela Ana; Torres-Palazzolo, Carolina Andrea; Altamirano, Jorgelina Cecilia; Camargo, Alejandra Beatriz
2017-01-15
Organosulphur compounds (OSCs) present in garlic (Allium sativum L.) are responsible of several biological properties. Functional foods researches indicate the importance of quantifying these compounds in food matrices and biological fluids. For this purpose, this paper introduces a novel methodology based on dispersive liquid-liquid microextraction (DLLME) coupled to high performance liquid chromatography with ultraviolet detector (HPLC-UV) for the extraction and determination of organosulphur compounds in different matrices. The target analytes were allicin, (E)- and (Z)-ajoene, 2-vinyl-4H-1,2-dithiin (2-VD), diallyl sulphide (DAS) and diallyl disulphide (DADS). The microextraction technique was optimized using an experimental design, and the analytical performance was evaluated under optimum conditions. The desirability function presented an optimal value for 600μL of chloroform as extraction solvent using acetonitrile as dispersant. The method proved to be reliable, precise and accurate. It was successfully applied to determine OSCs in cooked garlic samples as well as blood plasma and digestive fluids. Copyright © 2016 Elsevier Ltd. All rights reserved.
Effect of birefringence of lens material on polarization status and optical imaging characteristics
NASA Astrophysics Data System (ADS)
Kim, Wan-Chin; Park, No-Cheol
2018-04-01
In most cases of molding with glass or optical polymers, it is expected that there will be birefringence caused by the internal mechanical stresses remaining in the molding material. The distribution of the residual stress can be annealed by slow cooling, but this approach is disadvantageous with respect to the shape accuracy and manufacturing time. In this study, we propose an analytical model to calculate the diffracted field near the focal plane by considering two primary parameters, the orientation angle of the fast axis and the path difference. In order to verify the reliability of the analytical model, we compared the measured beam spot of the F-theta lens of the laser scanning unit (LSU) with the analytical result. In addition, we analyzed the calculated result from the perspective of the polarization status in the exit pupil. The proposed analysis method can be applied to enhance the image quality for cases in which birefringence occurs in a lens material by suitably modeling the amplitude and phase of the incident light flux.
Differential Privacy Preserving in Big Data Analytics for Connected Health.
Lin, Chi; Song, Zihao; Song, Houbing; Zhou, Yanhong; Wang, Yi; Wu, Guowei
2016-04-01
In Body Area Networks (BANs), big data collected by wearable sensors usually contain sensitive information, which is compulsory to be appropriately protected. Previous methods neglected privacy protection issue, leading to privacy exposure. In this paper, a differential privacy protection scheme for big data in body sensor network is developed. Compared with previous methods, this scheme will provide privacy protection with higher availability and reliability. We introduce the concept of dynamic noise thresholds, which makes our scheme more suitable to process big data. Experimental results demonstrate that, even when the attacker has full background knowledge, the proposed scheme can still provide enough interference to big sensitive data so as to preserve the privacy.
Experimental and analytical research on the aerodynamics of wind driven turbines. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohrbach, C.; Wainauski, H.; Worobel, R.
1977-12-01
This aerodynamic research program was aimed at providing a reliable, comprehensive data base on a series of wind turbine models covering a broad range of the prime aerodynamic and geometric variables. Such data obtained under controlled laboratory conditions on turbines designed by the same method, of the same size, and tested in the same wind tunnel had not been available in the literature. Moreover, this research program was further aimed at providing a basis for evaluating the adequacy of existing wind turbine aerodynamic design and performance methodology, for assessing the potential of recent advanced theories and for providing a basismore » for further method development and refinement.« less
NASA Technical Reports Server (NTRS)
Israelsson, Ulf E. (Inventor); Strayer, Donald M. (Inventor)
1992-01-01
A contact-less method for determining transport critical current density and flux penetration depth in bulk superconductor material. A compressor having a hollow interior and a plunger for selectively reducing the free space area for distribution of the magnetic flux therein are formed of superconductor material. Analytical relationships, based upon the critical state model, Maxwell's equations and geometrical relationships define transport critical current density and flux penetration depth in terms of the initial trapped magnetic flux density and the ratio between initial and final magnetic flux densities whereby data may be reliably determined by means of the simple test apparatus for evaluating the current density and flux penetration depth.
Sensitivity and systematics of calorimetric neutrino mass experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nucciotti, A.; Cremonesi, O.; Ferri, E.
2009-12-16
A large calorimetric neutrino mass experiment using thermal detectors is expected to play a crucial role in the challenge for directly assessing the neutrino mass. We discuss and compare here two approaches for the estimation of the experimental sensitivity of such an experiment. The first method uses an analytic formulation and allows to obtain readily a close estimate over a wide range of experimental configurations. The second method is based on a Montecarlo technique and is more precise and reliable. The Montecarlo approach is then exploited to study some sources of systematic uncertainties peculiar to calorimetric experiments. Finally, the toolsmore » are applied to investigate the optimal experimental configuration of the MARE project.« less
An Analytical Methodology for Predicting Repair Time Distributions of Advanced Technology Aircraft.
1985-12-01
1984. 3. Barlow, Richard E. "Mathematical Theory of Reliabilitys A Historical Perspective." ZEEE Transactions on Reliability, 33. 16-19 (April 1984...Technology (AU), Wright-Patterson AFB OH, March 1971. 11. Coppola, Anthony. "Reliability Engineering of J- , Electronic Equipment," ZEEE Transactions on...1982. 64. Woodruff, Brian W. at al. "Modified Goodness-o-Fit Tests for Gamma Distributions with Unknown Location and Scale Parameters," ZEEE
The focus on sample quality: Influence of colon tissue collection on reliability of qPCR data
Korenkova, Vlasta; Slyskova, Jana; Novosadova, Vendula; Pizzamiglio, Sara; Langerova, Lucie; Bjorkman, Jens; Vycital, Ondrej; Liska, Vaclav; Levy, Miroslav; Veskrna, Karel; Vodicka, Pavel; Vodickova, Ludmila; Kubista, Mikael; Verderio, Paolo
2016-01-01
Successful molecular analyses of human solid tissues require intact biological material with well-preserved nucleic acids, proteins, and other cell structures. Pre-analytical handling, comprising of the collection of material at the operating theatre, is among the first critical steps that influence sample quality. The aim of this study was to compare the experimental outcomes obtained from samples collected and stored by the conventional means of snap freezing and by PAXgene Tissue System (Qiagen). These approaches were evaluated by measuring rRNA and mRNA integrity of the samples (RNA Quality Indicator and Differential Amplification Method) and by gene expression profiling. The collection procedures of the biological material were implemented in two hospitals during colon cancer surgery in order to identify the impact of the collection method on the experimental outcome. Our study shows that the pre-analytical sample handling has a significant effect on the quality of RNA and on the variability of qPCR data. PAXgene collection mode proved to be more easily implemented in the operating room and moreover the quality of RNA obtained from human colon tissues by this method is superior to the one obtained by snap freezing. PMID:27383461
NASA Astrophysics Data System (ADS)
Sævik, P. N.; Nixon, C. W.
2017-11-01
We demonstrate how topology-based measures of connectivity can be used to improve analytical estimates of effective permeability in 2-D fracture networks, which is one of the key parameters necessary for fluid flow simulations at the reservoir scale. Existing methods in this field usually compute fracture connectivity using the average fracture length. This approach is valid for ideally shaped, randomly distributed fractures, but is not immediately applicable to natural fracture networks. In particular, natural networks tend to be more connected than randomly positioned fractures of comparable lengths, since natural fractures often terminate in each other. The proposed topological connectivity measure is based on the number of intersections and fracture terminations per sampling area, which for statistically stationary networks can be obtained directly from limited outcrop exposures. To evaluate the method, numerical permeability upscaling was performed on a large number of synthetic and natural fracture networks, with varying topology and geometry. The proposed method was seen to provide much more reliable permeability estimates than the length-based approach, across a wide range of fracture patterns. We summarize our results in a single, explicit formula for the effective permeability.
Cheng, Zhongzhe; Zhou, Xing; Li, Wenyi; Hu, Bingying; Zhang, Yang; Xu, Yong; Zhang, Lin; Jiang, Hongliang
2016-11-30
Capilliposide B, a novel oleanane triterpenoid saponin isolated from Lysimachia capillipes Hemsl, showed significant anti-tumor activities in recent studies. To characterize the excretion of Capilliposide B, a reliable liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed and validated for simultaneous determination of Capilliposide B and its active metabolite, Capilliposide A in rat urine and feces. Sample preparation using a solid-phase extraction procedure was optimized by acidification of samples at various degrees, providing extensive sample clean-up with a high extraction recovery. In addition, rat urinary samples were pretreated with CHAPS, an anti-adsorptive agent, for overcoming nonspecific analytes adsorption during sample storage and process. The method validation was conducted over the curve range of 10.0-5000ng/ml for both analytes. The intra- and inter-day precision and accuracy of the QC samples showed ≤11.0% RSD and -10.4-12.8% relative error. The method was successfully applied to an excretion study of Capilliposide B following intravenous administration. Copyright © 2016 Elsevier B.V. All rights reserved.
Li, Man; Liu, Xiao; Cai, Hao; Shen, Zhichun; Xu, Liu; Li, Weidong; Wu, Li; Duan, Jinao; Chen, Zhipeng
2016-12-16
Yuanhuacine was found to have significant inhibitory activity against A-549 human lung cancer cells. However, there would be serious adverse toxicity effects after systemic administration of yuanhuacine, such as by oral and intravenous ways. In order to achieve better curative effect and to alleviate the adverse toxicity effects, we tried to deliver yuanhuacine directly into the lungs. Ultra high-performance liquid chromatography tandem mass spectrometry (UHPLC-MS/MS) was used to detect the analyte and IS. After extraction (ether:dichloromethane = 8:1), the analyte and IS were separated on a Waters BEH-C 18 column (100 mm × 2.1 mm, 1.7 μm) under a 5 min gradient elution using a mixture of acetonitrile and 0.1% formic acid aqueous solution as mobile phase at a flow rate of 0.3 mL/min. ESI positive mode was chosen for detection. The method was fully validated for its selectivity, accuracy, precision, stability, matrix effect, and extraction recovery. This new method for yuanhuacine concentration determination in rat plasma was reliable and could be applied for its preclinical and clinical monitoring purpose.
Kocadağlı, Tolgahan; Yılmaz, Cemile; Gökmen, Vural
2014-06-15
This study aimed to develop a reliable analytical method for the determination of melatonin and its isomers in various food products. The method entails ethanol extraction of solid samples (or dilution of liquid samples) prior to liquid chromatography coupled to triple quadruple mass spectrometry (LC-MS/MS) analysis of target analytes. The method was in-house validated and successfully applied to various food matrices. Recovery of melatonin from different matrices were found to be 86.0 ± 3.6%, 76.9 ± 5.4%, 98.6 ± 6.4%, and 67.0 ± 4.5% for beer, walnut, tomato and sour cherry samples, respectively. No melatonin could be detected in black and green tea, sour cherry, sour cherry concentrate, kefir (a fermented milk drink) and red wine while the highest amount of melatonin (341.7 ± 29.3 pg/g) was detected in crumb. The highest amounts of melatonin isomer were detected in yeast-fermented foods such as 170.7 ± 29.9 ng/ml in red wine, 14.3 ± 0.48 ng/ml in beer, and 15.7 ± 1.4 ng/g in bread crumb. Copyright © 2013 Elsevier Ltd. All rights reserved.
Optical bandgap of semiconductor nanostructures: Methods for experimental data analysis
NASA Astrophysics Data System (ADS)
Raciti, R.; Bahariqushchi, R.; Summonte, C.; Aydinli, A.; Terrasi, A.; Mirabella, S.
2017-06-01
Determination of the optical bandgap (Eg) in semiconductor nanostructures is a key issue in understanding the extent of quantum confinement effects (QCE) on electronic properties and it usually involves some analytical approximation in experimental data reduction and modeling of the light absorption processes. Here, we compare some of the analytical procedures frequently used to evaluate the optical bandgap from reflectance (R) and transmittance (T) spectra. Ge quantum wells and quantum dots embedded in SiO2 were produced by plasma enhanced chemical vapor deposition, and light absorption was characterized by UV-Vis/NIR spectrophotometry. R&T elaboration to extract the absorption spectra was conducted by two approximated methods (single or double pass approximation, single pass analysis, and double pass analysis, respectively) followed by Eg evaluation through linear fit of Tauc or Cody plots. Direct fitting of R&T spectra through a Tauc-Lorentz oscillator model is used as comparison. Methods and data are discussed also in terms of the light absorption process in the presence of QCE. The reported data show that, despite the approximation, the DPA approach joined with Tauc plot gives reliable results, with clear advantages in terms of computational efforts and understanding of QCE.
Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya
2015-01-07
To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.
Update of Standard Practices for New Method Validation in Forensic Toxicology.
Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T
2017-01-01
International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Jiang, Hongliang; Wang, Yurong; Shet, Manjunath S; Zhang, Yang; Zenke, Duane; Fast, Douglas M
2011-09-01
A rapid, specific, and reliable LC-MS/MS based bioanalytical method was developed and validated for the simultaneous determination of naloxone (NLX) and its two metabolites, 6β-naloxol (NLL) and naloxone-3β-D-glucuronide (NLG) in mouse plasma. The optimal chromatographic behavior of these analytes was achieved on an Aquasil C18 column (50 mm × 2.1 mm, 5 μm) using reversed phase chromatography. The total LC analysis time per injection was 2.5 min with a flow rate of 1.0 mL/min with gradient elution. Sample preparation via protein precipitation with acetonitrile in a 96-well format was applied for analyses of these analytes. The analytes were monitored by electrospray ionization in positive ion multiple reaction monitoring (MRM) mode. Modification of collision energy besides chromatographic separation was applied to further eliminate interference peaks for NLL and NLG. The method validation was conducted over the curve range of 0.200/0.400/0.500 to 100/200/250 ng/mL for NLX/NLL/NLG, respectively, using 0.0250 mL of plasma sample. The intra- and inter-day precision and accuracy of the quality control samples at low, medium, and high concentration levels showed ≤ 6.5% relative standard deviation (RSD) and -8.3 to -2.5% relative error (RE). The method was successfully applied to determine the concentrations of NLX, NLL, and NLG in incurred mouse plasma samples. Copyright © 2011 Elsevier B.V. All rights reserved.
The study on the near infrared spectrum technology of sauce component analysis
NASA Astrophysics Data System (ADS)
Li, Shangyu; Zhang, Jun; Chen, Xingdan; Liang, Jingqiu; Wang, Ce
2006-01-01
The author, Shangyu Li, engages in supervising and inspecting the quality of products. In soy sauce manufacturing, quality control of intermediate and final products by many components such as total nitrogen, saltless soluble solids, nitrogen of amino acids and total acid is demanded. Wet chemistry analytical methods need much labor and time for these analyses. In order to compensate for this problem, we used near infrared spectroscopy technology to measure the chemical-composition of soy sauce. In the course of the work, a certain amount of soy sauce was collected and was analyzed by wet chemistry analytical methods. The soy sauce was scanned by two kinds of the spectrometer, the Fourier Transform near infrared spectrometer (FT-NIR spectrometer) and the filter near infrared spectroscopy analyzer. The near infrared spectroscopy of soy sauce was calibrated with the components of wet chemistry methods by partial least squares regression and stepwise multiple linear regression. The contents of saltless soluble solids, total nitrogen, total acid and nitrogen of amino acids were predicted by cross validation. The results are compared with the wet chemistry analytical methods. The correlation coefficient and root-mean-square error of prediction (RMSEP) in the better prediction run were found to be 0.961 and 0.206 for total nitrogen, 0.913 and 1.215 for saltless soluble solids, 0.855 and 0.199 nitrogen of amino acids, 0.966 and 0.231 for total acid, respectively. The results presented here demonstrate that the NIR spectroscopy technology is promising for fast and reliable determination of major components of soy sauce.
NASA reliability preferred practices for design and test
NASA Technical Reports Server (NTRS)
1991-01-01
Given here is a manual that was produced to communicate within the aerospace community design practices that have contributed to NASA mission success. The information represents the best technical advice that NASA has to offer on reliability design and test practices. Topics covered include reliability practices, including design criteria, test procedures, and analytical techniques that have been applied to previous space flight programs; and reliability guidelines, including techniques currently applied to space flight projects, where sufficient information exists to certify that the technique will contribute to mission success.
Bending of an Infinite beam on a base with two parameters in the absence of a part of the base
NASA Astrophysics Data System (ADS)
Aleksandrovskiy, Maxim; Zaharova, Lidiya
2018-03-01
Currently, in connection with the rapid development of high-rise construction and the improvement of joint operation of high-rise structures and bases models, the questions connected with the use of various calculation methods become topical. The rigor of analytical methods is capable of more detailed and accurate characterization of the structures behavior, which will affect the reliability of objects and can lead to a reduction in their cost. In the article, a model with two parameters is used as a computational model of the base that can effectively take into account the distributive properties of the base by varying the coefficient reflecting the shift parameter. The paper constructs the effective analytical solution of the problem of a beam of infinite length interacting with a two-parameter voided base. Using the Fourier integral equations, the original differential equation is reduced to the Fredholm integral equation of the second kind with a degenerate kernel, and all the integrals are solved analytically and explicitly, which leads to an increase in the accuracy of the computations in comparison with the approximate methods. The paper consider the solution of the problem of a beam loaded with a concentrated force applied at the point of origin with a fixed value of the length of the dip section. The paper gives the analysis of the obtained results values for various parameters of coefficient taking into account cohesion of the ground.
Wang, Jianshuang; Gao, Yang; Dorshorst, Drew W; Cai, Fang; Bremer, Meire; Milanowski, Dennis; Staton, Tracy L; Cape, Stephanie S; Dean, Brian; Ding, Xiao
2017-01-30
In human respiratory disease studies, liquid samples such as nasal secretion (NS), lung epithelial lining fluid (ELF), or upper airway mucosal lining fluid (MLF) are frequently collected, but their volumes often remain unknown. The lack of volume information makes it hard to estimate the actual concentration of recovered active pharmaceutical ingredient or biomarkers. Urea has been proposed to serve as a sample volume marker because it can freely diffuse through most body compartments and is less affected by disease states. Here, we report an easy and reliable LC-MS/MS method for cross-matrix measurement of urea in serum, plasma, universal transfer medium (UTM), synthetic absorptive matrix elution buffer 1 (SAMe1) and synthetic absorptive matrix elution buffer 2 (SAMe2) which are commonly sampled in human respiratory disease studies. The method uses two stable-isotope-labeled urea isotopologues, [ 15 N 2 ]-urea and [ 13 C, 15 N 2 ]-urea, as the surrogate analyte and the internal standard, respectively. This approach provides the best measurement consistency across different matrices. The analyte extraction was individually optimized in each matrix. Specifically in UTM, SAMe1 and SAMe2, the unique salting-out assisted liquid-liquid extraction (SALLE) not only dramatically reduces the matrix interferences but also improves the assay recovery. The use of an HILIC column largely increases the analyte retention. The typical run time is 3.6min which allows for high throughput analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
Rodriguez, Estrella Sanz; Poynter, Sam; Curran, Mark; Haddad, Paul R; Shellie, Robert A; Nesterenko, Pavel N; Paull, Brett
2015-08-28
Preservation of ionic species within Antarctic ice yields a unique proxy record of the Earth's climate history. Studies have been focused until now on two proxies: the ionic components of sea salt aerosol and methanesulfonic acid. Measurement of the all of the major ionic species in ice core samples is typically carried out by ion chromatography. Former methods, whilst providing suitable detection limits, have been based upon off-column preconcentration techniques, requiring larger sample volumes, with potential for sample contamination and/or carryover. Here, a new capillary ion chromatography based analytical method has been developed for quantitative analysis of limited volume Antarctic ice core samples. The developed analytical protocol applies capillary ion chromatography (with suppressed conductivity detection) and direct on-column sample injection and focusing, thus eliminating the requirement for off-column sample preconcentration. This limits the total sample volume needed to 300μL per analysis, allowing for triplicate sample analysis with <1mL of sample. This new approach provides a reliable and robust analytical method for the simultaneous determination of organic and inorganic anions, including fluoride, methanesulfonate, chloride, sulfate and nitrate anions. Application to composite ice-core samples is demonstrated, with coupling of the capillary ion chromatograph to high resolution mass spectrometry used to confirm the presence and purity of the observed methanesulfonate peak. Copyright © 2015 Elsevier B.V. All rights reserved.
Advances in analytical technologies for environmental protection and public safety.
Sadik, O A; Wanekaya, A K; Andreescu, S
2004-06-01
Due to the increased threats of chemical and biological agents of injury by terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat chemical and biochemical toxins. In addition to the right mix of policies and training of medical personnel on how to recognize symptoms of biochemical warfare agents, the major success in combating terrorism still lies in the prevention, early detection and the efficient and timely response using reliable analytical technologies and powerful therapies for minimizing the effects in the event of an attack. The public and regulatory agencies expect reliable methodologies and devices for public security. Today's systems are too bulky or slow to meet the "detect-to-warn" needs for first responders such as soldiers and medical personnel. This paper presents the challenges in monitoring technologies for warfare agents and other toxins. It provides an overview of how advances in environmental analytical methodologies could be adapted to design reliable sensors for public safety and environmental surveillance. The paths to designing sensors that meet the needs of today's measurement challenges are analyzed using examples of novel sensors, autonomous cell-based toxicity monitoring, 'Lab-on-a-Chip' devices and conventional environmental analytical techniques. Finally, in order to ensure that the public and legal authorities are provided with quality data to make informed decisions, guidelines are provided for assessing data quality and quality assurance using the United States Environmental Protection Agency (US-EPA) methodologies.
NASA Astrophysics Data System (ADS)
Woldegiorgis, Befekadu Taddesse; van Griensven, Ann; Pereira, Fernando; Bauwens, Willy
2017-06-01
Most common numerical solutions used in CSTR-based in-stream water quality simulators are susceptible to instabilities and/or solution inconsistencies. Usually, they cope with instability problems by adopting computationally expensive small time steps. However, some simulators use fixed computation time steps and hence do not have the flexibility to do so. This paper presents a novel quasi-analytical solution for CSTR-based water quality simulators of an unsteady system. The robustness of the new method is compared with the commonly used fourth-order Runge-Kutta methods, the Euler method and three versions of the SWAT model (SWAT2012, SWAT-TCEQ, and ESWAT). The performance of each method is tested for different hypothetical experiments. Besides the hypothetical data, a real case study is used for comparison. The growth factors we derived as stability measures for the different methods and the R-factor—considered as a consistency measure—turned out to be very useful for determining the most robust method. The new method outperformed all the numerical methods used in the hypothetical comparisons. The application for the Zenne River (Belgium) shows that the new method provides stable and consistent BOD simulations whereas the SWAT2012 model is shown to be unstable for the standard daily computation time step. The new method unconditionally simulates robust solutions. Therefore, it is a reliable scheme for CSTR-based water quality simulators that use first-order reaction formulations.
Murphy, Christine M; Devlin, John J; Beuhler, Michael C; Cheifetz, Paul; Maynard, Susan; Schwartz, Michael D; Kacinko, Sherri
2018-04-01
Nitromethane, found in fuels used for short distance racing, model cars, and model airplanes, produces a falsely elevated serum creatinine with standard creatinine analysis via the Jaffé method. Erroneous creatinine elevation often triggers extensive testing, leads to inaccurate diagnoses, and delayed or inappropriate medical interventions. Multiple reports in the literature identify "enzymatic assays" as an alternative method to detect the true value of creatinine, but this ambiguity does not help providers translate what type of enzymatic assay testing can be done in real time to determine if there is indeed false elevation. We report seven cases of ingested nitromethane where creatinine was determined via Beckman Coulter ® analyser using the Jaffé method, Vitros ® analyser, or i-Stat ® point-of-care testing. Nitromethane was detected and semi-quantified using a common clinical toxic alcohol analysis method, and quantified by headspace-gas chromatography-mass spectrometry. When creatinine was determined using i-Stat ® point-of-care testing or a Vitros ® analyser, levels were within the normal range. Comparatively, all initial creatinine levels obtained via the Jaffé method were elevated. Nitromethane concentrations ranged from 42 to 310 μg/mL. These cases demonstrate reliable assessment of creatinine through other enzymatic methods using a Vitros ® analyser or i-STAT ® . Additionally, nitromethane is detectable and quantifiable using routine alcohols gas chromatography analysis and by headspace-gas chromatography-mass spectrometry.
Factor Analytic Validation of the Ford, Wolvin, and Chung Listening Competence Scale
ERIC Educational Resources Information Center
Mickelson, William T.; Welch, S. A.
2012-01-01
This research begins to independently and quantitatively validate the Ford, Wolvin, and Chung (2000) Listening Competency Scale. Reliability and Confirmatory Factor analyses were conducted on two independent samples. The reliability estimates were found to be below those reported by Ford, Wolvin, and Chung (2000) and below acceptable levels for…
Chen, Fangfang; Gong, Zhiyuan; Kelly, Barry C
2015-02-27
A sensitive analytical method based on liquid-liquid extraction (LLE) and liquid chromatography tandem mass spectrometry (LC-MS/MS) was developed for rapid analysis of 11 pharmaceuticals and personal care products (PPCPs) in fish plasma micro-aliquots (∼20μL). Target PPCPs included, bisphenol A, carbamazepine, diclofenac, fluoxetine, gemfibrozil, ibuprofen, naproxen, risperidone, sertraline, simvastatin and triclosan. A relatively quicker and cheaper LLE procedure exhibited comparable analyte recoveries with solid-phase extraction. Rapid separation and analysis of target compounds in fish plasma extracts was achieved by employing a high efficiency C-18 HPLC column (Agilent Poroshell 120 SB-C18, 2.1mm×50mm, 2.7μm) and fast polarity switching, enabling effective monitoring of positive and negative ions in a single 9min run. With the exception of bisphenol A, which exhibited relatively high background contamination, method detection limits of individual PPCPs ranged between 0.15 and 0.69pg/μL, while method quantification limits were between 0.05 and 2.3pg/μL. Mean matrix effect (ME) values ranged between 65 and 156% for the various target analytes. Isotope dilution quantification using isotopically labelled internal surrogates was utilized to correct for signal suppression or enhancement and analyte losses during sample preparation. The method was evaluated by analysis of 20μL plasma micro-aliquots collected from zebrafish (Danio rerio) from a laboratory bioaccumulation study, which included control group fish (no exposure), as well as fish exposed to environmentally relevant concentrations of PPCPs. Using the developed LC-MS/MS based method, concentrations of the studied PPCPs were consistently detected in the low pg/μL (ppb) range. The method may be useful for investigations requiring fast, reliable concentration measurements of PPCPs in fish plasma. In particular, the method may be applicable for in situ contaminant biomonitoring, as well as bioaccumulation and toxicology studies employing small fishes with low blood compartment volumes. Copyright © 2015 Elsevier B.V. All rights reserved.
Xiao, Jie; Wang, Tianyang; Li, Pei; Liu, Ran; Li, Qing; Bi, Kaishun
2016-08-15
A sensitive, reliable and accurate UHPLC-MS/MS method has been firstly established and validated for the simultaneous quantification of ginkgo flavonoids, terpene lactones and nimodipine in rat plasma after oral administration of Ginkgo biloba dispersible tablets, Nimodipine tablets and the combination of the both, respectively. The plasma samples were extracted by two step liquid-liquid extraction, nimodipine was extracted by hexane-ether (3:1, v/v) at the first step, after that ginkgo flavonoids and terpene lactones were extracted by ethyl acetate. Then the analytes were successfully separated by running gradient elution with the mobile phase consisting of 0.1% formic acid in water and methanol at a flow rate of 0.6mL/min. The detection of the analytes was performed on a UHPLC-MS/MS system with turbo ion spray source in the negative ion and multiple reaction monitoring (MRM) mode. The calibration curves for the determination of all the analytes showed good linearity (R(2)>0.99), and the lower limits of quantification were 0.50-4.00ng/mL. Intra-day and inter-day precisions were in the range of 3.6%-9.2% and 3.2%-13.1% for all the analytes. The mean extraction recoveries of the analytes were within 69.82%-103.5% and the matrix were within 82.8%-110.0%. The validated method had been successfully applied to compare the pharmacokinetic parameters of ginkgo flavonoids, terpene lactones and nimodipine in rat plasma after oral administration of Ginkgo biloba dispersible tablets, Nimodipine tablets with the combination of the both. There were no statistically significant differences on the pharmacokinetic behaviors of all the analytes between the combined and single administration groups. Results showed that the combination of the two agents may avoid dosage adjustments in clinic and the combination is more convenient as well as efficient on different pathogenesis of cerebral ischemia. Copyright © 2016 Elsevier B.V. All rights reserved.
Hypothesis Testing Using Factor Score Regression
Devlieger, Ines; Mayer, Axel; Rosseel, Yves
2015-01-01
In this article, an overview is given of four methods to perform factor score regression (FSR), namely regression FSR, Bartlett FSR, the bias avoiding method of Skrondal and Laake, and the bias correcting method of Croon. The bias correcting method is extended to include a reliable standard error. The four methods are compared with each other and with structural equation modeling (SEM) by using analytic calculations and two Monte Carlo simulation studies to examine their finite sample characteristics. Several performance criteria are used, such as the bias using the unstandardized and standardized parameterization, efficiency, mean square error, standard error bias, type I error rate, and power. The results show that the bias correcting method, with the newly developed standard error, is the only suitable alternative for SEM. While it has a higher standard error bias than SEM, it has a comparable bias, efficiency, mean square error, power, and type I error rate. PMID:29795886
Application of the variational-asymptotical method to composite plates
NASA Technical Reports Server (NTRS)
Hodges, Dewey H.; Lee, Bok W.; Atilgan, Ali R.
1992-01-01
A method is developed for the 3D analysis of laminated plate deformation which is an extension of a variational-asymptotical method by Atilgan and Hodges (1991). Both methods are based on the treatment of plate deformation by splitting the 3D analysis into linear through-the-thickness analysis and 2D plate analysis. Whereas the first technique tackles transverse shear deformation in the second asymptotical approximation, the present method simplifies its treatment and restricts it to the first approximation. Both analytical techniques are applied to the linear cylindrical bending problem, and the strain and stress distributions are derived and compared with those of the exact solution. The present theory provides more accurate results than those of the classical laminated-plate theory for the transverse displacement of 2-, 3-, and 4-layer cross-ply laminated plates. The method can give reliable estimates of the in-plane strain and displacement distributions.
Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques
NASA Astrophysics Data System (ADS)
Mai, Juliane; Tolson, Bryan
2017-04-01
The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters or model processes. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method independency of the convergence testing method, we applied it to three widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991, Campolongo et al., 2000), the variance-based Sobol' method (Solbol' 1993, Saltelli et al. 2010) and a derivative-based method known as Parameter Importance index (Goehler et al. 2013). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. Subsequently, we focus on the model-independency by testing the frugal method using the hydrologic model mHM (www.ufz.de/mhm) with about 50 model parameters. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed (and published) sensitivity results. This is one step towards reliable and transferable, published sensitivity results.
Constraints on the [Formula: see text] form factor from analyticity and unitarity.
Ananthanarayan, B; Caprini, I; Kubis, B
Motivated by the discrepancies noted recently between the theoretical calculations of the electromagnetic [Formula: see text] form factor and certain experimental data, we investigate this form factor using analyticity and unitarity in a framework known as the method of unitarity bounds. We use a QCD correlator computed on the spacelike axis by operator product expansion and perturbative QCD as input, and exploit unitarity and the positivity of its spectral function, including the two-pion contribution that can be reliably calculated using high-precision data on the pion form factor. From this information, we derive upper and lower bounds on the modulus of the [Formula: see text] form factor in the elastic region. The results provide a significant check on those obtained with standard dispersion relations, confirming the existence of a disagreement with experimental data in the region around [Formula: see text].
Tung, Thanh Tran
2017-01-01
The early diagnosis of diseases, e.g., Parkinson’s and Alzheimer’s disease, diabetes, and various types of cancer, and monitoring the response of patients to the therapy plays a critical role in clinical treatment; therefore, there is an intensive research for the determination of many clinical analytes. In order to achieve point-of-care sensing in clinical practice, sensitive, selective, cost-effective, simple, reliable, and rapid analytical methods are required. Biosensors have become essential tools in biomarker sensing, in which electrode material and architecture play critical roles in achieving sensitive and stable detection. Carbon nanomaterials in the form of particle/dots, tube/wires, and sheets have recently become indispensable elements of biosensor platforms due to their excellent mechanical, electronic, and optical properties. This review summarizes developments in this lucrative field by presenting major biosensor types and variability of sensor platforms in biomedical applications. PMID:28825646
Neutron Activation Analysis of the Rare Earth Elements (REE) - With Emphasis on Geological Materials
NASA Astrophysics Data System (ADS)
Stosch, Heinz-Günter
2016-08-01
Neutron activation analysis (NAA) has been the analytical method of choice for rare earth element (REE) analysis from the early 1960s through the 1980s. At that time, irradiation facilitieswere widely available and fairly easily accessible. The development of high-resolution gamma-ray detectors in the mid-1960s eliminated, formany applications, the need for chemical separation of the REE from the matrix material, making NAA a reliable and effective analytical tool. While not as precise as isotopedilution mass spectrometry, NAA was competitive by being sensitive for the analysis of about half of the rare earths (La, Ce, Nd, Sm, Eu, Tb, Yb, Lu). The development of inductively coupled plasma mass spectrometry since the 1980s, together with decommissioning of research reactors and the lack of installation of new ones in Europe and North America has led to the rapid decline of NAA.
Photoacoustic spectroscopy for chemical detection
NASA Astrophysics Data System (ADS)
Holthoff, Ellen L.; Pellegrino, Paul M.
2012-06-01
The Global War on Terror has made rapid detection and identification of chemical and biological agents a priority for Military and Homeland Defense applications. Reliable real-time detection of these threats is complicated by our enemy's use of a diverse range of materials. Therefore, an adaptable platform is necessary. Photoacoustic spectroscopy (PAS) is a useful monitoring technique that is well suited for trace detection of gaseous media. This method routinely exhibits detection limits at the parts-per-billion (ppb) or sub-ppb range. The versatility of PAS also allows for the investigation of solid and liquid analytes. Current research utilizes quantum cascade lasers (QCLs) in combination with an air-coupled solid-phase photoacoustic cell design for the detection of condensed phase material films deposited on a surface. Furthermore, variation of the QCL pulse repetition rate allows for identification and molecular discrimination of analytes based solely on photoacoustic spectra collected at different film depths.
Analytical Characterization of Erythritol Tetranitrate, an Improvised Explosive.
Matyáš, Robert; Lyčka, Antonín; Jirásko, Robert; Jakový, Zdeněk; Maixner, Jaroslav; Mišková, Linda; Künzel, Martin
2016-05-01
Erythritol tetranitrate (ETN), an ester of nitric acid and erythritol, is a solid crystalline explosive with high explosive performance. Although it has never been used in any industrial or military application, it has become one of the most prepared and misused improvise explosives. In this study, several analytical techniques were explored to facilitate analysis in forensic laboratories. FTIR and Raman spectrometry measurements expand existing data and bring more detailed assignment of bands through the parallel study of erythritol [(15) N4 ] tetranitrate. In the case of powder diffraction, recently published data were verified, and (1) H, (13) C, and (15) N NMR spectra are discussed in detail. The technique of electrospray ionization tandem mass spectrometry was successfully used for the analysis of ETN. Described methods allow fast, versatile, and reliable detection or analysis of samples containing erythritol tetranitrate in forensic laboratories. © 2016 American Academy of Forensic Sciences.
Polymerase chain reaction technology as analytical tool in agricultural biotechnology.
Lipp, Markus; Shillito, Raymond; Giroux, Randal; Spiegelhalter, Frank; Charlton, Stacy; Pinero, David; Song, Ping
2005-01-01
The agricultural biotechnology industry applies polymerase chain reaction (PCR) technology at numerous points in product development. Commodity and food companies as well as third-party diagnostic testing companies also rely on PCR technology for a number of purposes. The primary use of the technology is to verify the presence or absence of genetically modified (GM) material in a product or to quantify the amount of GM material present in a product. This article describes the fundamental elements of PCR analysis and its application to the testing of grains. The document highlights the many areas to which attention must be paid in order to produce reliable test results. These include sample preparation, method validation, choice of appropriate reference materials, and biological and instrumental sources of error. The article also discusses issues related to the analysis of different matrixes and the effect they may have on the accuracy of the PCR analytical results.
A Review on Microfluidic Paper-Based Analytical Devices for Glucose Detection
Liu, Shuopeng; Su, Wenqiong; Ding, Xianting
2016-01-01
Glucose, as an essential substance directly involved in metabolic processes, is closely related to the occurrence of various diseases such as glucose metabolism disorders and islet cell carcinoma. Therefore, it is crucial to develop sensitive, accurate, rapid, and cost effective methods for frequent and convenient detections of glucose. Microfluidic Paper-based Analytical Devices (μPADs) not only satisfying the above requirements but also occupying the advantages of portability and minimal sample consumption, have exhibited great potential in the field of glucose detection. This article reviews and summarizes the most recent improvements in glucose detection in two aspects of colorimetric and electrochemical μPADs. The progressive techniques for fabricating channels on μPADs are also emphasized in this article. With the growth of diabetes and other glucose indication diseases in the underdeveloped and developing countries, low-cost and reliably commercial μPADs for glucose detection will be in unprecedentedly demand. PMID:27941634
NASA Astrophysics Data System (ADS)
Takeuchi, Toshie; Nakagawa, Takafumi; Tsukima, Mitsuru; Koyama, Kenichi; Tohya, Nobumoto; Yano, Tomotaka
A new electromagnetically actuated vacuum circuit breaker (VCB) has been designed and developed on the basis of the transient electromagnetic analysis coupled with motion. The VCB has three advanced bi-stable electromagnetic actuators, which control each phase independently. The VCB serves as a synchronous circuit breaker as well as a standard circuit breaker. In this work, the flux delay due to the eddy current is analytically formulated using the delay time constant of the actuator coil current, thereby leading to accurate driving behavior. With this analytical method, the electromagnetic mechanism for a 24kV rated VCB has been optimized; and as a result, the driving energy is reduced to one fifth of that of a conventional VCB employing spring mechanism, and the number of parts is significantly decreased. Therefore, the developed VCB becomes compact, highly reliable and highly durable.
Pitfalls in the detection of cholesterol in Huntington's disease models.
Marullo, Manuela; Valenza, Marta; Leoni, Valerio; Caccia, Claudio; Scarlatti, Chiara; De Mario, Agnese; Zuccato, Chiara; Di Donato, Stefano; Carafoli, Ernesto; Cattaneo, Elena
2012-10-11
Background Abnormalities in brain cholesterol homeostasis have been reported in Huntington's disease (HD), an adult-onset neurodegenerative disorder caused by an expansion in the number of CAG repeats in the huntingtin (HTT) gene. However, the results have been contradictory with respect to whether cholesterol levels increase or decrease in HD models. Biochemical and mass spectrometry methods show reduced levels of cholesterol precursors and cholesterol in HD cells and in the brains of several HD animal models. Abnormal brain cholesterol homeostasis was also inferred from studies in HD patients. In contrast, colorimetric and enzymatic methods indicate cholesterol accumulation in HD cells and tissues. Here we used several methods to investigate cholesterol levels in cultured cells in the presence or absence of mutant HTT protein. Results Colorimetric and enzymatic methods with low sensitivity gave variable results, whereas results from a sensitive analytical method, gas chromatography-mass spectrometry, were more reliable. Sample preparation, high cell density and cell clonality also influenced the detection of intracellular cholesterol. Conclusions Detection of cholesterol in HD samples by colorimetric and enzymatic assays should be supplemented by detection using more sensitive analytical methods. Care must be taken to prepare the sample appropriately. By evaluating lathosterol levels using isotopic dilution mass spectrometry, we confirmed reduced cholesterol biosynthesis in knock-in cells expressing the polyQ mutation in a constitutive or inducible manner. *Correspondence should be addressed to Elena Cattaneo: elena.cattaneo@unimi.it.
Wu, Wenjie; Zhang, Yuan; Wu, Hanqiu; Zhou, Weie; Cheng, Yan; Li, Hongna; Zhang, Chuanbin; Li, Lulu; Huang, Ying; Zhang, Feng
2017-07-01
Isoflavones are natural substances that exhibit hormone-like pharmacological activities. The separation of isoflavones remains an analytical challenge because of their similar structures. We show that ultra-high performance supercritical fluid chromatography can be an appropriate tool to achieve the fast separation of 12 common dietary isoflavones. Among the five tested columns the Torus DEA column was found to be the most effective column for the separation of these isoflavones. The impact of individual parameters on the retention time and separation factor was evaluated. These parameters were optimized to develop a simple, rapid, and green method for the separation of the 12 target analytes. It only took 12.91 min using gradient elution with methanol as an organic modifier and formic acid as an additive. These isoflavones were determined with limit of quantitation ranging from 0.10 to 0.50 μg/mL, which was sufficient for reliable determination of various matrixes. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Fully 3D-Printed Preconcentrator for Selective Extraction of Trace Elements in Seawater.
Su, Cheng-Kuan; Peng, Pei-Jin; Sun, Yuh-Chang
2015-07-07
In this study, we used a stereolithographic 3D printing technique and polyacrylate polymers to manufacture a solid phase extraction preconcentrator for the selective extraction of trace elements and the removal of unwanted salt matrices, enabling accurate and rapid analyses of trace elements in seawater samples when combined with a quadrupole-based inductively coupled plasma mass spectrometer. To maximize the extraction efficiency, we evaluated the effect of filling the extraction channel with ordered cuboids to improve liquid mixing. Upon automation of the system and optimization of the method, the device allowed highly sensitive and interference-free determination of Mn, Ni, Zn, Cu, Cd, and Pb, with detection limits comparable with those of most conventional methods. The system's analytical reliability was further confirmed through analyses of reference materials and spike analyses of real seawater samples. This study suggests that 3D printing can be a powerful tool for building multilayer fluidic manipulation devices, simplifying the construction of complex experimental components, and facilitating the operation of sophisticated analytical procedures for most sample pretreatment applications.
UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.
Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun
2013-12-01
Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.
Han, Changhee; Burn-Nunes, Laurie J; Lee, Khanghyun; Chang, Chaewon; Kang, Jung-Ho; Han, Yeongcheol; Hur, Soon Do; Hong, Sungmin
2015-08-01
An improved decontamination method and ultraclean analytical procedures have been developed to minimize Pb contamination of processed glacial ice cores and to achieve reliable determination of Pb isotopes in North Greenland Eemian Ice Drilling (NEEM) deep ice core sections with concentrations at the sub-picogram per gram level. A PL-7 (Fuso Chemical) silica-gel activator has replaced the previously used colloidal silica activator produced by Merck and has been shown to provide sufficiently enhanced ion beam intensity for Pb isotope analysis for a few tens of picograms of Pb. Considering the quantities of Pb contained in the NEEM Greenland ice core and a sample weight of 10 g used for the analysis, the blank contribution from the sample treatment was observed to be negligible. The decontamination and analysis of the artificial ice cores and selected NEEM Greenland ice core sections confirmed the cleanliness and effectiveness of the overall analytical process. Copyright © 2015 Elsevier B.V. All rights reserved.
Mathematical model to estimate risk of calcium-containing renal stones
NASA Technical Reports Server (NTRS)
Pietrzyk, R. A.; Feiveson, A. H.; Whitson, P. A.
1999-01-01
BACKGROUND/AIMS: Astronauts exposed to microgravity during the course of spaceflight undergo physiologic changes that alter the urinary environment so as to increase the risk of renal stone formation. This study was undertaken to identify a simple method with which to evaluate the potential risk of renal stone development during spaceflight. METHOD: We used a large database of urinary risk factors obtained from 323 astronauts before and after spaceflight to generate a mathematical model with which to predict the urinary supersaturation of calcium stone forming salts. RESULT: This model, which involves the fewest possible analytical variables (urinary calcium, citrate, oxalate, phosphorus, and total volume), reliably and accurately predicted the urinary supersaturation of the calcium stone forming salts when compared to results obtained from a group of 6 astronauts who collected urine during flight. CONCLUSIONS: The use of this model will simplify both routine medical monitoring during spaceflight as well as the evaluation of countermeasures designed to minimize renal stone development. This model also can be used for Earth-based applications in which access to analytical resources is limited.
Martignac, Marion; Balayssac, Stéphane; Gilard, Véronique; Benoit-Marquié, Florence
2015-06-18
We have investigated the removal of bortezomib, an anticancer drug prescribed in multiple myeloma, using the photochemical advanced oxidation process of V-UV/UV (185/254 nm). We used two complementary analytical techniques to follow the removal rate of bortezomib. Nuclear magnetic resonance (NMR) is a nonselective method requiring no prior knowledge of the structures of the byproducts and permits us to provide a spectral signature (fingerprinting approach). This untargeted method provides clues to the molecular structure changes and information on the degradation of the parent drug during the irradiation process. This holistic NMR approach could provide information for monitoring aromaticity evolution. We use liquid chromatography, coupled with high-resolution mass spectrometry (LC-MS), to correlate results obtained by (1)H NMR and for accurate identification of the byproducts, in order to understand the mechanistic degradation pathways of bortezomib. The results show that primary byproducts come from photoassisted deboronation of bortezomib at 254 nm. A secondary byproduct of pyrazinecarboxamide was also identified. We obtained a reliable correlation between these two analytical techniques.
Hou, Xiao-Lin; Chen, Guo; Zhu, Li; Yang, Ting; Zhao, Jian; Wang, Lei; Wu, Yin-Liang
2014-07-01
A simple, sensitive and reliable analytical method was developed for the simultaneous determination of 38 veterinary drugs (18 sulfonamides, 11 quinolones and 9 benzimidazoles) and 8 metabolites of benzimidazoles in bovine milk by ultra high performance liquid chromatography-positive electrospray ionization tandem mass spectrometry (UHPLC-ESI-MS/MS). Samples were extracted with acidified acetonitrile, cleaned up with Oasis(®) MCX cartridges, and analyzed by LC-MS/MS on an Acquity UPLC(®) BEH C18 column with gradient elution. The method allows such multi-analyte measurements within a 13min runtime while the specificity is ensured through the MRM acquisition mode. The method was validated according to the European Commission Decision 2002/657/EC determining specificity, decision limit (CCα), detection capability (CCβ), recovery, precision, linearity and stability. For compounds which have MRLs in bovine milk, the CCα values fall into a range from 11 to 115μg/kg, and the CCβ values fall within a range of 12-125μg/kg. For compounds which have not MRLs in bovine milk, the CCα values fall into a range from 0.01 to 0.08μg/kg, and the CCβ values fall within a range of 0.02-0.11μg/kg. The mean recoveries of the 46 analytes were between 87 and 119%. The calculated RSD values of repeatability and within-laboratory reproducibility experiments were below 11% and 15% for the 46 compounds, respectively. The method was demonstrated to be suitable for the simultaneous determination of sulfonamides, quinolones and benzimidazoles in bovine milk. Copyright © 2014 Elsevier B.V. All rights reserved.
Dimier, Natalie; Todd, Susan
2017-09-01
Clinical trials of experimental treatments must be designed with primary endpoints that directly measure clinical benefit for patients. In many disease areas, the recognised gold standard primary endpoint can take many years to mature, leading to challenges in the conduct and quality of clinical studies. There is increasing interest in using shorter-term surrogate endpoints as substitutes for costly long-term clinical trial endpoints; such surrogates need to be selected according to biological plausibility, as well as the ability to reliably predict the unobserved treatment effect on the long-term endpoint. A number of statistical methods to evaluate this prediction have been proposed; this paper uses a simulation study to explore one such method in the context of time-to-event surrogates for a time-to-event true endpoint. This two-stage meta-analytic copula method has been extensively studied for time-to-event surrogate endpoints with one event of interest, but thus far has not been explored for the assessment of surrogates which have multiple events of interest, such as those incorporating information directly from the true clinical endpoint. We assess the sensitivity of the method to various factors including strength of association between endpoints, the quantity of data available, and the effect of censoring. In particular, we consider scenarios where there exist very little data on which to assess surrogacy. Results show that the two-stage meta-analytic copula method performs well under certain circumstances and could be considered useful in practice, but demonstrates limitations that may prevent universal use. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Fu, Liang; Xie, Hualin; Shi, Shuyun; Chen, Xiaoqing
2018-06-01
The content of non-metallic impurities in high-purity tetramethylammonium hydroxide (HPTMAH) aqueous solution has an important influence on the yield, electrical properties and reliability of the integrated circuit during the process of chip etching and cleaning. Therefore, an efficient analytical method to directly quantify the content of non-metallic impurities in HPTMAH aqueous solutions is necessary. The present study was aimed to develop a novel method that can accurately determine seven non-metallic impurities (B, Si, P, S, Cl, As, and Se) in an aqueous solution of HPTMAH by inductively coupled plasma tandem mass spectrometry (ICP-MS/MS). The samples were measured using a direct injection method. In the MS/MS mode, oxygen and hydrogen were used as reaction gases in the octopole reaction system (ORS) to eliminate mass spectral interferences during the analytical process. The detection limits of B, Si, P, S, Cl, As, and Se were 0.31, 0.48, 0.051, 0.27, 3.10, 0.008, and 0.005 μg L-1, respectively. The samples were analyzed by the developed method and the sector field inductively coupled plasma mass spectrometry (SF-ICP-MS) was used for contrastive analysis. The values of these seven elements measured using ICP-MS/MS were consistent with those measured by SF-ICP-MS. The proposed method can be utilized to analyze non-metallic impurities in HPTMAH aqueous solution. Table S2 Multiple potential interferences on the analytes. Table S3 Parameters of calibration curve and the detection limit (DL). Table S4 Results obtained for 25% concentration high-purity grade TMAH aqueous solution samples (μg L-1, mean ± standard deviation, n = 10).
Zhang, Daping; Wu, Lei; Chow, Diana S-L; Tam, Vincent H; Rios, Danielle R
2016-01-05
The determination of dopamine facilitates better understanding of the complex brain disorders in the central nervous system and the regulation of endocrine system, cardiovascular functions and renal functions in the periphery. The purpose of this study was to develop a highly sensitive and reliable assay for the quantification of dopamine in human neonate plasma. Dopamine was extracted from human plasma by strong cation exchange (SCX) solid phase extraction (SPE), and subsequently derivatized with propionic anhydride. The derivatized analyte was separated by a Waters Acquity UPLC BEH C18 column using gradient elution at 0.4 ml/min with mobile phases A (0.2% formic acid in water [v/v]) and B (MeOH-ACN [v/v, 30:70]). Analysis was performed under positive electrospray ionization tandem mass spectrometer (ESI-MS/MS) in the multiple reaction monitoring (MRM) mode. The stable and relatively non-polar nature of the derivatized analyte enables reliable quantification of dopamine in the range of 10-1000 pg/ml using 200 μl of plasma sample. The method was validated with intra-day and inter-day precision less than 7%, and the intra-day and inter-day accuracy of 91.9-101.9% and 92.3-102.6%, respectively. The validated assay was applied to quantify dopamine levels in two preterm neonate plasma samples. In conclusion, a sensitive and selective LC-MS/MS method has been developed and validated, and successfully used for the determination of plasma dopamine levels in preterm neonates. Copyright © 2015 Elsevier B.V. All rights reserved.
Liquid chromatographic determination of sennosides in Cassia angustifolia leaves.
Srivastava, Alpuna; Pandey, Richa; Verma, Ram K; Gupta, Madan M
2006-01-01
A simple liquid chromatographic method was developed for the determination of sennosides B and A in leaves of Cassia angustifolia. These compounds were extracted from leaves with a mixture of methanol-water (70 + 30, v/v) after defatting with hexane. Analyte separation and quantitation were achieved by gradient reversed-phase liquid chromatography and UV absorbance at 270 nm using a photodiode array detector. The method involves the use of an RP-18 Lichrocart reversed-phase column (5 microm, 125 x 4.0 mm id) and a binary gradient mobile-phase profile. The various other aspects of analysis, namely, peak purity, similarity, recovery, repeatability, and robustness, were validated. Average recoveries of 98.5 and 98.6%, with a coefficient of variation of 0.8 and 0.3%, were obtained by spiking sample solution with 3 different concentration solutions of standards (60, 100, and 200 microg/mL). Detection limits were 10 microg/mL for sennoside B and 35 microg/mL for sennoside A, present in the sample solution. The quantitation limits were 28 and 100 microg/mL. The analytical method was applied to a large number of senna leaf samples. The new method provides a reliable tool for rapid screening of C. angustifolia samples in large numbers, which is needed in breeding/genetic engineering and genetic mapping experiments.
Monitoring occupational exposure to cancer chemotherapy drugs
NASA Technical Reports Server (NTRS)
Baker, E. S.; Connor, T. H.
1996-01-01
Reports of the health effects of handling cytotoxic drugs and compliance with guidelines for handling these agents are briefly reviewed, and studies using analytical and biological methods of detecting exposure are evaluated. There is little conclusive evidence of detrimental health effects from occupational exposure to cytotoxic drugs. Work practices have improved since the issuance of guidelines for handling these drugs, but compliance with the recommended practices is still inadequate. Of 64 reports published since 1979 on studies of workers' exposure to these drugs, 53 involved studies of changes in cellular or molecular endpoints (biological markers) and 12 described chemical analyses of drugs or their metabolites in urine (2 involved both, and 2 reported the same study). The primary biological markers used were urine mutagenicity, sister chromatid exchange, and chromosomal aberrations; other studies involved formation of micronuclei and measurements of urinary thioethers. The studies had small sample sizes, and the methods were qualitative, nonspecific, subject to many confounders, and possibly not sensitive enough to detect most occupational exposures. Since none of the currently available biological and analytical methods is sufficiently reliable or reproducible for routine monitoring of exposure in the workplace, further studies using these methods are not recommended; efforts should focus instead on wide-spread implementation of improved practices for handling cytotoxic drugs.
Khorosheva, Eugenia M.; Karymov, Mikhail A.; Selck, David A.; Ismagilov, Rustem F.
2016-01-01
In this paper, we asked if it is possible to identify the best primers and reaction conditions based on improvements in reaction speed when optimizing isothermal reactions. We used digital single-molecule, real-time analyses of both speed and efficiency of isothermal amplification reactions, which revealed that improvements in the speed of isothermal amplification reactions did not always correlate with improvements in digital efficiency (the fraction of molecules that amplify) or with analytical sensitivity. However, we observed that the speeds of amplification for single-molecule (in a digital device) and multi-molecule (e.g. in a PCR well plate) formats always correlated for the same conditions. Also, digital efficiency correlated with the analytical sensitivity of the same reaction performed in a multi-molecule format. Our finding was supported experimentally with examples of primer design, the use or exclusion of loop primers in different combinations, and the use of different enzyme mixtures in one-step reverse-transcription loop-mediated amplification (RT-LAMP). Our results show that measuring the digital efficiency of amplification of single-template molecules allows quick, reliable comparisons of the analytical sensitivity of reactions under any two tested conditions, independent of the speeds of the isothermal amplification reactions. PMID:26358811
Umari, A.M.; Gorelick, S.M.
1986-01-01
It is possible to obtain analytic solutions to the groundwater flow and solute transport equations if space variables are discretized but time is left continuous. From these solutions, hydraulic head and concentration fields for any future time can be obtained without ' marching ' through intermediate time steps. This analytical approach involves matrix exponentiation and is referred to as the Matrix Exponential Time Advancement (META) method. Two algorithms are presented for the META method, one for symmetric and the other for non-symmetric exponent matrices. A numerical accuracy indicator, referred to as the matrix condition number, was defined and used to determine the maximum number of significant figures that may be lost in the META method computations. The relative computational and storage requirements of the META method with respect to the time marching method increase with the number of nodes in the discretized problem. The potential greater accuracy of the META method and the associated greater reliability through use of the matrix condition number have to be weighed against this increased relative computational and storage requirements of this approach as the number of nodes becomes large. For a particular number of nodes, the META method may be computationally more efficient than the time-marching method, depending on the size of time steps used in the latter. A numerical example illustrates application of the META method to a sample ground-water-flow problem. (Author 's abstract)
Gowd, Snigdha; Shankar, T; Dash, Samarendra; Sahoo, Nivedita; Chatterjee, Suravi; Mohanty, Pritam
2017-01-01
Aims and Objective: The aim of the study was to evaluate the reliability of cone beam computed tomography (CBCT) obtained image over plaster model for the assessment of mixed dentition analysis. Materials and Methods: Thirty CBCT-derived images and thirty plaster models were derived from the dental archives, and Moyer's and Tanaka-Johnston analyses were performed. The data obtained were interpreted and analyzed statistically using SPSS 10.0/PC (SPSS Inc., Chicago, IL, USA). Descriptive and analytical analysis along with Student's t-test was performed to qualitatively evaluate the data and P < 0.05 was considered statistically significant. Results: Statistically, significant results were obtained on data comparison between CBCT-derived images and plaster model; the mean for Moyer's analysis in the left and right lower arch for CBCT and plaster model was 21.2 mm, 21.1 mm and 22.5 mm, 22.5 mm, respectively. Conclusion: CBCT-derived images were less reliable as compared to data obtained directly from plaster model for mixed dentition analysis. PMID:28852639
Analytical method for establishing indentation rolling resistance
NASA Astrophysics Data System (ADS)
Gładysiewicz, Lech; Konieczna, Martyna
2018-01-01
Belt conveyors are highly reliable machines able to work in special operating conditions. Harsh environment, long distance of transporting and great mass of transported martials are cause of high energy usage. That is why research in the field of belt conveyor transportation nowadays focuses on reducing the power consumption without lowering their efficiency. In this paper, previous methods for testing rolling resistance are described, and new method designed by authors was presented. New method of testing rolling resistance is quite simple and inexpensive. Moreover it allows to conduct the experimental tests of the impact of different parameters on the value of indentation rolling resistance such as core design, cover thickness, ambient temperature, idler travel frequency, or load value as well. Finally results of tests of relationship between rolling resistance and idler travel frequency and between rolling resistance and idler travel speed was presented.
Reliable use of determinants to solve nonlinear structural eigenvalue problems efficiently
NASA Technical Reports Server (NTRS)
Williams, F. W.; Kennedy, D.
1988-01-01
The analytical derivation, numerical implementation, and performance of a multiple-determinant parabolic interpolation method (MDPIM) for use in solving transcendental eigenvalue (critical buckling or undamped free vibration) problems in structural mechanics are presented. The overall bounding, eigenvalue-separation, qualified parabolic interpolation, accuracy-confirmation, and convergence-recovery stages of the MDPIM are described in detail, and the numbers of iterations required to solve sample plane-frame problems using the MDPIM are compared with those for a conventional bisection method and for the Newtonian method of Simpson (1984) in extensive tables. The MDPIM is shown to use 31 percent less computation time than bisection when accuracy of 0.0001 is required, but 62 percent less when accuracy of 10 to the -8th is required; the time savings over the Newtonian method are about 10 percent.
Determination of cyclic volatile methylsiloxanes in personal care products by gas chromatography.
Brothers, H M; Boehmer, T; Campbell, R A; Dorn, S; Kerbleski, J J; Lewis, S; Mund, C; Pero, D; Saito, K; Wieser, M; Zoller, W
2017-12-01
Organosiloxanes are prevalent in personal care products (PCPs) due to the desired properties they impart in the usage and application of such products. However, the European Chemical Agency (ECHA) has recently published restriction proposals on the amount of two cyclic siloxanes, octamethylcyclotetrasiloxane (D4) and decamethylcyclotetrasiloxane (D5), allowed in wash off products such as shampoos and conditioners which are discharged down the drain during consumer use. This legislation will require that reliable analytical methods are available for manufacturers and government agencies to use in documenting compliance with the restrictions. This article proposes a simple analytical method to enable accurate measurement of these compounds down to the circa 0.1 weight per cent level in PCPs. Although gas chromatography methods are reported in the literature for quantitation of D4 and D5 in several matrices including PCPs, the potential for generation of false positives due to contamination, co-elution and in situ generation of cyclic volatile methylsiloxanes (cVMS) is always present and needs to be controlled. This report demonstrates the applicability of using a combination of emulsion break, liquid-liquid extraction and silylation sample preparation followed by GC-FID analysis as a suitable means of analysing PCPs for specific cVMS. The reliability and limitations of such methodology were demonstrated through several round-robin studies conducted in the laboratories of a consortium of silicone manufacturers. In addition, this report presents examples of false positives encountered during development of the method and presents a comparative analysis between this method and a published QuEChERS sample preparation procedure to illustrate the potential for generation of false positives when an inappropriate approach is applied to determination of cVMS in personal care products. This report demonstrates that an approach to determine cVMS levels in personal care products is to perform an emulsion break on the sample, isolate the non-polar phase from the emulsion break and treat with a silylation reagent to abate potential in situ formation of cyclics during the course of GC-FID analysis. Round-robin studies conducted in laboratories representing multiple siloxane manufacturers demonstrated the reliability of the GC-FID method when measuring cVMS in PCPs down to circa 0.1%. © 2017 CES - Silicones Europe. International Journal of Cosmetic Science published by John Wiley & Sons Ltd on behalf of Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Predictive analytics and child protection: constraints and opportunities.
Russell, Jesse
2015-08-01
This paper considers how predictive analytics might inform, assist, and improve decision making in child protection. Predictive analytics represents recent increases in data quantity and data diversity, along with advances in computing technology. While the use of data and statistical modeling is not new to child protection decision making, its use in child protection is experiencing growth, and efforts to leverage predictive analytics for better decision-making in child protection are increasing. Past experiences, constraints and opportunities are reviewed. For predictive analytics to make the most impact on child protection practice and outcomes, it must embrace established criteria of validity, equity, reliability, and usefulness. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wannet, W J; Hermans, J H; van Der Drift, C; Op Den Camp, H J
2000-02-01
A convenient and sensitive method was developed to separate and detect various types of carbohydrates (polyols, mono- and disaccharides, and phosphorylated sugars) simultaneously using high-performance liquid chromatography (HPLC). The method consists of a chromatographic separation on a CarboPac PA1 anion-exchange analytical column followed by pulsed amperometric detection. In a single run (43 min) 13 carbohydrates were readily resolved. Calibration plots were linear over the ranges of 5-25 microM to 1. 0-1.5 mM. The reliable and fast analysis technique, avoiding derivatization steps and long run times, was used to determine the levels of carbohydrates involved in mannitol and trehalose metabolism in the edible mushroom Agaricus bisporus. Moreover, the method was used to study the trehalose phosphorylase reaction.
Hsieh, Han-Yun; Shyu, Ching-Lin; Liao, Chen-Wei; Lee, Ren-Jye; Lee, Maw-Rong; Vickroy, Thomas W; Chou, Chi-Chung
2012-04-01
Zeranol (Z) is a semi-synthetic mycotoxin that is used in some countries as a growth-promoting agent in livestock. In view of the known oestrogenic actions by Z and certain Z analogues, significant concerns exist with regard to the presence of Z residues in human foods and the potential for untoward effects, including carcinogenicity within the reproductive system. In order to confirm that foods are free from harmful Z residues, regulators need a quick and reliable analytical method that can be used for routine confirmation of Z-positive samples identified by enzyme-linked immunosorbent assay (ELISA) screening. In this study the authors have developed and validated a simple and rapid high-performance liquid chromatography method incorporating ultraviolet (UV) absorbance (wavelength 274 nm) and electrochemical (EC) dual-mode detection for simultaneous determination of Z-related mycotoxins produced from mouldy grain matrices, including rice, soybean and corn flakes. Recoveries for all analytes were around 80% and the limits of detection ranged from 10 to 25 ng mL(-1) for UV and from 50 to 90 ng mL(-1) for EC detection with good accuracy and reproducibility. Differential profiles and occurrence rates of Z, β-zearalenol, β-zearalanol and α-zearalenol in naturally moulded grain matrices were observed, indicating different metabolite patterns and possibly grain-specific effects of mycotoxin exposure for humans and animals. The strength of this dual detection method lies in its selectivity characterised by a carbon screen-printed electrode such that aflatoxin interference is precluded. The combined dual detection technique affords quick and reliable semi-confirmative and quantitative information on multiple types of Z analogues in mouldy grains without the necessity of using expensive mass spectrometry. The method is considered a superior supplement to ELISA, which only screens total Z immunoreactivity. Copyright © 2011 Society of Chemical Industry.
Intersession reliability of fMRI activation for heat pain and motor tasks
Quiton, Raimi L.; Keaser, Michael L.; Zhuo, Jiachen; Gullapalli, Rao P.; Greenspan, Joel D.
2014-01-01
As the practice of conducting longitudinal fMRI studies to assess mechanisms of pain-reducing interventions becomes more common, there is a great need to assess the test–retest reliability of the pain-related BOLD fMRI signal across repeated sessions. This study quantitatively evaluated the reliability of heat pain-related BOLD fMRI brain responses in healthy volunteers across 3 sessions conducted on separate days using two measures: (1) intraclass correlation coefficients (ICC) calculated based on signal amplitude and (2) spatial overlap. The ICC analysis of pain-related BOLD fMRI responses showed fair-to-moderate intersession reliability in brain areas regarded as part of the cortical pain network. Areas with the highest intersession reliability based on the ICC analysis included the anterior midcingulate cortex, anterior insula, and second somatosensory cortex. Areas with the lowest intersession reliability based on the ICC analysis also showed low spatial reliability; these regions included pregenual anterior cingulate cortex, primary somatosensory cortex, and posterior insula. Thus, this study found regional differences in pain-related BOLD fMRI response reliability, which may provide useful information to guide longitudinal pain studies. A simple motor task (finger-thumb opposition) was performed by the same subjects in the same sessions as the painful heat stimuli were delivered. Intersession reliability of fMRI activation in cortical motor areas was comparable to previously published findings for both spatial overlap and ICC measures, providing support for the validity of the analytical approach used to assess intersession reliability of pain-related fMRI activation. A secondary finding of this study is that the use of standard ICC alone as a measure of reliability may not be sufficient, as the underlying variance structure of an fMRI dataset can result in inappropriately high ICC values; a method to eliminate these false positive results was used in this study and is recommended for future studies of test–retest reliability. PMID:25161897
Taheriyoun, Masoud; Moradinejad, Saber
2015-01-01
The reliability of a wastewater treatment plant is a critical issue when the effluent is reused or discharged to water resources. Main factors affecting the performance of the wastewater treatment plant are the variation of the influent, inherent variability in the treatment processes, deficiencies in design, mechanical equipment, and operational failures. Thus, meeting the established reuse/discharge criteria requires assessment of plant reliability. Among many techniques developed in system reliability analysis, fault tree analysis (FTA) is one of the popular and efficient methods. FTA is a top down, deductive failure analysis in which an undesired state of a system is analyzed. In this study, the problem of reliability was studied on Tehran West Town wastewater treatment plant. This plant is a conventional activated sludge process, and the effluent is reused in landscape irrigation. The fault tree diagram was established with the violation of allowable effluent BOD as the top event in the diagram, and the deficiencies of the system were identified based on the developed model. Some basic events are operator's mistake, physical damage, and design problems. The analytical method is minimal cut sets (based on numerical probability) and Monte Carlo simulation. Basic event probabilities were calculated according to available data and experts' opinions. The results showed that human factors, especially human error had a great effect on top event occurrence. The mechanical, climate, and sewer system factors were in subsequent tier. Literature shows applying FTA has been seldom used in the past wastewater treatment plant (WWTP) risk analysis studies. Thus, the developed FTA model in this study considerably improves the insight into causal failure analysis of a WWTP. It provides an efficient tool for WWTP operators and decision makers to achieve the standard limits in wastewater reuse and discharge to the environment.
Constructing the "Best" Reliability Data for the Job
NASA Technical Reports Server (NTRS)
DeMott, D. L.; Kleinhammer, R. K.
2014-01-01
Modern business and technical decisions are based on the results of analyses. When considering assessments using "reliability data", the concern is how long a system will continue to operate as designed. Generally, the results are only as good as the data used. Ideally, a large set of pass/fail tests or observations to estimate the probability of failure of the item under test would produce the best data. However, this is a costly endeavor if used for every analysis and design. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, we attempt to develop the "best" or composite analog data to support our assessments. One method used incorporates processes for reviewing existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. Data that is more representative of reality and more project specific would provide more accurate analysis, and hopefully a better final decision.
Constructing the Best Reliability Data for the Job
NASA Technical Reports Server (NTRS)
Kleinhammer, R. K.; Kahn, J. C.
2014-01-01
Modern business and technical decisions are based on the results of analyses. When considering assessments using "reliability data", the concern is how long a system will continue to operate as designed. Generally, the results are only as good as the data used. Ideally, a large set of pass/fail tests or observations to estimate the probability of failure of the item under test would produce the best data. However, this is a costly endeavor if used for every analysis and design. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, we attempt to develop the "best" or composite analog data to support our assessments. One method used incorporates processes for reviewing existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. Data that is more representative of reality and more project specific would provide more accurate analysis, and hopefully a better final decision.
Sun, Yichun; Xue, Juan; Li, Baimei; Lin, Xiaoting; Wang, Zhibin; Jiang, Hai; Zhang, Hongwei; Wang, Qiuhong; Kuang, Haixue
2016-11-01
A rapid, sensitive, and reliable analytical ultra performance liquid chromatography with tandem mass spectrometry method was developed for the simultaneous determination of Aralia-saponin IV, 3-O-β-d-glucopyranosyl-(1→3)-β-d-glucopyranosyl-(1→3)-β-d-glucopyranosyl oleanolic acid 28-O-β-d-glucopyranoside, Aralia-saponin A and Aralia-saponin B after the oral administration of total saponin of Aralia elata leaves in rat plasma. Plasma samples were pretreated by protein precipitation with methanol. The analysis was performed on an ACQUITY UPLC HSS T3 column. The detection was performed on a triple quadrupole tandem mass spectrometer in multiple reaction monitoring mode using an electrospray ionization source with negative ionization mode. Under the experimental conditions, the calibration curves of four analytes had good linearity values (r > 0.991). The intra- and inter-day precision values of the four analytes were ≤ 11.6%, and the accuracy was between -6.2 and 4.2%.The extraction recoveries of four triterpenoid saponins were in the range of 84.06-91.66% (RSD < 10.5%), and all values of the matrix effect were more than 90.30%. The developed analytical method was successfully applied to pharmacokinetic study on simultaneous determination of the four triterpenoid saponins in rat plasma after oral administration of total saponin of Aralia elata leaves, which helps guiding clinical usage of Aralia elata leaves. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Aramendía, Maite; Rello, Luis; Vanhaecke, Frank; Resano, Martín
2012-10-16
Collection of biological fluids on clinical filter papers shows important advantages from a logistic point of view, although analysis of these specimens is far from straightforward. Concerning urine analysis, and particularly when direct trace elemental analysis by laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS) is aimed at, several problems arise, such as lack of sensitivity or different distribution of the analytes on the filter paper, rendering obtaining reliable quantitative results quite difficult. In this paper, a novel approach for urine collection is proposed, which circumvents many of these problems. This methodology consists on the use of precut filter paper discs where large amounts of sample can be retained upon a single deposition. This provides higher amounts of the target analytes and, thus, sufficient sensitivity, and allows addition of an adequate internal standard at the clinical lab prior to analysis, therefore making it suitable for a strategy based on unsupervised sample collection and ulterior analysis at referral centers. On the basis of this sampling methodology, an analytical method was developed for the direct determination of several elements in urine (Be, Bi, Cd, Co, Cu, Ni, Sb, Sn, Tl, Pb, and V) at the low μg L(-1) level by means of LA-ICPMS. The method developed provides good results in terms of accuracy and LODs (≤1 μg L(-1) for most of the analytes tested), with a precision in the range of 15%, fit-for-purpose for clinical control analysis.
Sandhu, Sundeep Kaur; Kellett, Stephen; Hardy, Gillian
2017-11-01
"Exits" in cognitive analytic therapy (CAT) are methods that change unhelpful patterns or roles during the final "revision" phase of the therapy. How exits are conceived and achieved is currently poorly understood. This study focussed on the revision stage to explore and define how change is accomplished in CAT. Qualitative content analysis studied transcripts of sessions 6 and 7 of a protocol delivered 8-session CAT treatment for depression. Eight participants met the study inclusion criteria, and therefore, 16 sessions were analysed. The exit model developed contained 3 distinct (but interacting) phases: (a) developing an observing self via therapist input or client self-reflection, (b) breaking out of old patterns by creating new roles and procedures, and (c) utilisation of a range of methods to support and maintain change. Levels of interrater reliability for the exit categories that formed the model were good. The revision stage of CAT emerged as a complex and dynamic process involving 3 interacting stages. Further research is recommended to understand how exits relate to durability of change and whether change processes differ according to presenting problem. Exit work in cognitive analytic therapy is a dynamic process that requires progression through stages of insight, active change, and consolidation. Development of an "observing self" is an important foundation stone for change, and cognitive analytic therapists need to work within the client's zone of proximal development. A number of aspects appear important in facilitating change, such as attending to the process and feelings generated by change talk. Copyright © 2017 John Wiley & Sons, Ltd.
Probabilistic finite elements for fatigue and fracture analysis
NASA Astrophysics Data System (ADS)
Belytschko, Ted; Liu, Wing Kam
1993-04-01
An overview of the probabilistic finite element method (PFEM) developed by the authors and their colleagues in recent years is presented. The primary focus is placed on the development of PFEM for both structural mechanics problems and fracture mechanics problems. The perturbation techniques are used as major tools for the analytical derivation. The following topics are covered: (1) representation and discretization of random fields; (2) development of PFEM for the general linear transient problem and nonlinear elasticity using Hu-Washizu variational principle; (3) computational aspects; (4) discussions of the application of PFEM to the reliability analysis of both brittle fracture and fatigue; and (5) a stochastic computational tool based on stochastic boundary element (SBEM). Results are obtained for the reliability index and corresponding probability of failure for: (1) fatigue crack growth; (2) defect geometry; (3) fatigue parameters; and (4) applied loads. These results show that initial defect is a critical parameter.
Probabilistic finite elements for fatigue and fracture analysis
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Liu, Wing Kam
1993-01-01
An overview of the probabilistic finite element method (PFEM) developed by the authors and their colleagues in recent years is presented. The primary focus is placed on the development of PFEM for both structural mechanics problems and fracture mechanics problems. The perturbation techniques are used as major tools for the analytical derivation. The following topics are covered: (1) representation and discretization of random fields; (2) development of PFEM for the general linear transient problem and nonlinear elasticity using Hu-Washizu variational principle; (3) computational aspects; (4) discussions of the application of PFEM to the reliability analysis of both brittle fracture and fatigue; and (5) a stochastic computational tool based on stochastic boundary element (SBEM). Results are obtained for the reliability index and corresponding probability of failure for: (1) fatigue crack growth; (2) defect geometry; (3) fatigue parameters; and (4) applied loads. These results show that initial defect is a critical parameter.
Kandhro, Aftab A; Laghari, Abdul Hafeez; Mahesar, Sarfaraz A; Saleem, Rubina; Nelofar, Aisha; Khan, Salman Tariq; Sherazi, S T H
2013-11-01
A quick and reliable analytical method for the quantitative assessment of cefixime in orally administered pharmaceutical formulations is developed by using diamond cell attenuated total reflectance (ATR) Fourier transform infrared (FT-IR) spectroscopy as an easy procedure for quality control laboratories. The standards for calibration were prepared in aqueous medium ranging from 350 to 6000mg/kg. The calibration model was developed based on partial least square (PLS) using finger print region of FT-IR spectrum in the range from 1485 to 887cm(-1). Excellent coefficient of determination (R(2)) was achieved as high as 0.99976 with root mean square error of 44.8 for calibration. The application of diamond cell (smart accessory) ATR FT-IR proves a reliable determination of cefixime in pharmaceutical formulations to assess the quality of the final product. Copyright © 2013 Elsevier B.V. All rights reserved.
Profiling defect depth in composite materials using thermal imaging NDE
NASA Astrophysics Data System (ADS)
Obeidat, Omar; Yu, Qiuye; Han, Xiaoyan
2018-04-01
Sonic Infrared (IR) NDE, is a relatively new NDE technology; it has been demonstrated as a reliable and sensitive method to detect defects. SIR uses ultrasonic excitation with IR imaging to detect defects and flaws in the structures being inspected. An IR camera captures infrared radiation from the target for a period of time covering the ultrasound pulse. This period of time may be much longer than the pulse depending on the defect depth and the thermal properties of the materials. With the increasing deployment of composites in modern aerospace and automobile structures, fast, wide-area and reliable NDE methods are necessary. Impact damage is one of the major concerns in modern composites. Damage can occur at a certain depth without any visual indication on the surface. Defect depth information can influence maintenance decisions. Depth profiling relies on the time delays in the captured image sequence. We'll present our work on the defect depth profiling by using the temporal information of IR images. An analytical model is introduced to describe heat diffusion from subsurface defects in composite materials. Depth profiling using peak time is introduced as well.
Structural health monitoring in composite materials using frequency response methods
NASA Astrophysics Data System (ADS)
Kessler, Seth S.; Spearing, S. Mark; Atalla, Mauro J.; Cesnik, Carlos E. S.; Soutis, Constantinos
2001-08-01
Cost effective and reliable damage detection is critical for the utilization of composite materials in structural applications. Non-destructive evaluation techniques (e.g. ultrasound, radiography, infra-red imaging) are available for use during standard repair and maintenance cycles, however by comparison to the techniques used for metals these are relatively expensive and time consuming. This paper presents part of an experimental and analytical survey of candidate methods for the detection of damage in composite materials. The experimental results are presented for the application of modal analysis techniques applied to rectangular laminated graphite/epoxy specimens containing representative damage modes, including delamination, transverse ply cracks and through-holes. Changes in natural frequencies and modes were then found using a scanning laser vibrometer, and 2-D finite element models were created for comparison with the experimental results. The models accurately predicted the response of the specimems at low frequencies, but the local excitation and coalescence of higher frequency modes make mode-dependent damage detection difficult and most likely impractical for structural applications. The frequency response method was found to be reliable for detecting even small amounts of damage in a simple composite structure, however the potentially important information about damage type, size, location and orientation were lost using this method since several combinations of these variables can yield identical response signatures.
Mezouari, S; Liu, W Yun; Pace, G; Hartman, T G
2015-01-01
The objective of this study was to develop an improved analytical method for the determination of 3-chloro-1,2-propanediol (3-MCPD) and 1,3-dichloropropanol (1,3-DCP) in paper-type food packaging. The established method includes aqueous extraction, matrix spiking of a deuterated surrogate internal standard (3-MCPD-d₅), clean-up using Extrelut solid-phase extraction, derivatisation using a silylation reagent, and GC-MS analysis of the chloropropanols as their corresponding trimethyl silyl ethers. The new method is applicable to food-grade packaging samples using European Commission standard aqueous extraction and aqueous food stimulant migration tests. In this improved method, the derivatisation procedure was optimised; the cost and time of the analysis were reduced by using 10 times less sample, solvents and reagents than in previously described methods. Overall the validation data demonstrate that the method is precise and reliable. The limit of detection (LOD) of the aqueous extract was 0.010 mg kg(-1) (w/w) for both 3-MCPD and 1,3-DCP. Analytical precision had a relative standard deviation (RSD) of 3.36% for 3-MCPD and an RSD of 7.65% for 1,3-DCP. The new method was satisfactorily applied to the analysis of over 100 commercial paperboard packaging samples. The data are being used to guide the product development of a next generation of wet-strength resins with reduced chloropropanol content, and also for risk assessments to calculate the virtual safe dose (VSD).
Volatile organic compounds: sampling methods and their worldwide profile in ambient air.
Kumar, Anuj; Víden, Ivan
2007-08-01
The atmosphere is a particularly difficult analytical system because of the very low levels of substances to be analysed, sharp variations in pollutant levels with time and location, differences in wind, temperature and humidity. This makes the selection of an efficient sampling technique for air analysis a key step to reliable results. Generally, methods for volatile organic compounds sampling include collection of the whole air or preconcentration of samples on adsorbents. All the methods vary from each other according to the sampling technique, type of sorbent, method of extraction and identification technique. In this review paper we discuss various important aspects for sampling of volatile organic compounds by the widely used and advanced sampling methods. Characteristics of various adsorbents used for VOCs sampling are also described. Furthermore, this paper makes an effort to comprehensively review the concentration levels of volatile organic compounds along with the methodology used for analysis, in major cities of the world.