Tutorial in Biostatistics: Instrumental Variable Methods for Causal Inference*
Baiocchi, Michael; Cheng, Jing; Small, Dylan S.
2014-01-01
A goal of many health studies is to determine the causal effect of a treatment or intervention on health outcomes. Often, it is not ethically or practically possible to conduct a perfectly randomized experiment and instead an observational study must be used. A major challenge to the validity of observational studies is the possibility of unmeasured confounding (i.e., unmeasured ways in which the treatment and control groups differ before treatment administration which also affect the outcome). Instrumental variables analysis is a method for controlling for unmeasured confounding. This type of analysis requires the measurement of a valid instrumental variable, which is a variable that (i) is independent of the unmeasured confounding; (ii) affects the treatment; and (iii) affects the outcome only indirectly through its effect on the treatment. This tutorial discusses the types of causal effects that can be estimated by instrumental variables analysis; the assumptions needed for instrumental variables analysis to provide valid estimates of causal effects and sensitivity analysis for those assumptions; methods of estimation of causal effects using instrumental variables; and sources of instrumental variables in health studies. PMID:24599889
Operational modal analysis applied to the concert harp
NASA Astrophysics Data System (ADS)
Chomette, B.; Le Carrou, J.-L.
2015-05-01
Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.
Seixas, Fábio Heredia; Estrela, Carlos; Bueno, Mike Reis; Sousa-Neto, Manoel Damião; Pécora, Jesus Djalma
2015-06-01
The aim of this study was to determine the root canal area before and after the instrumentation 1 mm short of the apical foramen by clinical and cone beam computed tomography (CBCT) methods, and to evaluate the cleanliness of the apical region in mesiodistal flattened teeth by using optical microscopy. Forty-two human single-canal mandibular incisors were instrumented using the Free Tip Preparation technique up to three, four or five instruments from the initial. Cone beam computed tomography scans were acquired of the samples before and after root canal preparation (RCP). Irrigation was performed by conventional or hydrodynamic means, using 2.5% sodium hypochlorite. The samples were prepared for observation under an optical microscope. Images were digitally obtained, analyzed and the results were submitted to statistical analysis (two-way ANOVA complemented by Bonferroni's post-test). There was no significant difference between the studied anatomical areas with both CBCT and clinical methods. There were no differences between irrigation methods. It was verified differences between instrumentation techniques. Instrumentation with four instruments from the initial instrument determined a significant increase in the contact area when compared to preparation with three instruments, but RCP with 5 instruments did not result in a better cleanliness. The analysis with CBCT was not capable to determine the precise shape of surgical apical area comparing to the clinical method. Both the conventional and hydrodynamic irrigation techniques were not able to promote root canals debris-free. The instruments action in root canal walls was proportional to the number of instruments used from the initial apical instrument.
Comparison of variance estimators for meta-analysis of instrumental variable estimates
Schmidt, AF; Hingorani, AD; Jefferis, BJ; White, J; Groenwold, RHH; Dudbridge, F
2016-01-01
Abstract Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two versions of the delta method (IV before or after pooling), four bootstrap estimators, a jack-knife estimator and a heteroscedasticity-consistent (HC) variance estimator were compared using simulation. Two types of meta-analyses were compared, a two-stage meta-analysis pooling results, and a one-stage meta-analysis pooling datasets. Results: Using a two-stage meta-analysis, coverage of the point estimate using bootstrapped estimators deviated from nominal levels at weak instrument settings and/or outcome probabilities ≤ 0.10. The jack-knife estimator was the least biased resampling method, the HC estimator often failed at outcome probabilities ≤ 0.50 and overall the delta method estimators were the least biased. In the presence of between-study heterogeneity, the delta method before meta-analysis performed best. Using a one-stage meta-analysis all methods performed equally well and better than two-stage meta-analysis of greater or equal size. Conclusions: In the presence of between-study heterogeneity, two-stage meta-analyses should preferentially use the delta method before meta-analysis. Weak instrument bias can be reduced by performing a one-stage meta-analysis. PMID:27591262
Instruments for Water Quality Monitoring
ERIC Educational Resources Information Center
Ballinger, Dwight G.
1972-01-01
Presents information regarding available instruments for industries and agencies who must monitor numerous aquatic parameters. Charts denote examples of parameters sampled, testing methods, range and accuracy of test methods, cost analysis, and reliability of instruments. (BL)
Industrial Instrument Mechanic. Occupational Analyses Series.
ERIC Educational Resources Information Center
Dean, Ann; Zagorac, Mike; Bumbaka, Nick
This analysis covers tasks performed by an industrial instrument mechanic, an occupational title some provinces and territories of Canada have also identified as industrial instrumentation and instrument mechanic. A guide to analysis discusses development, structure, and validation method; scope of the occupation; trends; and safety. To facilitate…
Tsao, C C; Liou, J U; Wen, P H; Peng, C C; Liu, T S
2013-01-01
Aim To develop analytical models and analyse the stress distribution and flexibility of nickel–titanium (NiTi) instruments subject to bending forces. Methodology The analytical method was used to analyse the behaviours of NiTi instruments under bending forces. Two NiTi instruments (RaCe and Mani NRT) with different cross-sections and geometries were considered. Analytical results were derived using Euler–Bernoulli nonlinear differential equations that took into account the screw pitch variation of these NiTi instruments. In addition, the nonlinear deformation analysis based on the analytical model and the finite element nonlinear analysis was carried out. Numerical results are obtained by carrying out a finite element method. Results According to analytical results, the maximum curvature of the instrument occurs near the instrument tip. Results of the finite element analysis revealed that the position of maximum von Mises stress was near the instrument tip. Therefore, the proposed analytical model can be used to predict the position of maximum curvature in the instrument where fracture may occur. Finally, results of analytical and numerical models were compatible. Conclusion The proposed analytical model was validated by numerical results in analysing bending deformation of NiTi instruments. The analytical model is useful in the design and analysis of instruments. The proposed theoretical model is effective in studying the flexibility of NiTi instruments. Compared with the finite element method, the analytical model can deal conveniently and effectively with the subject of bending behaviour of rotary NiTi endodontic instruments. PMID:23173762
Spectroscopic chemical analysis methods and apparatus
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Reid, Ray D. (Inventor)
2009-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.
Spectroscopic chemical analysis methods and apparatus
NASA Technical Reports Server (NTRS)
Reid, Ray D. (Inventor); Hug, William F. (Inventor)
2010-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.
Cost Analysis at the Local Level: Applications and Attitudes. Paper and Report Series No. 103.
ERIC Educational Resources Information Center
Smith, Jana Kay
This study reports the results of a survey sent to 67 metropolitan school district evaluators. The survey assessed past and anticipated conduct of cost analysis methods, as well as attitudes toward the use of these methods. The instrument used contained many items taken from a survey instrument used in a previous study of cost analysis methods at…
Binocular optical axis parallelism detection precision analysis based on Monte Carlo method
NASA Astrophysics Data System (ADS)
Ying, Jiaju; Liu, Bingqi
2018-02-01
According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.
Low-Dimensional Feature Representation for Instrument Identification
NASA Astrophysics Data System (ADS)
Ihara, Mizuki; Maeda, Shin-Ichi; Ikeda, Kazushi; Ishii, Shin
For monophonic music instrument identification, various feature extraction and selection methods have been proposed. One of the issues toward instrument identification is that the same spectrum is not always observed even in the same instrument due to the difference of the recording condition. Therefore, it is important to find non-redundant instrument-specific features that maintain information essential for high-quality instrument identification to apply them to various instrumental music analyses. For such a dimensionality reduction method, the authors propose the utilization of linear projection methods: local Fisher discriminant analysis (LFDA) and LFDA combined with principal component analysis (PCA). After experimentally clarifying that raw power spectra are actually good for instrument classification, the authors reduced the feature dimensionality by LFDA or by PCA followed by LFDA (PCA-LFDA). The reduced features achieved reasonably high identification performance that was comparable or higher than those by the power spectra and those achieved by other existing studies. These results demonstrated that our LFDA and PCA-LFDA can successfully extract low-dimensional instrument features that maintain the characteristic information of the instruments.
Power calculator for instrumental variable analysis in pharmacoepidemiology
Walker, Venexia M; Davies, Neil M; Windmeijer, Frank; Burgess, Stephen; Martin, Richard M
2017-01-01
Abstract Background Instrumental variable analysis, for example with physicians’ prescribing preferences as an instrument for medications issued in primary care, is an increasingly popular method in the field of pharmacoepidemiology. Existing power calculators for studies using instrumental variable analysis, such as Mendelian randomization power calculators, do not allow for the structure of research questions in this field. This is because the analysis in pharmacoepidemiology will typically have stronger instruments and detect larger causal effects than in other fields. Consequently, there is a need for dedicated power calculators for pharmacoepidemiological research. Methods and Results The formula for calculating the power of a study using instrumental variable analysis in the context of pharmacoepidemiology is derived before being validated by a simulation study. The formula is applicable for studies using a single binary instrument to analyse the causal effect of a binary exposure on a continuous outcome. An online calculator, as well as packages in both R and Stata, are provided for the implementation of the formula by others. Conclusions The statistical power of instrumental variable analysis in pharmacoepidemiological studies to detect a clinically meaningful treatment effect is an important consideration. Research questions in this field have distinct structures that must be accounted for when calculating power. The formula presented differs from existing instrumental variable power formulae due to its parametrization, which is designed specifically for ease of use by pharmacoepidemiologists. PMID:28575313
Spectroscopic chemical analysis methods and apparatus
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)
2013-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.
Pressure-Assisted Chelating Extraction as a Teaching Tool in Instrumental Analysis
ERIC Educational Resources Information Center
Sadik, Omowunmi A.; Wanekaya, Adam K.; Yevgeny, Gelfand
2004-01-01
A novel instrumental-digestion technique using pressure-assisted chelating extraction (PACE), for undergraduate laboratory is reported. This procedure is used for exposing students to safe sample-preparation techniques, for correlating wet-chemical methods with modern instrumental analysis and comparing the performance of PACE with conventional…
ERIC Educational Resources Information Center
Dickson-Karn, Nicole M.
2017-01-01
A multi-instrument approach has been applied to the efficient identification of polymers in an upper-division undergraduate instrumental analysis laboratory course. Attenuated total reflectance Fourier transform infrared spectroscopy (ATR-FTIR) is used in conjunction with differential scanning calorimetry (DSC) to identify 18 polymer samples and…
ERIC Educational Resources Information Center
Nivens, Delana A.; Padgett, Clifford W.; Chase, Jeffery M.; Verges, Katie J.; Jamieson, Deborah S.
2010-01-01
Case studies and current literature are combined with spectroscopic analysis to provide a unique chemistry experience for art history students and to provide a unique inquiry-based laboratory experiment for analytical chemistry students. The XRF analysis method was used to demonstrate to nonscience majors (art history students) a powerful…
Soft X-ray astronomy using grazing incidence optics
NASA Technical Reports Server (NTRS)
Davis, John M.
1989-01-01
The instrumental background of X-ray astronomy with an emphasis on high resolution imagery is outlined. Optical and system performance, in terms of resolution, are compared and methods for improving the latter in finite length instruments described. The method of analysis of broadband images to obtain diagnostic information is described and is applied to the analysis of coronal structures.
Neutron activation analysis: trends in developments and applications
NASA Astrophysics Data System (ADS)
de Goeij, J. J.; Bode, P.
1995-03-01
New developments in instrumentation for, and methodology of, Instrumental Neutron Activation Analysis (INAA) may lead to new niches for this method of elemental analysis. This paper describes the possibilities of advanced detectors, automated irradiation and counting stations, and very large sample analysis. An overview is given of some typical new fields of application.
Development of TPS flight test and operational instrumentation
NASA Technical Reports Server (NTRS)
Carnahan, K. R.; Hartman, G. J.; Neuner, G. J.
1975-01-01
Thermal and flow sensor instrumentation was developed for use as an integral part of the space shuttle orbiter reusable thermal protection system. The effort was performed in three tasks: a study to determine the optimum instruments and instrument installations for the space shuttle orbiter RSI and RCC TPS; tests and/or analysis to determine the instrument installations to minimize measurement errors; and analysis using data from the test program for comparison to analytical methods. A detailed review of existing state of the art instrumentation in industry was performed to determine the baseline for the departure of the research effort. From this information, detailed criteria for thermal protection system instrumentation were developed.
2012-01-01
Background Continuous quality improvement (CQI) methods are widely used in healthcare; however, the effectiveness of the methods is variable, and evidence about the extent to which contextual and other factors modify effects is limited. Investigating the relationship between these factors and CQI outcomes poses challenges for those evaluating CQI, among the most complex of which relate to the measurement of modifying factors. We aimed to provide guidance to support the selection of measurement instruments by systematically collating, categorising, and reviewing quantitative self-report instruments. Methods Data sources: We searched MEDLINE, PsycINFO, and Health and Psychosocial Instruments, reference lists of systematic reviews, and citations and references of the main report of instruments. Study selection: The scope of the review was determined by a conceptual framework developed to capture factors relevant to evaluating CQI in primary care (the InQuIRe framework). Papers reporting development or use of an instrument measuring a construct encompassed by the framework were included. Data extracted included instrument purpose; theoretical basis, constructs measured and definitions; development methods and assessment of measurement properties. Analysis and synthesis: We used qualitative analysis of instrument content and our initial framework to develop a taxonomy for summarising and comparing instruments. Instrument content was categorised using the taxonomy, illustrating coverage of the InQuIRe framework. Methods of development and evidence of measurement properties were reviewed for instruments with potential for use in primary care. Results We identified 186 potentially relevant instruments, 152 of which were analysed to develop the taxonomy. Eighty-four instruments measured constructs relevant to primary care, with content measuring CQI implementation and use (19 instruments), organizational context (51 instruments), and individual factors (21 instruments). Forty-one instruments were included for full review. Development methods were often pragmatic, rather than systematic and theory-based, and evidence supporting measurement properties was limited. Conclusions Many instruments are available for evaluating CQI, but most require further use and testing to establish their measurement properties. Further development and use of these measures in evaluations should increase the contribution made by individual studies to our understanding of CQI and enhance our ability to synthesise evidence for informing policy and practice. PMID:23241168
Atkins, Rahshida
2014-01-01
Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis.
Atkins, Rahshida
2015-01-01
Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis. PMID:25626225
ERIC Educational Resources Information Center
Ji, Chang; Boisvert, Susanne M.; Arida, Ann-Marie C.; Day, Shannon E.
2008-01-01
An internal standard method applicable to undergraduate instrumental analysis or environmental chemistry laboratory has been designed and tested to determine the Henry's law constants for a series of alkyl nitriles. In this method, a mixture of the analytes and an internal standard is prepared and used to make a standard solution (organic solvent)…
Developments in Sampling and Analysis Instrumentation for Stationary Sources
ERIC Educational Resources Information Center
Nader, John S.
1973-01-01
Instrumentation for the measurement of pollutant emissions is considered including sample-site selection, sample transport, sample treatment, sample analysis, and data reduction, display, and interpretation. Measurement approaches discussed involve sample extraction from within the stack and electro-optical methods. (BL)
ERIC Educational Resources Information Center
Economou, A.; Tzanavaras, P. D.; Themelis, D. G.
2005-01-01
The sequential-injection analysis (SIA) is an approach to sample handling that enables the automation of manual wet-chemistry procedures in a rapid, precise and efficient manner. The experiments using SIA fits well in the course of Instrumental Chemical Analysis and especially in the section of Automatic Methods of analysis provided by chemistry…
NASA Astrophysics Data System (ADS)
Wetzel, Angela Payne
Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize this gap in best practices and subsequently to promote instrument development research that is more consistent through the peer-review process.
Applying mixed methods to pretest the Pressure Ulcer Quality of Life (PU-QOL) instrument.
Gorecki, C; Lamping, D L; Nixon, J; Brown, J M; Cano, S
2012-04-01
Pretesting is key in the development of patient-reported outcome (PRO) instruments. We describe a mixed-methods approach based on interviews and Rasch measurement methods in the pretesting of the Pressure Ulcer Quality of Life (PU-QOL) instrument. We used cognitive interviews to pretest the PU-QOL in 35 patients with pressure ulcers with the view to identifying problematic items, followed by Rasch analysis to examine response options, appropriateness of the item series and biases due to question ordering (item fit). We then compared findings in an interactive and iterative process to identify potential strengths and weaknesses of PU-QOL items, and guide decision-making about further revisions to items and design/layout. Although cognitive interviews largely supported items, they highlighted problems with layout, response options and comprehension. Findings from the Rasch analysis identified problems with response options through reversed thresholds. The use of a mixed-methods approach in pretesting the PU-QOL instrument proved beneficial for identifying problems with scale layout, response options and framing/wording of items. Rasch measurement methods are a useful addition to standard qualitative pretesting for evaluating strengths and weaknesses of early stage PRO instruments.
Root canal centering ability of rotary cutting nickel titanium instruments: A meta-analysis
Gundappa, Mohan; Bansal, Rashmi; Khoriya, Sarvesh; Mohan, Ranjana
2014-01-01
Aim: To systematically review articles on canal centering ability of endodontic rotary cutting Nickel-Titanium (Ni-Ti) instruments and subject results to meta-analysis. Materials and Methods: A comprehensive search was initiated on canal centering ability of different rotary cutting Ni-Ti files such as Protaper, Hero Shaper, K3, Mtwo, Race, Wave One by selecting articles published in peer reviewed journals during 1991-2013 using “Pub Med” database. Inclusion and exclusion criteria were established. A data was created by tabulating: Author name, publication year, sample size, number of experimental groups, methods to evaluate canal centering ability, instrument cross section, taper, tip design, rake angle, mean and standard deviation. The data generated was subjected to meta-analysis. Results: Maximum studies were found to be conducted on mesiobuccal canal of mandibular 1st molar with curvature ranging from 15-60°. The difference in canal centering ability of different rotary cutting Ni-Ti instruments was not statistically significant. Conclusion: All endodontic rotary cutting Ni-Ti instruments are capable of producing centered preparations. Protaper depicted the best centering ability. Computed tomography is an effective method of evaluating canal centering ability. PMID:25506134
Guild, Georgia E.; Stangoulis, James C. R.
2016-01-01
Within the HarvestPlus program there are many collaborators currently using X-Ray Fluorescence (XRF) spectroscopy to measure Fe and Zn in their target crops. In India, five HarvestPlus wheat collaborators have laboratories that conduct this analysis and their throughput has increased significantly. The benefits of using XRF are its ease of use, minimal sample preparation and high throughput analysis. The lack of commercially available calibration standards has led to a need for alternative calibration arrangements for many of the instruments. Consequently, the majority of instruments have either been installed with an electronic transfer of an original grain calibration set developed by a preferred lab, or a locally supplied calibration. Unfortunately, neither of these methods has been entirely successful. The electronic transfer is unable to account for small variations between the instruments, whereas the use of a locally provided calibration set is heavily reliant on the accuracy of the reference analysis method, which is particularly difficult to achieve when analyzing low levels of micronutrient. Consequently, we have developed a calibration method that uses non-matrix matched glass disks. Here we present the validation of this method and show this calibration approach can improve the reproducibility and accuracy of whole grain wheat analysis on 5 different XRF instruments across the HarvestPlus breeding program. PMID:27375644
Development, history, and future of automated cell counters.
Green, Ralph; Wachsmann-Hogiu, Sebastian
2015-03-01
Modern automated hematology instruments use either optical methods (light scatter), impedance-based methods based on the Coulter principle (changes in electrical current induced by blood cells flowing through an electrically charged opening), or a combination of both optical and impedance-based methods. Progressive improvement in these instruments has allowed the enumeration and evaluation of blood cells with great accuracy, precision, and speed at very low cost. Future directions of hematology instrumentation include the addition of new parameters and the development of point-of-care instrumentation. In the future, in-vivo analysis of blood cells may allow noninvasive and near-continuous measurements. Copyright © 2015 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Wetzel, Angela Payne
2011-01-01
Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across…
ERIC Educational Resources Information Center
Pence, Brian Wells; Miller, William C.; Gaynes, Bradley N.
2009-01-01
Prevalence and validation studies rely on imperfect reference standard (RS) diagnostic instruments that can bias prevalence and test characteristic estimates. The authors illustrate 2 methods to account for RS misclassification. Latent class analysis (LCA) combines information from multiple imperfect measures of an unmeasurable latent condition to…
A new method for the assessment of the surface topography of NiTi rotary instruments.
Ferreira, F; Barbosa, I; Scelza, P; Russano, D; Neff, J; Montagnana, M; Zaccaro Scelza, M
2017-09-01
To describe a new method for the assessment of nanoscale alterations in the surface topography of nickel-titanium endodontic instruments using a high-resolution optical method and to verify the accuracy of the technique. Noncontact three-dimensional optical profilometry was used to evaluate defects on a size 25, .08 taper reciprocating instrument (WaveOne ® ), which was subjected to a cyclic fatigue test in a simulated root canal in a clear resin block. For the investigation, an original procedure was established for the analysis of similar areas located 3 mm from the tip of the instrument before and after canal preparation to enable the repeatability and reproducibility of the measurements with precision. All observations and analysis were taken in areas measuring 210 × 210 μm provided by the software of the equipment. The three-dimensional high-resolution image analysis showed clear alterations in the surface topography of the examined cutting blade and flute of the instrument, before and after use, with the presence of surface irregularities such as deformations, debris, grooves, cracks, steps and microcavities. Optical profilometry provided accurate qualitative nanoscale evaluation of similar surfaces before and after the fatigue test. The stability and repeatability of the technique enables a more comprehensive understanding of the effects of wear on the surface of endodontic instruments. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.
[The Application of the Fault Tree Analysis Method in Medical Equipment Maintenance].
Liu, Hongbin
2015-11-01
In this paper, the traditional fault tree analysis method is presented, detailed instructions for its application characteristics in medical instrument maintenance is made. It is made significant changes when the traditional fault tree analysis method is introduced into the medical instrument maintenance: gave up the logic symbolic, logic analysis and calculation, gave up its complicated programs, and only keep its image and practical fault tree diagram, and the fault tree diagram there are also differences: the fault tree is no longer a logical tree but the thinking tree in troubleshooting, the definition of the fault tree's nodes is different, the composition of the fault tree's branches is also different.
Defect propagation in NiTi rotary instruments: a noncontact optical profilometry analysis.
Barbosa, I; Ferreira, F; Scelza, P; Neff, J; Russano, D; Montagnana, M; Zaccaro Scelza, M
2018-04-10
To evaluate the presence and propagation of defects and their effects on surfaces of nickel-titanium (NiTi) instruments using noncontact, three-dimensional optical profilometry, and to assess the accuracy of this method of investigation. The flute surface areas of instruments from two commercial instrumentation systems, namely Reciproc R25 (n = 5) and WaveOne Primary (n = 5), were assessed and compared before and after performing two instrumentation cycles in simulated root canals in clear resin blocks. All the analyses were conducted on areas measuring 211 × 211 μm, located 3 mm from the tips of the instruments. A quantitative analysis was conducted before and after the first and second instrumentation cycles, using the Sa (average roughness over the measurement field), Sq (root mean square roughness) and Sz (average height over the measurement field) amplitude parameters. All the data were submitted to statistical analysis at a 5% level of significance. There was a significant increase (P = 0.007) in wear in both groups, especially between baseline and the second instrumentation cycle, with significantly higher wear values being observed on WaveOne instruments (Sz median values = 33.68 and 2.89 μm, respectively, for WO and RP groups). A significant increase in surface roughness (P = 0.016 and P = 0.008, respectively, for Sa and Sq) was observed in both groups from the first to the second instrumentation cycle, mostly in WaveOne specimens. Qualitative analysis revealed a greater number of defects on the flute topography of all the instruments after use. More defects were identified in WaveOne Primary instruments compared to Reciproc R25, irrespective of the evaluation stage. The investigation method provided an accurate, repeatable and reproducible assessment of NiTi instruments at different time-points. © 2018 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Review of analytical methods for the quantification of iodine in complex matrices.
Shelor, C Phillip; Dasgupta, Purnendu K
2011-09-19
Iodine is an essential element of human nutrition. Nearly a third of the global population has insufficient iodine intake and is at risk of developing Iodine Deficiency Disorders (IDD). Most countries have iodine supplementation and monitoring programs. Urinary iodide (UI) is the biomarker used for epidemiological studies; only a few methods are currently used routinely for analysis. These methods either require expensive instrumentation with qualified personnel (inductively coupled plasma-mass spectrometry, instrumental nuclear activation analysis) or oxidative sample digestion to remove potential interferences prior to analysis by a kinetic colorimetric method originally introduced by Sandell and Kolthoff ~75 years ago. The Sandell-Kolthoff (S-K) method is based on the catalytic effect of iodide on the reaction between Ce(4+) and As(3+). No available technique fully fits the needs of developing countries; research into inexpensive reliable methods and instrumentation are needed. There have been multiple reviews of methods used for epidemiological studies and specific techniques. However, a general review of iodine determination on a wide-ranging set of complex matrices is not available. While this review is not comprehensive, we cover the principal developments since the original development of the S-K method. Copyright © 2011 Elsevier B.V. All rights reserved.
Reuse of disposable laparoscopic instruments: cost analysis*
DesCôteaux, Jean-Gaston; Tye, Lucille; Poulin, Eric C.
1996-01-01
Objective To evaluate the cost benefits of reusing disposable laparoscopic instruments. Design A cost-analysis study based on a review of laparoscopic and thoracoscopic procedures performed between August 1990 and January 1994, including analysis of disposable instrument use, purchase records, and reprocessing costs for each instrument. Setting The general surgery department of a 461-bed teaching hospital where disposable laparoscopic instruments are routinely reused according to internally validated reprocessing protocols. Methods Laparoscopic and thoracoscopic interventions performed between August 1990 and January 1994 for which the number and types of disposable laparoscopic instruments were standardized. Main Outcome Measures Reprocessing cost per instrument, the savings realized by reusing disposable laparoscopic instruments and the cost-efficient number of reuses per instrument. Results The cost of reprocessing instruments varied from $2.64 (Can) to $4.66 for each disposable laparoscopic instrument. Purchases of 10 commonly reused disposable laparoscopic instruments totalled $183 279, and the total reprocessing cost was estimated at $35 665 for the study period. Not reusing disposable instruments would have cost $527 575 in instrument purchases for the same period. Disposable laparoscopic instruments were reused 1.7 to 68 times each. Conclusions Under carefully monitored conditions and strict guidelines, reuse of disposable laparoscopic and thoracoscopic instruments can be cost-effective. PMID:8769924
Application of the pulsed fast/thermal neutron method for soil elemental analysis
USDA-ARS?s Scientific Manuscript database
Soil science is a research field where physic concepts and experimental methods are widely used, particularly in agro-chemistry and soil elemental analysis. Different methods of analysis are currently available. The evolution of nuclear physics (methodology and instrumentation) combined with the ava...
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338
Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.
Monitoring Instrument Performance in Regional Broadband Seismic Network Using Ambient Seismic Noise
NASA Astrophysics Data System (ADS)
Ye, F.; Lyu, S.; Lin, J.
2017-12-01
In the past ten years, the number of seismic stations has increased significantly, and regional seismic networks with advanced technology have been gradually developed all over the world. The resulting broadband data help to improve the seismological research. It is important to monitor the performance of broadband instruments in a new network in a long period of time to ensure the accuracy of seismic records. Here, we propose a method that uses ambient noise data in the period range 5-25 s to monitor instrument performance and check data quality in situ. The method is based on an analysis of amplitude and phase index parameters calculated from pairwise cross-correlations of three stations, which provides multiple references for reliable error estimates. Index parameters calculated daily during a two-year observation period are evaluated to identify stations with instrument response errors in near real time. During data processing, initial instrument responses are used in place of available instrument responses to simulate instrument response errors, which are then used to verify our results. We also examine feasibility of the tailing noise using data from stations selected from USArray in different locations and analyze the possible instrumental errors resulting in time-shifts used to verify the method. Additionally, we show an application that effects of instrument response errors that experience pole-zeros variations on monitoring temporal variations in crustal properties appear statistically significant velocity perturbation larger than the standard deviation. The results indicate that monitoring seismic instrument performance helps eliminate data pollution before analysis begins.
Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.
2014-01-01
Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811
NASA Technical Reports Server (NTRS)
Mueller, J. L. (Editor); Fargion, Giuletta S. (Editor); McClain, Charles R. (Editor); Pegau, Scott; Zaneveld, J. Ronald V.; Mitchell, B. Gregg; Kahru, Mati; Wieland, John; Stramska, Malgorzat
2003-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 (Mueller and Fargion 2002, Volumes 1 and 2) is entirely superseded by the six volumes of Revision 4 listed above.
NASA Technical Reports Server (NTRS)
Mueller, J. L.; Fargion, G. S.; McClain, C. R. (Editor); Pegau, S.; Zanefeld, J. R. V.; Mitchell, B. G.; Kahru, M.; Wieland, J.; Stramska, M.
2003-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparision and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background, and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 is entirely superseded by the six volumes of Revision 4 listed above.
Instrumental variables and Mendelian randomization with invalid instruments
NASA Astrophysics Data System (ADS)
Kang, Hyunseung
Instrumental variables (IV) methods have been widely used to determine the causal effect of a treatment, exposure, policy, or an intervention on an outcome of interest. The IV method relies on having a valid instrument, a variable that is (A1) associated with the exposure, (A2) has no direct effect on the outcome, and (A3) is unrelated to the unmeasured confounders associated with the exposure and the outcome. However, in practice, finding a valid instrument, especially those that satisfy (A2) and (A3), can be challenging. For example, in Mendelian randomization studies where genetic markers are used as instruments, complete knowledge about instruments' validity is equivalent to complete knowledge about the involved genes' functions. The dissertation explores the theory, methods, and application of IV methods when invalid instruments are present. First, when we have multiple candidate instruments, we establish a theoretical bound whereby causal effects are only identified as long as less than 50% of instruments are invalid, without knowing which of the instruments are invalid. We also propose a fast penalized method, called sisVIVE, to estimate the causal effect. We find that sisVIVE outperforms traditional IV methods when invalid instruments are present both in simulation studies as well as in real data analysis. Second, we propose a robust confidence interval under the multiple invalid IV setting. This work is an extension of our work on sisVIVE. However, unlike sisVIVE which is robust to violations of (A2) and (A3), our confidence interval procedure provides honest coverage even if all three assumptions, (A1)-(A3), are violated. Third, we study the single IV setting where the one IV we have may actually be invalid. We propose a nonparametric IV estimation method based on full matching, a technique popular in causal inference for observational data, that leverages observed covariates to make the instrument more valid. We propose an estimator along with inferential results that are robust to mis-specifications of the covariate-outcome model. We also provide a sensitivity analysis should the instrument turn out to be invalid, specifically violate (A3). Fourth, in application work, we study the causal effect of malaria on stunting among children in Ghana. Previous studies on the effect of malaria and stunting were observational and contained various unobserved confounders, most notably nutritional deficiencies. To infer causality, we use the sickle cell genotype, a trait that confers some protection against malaria and was randomly assigned at birth, as an IV and apply our nonparametric IV method. We find that the risk of stunting increases by 0.22 (95% CI: 0.044,1) for every malaria episode and is sensitive to unmeasured confounders.
Van Berkel, Gary J.; Kertesz, Vilmos
2011-08-09
A system and method utilizes an image analysis approach for controlling the collection instrument-to-surface distance in a sampling system for use, for example, with mass spectrometric detection. Such an approach involves the capturing of an image of the collection instrument or the shadow thereof cast across the surface and the utilization of line average brightness (LAB) techniques to determine the actual distance between the collection instrument and the surface. The actual distance is subsequently compared to a target distance for re-optimization, as necessary, of the collection instrument-to-surface during an automated surface sampling operation.
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby
2017-01-01
Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.
Instrumentation for motor-current signature analysis using synchronous sampling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castleberry, K.N.
1996-07-01
Personnel in the Instrumentation and Controls Division at Oak Ridge National Laboratory, in association with the United States Enrichment Corporation, the U.S. Navy, and various Department of Energy sponsors, have been involved in the development and application of motor-current signature analysis for several years. In that time, innovation in the field has resulted in major improvements in signal processing, analysis, and system performance and capabilities. Recent work has concentrated on industrial implementation of one of the most promising new techniques. This report describes the developed method and the instrumentation package that is being used to investigate and develop potential applications.
Field, Christopher R.; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C.; Rose-Pehrsson, Susan L.
2014-01-01
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples. PMID:25145416
Field, Christopher R; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C; Rose-Pehrsson, Susan L
2014-07-25
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples.
Falsification Testing of Instrumental Variables Methods for Comparative Effectiveness Research.
Pizer, Steven D
2016-04-01
To demonstrate how falsification tests can be used to evaluate instrumental variables methods applicable to a wide variety of comparative effectiveness research questions. Brief conceptual review of instrumental variables and falsification testing principles and techniques accompanied by an empirical application. Sample STATA code related to the empirical application is provided in the Appendix. Comparative long-term risks of sulfonylureas and thiazolidinediones for management of type 2 diabetes. Outcomes include mortality and hospitalization for an ambulatory care-sensitive condition. Prescribing pattern variations are used as instrumental variables. Falsification testing is an easily computed and powerful way to evaluate the validity of the key assumption underlying instrumental variables analysis. If falsification tests are used, instrumental variables techniques can help answer a multitude of important clinical questions. © Health Research and Educational Trust.
Spectroscopic Chemical Analysis Methods and Apparatus
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Lane, Arthur L. (Inventor); Bhartia, Rohit (Inventor); Reid, Ray D. (Inventor)
2017-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.
Spectroscopic Chemical Analysis Methods and Apparatus
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Lane, Arthur L. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)
2018-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.
Westgard, Sten A
2016-06-01
To assess the analytical performance of instruments and methods through external quality assessment and proficiency testing data on the Sigma scale. A representative report from five different EQA/PT programs around the world (2 US, 1 Canadian, 1 UK, and 1 Australasian) was accessed. The instrument group standard deviations were used as surrogate estimates of instrument imprecision. Performance specifications from the US CLIA proficiency testing criteria were used to establish a common quality goal. Then Sigma-metrics were calculated to grade the analytical performance. Different methods have different Sigma-metrics for each analyte reviewed. Summary Sigma-metrics estimate the percentage of the chemistry analytes that are expected to perform above Five Sigma, which is where optimized QC design can be implemented. The range of performance varies from 37% to 88%, exhibiting significant differentiation between instruments and manufacturers. Median Sigmas for the different manufacturers in three analytes (albumin, glucose, sodium) showed significant differentiation. Chemistry tests are not commodities. Quality varies significantly from manufacturer to manufacturer, instrument to instrument, and method to method. The Sigma-assessments from multiple EQA/PT programs provide more insight into the performance of methods and instruments than any single program by itself. It is possible to produce a ranking of performance by manufacturer, instrument and individual method. Laboratories seeking optimal instrumentation would do well to consult this data as part of their decision-making process. To confirm that these assessments are stable and reliable, a longer term study should be conducted that examines more results over a longer time period. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Pre-Steady-State Kinetic Analysis of Single-Nucleotide Incorporation by DNA Polymerases
Su, Yan; Guengerich, F. Peter
2016-01-01
Pre-steady-state kinetic analysis is a powerful and widely used method to obtain multiple kinetic parameters. This protocol provides a step-by-step procedure for pre-steady-state kinetic analysis of single-nucleotide incorporation by a DNA polymerase. It describes the experimental details of DNA substrate annealing, reaction mixture preparation, handling of the RQF-3 rapid quench-flow instrument, denaturing polyacrylamide DNA gel preparation, electrophoresis, quantitation, and data analysis. The core and unique part of this protocol is the rationale for preparation of the reaction mixture (the ratio of the polymerase to the DNA substrate) and methods for conducting pre-steady-state assays on an RQF-3 rapid quench-flow instrument, as well as data interpretation after analysis. In addition, the methods for the DNA substrate annealing and DNA polyacrylamide gel preparation, electrophoresis, quantitation and analysis are suitable for use in other studies. PMID:27248785
Analysis of a spacecraft instrument ball bearing assembly lubricated by a perfluoroalkylether
NASA Technical Reports Server (NTRS)
Morales, W.; Jones, W. R., Jr.; Buckley, D. H.
1986-01-01
An analysis of a spacecraft instrument ball bearing assembly, subjected to a scanning life test, was performed to determine the possible case of rotational problems involving these units aboard several satellites. The analysis indicated an ineffective transfer of a fluorinated liquid lubricant from a phenolic retainer to the bearing balls. Part of the analysis led to a novel HPLC separation method employing a fluorinated mobile phase in conjunction with silica based size exclusion columns.
Rôças, I N; Lima, K C; Siqueira, J F
2013-07-01
To compare the antibacterial efficacy of two instrumentation techniques, one using hand nickel-titanium (NiTi) instruments and the other using rotary NiTi instruments, in root canals of teeth with apical periodontitis. Root canals from single-rooted teeth were instrumented using either hand NiTi instruments in the alternated rotation motion technique or rotary BioRaCe instruments. The irrigant used in both groups was 2.5% NaOCl. DNA extracts from samples taken before and after instrumentation were subjected to quantitative analysis by real-time polymerase chain reaction (qPCR). Qualitative analysis was also performed using presence/absence data from culture and qPCR assays. Bacteria were detected in all S1 samples by both methods. In culture analysis, 45% and 35% of the canals were still positive for bacterial presence after hand and rotary NiTi instrumentation, respectively (P > 0.05). Rotary NiTi instrumentation resulted in significantly fewer qPCR-positive cases (60%) than hand NiTi instrumentation (95%) (P = 0.01). Intergroup comparison of quantitative data showed no significant difference between the two techniques. There was no significant difference in bacterial reduction in infected canals after instrumentation using hand or rotary NiTi instruments. In terms of incidence of positive results for bacteria, culture also showed no significant differences between the groups, but the rotary NiTi instrumentation resulted in more negative results in the more sensitive qPCR analysis. © 2013 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Protocol Analysis as a Tool in Function and Task Analysis
1999-10-01
Autocontingency The use of log-linear and logistic regression methods to analyse sequential data seems appealing , and is strongly advocated by...collection and analysis of observational data. Behavior Research Methods, Instruments, and Computers, 23(3), 415-429. Patrick, J. D. (1991). Snob : A
Contribution to interplay between a delamination test and a sensory analysis of mid-range lipsticks.
Richard, C; Tillé-Salmon, B; Mofid, Y
2016-02-01
Lipstick is currently one of the most sold products of cosmetics industry, and the competition between the various manufacturers is significant. Customers mainly seek products with high spreadability, especially long-lasting or long wear on the lips. Evaluation tests of cosmetics are usually performed by sensory analysis. This can then represent a considerable cost. The object of this study was to develop a fast and simple test of delamination (objective method with calibrated instruments) and to interplay the obtained results with those of a discriminative sensory analysis (subjective method) in order to show the relevance of the instrumental test. Three mid-range lipsticks were randomly chosen and were tested. They were made of compositions as described by the International Nomenclature of Cosmetic Ingredients (INCI). Instrumental characterization was performed by texture profile analysis and by a special delamination test. The sensory analysis was voluntarily conducted with an untrained panel as blind test to confirm or reverse the possible interplay. The two approaches or methods gave the same type of classification. The high-fat lipstick had the worst behaviour with the delamination test and the worst notation of the intensity of descriptors with the sensory analysis. There is a high correlation between the sensory analysis and the instrumental measurements in this study. The delamination test carried out should permit to quickly determine the lasting (screening test) and in consequence optimize the basic formula of lipsticks. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
2013-01-01
Background Measuring team factors in evaluations of Continuous Quality Improvement (CQI) may provide important information for enhancing CQI processes and outcomes; however, the large number of potentially relevant factors and associated measurement instruments makes inclusion of such measures challenging. This review aims to provide guidance on the selection of instruments for measuring team-level factors by systematically collating, categorizing, and reviewing quantitative self-report instruments. Methods Data sources: We searched MEDLINE, PsycINFO, and Health and Psychosocial Instruments; reference lists of systematic reviews; and citations and references of the main report of instruments. Study selection: To determine the scope of the review, we developed and used a conceptual framework designed to capture factors relevant to evaluating CQI in primary care (the InQuIRe framework). We included papers reporting development or use of an instrument measuring factors relevant to teamwork. Data extracted included instrument purpose; theoretical basis, constructs measured and definitions; development methods and assessment of measurement properties. Analysis and synthesis: We used qualitative analysis of instrument content and our initial framework to develop a taxonomy for summarizing and comparing instruments. Instrument content was categorized using the taxonomy, illustrating coverage of the InQuIRe framework. Methods of development and evidence of measurement properties were reviewed for instruments with potential for use in primary care. Results We identified 192 potentially relevant instruments, 170 of which were analyzed to develop the taxonomy. Eighty-one instruments measured constructs relevant to CQI teams in primary care, with content covering teamwork context (45 instruments measured enabling conditions or attitudes to teamwork), team process (57 instruments measured teamwork behaviors), and team outcomes (59 instruments measured perceptions of the team or its effectiveness). Forty instruments were included for full review, many with a strong theoretical basis. Evidence supporting measurement properties was limited. Conclusions Existing instruments cover many of the factors hypothesized to contribute to QI success. With further testing, use of these instruments measuring team factors in evaluations could aid our understanding of the influence of teamwork on CQI outcomes. Greater consistency in the factors measured and choice of measurement instruments is required to enable synthesis of findings for informing policy and practice. PMID:23410500
Ion mobility analysis of lipoproteins
Benner, W Henry [Danville, CA; Krauss, Ronald M [Berkeley, CA; Blanche, Patricia J [Berkeley, CA
2007-08-21
A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.
NASA Technical Reports Server (NTRS)
Chipera, S. J.; Vaniman, D. T.; Bish, D. L.; Sarrazin, P.; Feldman, S.; Blake, D. F.; Bearman, G.; Bar-Cohen, Y.
2004-01-01
A miniature XRD/XRF (X-ray diffraction / X-ray fluorescence) instrument, CHEMIN, is currently being developed for definitive mineralogic analysis of soils and rocks on Mars. One of the technical issues that must be addressed to enable remote XRD analysis is how best to obtain a representative sample powder for analysis. For powder XRD analyses, it is beneficial to have a fine-grained sample to reduce preferred orientation effects and to provide a statistically significant number of crystallites to the X-ray beam. Although a two-dimensional detector as used in the CHEMIN instrument will produce good results even with poorly prepared powder, the quality of the data will improve and the time required for data collection will be reduced if the sample is fine-grained and randomly oriented. A variety of methods have been proposed for XRD sample preparation. Chipera et al. presented grain size distributions and XRD results from powders generated with an Ultrasonic/Sonic Driller/Corer (USDC) currently being developed at JPL. The USDC was shown to be an effective instrument for sampling rock to produce powder suitable for XRD. In this paper, we compare powder prepared using the USDC with powder obtained with a miniaturized rock crusher developed at JPL and with powder obtained with a rotary tungsten carbide bit to powders obtained from a laboratory bench-scale Retsch mill (provides benchmark mineralogical data). These comparisons will allow assessment of the suitability of these methods for analysis by an XRD/XRF instrument such as CHEMIN.
Cho, Yunju; Ahmed, Arif; Islam, Annana; Kim, Sunghwan
2015-01-01
Because of the increasing importance of heavy and unconventional crude oil as an energy source, there is a growing need for petroleomics: the pursuit of more complete and detailed knowledge of the chemical compositions of crude oil. Crude oil has an extremely complex nature; hence, techniques with ultra-high resolving capabilities, such as Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS), are necessary. FT-ICR MS has been successfully applied to the study of heavy and unconventional crude oils such as bitumen and shale oil. However, the analysis of crude oil with FT-ICR MS is not trivial, and it has pushed analysis to the limits of instrumental and methodological capabilities. For example, high-resolution mass spectra of crude oils may contain over 100,000 peaks that require interpretation. To visualize large data sets more effectively, data processing methods such as Kendrick mass defect analysis and statistical analyses have been developed. The successful application of FT-ICR MS to the study of crude oil has been critically dependent on key developments in FT-ICR MS instrumentation and data processing methods. This review offers an introduction to the basic principles, FT-ICR MS instrumentation development, ionization techniques, and data interpretation methods for petroleomics and is intended for readers having no prior experience in this field of study. © 2014 Wiley Periodicals, Inc.
Casual instrument corrections for short-period and broadband seismometers
Haney, Matthew M.; Power, John; West, Michael; Michaels, Paul
2012-01-01
Of all the filters applied to recordings of seismic waves, which include source, path, and site effects, the one we know most precisely is the instrument filter. Therefore, it behooves seismologists to accurately remove the effect of the instrument from raw seismograms. Applying instrument corrections allows analysis of the seismogram in terms of physical units (e.g., displacement or particle velocity of the Earth’s surface) instead of the output of the instrument (e.g., digital counts). The instrument correction can be considered the most fundamental processing step in seismology since it relates the raw data to an observable quantity of interest to seismologists. Complicating matters is the fact that, in practice, the term “instrument correction” refers to more than simply the seismometer. The instrument correction compensates for the complete recording system including the seismometer, telemetry, digitizer, and any anti‐alias filters. Knowledge of all these components is necessary to perform an accurate instrument correction. The subject of instrument corrections has been covered extensively in the literature (Seidl, 1980; Scherbaum, 1996). However, the prospect of applying instrument corrections still evokes angst among many seismologists—the authors of this paper included. There may be several reasons for this. For instance, the seminal paper by Seidl (1980) exists in a journal that is not currently available in electronic format and cannot be accessed online. Also, a standard method for applying instrument corrections involves the programs TRANSFER and EVALRESP in the Seismic Analysis Code (SAC) package (Goldstein et al., 2003). The exact mathematical methods implemented in these codes are not thoroughly described in the documentation accompanying SAC.
Kasparaviciene, Giedre; Savickas, Arunas; Kalveniene, Zenona; Velziene, Saule; Kubiliene, Loreta; Bernatoniene, Jurga
2016-01-01
The aim of this study was to optimize the lipsticks formulation according to the physical properties and sensory attributes and investigate the relationship between instrumental and sensory analyses and evaluate the influence of the main ingredients, beeswax and oil, with analysis of lipsticks properties. Central composite design was used to optimize the mixture of oils and beeswax and cocoa butter for formulation of lipsticks. Antioxidant activity was evaluated by DPPH free radical scavenging method spectrophotometrically. Physical properties of lipsticks melting point were determined in a glass tube; the hardness was investigated with texture analyzer. Sensory analysis was performed with untrained volunteers. The optimized mixture of sea buckthorn oil and grapeseed oil mixture ratio 13.96 : 6.18 showed the highest antioxidative activity (70 ± 0.84%) and was chosen for lipstick formulation. According to the sensory and instrumental analysis results, optimal ingredients amounts for the lipstick were calculated: 57.67% mixture of oils, 19.58% beeswax, and 22.75% cocoa butter. Experimentally designed and optimized lipstick formulation had good physical properties and high scored sensory evaluation. Correlation analysis showed a significant relationship between sensory and instrumental evaluations.
Kasparaviciene, Giedre; Savickas, Arunas; Kalveniene, Zenona; Velziene, Saule; Kubiliene, Loreta
2016-01-01
The aim of this study was to optimize the lipsticks formulation according to the physical properties and sensory attributes and investigate the relationship between instrumental and sensory analyses and evaluate the influence of the main ingredients, beeswax and oil, with analysis of lipsticks properties. Central composite design was used to optimize the mixture of oils and beeswax and cocoa butter for formulation of lipsticks. Antioxidant activity was evaluated by DPPH free radical scavenging method spectrophotometrically. Physical properties of lipsticks melting point were determined in a glass tube; the hardness was investigated with texture analyzer. Sensory analysis was performed with untrained volunteers. The optimized mixture of sea buckthorn oil and grapeseed oil mixture ratio 13.96 : 6.18 showed the highest antioxidative activity (70 ± 0.84%) and was chosen for lipstick formulation. According to the sensory and instrumental analysis results, optimal ingredients amounts for the lipstick were calculated: 57.67% mixture of oils, 19.58% beeswax, and 22.75% cocoa butter. Experimentally designed and optimized lipstick formulation had good physical properties and high scored sensory evaluation. Correlation analysis showed a significant relationship between sensory and instrumental evaluations. PMID:27994631
Vo, Evanly; Zhuang, Ziqing; Birch, Eileen; Birch, Quinn
2016-01-01
The aim of this study was to apply a direct-reading aerosol instrument method and an elemental carbon (EC) analysis method to measure the mass-based penetration of single-walled carbon nanotubes (SWCNTs) and multi-walled carbon nanotubes (MWCNTs) through elastomeric half-mask respirators (EHRs) and filtering facepiece respirators (FFRs). For the direct-reading aerosol instrument method, two scanning mobility particle sizer/aerodynamic particle sizer systems were used to simultaneously determine the upstream (outside respirator) and downstream (inside respirator) test aerosols. For the EC analysis method, upstream and downstream CNTs were collected on filter cassettes and then analyzed using a thermal-optical technique. CNT mass penetrations were found in both methods to be within the associated efficiency requirements for each type and class of the respirator models that were tested. Generally, the penetrations of SWCNTs and MWCNTs had a similar trend with penetration being the highest for the N95 EHRs, followed by N95 FFRs, P100 EHRs, and P100 FFRs. This trend held true for both methods; however, the CNT penetration determined by the direct-reading aerosol instrument method (0.009-1.09% for SWCNTs and 0.005-0.21% for MWCNTs) was greater relative to the penetration values found through EC analysis method (0.007-0.69% for SWCNTs and 0.004-0.13% for MWCNTs). The results of this study illustrate considerations for how the methods can be used to evaluate penetration of morphologically complex materials through FFRs and EHRs.
Extending the frontiers of mass spectrometric instrumentation and methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schieffer, Gregg Martin
2010-01-01
The focus of this dissertation is two-fold: developing novel analysis methods using mass spectrometry and the implementation and characterization of a novel ion mobility mass spectrometry instrumentation. The novel mass spectrometry combines ion trap for ion/ion reactions coupled to an ion mobility cell. The long term goal of this instrumentation is to use ion/ion reactions to probe the structure of gas phase biomolecule ions. The three ion source - ion trap - ion mobility - qTOF mass spectrometer (IT - IM - TOF MS) instrument is described. The analysis of the degradation products in coal (Chapter 2) and the imagingmore » plant metabolites (Appendix III) fall under the methods development category. These projects use existing commercial instrumentation (JEOL AccuTOF MS and Thermo Finnigan LCQ IT, respectively) for the mass analysis of the degraded coal products and the plant metabolites, respectively. The coal degradation paper discusses the use of the DART ion source for fast and easy sample analysis. The sample preparation consisted of a simple 50 fold dilution of the soluble coal products in water and placing the liquid in front of the heated gas stream. This is the first time the DART ion source has been used for analysis of coal. Steven Raders under the guidance of John Verkade came up with the coal degradation projects. Raders performed the coal degradation reactions, worked up the products, and sent them to me. Gregg Schieffer developed the method and wrote the paper demonstrating the use of the DART ion source for the fast and easy sample analysis. The plant metabolite imaging project extends the use of colloidal graphite as a sample coating for atmospheric pressure LDI. DC Perdian and I closely worked together to make this project work. Perdian focused on building the LDI setup whereas Schieffer focused on the MSn analysis of the metabolites. Both Perdian and I took the data featured in the paper. Perdian was the primary writer of the paper and used it as a chapter in his dissertation. Perdian and Schieffer worked together to address the revisions and publish it in Rapid Communications in Mass Spectrometry Journal.« less
, microquasars, neutron stars, pulsars, black holes astro-ph.IM - Instrumentation and Methods for Astrophysics Astrophysics. Methods for data analysis, statistical methods. Software, database design astro-ph.SR - Solar and
Interpreting findings from Mendelian randomization using the MR-Egger method.
Burgess, Stephen; Thompson, Simon G
2017-05-01
Mendelian randomization-Egger (MR-Egger) is an analysis method for Mendelian randomization using summarized genetic data. MR-Egger consists of three parts: (1) a test for directional pleiotropy, (2) a test for a causal effect, and (3) an estimate of the causal effect. While conventional analysis methods for Mendelian randomization assume that all genetic variants satisfy the instrumental variable assumptions, the MR-Egger method is able to assess whether genetic variants have pleiotropic effects on the outcome that differ on average from zero (directional pleiotropy), as well as to provide a consistent estimate of the causal effect, under a weaker assumption-the InSIDE (INstrument Strength Independent of Direct Effect) assumption. In this paper, we provide a critical assessment of the MR-Egger method with regard to its implementation and interpretation. While the MR-Egger method is a worthwhile sensitivity analysis for detecting violations of the instrumental variable assumptions, there are several reasons why causal estimates from the MR-Egger method may be biased and have inflated Type 1 error rates in practice, including violations of the InSIDE assumption and the influence of outlying variants. The issues raised in this paper have potentially serious consequences for causal inferences from the MR-Egger approach. We give examples of scenarios in which the estimates from conventional Mendelian randomization methods and MR-Egger differ, and discuss how to interpret findings in such cases.
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens
We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; ...
2016-12-15
We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less
Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins
Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.
2010-12-14
A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.
The Analysis of Organizational Diagnosis on Based Six Box Model in Universities
ERIC Educational Resources Information Center
Hamid, Rahimi; Siadat, Sayyed Ali; Reza, Hoveida; Arash, Shahin; Ali, Nasrabadi Hasan; Azizollah, Arbabisarjou
2011-01-01
Purpose: The analysis of organizational diagnosis on based six box model at universities. Research method: Research method was descriptive-survey. Statistical population consisted of 1544 faculty members of universities which through random strafed sampling method 218 persons were chosen as the sample. Research Instrument were organizational…
Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye
2016-01-13
A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.
The instruments of higher order thinking skills
NASA Astrophysics Data System (ADS)
Ahmad, S.; Prahmana, R. C. I.; Kenedi, A. K.; Helsa, Y.; Arianil, Y.; Zainil, M.
2017-12-01
This research developed the standard of instrument for measuring the High Order Thinking Skill (HOTS) ability of PGSD students. The research method used is development research with eight steps namely theoretical studies, operational definition, designation construct, dimensions and indicators, the preparation of the lattice, the preparation of grain, an analysis of legibility and Social desirability, field trials, and data analysis. In accordance with the type of data to be obtained in this study, the research instrument using validation sheet, implementation observation, and questionnaire. The results show that the instruments are valid and feasible to be used by expert and have been tested on PGSD students with 60% of PGSD students with low categorization.
Tunable lasers and their application in analytical chemistry
NASA Technical Reports Server (NTRS)
Steinfeld, J. I.
1975-01-01
The impact that laser techniques might have in chemical analysis is examined. Absorption, scattering, and heterodyne detection is considered. Particular emphasis is placed on the advantages of using frequency-tunable sources, and dye solution lasers are regarded as the outstanding example of this type of laser. Types of spectroscopy that can be carried out with lasers are discussed along with the ultimate sensitivity or minimum detectable concentration of molecules that can be achieved with each method. Analytical applications include laser microprobe analysis, remote sensing and instrumental methods such as laser-Raman spectroscopy, atomic absorption/fluorescence spectrometry, fluorescence assay techniques, optoacoustic spectroscopy, and polarization measurements. The application of lasers to spectroscopic methods of analysis would seem to be a rewarding field both for research in analytical chemistry and for investments in instrument manufacturing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1990. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis. The Chemical Analysis Group uses wet-chemical and instrumental methods for elemental, compositional, and isotopic analyses of solid, liquid, and gaseous samples and provides specialized analytical services. The Instrumental Analysis Group uses nuclear counting techniques in radiochemical analyses over a wide range of sample types from low-level environmental samples to samples of high radioactivity. The Organic Analysis Group uses amore » number of complementary techniques to separate and to quantitatively and qualitatively analyze complex organic mixtures and compounds at the trace level, including synthetic fuels, toxic substances, fossil-fuel residues and emissions, pollutants, biologically active compounds, pesticides, and drugs. The Environmental Analysis Group performs analyses of inorganic environmental and hazardous waste and coal samples.« less
A review of promising new immunoassay technology for monitoring forest herbicides
Charles K. McMahon
1993-01-01
Rising costs of classical instrumental methods of chemical analysis coupled with an increasing need for environmental monitoring has lead to the development of highly sensitive, low-cost immunochemical methods of analysis for the detection of environmental contaminants. These methods known simply as immunoassays are chemical assays which use antibodies as reagents. A...
Method Development for Analysis of Aspirin Tablets.
ERIC Educational Resources Information Center
Street, Kenneth W., Jr.
1988-01-01
Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)
Novel spinal instrumentation to enhance osteogenesis and fusion: a preliminary study.
MacEwan, Matthew R; Talcott, Michael R; Moran, Daniel W; Leuthardt, Eric C
2016-09-01
OBJECTIVE Instrumented spinal fusion continues to exhibit high failure rates in patients undergoing multilevel lumbar fusion or pseudarthrosis revision; with Grade II or higher spondylolisthesis; or in those possessing risk factors such as obesity, tobacco use, or metabolic disorders. Direct current (DC) electrical stimulation of bone growth represents a unique surgical adjunct in vertebral fusion procedures, yet existing spinal fusion stimulators are not optimized to enhance interbody fusion. To develop an advanced method of applying DC electrical stimulation to promote interbody fusion, a novel osteogenic spinal system capable of routing DC through rigid instrumentation and into the vertebral bodies was fabricated. A pilot study was designed to assess the feasibility of osteogenic instrumentation and compare the ability of osteogenic instrumentation to promote successful interbody fusion in vivo to standard spinal instrumentation with autograft. METHODS Instrumented, single-level, posterior lumbar interbody fusion (PLIF) with autologous graft was performed at L4-5 in adult Toggenburg/Alpine goats, using both osteogenic spinal instrumentation (plus electrical stimulation) and standard spinal instrumentation (no electrical stimulation). At terminal time points (3 months, 6 months), animals were killed and lumbar spines were explanted for radiographic analysis using a SOMATOM Dual Source Definition CT Scanner and high-resolution Microcat II CT Scanner. Trabecular continuity, radiodensity within the fusion mass, and regional bone formation were examined to determine successful spinal fusion. RESULTS Quantitative analysis of average bone density in pedicle screw beds confirmed that electroactive pedicle screws used in the osteogenic spinal system focally enhanced bone density in instrumented vertebral bodies. Qualitative and quantitative analysis of high-resolution CT scans of explanted lumbar spines further demonstrated that the osteogenic spinal system induced solid bony fusion across the L4-5 disc space as early as 6 weeks postoperatively. In comparison, inactive spinal instrumentation with autograft was unable to promote successful interbody fusion by 6 months postoperatively. CONCLUSIONS Results of this study demonstrate that novel osteogenic spinal instrumentation supports interbody fusion through the focal delivery of DC electrical stimulation. With further technical development and scientific/clinical validation, osteogenic spinal instrumentation may offer a unique alternative to biological scaffolds and pharmaceutical adjuncts used in spinal fusion procedures.
NASA Astrophysics Data System (ADS)
Platonov, I. A.; Kolesnichenko, I. N.; Lange, P. K.
2018-05-01
In this paper, the chromatography desorption method of obtaining gas mixtures of known compositions stable for a time sufficient to calibrate analytical instruments is considered. The comparative analysis results of the preparation accuracy of gas mixtures with volatile organic compounds using diffusion, polyabarbotage and chromatography desorption methods are presented. It is shown that the application of chromatography desorption devices allows one to obtain gas mixtures that are stable for 10...60 hours in a dynamic condition. These gas mixtures contain volatile aliphatic and aromatic hydrocarbons with a concentration error of no more than 7%. It is shown that it is expedient to use such gas mixtures for analytical instruments calibration (chromatographs, spectrophotometers, etc.)
Nanopore sequencing in microgravity
McIntyre, Alexa B R; Rizzardi, Lindsay; Yu, Angela M; Alexander, Noah; Rosen, Gail L; Botkin, Douglas J; Stahl, Sarah E; John, Kristen K; Castro-Wallace, Sarah L; McGrath, Ken; Burton, Aaron S; Feinberg, Andrew P; Mason, Christopher E
2016-01-01
Rapid DNA sequencing and analysis has been a long-sought goal in remote research and point-of-care medicine. In microgravity, DNA sequencing can facilitate novel astrobiological research and close monitoring of crew health, but spaceflight places stringent restrictions on the mass and volume of instruments, crew operation time, and instrument functionality. The recent emergence of portable, nanopore-based tools with streamlined sample preparation protocols finally enables DNA sequencing on missions in microgravity. As a first step toward sequencing in space and aboard the International Space Station (ISS), we tested the Oxford Nanopore Technologies MinION during a parabolic flight to understand the effects of variable gravity on the instrument and data. In a successful proof-of-principle experiment, we found that the instrument generated DNA reads over the course of the flight, including the first ever sequenced in microgravity, and additional reads measured after the flight concluded its parabolas. Here we detail modifications to the sample-loading procedures to facilitate nanopore sequencing aboard the ISS and in other microgravity environments. We also evaluate existing analysis methods and outline two new approaches, the first based on a wave-fingerprint method and the second on entropy signal mapping. Computationally light analysis methods offer the potential for in situ species identification, but are limited by the error profiles (stays, skips, and mismatches) of older nanopore data. Higher accuracies attainable with modified sample processing methods and the latest version of flow cells will further enable the use of nanopore sequencers for diagnostics and research in space. PMID:28725742
Stringheta, Carolina Pessoa; Pelegrine, Rina Andréa; Kato, Augusto Shoji; Freire, Laila Gonzales; Iglecias, Elaine Faga; Gavini, Giulio; Bueno, Carlos Eduardo da Silveira
2017-12-01
The objective of this study was to compare the methods of micro-computed tomography (micro-CT) and cross-sectioning followed by stereomicroscopy in assessing dentinal defects after instrumentation with different mechanized systems. Forty mesial roots of mandibular molars were scanned and divided into 4 groups (n = 10): Group R, Reciproc; Group PTN, ProTaper Next; Group WOG, WaveOne Gold; Group PDL, ProDesign Logic. After instrumentation, the roots were once again submitted to a micro-CT scan, and then sectioned at 3, 6, and 9 mm from the apex, and assessed for the presence of complete and incomplete dentinal defects under a stereomicroscope. The nonparametric Kruskal-Wallis, Friedman, and Wilcoxon tests were used in the statistical analysis. The study used a significance level of 5%. The total number of defects observed by cross-sectioning followed by stereomicroscopy was significantly higher than that observed by micro-CT, in all of the experimental groups (P ≤ .05). All of the defects identified in the postoperative period were already present in the corresponding preoperative period. There was no significant difference among the instrumentation systems as to the median numbers of defects, for either cross-sectioning followed by stereomicroscopy or micro-CT, at all the root levels (P > .05). In the micro-CT analysis, no significant difference was found between the median numbers of pre- and postinstrumentation defects, regardless of the instrumentation system (P > .05). None of the evaluated instrumentation systems led to the formation of new dentin defects. All of the defects identified in the stereomicroscopic analysis were already present before instrumentation, or were absent at both time points in the micro-CT analysis, indicating that the formation of new defects resulted from the sectioning procedure performed before stereomicroscopy and not from instrumentation. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Chen, Lixun; Jiang, Ling; Shen, Aizong; Wei, Wei
2016-09-01
The frequently low quality of submitted spontaneous reports is of an increasing concern; to our knowledge, no validated instrument exists for assessing case reports' quality comprehensively enough. This work was conducted to develop such a quality instrument for assessing the spontaneous reports of adverse drug reaction (ADR)/adverse drug event (ADE) in China. Initial evaluation indicators were generated using systematic and literature data analysis. Final indicators and their weights were identified using Delphi method. The final quality instrument was developed by adopting the synthetic scoring method. A consensus was reached after four rounds of Delphi survey. The developed quality instrument consisted of 6 first-rank indicators, 18 second-rank indicators, and 115 third-rank indicators, and each rank indicator has been weighted. It evaluates the quality of spontaneous reports of ADR/ADE comprehensively and quantitatively on six parameters: authenticity, duplication, regulatory, completeness, vigilance level, and reporting time frame. The developed instrument was tested with good reliability and validity, which can be used to comprehensively and quantitatively assess the submitted spontaneous reports of ADR/ADE in China.
A Comparison of seismic instrument noise coherence analysis techniques
Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.
2011-01-01
The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.
Analytical Chemistry Laboratory Progress Report for FY 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less
[Medical Equipment Maintenance Methods].
Liu, Hongbin
2015-09-01
Due to the high technology and the complexity of medical equipment, as well as to the safety and effectiveness, it determines the high requirements of the medical equipment maintenance work. This paper introduces some basic methods of medical instrument maintenance, including fault tree analysis, node method and exclusive method which are the three important methods in the medical equipment maintenance, through using these three methods for the instruments that have circuit drawings, hardware breakdown maintenance can be done easily. And this paper introduces the processing methods of some special fault conditions, in order to reduce little detours in meeting the same problems. Learning is very important for stuff just engaged in this area.
ERIC Educational Resources Information Center
Owens, Janel E.; Zimmerman, Laura B.; Gardner, Michael A.; Lowe, Luis E.
2016-01-01
Analysis of whiskey samples prepared by a green microextraction technique, dispersive liquid-liquid microextraction (DLLME), before analysis by a qualitative gas chromatography-mass spectrometry (GC/MS) method, is described as a laboratory experiment for an upper division instrumental methods of analysis laboratory course. Here, aroma compounds in…
Apical extrusion of debris using two hand and two rotary instrumentation techniques.
Reddy, S A; Hicks, M L
1998-03-01
The purpose of this study was to investigate the quantity of apical debris produced in vitro using two hand and two rotary instrumentation techniques. Sixty minimally curved, mature human mandibular premolars with single canals were divided into 4 groups of 15 teeth each and prepared using step-back instrumentation with K-files, balanced force with Flex-R files, Lightspeed nickel-titanium instruments, or .04 taper ProFile Series 29 rotary nickel-titanium files. Debris extruded through the apical foramen during instrumentation was collected on preweighed filters. The mean weight of extruded debris for each group was statistically analyzed using a Kruskal Wallis one-way analysis of variance and a Mann-Whitney U rank sum tested. Although all instrumentation techniques produced apically extruded debris, step-back instrumentation produced significantly more debris than the other methods (p < 0.0001). There was no difference between balanced force hand instrumentation and the two rotary nickel-titanium instrumentation methods (p > 0.05). Hand or engine-driven instrumentation that uses rotation seems to reduce significantly the amount of debris extruded apically when compared with a push-pull (filing) technique. Decreased apical extrusion of debris has strong implications for a decreased incidence of postoperative inflammation and pain.
Construction Of Critical Thinking Skills Test Instrument Related The Concept On Sound Wave
NASA Astrophysics Data System (ADS)
Mabruroh, F.; Suhandi, A.
2017-02-01
This study aimed to construct test instrument of critical thinking skills of high school students related the concept on sound wave. This research using a mixed methods with sequential exploratory design, consists of: 1) a preliminary study; 2) design and review of test instruments. The form of test instruments in essay questions, consist of 18 questions that was divided into 5 indicators and 8 sub-indicators of the critical thinking skills expressed by Ennis, with questions that are qualitative and contextual. Phases of preliminary study include: a) policy studies; b) survey to the school; c) and literature studies. Phases of the design and review of test instruments consist of two steps, namely a draft design of test instruments include: a) analysis of the depth of teaching materials; b) the selection of indicators and sub-indicators of critical thinking skills; c) analysis of indicators and sub-indicators of critical thinking skills; d) implementation of indicators and sub-indicators of critical thinking skills; and e) making the descriptions about the test instrument. In the next phase of the review test instruments, consist of: a) writing about the test instrument; b) validity test by experts; and c) revision of test instruments based on the validator.
Kullmann, Lajos; Paulik, Edit
2011-02-01
Quality of health and social care is being assessed by largely different methods. Obtaining comparable and valuable data is difficult. Thus, internationally developed instruments have special value. A set of instruments has been developed simultaneously using World Health Organization's instrument development method. One of these is the instrument "Quality of Care and Support for People with Disabilities". Response scales contain five options for physically and three for intellectually disabled persons. Psychometric analysis of the Hungarian instrument version was based on interviews with 151 physically and 166 intellectually disabled persons. Answering rate was high, above 95% with the exception of one item. Internal consistency of the two instrument versions by Cronbach's alpha is 0.845 and 0.745 respectively. Lowest satisfaction was found in the domain "information" in both groups that correlates significantly with health conditions at p < 0.01 and p < 0.05 level respectively. The field trial confirms validity and reliability of the instrument. Its wider use may help the evaluation of satisfaction concerning different components of quality of care, consequently better tailoring of services to needs.
Gere, Attila; Losó, Viktor; Györey, Annamária; Kovács, Sándor; Huzsvai, László; Nábrádi, András; Kókai, Zoltán; Sipos, László
2014-12-01
Traditional internal and external preference mapping methods are based on principal component analysis (PCA). However, parallel factor analysis (PARAFAC) and Tucker-3 methods could be a better choice. To evaluate the methods, preference maps of sweet corn varieties will be introduced. A preference map of eight sweet corn varieties was established using PARAFAC and Tucker-3 methods. Instrumental data were also integrated into the maps. The triplot created by the PARAFAC model explains better how odour is separated from texture or appearance, and how some varieties are separated from others. Internal and external preference maps were created using parallel factor analysis (PARAFAC) and Tucker-3 models employing both sensory (trained panel and consumers) and instrumental parameters simultaneously. Triplots of the applied three-way models have a competitive advantage compared to the traditional biplots of the PCA-based external preference maps. The solution of PARAFAC and Tucker-3 is very similar regarding the interpretation of the first and third factors. The main difference is due to the second factor as it differentiated the attributes better. Consumers who prefer 'super sweet' varieties (they place great emphasis especially on taste) are much younger and have significantly higher incomes, and buy sweet corn products rarely (once a month). Consumers who consume sweet corn products mainly because of their texture and appearance are significantly older and include a higher ratio of men. © 2014 Society of Chemical Industry.
Analysis of cigarette purchase task instrument data with a left-censored mixed effects model.
Liao, Wenjie; Luo, Xianghua; Le, Chap T; Chu, Haitao; Epstein, Leonard H; Yu, Jihnhee; Ahluwalia, Jasjit S; Thomas, Janet L
2013-04-01
The drug purchase task is a frequently used instrument for measuring the relative reinforcing efficacy (RRE) of a substance, a central concept in psychopharmacological research. Although a purchase task instrument, such as the cigarette purchase task (CPT), provides a comprehensive and inexpensive way to assess various aspects of a drug's RRE, the application of conventional statistical methods to data generated from such an instrument may not be adequate by simply ignoring or replacing the extra zeros or missing values in the data with arbitrary small consumption values, for example, 0.001. We applied the left-censored mixed effects model to CPT data from a smoking cessation study of college students and demonstrated its superiority over the existing methods with simulation studies. Theoretical implications of the findings, limitations of the proposed method, and future directions of research are also discussed.
Analysis of Cigarette Purchase Task Instrument Data with a Left-Censored Mixed Effects Model
Liao, Wenjie; Luo, Xianghua; Le, Chap; Chu, Haitao; Epstein, Leonard H.; Yu, Jihnhee; Ahluwalia, Jasjit S.; Thomas, Janet L.
2015-01-01
The drug purchase task is a frequently used instrument for measuring the relative reinforcing efficacy (RRE) of a substance, a central concept in psychopharmacological research. While a purchase task instrument, such as the cigarette purchase task (CPT), provides a comprehensive and inexpensive way to assess various aspects of a drug’s RRE, the application of conventional statistical methods to data generated from such an instrument may not be adequate by simply ignoring or replacing the extra zeros or missing values in the data with arbitrary small consumption values, e.g. 0.001. We applied the left-censored mixed effects model to CPT data from a smoking cessation study of college students and demonstrated its superiority over the existing methods with simulation studies. Theoretical implications of the findings, limitations of the proposed method and future directions of research are also discussed. PMID:23356731
ERIC Educational Resources Information Center
Zabzdyr, Jennifer L.; Lillard, Sheri J.
2001-01-01
Introduces a laboratory experiment for determining blood alcohol content using a combination of instrumental analysis and forensic science. Teaches the importance of careful laboratory technique and that experiments are conducted for a reason. Includes the procedure of the experiment. (Contains 17 references.) (YDS)
Investigating the Structure of the Pediatric Symptoms Checklist in the Preschool Setting
ERIC Educational Resources Information Center
DiStefano, Christine; Liu, Jin; Burgess, Yin
2017-01-01
When using educational/psychological instruments, psychometric investigations should be conducted before adopting to new environments to ensure that an instrument measures the same constructs. Exploratory structural equation modeling and confirmatory factor analysis methods were used to examine the utility of the short form of the Pediatric…
Hybrid architecture active wavefront sensing and control system, and method
NASA Technical Reports Server (NTRS)
Feinberg, Lee D. (Inventor); Dean, Bruce H. (Inventor); Hyde, Tristram T. (Inventor)
2011-01-01
According to various embodiments, provided herein is an optical system and method that can be configured to perform image analysis. The optical system can comprise a telescope assembly and one or more hybrid instruments. The one or more hybrid instruments can be configured to receive image data from the telescope assembly and perform a fine guidance operation and a wavefront sensing operation, simultaneously, on the image data received from the telescope assembly.
ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.
2011-04-20
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less
Instrumental Variable Analysis with a Nonlinear Exposure–Outcome Relationship
Davies, Neil M.; Thompson, Simon G.
2014-01-01
Background: Instrumental variable methods can estimate the causal effect of an exposure on an outcome using observational data. Many instrumental variable methods assume that the exposure–outcome relation is linear, but in practice this assumption is often in doubt, or perhaps the shape of the relation is a target for investigation. We investigate this issue in the context of Mendelian randomization, the use of genetic variants as instrumental variables. Methods: Using simulations, we demonstrate the performance of a simple linear instrumental variable method when the true shape of the exposure–outcome relation is not linear. We also present a novel method for estimating the effect of the exposure on the outcome within strata of the exposure distribution. This enables the estimation of localized average causal effects within quantile groups of the exposure or as a continuous function of the exposure using a sliding window approach. Results: Our simulations suggest that linear instrumental variable estimates approximate a population-averaged causal effect. This is the average difference in the outcome if the exposure for every individual in the population is increased by a fixed amount. Estimates of localized average causal effects reveal the shape of the exposure–outcome relation for a variety of models. These methods are used to investigate the relations between body mass index and a range of cardiovascular risk factors. Conclusions: Nonlinear exposure–outcome relations should not be a barrier to instrumental variable analyses. When the exposure–outcome relation is not linear, either a population-averaged causal effect or the shape of the exposure–outcome relation can be estimated. PMID:25166881
Zaki, Rafdzah; Bulgiba, Awang; Nordin, Noorhaire; Azina Ismail, Noor
2013-06-01
Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice. In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. The Intra-class Correlation Coefficient (ICC) is the most popular method with 25 (60%) studies having used this method followed by the comparing means (8 or 19%). Out of 25 studies using the ICC, only 7 (28%) reported the confidence intervals and types of ICC used. Most studies (71%) also tested the agreement of instruments. This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.
Aerosol preparation of intact lipoproteins
Benner, W Henry [Danville, CA; Krauss, Ronald M [Berkeley, CA; Blanche, Patricia J [Berkeley, CA
2012-01-17
A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.
2012-01-01
Background Multi attribute utility (MAU) instruments are used to include the health related quality of life (HRQoL) in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL)-6D, MAU instrument. Methods The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM) to meet these dual requirements. Results and Discussion The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. Conclusions The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs. PMID:22507254
2014-01-01
This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482
Analysis of small-angle X-ray scattering data in the presence of significant instrumental smearing
Bergenholtz, Johan; Ulama, Jeanette; Zackrisson Oskolkova, Malin
2016-01-01
A laboratory-scale small-angle X-ray scattering instrument with pinhole collimation has been used to assess smearing effects due to instrumental resolution. A new, numerically efficient method to smear ideal model intensities is developed and presented. It allows for directly using measured profiles of isotropic but otherwise arbitrary beams in smearing calculations. Samples of low-polydispersity polymer spheres have been used to show that scattering data can in this way be quantitatively modeled even when there is substantial distortion due to instrumental resolution. PMID:26937235
Davis, Stephen; Gluskin, Alan H; Livingood, Philip M; Chambers, David W
2010-11-01
This study was designed to calculate probabilities for tissue injury and to measure effectiveness of various coolant strategies in countering heat buildup produced by dry ultrasonic vibration during post removal. A simulated biological model was used to evaluate the cooling efficacy of a common refrigerant spray, water spray, and air spray in the recovery of post temperatures deep within the root canal space. The data set consisted of cervical and apical measures of temperature increase at 1-second intervals from baseline during continuous ultrasonic instrumentation until a 10 °C increase in temperature at the cervical site was registered, wherein instrumentation ceased, and the teeth were allowed to cool under ambient conditions or with the assistance of 4 coolant methods. Data were analyzed with analysis of variance by using the independent variables of time of ultrasonic application (10, 15, 20 seconds) and cooling method. In addition to the customary means, standard deviations, and analysis of variance tests, analyses were conducted to determine probabilities that procedures would reach or exceed the 10 °C threshold. Both instrumentation time and cooling agent effects were significant at P <.0001. Under the conditions of this study, it was shown that injurious heat transfer occurs in less than 1 minute during dry ultrasonic instrumentation of metallic posts. Cycles of short instrumentation times with active coolants were effective in reducing the probability of tissue damage when teeth were instrumented dry. With as little as 20 seconds of continuous dry ultrasonic instrumentation, the consequences of thermal buildup to an individual tooth might contribute to an injurious clinical outcome. Copyright © 2010 American Association of Endodontists. All rights reserved.
Iorgulescu, E; Voicu, V A; Sârbu, C; Tache, F; Albu, F; Medvedovici, A
2016-08-01
The influence of the experimental variability (instrumental repeatability, instrumental intermediate precision and sample preparation variability) and data pre-processing (normalization, peak alignment, background subtraction) on the discrimination power of multivariate data analysis methods (Principal Component Analysis -PCA- and Cluster Analysis -CA-) as well as a new algorithm based on linear regression was studied. Data used in the study were obtained through positive or negative ion monitoring electrospray mass spectrometry (+/-ESI/MS) and reversed phase liquid chromatography/UV spectrometric detection (RPLC/UV) applied to green tea extracts. Extractions in ethanol and heated water infusion were used as sample preparation procedures. The multivariate methods were directly applied to mass spectra and chromatograms, involving strictly a holistic comparison of shapes, without assignment of any structural identity to compounds. An alternative data interpretation based on linear regression analysis mutually applied to data series is also discussed. Slopes, intercepts and correlation coefficients produced by the linear regression analysis applied on pairs of very large experimental data series successfully retain information resulting from high frequency instrumental acquisition rates, obviously better defining the profiles being compared. Consequently, each type of sample or comparison between samples produces in the Cartesian space an ellipsoidal volume defined by the normal variation intervals of the slope, intercept and correlation coefficient. Distances between volumes graphically illustrates (dis)similarities between compared data. The instrumental intermediate precision had the major effect on the discrimination power of the multivariate data analysis methods. Mass spectra produced through ionization from liquid state in atmospheric pressure conditions of bulk complex mixtures resulting from extracted materials of natural origins provided an excellent data basis for multivariate analysis methods, equivalent to data resulting from chromatographic separations. The alternative evaluation of very large data series based on linear regression analysis produced information equivalent to results obtained through application of PCA an CA. Copyright © 2016 Elsevier B.V. All rights reserved.
Vasli, Parvaneh; Dehghan-Nayeri, Nahid; Khosravi, Laleh
2018-01-01
Despite the emphasis placed on the implementation of continuing professional education programs in Iran, researchers or practitioners have not developed an instrument for assessing the factors that affect the knowledge transfer from such programs to clinical practice. The aim of this study was to design and validate such instrument for the Iranian context. The research used a three-stage mix method. In the first stage, in-depth interviews with nurses and content analysis were conducted, after which themes were extracted from the data. In the second stage, the findings of the content analysis and literature review were examined, and preliminary instrument options were developed. In the third stage, qualitative content validity, face validity, content validity ratio, content validity index, and construct validity using exploratory factor analysis was conducted. The reliability of the instrument was measured before and after the determination of construct validity. Primary tool instrument initially comprised 53 items, and its content validity index was 0.86. In the multi-stage factor analysis, eight questions were excluded, thereby reducing 11 factors to five and finally, to four. The final instrument with 43 items consists of the following dimensions: structure and organizational climate, personal characteristics, nature and status of professionals, and nature of educational programs. Managers can use the Iranian instrument to identify factors affecting knowledge transfer of continuing professional education to clinical practice. Copyright © 2017. Published by Elsevier Ltd.
Method for improving instrument response
Hahn, David W.; Hencken, Kenneth R.; Johnsen, Howard A.; Flower, William L.
2000-01-01
This invention pertains generally to a method for improving the accuracy of particle analysis under conditions of discrete particle loading and particularly to a method for improving signal-to-noise ratio and instrument response in laser spark spectroscopic analysis of particulate emissions. Under conditions of low particle density loading (particles/m.sup.3) resulting from low overall metal concentrations and/or large particle size uniform sampling can not be guaranteed. The present invention discloses a technique for separating laser sparks that arise from sample particles from those that do not; that is, a process for systematically "gating" the instrument response arising from "sampled" particles from those responses which do not, is dislosed as a solution to his problem. The disclosed approach is based on random sampling combined with a conditional analysis of each pulse. A threshold value is determined for the ratio of the intensity of a spectral line for a given element to a baseline region. If the threshold value is exceeded, the pulse is classified as a "hit" and that data is collected and an average spectrum is generated from an arithmetic average of "hits". The true metal concentration is determined from the averaged spectrum.
Sparse feature learning for instrument identification: Effects of sampling and pooling methods.
Han, Yoonchang; Lee, Subin; Nam, Juhan; Lee, Kyogu
2016-05-01
Feature learning for music applications has recently received considerable attention from many researchers. This paper reports on the sparse feature learning algorithm for musical instrument identification, and in particular, focuses on the effects of the frame sampling techniques for dictionary learning and the pooling methods for feature aggregation. To this end, two frame sampling techniques are examined that are fixed and proportional random sampling. Furthermore, the effect of using onset frame was analyzed for both of proposed sampling methods. Regarding summarization of the feature activation, a standard deviation pooling method is used and compared with the commonly used max- and average-pooling techniques. Using more than 47 000 recordings of 24 instruments from various performers, playing styles, and dynamics, a number of tuning parameters are experimented including the analysis frame size, the dictionary size, and the type of frequency scaling as well as the different sampling and pooling methods. The results show that the combination of proportional sampling and standard deviation pooling achieve the best overall performance of 95.62% while the optimal parameter set varies among the instrument classes.
Quality of Work: Validation of a New Instrument in Three Languages
Steffgen, Georges; Kohl, Diane; Reese, Gerhard; Happ, Christian; Sischka, Philipp
2015-01-01
Introduction and objective: A new instrument to measure quality of work was developed in three languages (German, French and Luxembourgish) and validated in a study of employees working in Luxembourg. Methods and results: A representative sample (n = 1529) was taken and exploratory factor analysis revealed a six-factor solution for the 21-item instrument (satisfaction and respect, mobbing, mental strain at work, cooperation, communication and feedback, and appraisal). Reliability analysis showed satisfying reliability for all six factors and the total questionnaire. In order to examine the construct validity of the new instrument, regression analyses were conducted to test whether the instrument predicted work characteristics’ influence on three components of well-being—burnout, psychological stress and maladaptive coping behaviors. Conclusion: The present validation offers a trilingual inventory for measuring quality of work that may be used, for example, as an assessment tool or for testing the effectiveness of interventions. PMID:26703634
Instrumental variable methods in comparative safety and effectiveness research.
Brookhart, M Alan; Rassen, Jeremy A; Schneeweiss, Sebastian
2010-06-01
Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial.
A new method for the measurement of tremor at rest.
Comby, B; Chevalier, G; Bouchoucha, M
1992-01-01
This paper establishes a standard method for measuring human tremor. The electronic instrument described is an application of this method. It solves the need for an effective and simple tremor-measuring instrument fit for wide distribution. This instrument consists of a piezoelectric accelerometer connected to an electronic circuit and to an LCD display. The signal is also analysed by a computer after accelerometer analogic/digital conversion in order to test the method. The tremor of 1079 healthy subjects was studied. Spectral analysis showed frequency peaks between 5.85 and 8.80 Hz. Chronic cigarette-smoking and coffee drinking did not modify the tremor as compared with controls. Relaxation session decreased tremor significantly in healthy subjects (P less than 0.01). This new tremor-measuring method opens new horizons in the understanding of physiological and pathological tremor, stress, anxiety and in the means to avoid or compensate them.
Estes, Jacob M; Kirby, Tyler O; Huh, Warner K
2007-01-01
To determine whether autoclave sterilization eradicates human papillomavirus (HPV) DNA on specula and instruments used to treat women with cervical neoplasia. Specula and instruments used in two referral colposcopy clinics were evaluated to determine the PGMY9/11 primer system's ability to amplify residual HPV DNA. Each speculum and instrument was sampled with a Dacron swab and stored in PreservCyt solution (Cytyc Corporation, Marlborough, MA) at 4 degrees C. DNA amplification was performed under standard conditions with appropriate controls followed by HPV typing using the reverse line blot test (Roche Molecular Systems, Alameda, CA). Once validated, the same polymerase chain reaction method was used on autoclave-sterilized specula and biopsy instruments and heated glass bead- and Cidex bath (Johnson & Johnson, New Brunswick, NJ)-sterilized instruments. All results, with appropriate positive and negative controls, were confirmed in triplicate. A total of 140 instruments (70 used and 70 autoclaved) were sampled for residual HPV DNA. Five samples in the contaminated specula arm were excluded from analysis secondary to insufficient sampling. Of the remaining samples, 52.3% (34/65) of contaminated instruments-both specula and biopsy instruments-had detectable HPV DNA. Fifty-five percent of contaminated biopsy instruments (11/20) were positive and 51.1% of contaminated specula (23/45) were positive. All 70 autoclaved samples (50 specula and 20 biopsy instruments) were negative for residual HPV DNA or beta-globin. One instrument in the glass bead and Cidex group that was presumed sterile was positive for HPV 16 DNA. The PGMY9/11 primer system is an effective method to detect residual HPV DNA. Autoclave sterilization appears to eradicate HPV DNA to levels undetectable with this sensitive assay, whereas heated glass beads followed by Cidex bath appears to be inadequate methods. These results suggest that autoclave sterilization is effective when using nondisposable instruments and should be the method of choice in studies using polymerase chain reaction-based amplification of HPV DNA.
Text mining a self-report back-translation.
Blanch, Angel; Aluja, Anton
2016-06-01
There are several recommendations about the routine to undertake when back translating self-report instruments in cross-cultural research. However, text mining methods have been generally ignored within this field. This work describes a text mining innovative application useful to adapt a personality questionnaire to 12 different languages. The method is divided in 3 different stages, a descriptive analysis of the available back-translated instrument versions, a dissimilarity assessment between the source language instrument and the 12 back-translations, and an item assessment of item meaning equivalence. The suggested method contributes to improve the back-translation process of self-report instruments for cross-cultural research in 2 significant intertwined ways. First, it defines a systematic approach to the back translation issue, allowing for a more orderly and informed evaluation concerning the equivalence of different versions of the same instrument in different languages. Second, it provides more accurate instrument back-translations, which has direct implications for the reliability and validity of the instrument's test scores when used in different cultures/languages. In addition, this procedure can be extended to the back-translation of self-reports measuring psychological constructs in clinical assessment. Future research works could refine the suggested methodology and use additional available text mining tools. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
A Novel Method to Decontaminate Surgical Instruments for Operational and Austere Environments.
Knox, Randy W; Demons, Samandra T; Cunningham, Cord W
2015-12-01
The purpose of this investigation was to test a field-expedient, cost-effective method to decontaminate, sterilize, and package surgical instruments in an operational (combat) or austere environment using chlorhexidine sponges, ultraviolet C (UVC) light, and commercially available vacuum sealing. This was a bench study of 4 experimental groups and 1 control group of 120 surgical instruments. Experimental groups were inoculated with a 10(6) concentration of common wound bacteria. The control group was vacuum sealed without inoculum. Groups 1, 2, and 3 were first scrubbed with a chlorhexidine sponge, rinsed, and dried. Group 1 was then packaged; group 2 was irradiated with UVC light, then packaged; group 3 was packaged, then irradiated with UVC light through the bag; and group 4 was packaged without chlorhexidine scrubbing or UVC irradiation. The UVC was not tested by itself, as it does not grossly clean. The instruments were stored overnight and tested for remaining colony forming units (CFU). Data analysis was conducted using analysis of variance and group comparisons using the Tukey method. Group 4 CFU was statistically greater (P < .001) than the control group and groups 1 through 3. There was no statistically significant difference between the control group and groups 1 through 3. Vacuum sealing of chlorhexidine-scrubbed contaminated instruments with and without handheld UVC irradiation appears to be an acceptable method of field decontamination. Chlorhexidine scrubbing alone achieved a 99.9% reduction in CFU, whereas adding UVC before packaging achieved sterilization or 100% reduction in CFU, and UVC through the bag achieved disinfection. Published by Elsevier Inc.
Social Desirability Bias in Self-Reporting of Hearing Protector Use among Farm Operators
McCullagh, Marjorie C.; Rosemberg, Marie-Anne
2015-01-01
Objective: The purposes of this study were (i) to examine the relationship between reported hearing protector use and social desirability bias, and (ii) to compare results of the Marlowe-Crowne social desirability instrument when administered using two different methods (i.e. online and by telephone). Methods: A shortened version of the Marlowe-Crowne social desirability instrument, as well as a self-administered instrument measuring use of hearing protectors, was administered to 497 participants in a study of hearing protector use. The relationship between hearing protector use and social desirability bias was examined using regression analysis. The results of two methods of administration of the Marlowe-Crowne social desirability instrument were compared using t-tests and regression analysis. Results: Reliability (using Cronbach’s alpha) for the shortened seven-item scale for this sample was 0.58. There was no evidence of a relationship between reported hearing protector use and social desirability reporting bias, as measured by the shortened Marlowe-Crowne. The difference in results by method of administration (i.e. online, telephone) was very small. Conclusions: This is the first published study to measure social desirability bias in reporting of hearing protector use among farmers. Findings of this study do not support the presence of social desirability bias in farmers’ reporting of hearing protector use, lending support for the validity of self-report in hearing protector use in this population. PMID:26209595
HAWC+/SOFIA Instrumental Polarization Calibration
NASA Astrophysics Data System (ADS)
Michail, Joseph M.; Chuss, David; Dowell, Charles D.; Santos, Fabio; Siah, Javad; Vaillancourt, John; HAWC+ Instrument Team
2018-01-01
HAWC+ is a new far-infrared polarimeter for the NASA/DLR SOFIA (Stratospheric Observatory for Infrared Astronomy) telescope. HAWC+ has the capability to measure the polarization of astronomical sources with unprecedented sensitivity and angular resolution in four bands from 50-250 microns. Using data obtained during commissioning flights, we implemented a calibration strategy that separates the astronomical polarization signal from the induced instrumental polarization. The result of this analysis is a map of the instrumental polarization as a function of position in the instrument's focal plane in each band. The results show consistency between bands, as well as with other methods used to determine preliminary instrumental polarization values.
Stukel, Thérèse A.; Fisher, Elliott S; Wennberg, David E.; Alter, David A.; Gottlieb, Daniel J.; Vermeulen, Marian J.
2007-01-01
Context Comparisons of outcomes between patients treated and untreated in observational studies may be biased due to differences in patient prognosis between groups, often because of unobserved treatment selection biases. Objective To compare 4 analytic methods for removing the effects of selection bias in observational studies: multivariable model risk adjustment, propensity score risk adjustment, propensity-based matching, and instrumental variable analysis. Design, Setting, and Patients A national cohort of 122 124 patients who were elderly (aged 65–84 years), receiving Medicare, and hospitalized with acute myocardial infarction (AMI) in 1994–1995, and who were eligible for cardiac catheterization. Baseline chart reviews were taken from the Cooperative Cardiovascular Project and linked to Medicare health administrative data to provide a rich set of prognostic variables. Patients were followed up for 7 years through December 31, 2001, to assess the association between long-term survival and cardiac catheterization within 30 days of hospital admission. Main Outcome Measure Risk-adjusted relative mortality rate using each of the analytic methods. Results Patients who received cardiac catheterization (n=73 238) were younger and had lower AMI severity than those who did not. After adjustment for prognostic factors by using standard statistical risk-adjustment methods, cardiac catheterization was associated with a 50% relative decrease in mortality (for multivariable model risk adjustment: adjusted relative risk [RR], 0.51; 95% confidence interval [CI], 0.50–0.52; for propensity score risk adjustment: adjusted RR, 0.54; 95% CI, 0.53–0.55; and for propensity-based matching: adjusted RR, 0.54; 95% CI, 0.52–0.56). Using regional catheterization rate as an instrument, instrumental variable analysis showed a 16% relative decrease in mortality (adjusted RR, 0.84; 95% CI, 0.79–0.90). The survival benefits of routine invasive care from randomized clinical trials are between 8% and 21 %. Conclusions Estimates of the observational association of cardiac catheterization with long-term AMI mortality are highly sensitive to analytic method. All standard risk-adjustment methods have the same limitations regarding removal of unmeasured treatment selection biases. Compared with standard modeling, instrumental variable analysis may produce less biased estimates of treatment effects, but is more suited to answering policy questions than specific clinical questions. PMID:17227979
NASA Astrophysics Data System (ADS)
Osawa, Takahito; Hatsukawa, Yuichi; Appel, Peter W. U.; Matsue, Hideaki
2011-04-01
The authors have established a method of determining mercury and gold in severely polluted environmental samples using prompt gamma-ray analysis (PGA) and instrumental neutron activation analysis (INAA). Since large amounts of mercury are constantly being released into the environment by small-scale gold mining in many developing countries, the mercury concentration in tailings and water has to be determined to mitigate environmental pollution. Cold-vapor atomic absorption analysis, the most pervasive method of mercury analysis, is not suitable because tailings and water around mining facilities have extremely high mercury concentrations. On the other hand, PGA can determine high mercury concentrations in polluted samples as it has an appropriate level of sensitivity. Moreover, gold concentrations can be determined sequentially by using INAA after PGA. In conclusion, the analytical procedure established in this work using PGA and INAA is the best way to evaluate the degree of pollution and the tailing resource value. This method will significantly contribute to mitigating problems in the global environment.
EPA is currently considering a quantitative polymerase chain reaction (qPCR) method, targeting Enterococcus spp., for beach monitoring. Improvements in the method’s cost-effectiveness may be realized by the use of newer instrumentation such as the Applied Biosystems StepOneTM a...
Using Social Network Methods to Study School Leadership
ERIC Educational Resources Information Center
Pitts, Virginia M.; Spillane, James P.
2009-01-01
Social network analysis is increasingly used in the study of policy implementation and school leadership. A key question that remains is that of instrument validity--that is, the question of whether these social network survey instruments measure what they purport to measure. In this paper, we describe our work to examine the validity of the…
New instrument expanding individual tree stem analysis
Neil A. Clark
2001-01-01
Forest health, vitality, and productivity are interrelated and are maintained by using sound forest management. There are some standard indicators that are measured to assess the extent and severity of damage inflicted by biotic and abiotic agents. Assessment of these indicators using affordable methods is a subjective process. A video rangefinder instrument is...
From air to rubber: New techniques for measuring and replicating mouthpieces, bocals, and bores
NASA Astrophysics Data System (ADS)
Fuks, Leonardo
2002-11-01
The history of musical instruments comprises a long genealogy of models and prototypes that results from a combination of copying existing specimens with the change in constructive parameters, and the addition of new devices. In making wind instruments, several techniques have been traditionally employed for extracting the external and internal dimensions of toneholes, air columns, bells, and mouthpieces. In the twentieth century, methods such as pulse reflectometry, x-ray, magnetic resonance, and ultrasound imaging have been made available for bore measurement. Advantages and drawbacks of the existing methods are discussed and a new method is presented that makes use of the injection and coating of silicon rubber, for accurate molding of the instrument. This technique is harmless to all traditional materials, being indicated also for measurements of historical instruments. The paper presents dimensional data obtained from clarinet and saxophone mouthpieces. A set of replicas of top quality clarinet and saxophone mouthpieces, trombone bocals, and flute headjoints is shown, with comparative acoustical and performance analyses. The application of such techniques for historical and modern instrument analysis, restoration, and manufacturing is proposed.
Ostrinskaya, Alla; Kunz, Roderick R; Clark, Michelle; Kingsborough, Richard P; Ong, Ta-Hsuan; Deneault, Sandra
2018-05-24
A flow-injection analysis tandem mass spectrometry (FIA MSMS) method was developed for rapid quantitative analysis of 10 different inorganic and organic explosives. Performance is optimized by tailoring the ionization method (APCI/ESI), de-clustering potentials, and collision energies for each specific analyte. In doing so, a single instrument can be used to detect urea nitrate, potassium chlorate, 2,4,6-trinitrotoluene, 2,4,6-trinitrophenylmethylnitramine, triacetone triperoxide, hexamethylene triperoxide diamine, pentaerythritol tetranitrate, 1,3,5-trinitroperhydro-1,3,5-triazine, nitroglycerin, and octohy-dro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine with sensitivities all in the picogram per milliliter range. In conclusion, FIA APCI/ESI MSMS is a fast (<1 min/sample), sensitive (~pg/mL LOQ), and precise (intraday RSD < 10%) method for trace explosive detection that can play an important role in criminal and attributional forensics, counterterrorism, and environmental protection areas, and has the potential to augment or replace several of the existing explosive detection methods. © 2018 American Academy of Forensic Sciences.
Inan, U; Gurel, M
2017-02-01
Instrument fracture is a serious concern in endodontic practice. The aim of this study was to investigate the surface quality of new and used rotary nickel-titanium (NiTi) instruments manufactured by the traditional grinding process and twisting methods. Total 16 instruments of two rotary NiTi systems were used in this study. Eight Twisted Files (TF) (SybronEndo, Orange, CA, USA) and 8 Mtwo (VDW, Munich, Germany) instruments were evaluated. New and used of 4 experimental groups were evaluated using an atomic force microscopy (AFM). New and used instruments were analyzed on 3 points along a 3 mm. section at the tip of the instrument. Quantitative measurements according to the topographical deviations were recorded. The data were statistically analyzed with paired samples t-test and independent samples t-test. Mean root mean square (RMS) values for new and used TF 25.06 files were 10.70 ± 2.80 nm and 21.58 ± 6.42 nm, respectively, and the difference between them was statistically significant (P < 0.05). Mean RMS values for new and used Mtwo 25.06 files were 24.16 ± 9.30 nm and 39.15 ± 16.20 nm respectively, the difference between them also was statistically significant (P < 0.05). According to the AFM analysis, instruments produced by twisting method (TF 25.06) had better surface quality than the instruments produced by traditional grinding process (Mtwo 25.06 files).
Laboratory and field based evaluation of chromatography ...
The Monitor for AeRosols and GAses in ambient air (MARGA) is an on-line ion-chromatography-based instrument designed for speciation of the inorganic gas and aerosol ammonium-nitrate-sulfate system. Previous work to characterize the performance of the MARGA has been primarily based on field comparison to other measurement methods to evaluate accuracy. While such studies are useful, the underlying reasons for disagreement among methods are not always clear. This study examines aspects of MARGA accuracy and precision specifically related to automated chromatography analysis. Using laboratory standards, analytical accuracy, precision, and method detection limits derived from the MARGA chromatography software are compared to an alternative software package (Chromeleon, Thermo Scientific Dionex). Field measurements are used to further evaluate instrument performance, including the MARGA’s use of an internal LiBr standard to control accuracy. Using gas/aerosol ratios and aerosol neutralization state as a case study, the impact of chromatography on measurement error is assessed. The new generation of on-line chromatography-based gas and particle measurement systems have many advantages, including simultaneous analysis of multiple pollutants. The Monitor for Aerosols and Gases in Ambient Air (MARGA) is such an instrument that is used in North America, Europe, and Asia for atmospheric process studies as well as routine monitoring. While the instrument has been evaluat
Laser Capture Microdissection for Protein and NanoString RNA analysis
Golubeva, Yelena; Salcedo, Rosalba; Mueller, Claudius; Liotta, Lance A.; Espina, Virginia
2013-01-01
Laser capture microdissection (LCM) allows the precise procurement of enriched cell populations from a heterogeneous tissue, or live cell culture, under direct microscopic visualization. Histologically enriched cell populations can be procured by harvesting cells of interest directly, or isolating specific cells by ablating unwanted cells. The basic components of laser microdissection technology are a) visualization of cells via light microscopy, b) transfer of laser energy to a thermolabile polymer with either the formation of a polymer-cell composite (capture method) or transfer of laser energy via an ultraviolet laser to photovolatize a region of tissue (cutting method), and c) removal of cells of interest from the heterogeneous tissue section. The capture and cutting methods (instruments) for laser microdissection differ in the manner by which cells of interest are removed from the heterogeneous sample. Laser energy in the capture method is infrared (810nm), while in the cutting mode the laser is ultraviolet (355nm). Infrared lasers melt a thermolabile polymer that adheres to the cells of interest, whereas ultraviolet lasers ablate cells for either removal of unwanted cells or excision of a defined area of cells. LCM technology is applicable to an array of applications including mass spectrometry, DNA genotyping and loss-of-heterozygosity analysis, RNA transcript profiling, cDNA library generation, proteomics discovery, and signal kinase pathway profiling. This chapter describes laser capture microdissection using an ArcturusXT instrument for protein LCM sample analysis, and using a mmi CellCut Plus® instrument for RNA analysis via NanoString technology. PMID:23027006
NASA Astrophysics Data System (ADS)
Rachmatullah, A.; Octavianda, R. P.; Ha, M.; Rustaman, N. Y.; Diana, S.
2017-02-01
Along with numerous instruments developed and used in science education researches, some of those instruments have been translated to local language in the country where the instruments were used. Most of researchers that used those translated instruments did not report the quality of those translated instruments. One of the instruments is the Scientific Literacy Assessment (SLA) including the Science Motivation and Beliefs (SLA-MB) as part of the SLA. In this study, the SLA-MB has been translated into Indonesian Language (Bahasa). The purpose of this study is to investigate the SLA-MB instrument that has been translated to Indonesian language from the view of dimensionality, reliability, item quality and differential item functioning (DIF) based on IRT-Rasch analysis. We used Conquest and Winstep as the program for IRT-Rasch analysis. We employed quantitative research method with school-survey on this study. Research subjects are 223 Indonesian Middle school students (age 13-16), with 64 boys and 159 girls. IRT-Rasch analysis of the SLA-MB Indonesian version indicated that a three-dimensional model fit significantly better than one-dimension model, and the reliability of each dimensions are about 0.60 to 0.82. As well as those findings, fit values of all items are acceptable, moreover we found no DIF for all of the SLA-MB items. Overall, our study suggests that Indonesian version of SLA-MB is acceptable to be implemented as research instrument conducted in Indonesia.
Determining the risk of cardiovascular disease using ion mobility of lipoproteins
Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.
2010-05-11
A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.
A scalable correlator for multichannel diffuse correlation spectroscopy.
Stapels, Christopher J; Kolodziejski, Noah J; McAdams, Daniel; Podolsky, Matthew J; Fernandez, Daniel E; Farkas, Dana; Christian, James F
2016-02-01
Diffuse correlation spectroscopy (DCS) is a technique which enables powerful and robust non-invasive optical studies of tissue micro-circulation and vascular blood flow. The technique amounts to autocorrelation analysis of coherent photons after their migration through moving scatterers and subsequent collection by single-mode optical fibers. A primary cost driver of DCS instruments are the commercial hardware-based correlators, limiting the proliferation of multi-channel instruments for validation of perfusion analysis as a clinical diagnostic metric. We present the development of a low-cost scalable correlator enabled by microchip-based time-tagging, and a software-based multi-tau data analysis method. We will discuss the capabilities of the instrument as well as the implementation and validation of 2- and 8-channel systems built for live animal and pre-clinical settings.
CIEL*a*b* color space predictive models for colorimetry devices--analysis of perfume quality.
Korifi, Rabia; Le Dréau, Yveline; Antinelli, Jean-François; Valls, Robert; Dupuy, Nathalie
2013-01-30
Color perception plays a major role in the consumer evaluation of perfume quality. Consumers need first to be entirely satisfied with the sensory properties of products, before other quality dimensions become relevant. The evaluation of complex mixtures color presents a challenge even for modern analytical techniques. A variety of instruments are available for color measurement. They can be classified as tristimulus colorimeters and spectrophotometers. Obsolescence of the electronics of old tristimulus colorimeter arises from the difficulty in finding repair parts and leads to its replacement by more modern instruments. High quality levels in color measurement, i.e., accuracy and reliability in color control are the major advantages of the new generation of color instrumentation, the integrating sphere spectrophotometer. Two models of spectrophotometer were tested in transmittance mode, employing the d/0° geometry. The CIEL(*)a(*)b(*) color space parameters were measured with each instrument for 380 samples of raw materials and bases used in the perfume compositions. The results were graphically compared between the colorimeter device and the spectrophotometer devices. All color space parameters obtained with the colorimeter were used as dependent variables to generate regression equations with values obtained from the spectrophotometers. The data was statistically analyzed to create predictive model between the reference and the target instruments through two methods. The first method uses linear regression analysis and the second method consists of partial least square regression (PLS) on each component. Copyright © 2012 Elsevier B.V. All rights reserved.
Akamaru, Tomoyuki; Kawahara, Norio; Sakamoto, Jiro; Yoshida, Akira; Murakami, Hideki; Hato, Taizo; Awamori, Serina; Oda, Juhachi; Tomita, Katsuro
2005-12-15
A finite-element study of posterior alone or anterior/posterior combined instrumentation following total spondylectomy and replacement with a titanium mesh cage used as an anterior strut. To compare the effect of posterior instrumentation versus anterior/posterior instrumentation on transmission of the stress to grafted bone inside a titanium mesh cage following total spondylectomy. The most recent reconstruction techniques following total spondylectomy for malignant spinal tumor include a titanium mesh cage filled with autologous bone as an anterior strut. The need for additional anterior instrumentation with posterior pedicle screws and rods is controversial. Transmission of the mechanical stress to grafted bone inside a titanium mesh cage is important for fusion and remodeling. To our knowledge, there are no published reports comparing the load-sharing properties of the different reconstruction methods following total spondylectomy. A 3-dimensional finite-element model of the reconstructed spine (T10-L4) following total spondylectomy at T12 was constructed. A Harms titanium mesh cage (DePuy Spine, Raynham, MA) was positioned as an anterior replacement, and 3 types of the reconstruction methods were compared: (1) multilevel posterior instrumentation (MPI) (i.e., posterior pedicle screws and rods at T10-L2 without anterior instrumentation); (2) MPI with anterior instrumentation (MPAI) (i.e., MPAI [Kaneda SR; DePuy Spine] at T11-L1); and (3) short posterior and anterior instrumentation (SPAI) (i.e., posterior pedicle screws and rods with anterior instrumentation at T11-L1). The mechanical energy stress distribution exerted inside the titanium mesh cage was evaluated and compared by finite-element analysis for the 3 different reconstruction methods. Simulated forces were applied to give axial compression, flexion, extension, and lateral bending. In flexion mode, the energy stress distribution in MPI was higher than 3.0 x 10 MPa in 73.0% of the total volume inside the titanium mesh cage, while 38.0% in MPAI, and 43.3% in SPAI. In axial compression and extension modes, there were no remarkable differences for each reconstruction method. In left-bending mode, there was little stress energy in the cancellous bone inside the titanium mesh cage in MPAI and SPAI. This experiment shows that from the viewpoint of stress shielding, the reconstruction method, using additional anterior instrumentation with posterior pedicle screws (MPAI and SPAI), stress shields the cancellous bone inside the titanium mesh cage to a higher degree than does the system using posterior pedicle screw fixation alone (MPI). Thus, a reconstruction method with no anterior fixation should be better at allowing stress for remodeling of the bone graft inside the titanium mesh cage.
NASA Astrophysics Data System (ADS)
Papadopoulou, D. N.; Zachariadis, G. A.; Anthemidis, A. N.; Tsirliganis, N. C.; Stratis, J. A.
2004-12-01
Two multielement instrumental methods of analysis, micro X-ray fluorescence spectrometry (micro-XRF) and inductively coupled plasma atomic emission spectrometry (ICP-AES) were applied for the analysis of 7th and 5th century B.C. ancient ceramic sherds in order to evaluate the above two methods and to assess the potential to use the current compact and portable micro-XRF instrument for the in situ analysis of ancient ceramics. The distinguishing factor of interest is that micro-XRF spectrometry offers the possibility of a nondestructive analysis, an aspect of primary importance in the compositional analysis of cultural objects. Micro-XRF measurements were performed firstly directly on the ceramic sherds with no special pretreatment apart from surface cleaning (micro-XRF on sherds) and secondly on pressed pellet disks which were prepared for each ceramic sherd (micro-XRF on pellet). For the ICP-AES determination of elements, test solutions were prepared by the application of a microwave-assisted decomposition procedure in closed high-pressure PFA vessels. Also, the standard reference material SARM 69 was used for the efficiency calibration of the micro-XRF instrument and was analysed by both methods. In order to verify the calibration, the standard reference materials NCS DC 73332 and SRM620 as well as the reference materials AWI-1 and PRI-1 were analysed by micro-XRF. Elemental concentrations determined by the three analytical procedures (ICP-AES, micro-XRF on sherds and micro-XRF on pellets) were statistically treated by correlation analysis and Student's t-test (at the 95% confidence level).
Schut, Henk; Stroebe, Margaret S.; Wilson, Stewart; Birrell, John
2016-01-01
Objective This study assessed the validity of the Indicator of Bereavement Adaptation Cruse Scotland (IBACS). Designed for use in clinical and non-clinical settings, the IBACS measures severity of grief symptoms and risk of developing complications. Method N = 196 (44 male, 152 female) help-seeking, bereaved Scottish adults participated at two timepoints: T1 (baseline) and T2 (after 18 months). Four validated assessment instruments were administered: CORE-R, ICG-R, IES-R, SCL-90-R. Discriminative ability was assessed using ROC curve analysis. Concurrent validity was tested through correlation analysis at T1. Predictive validity was assessed using correlation analyses and ROC curve analysis. Optimal IBACS cutoff values were obtained by calculating a maximal Youden index J in ROC curve analysis. Clinical implications were compared across instruments. Results ROC curve analysis results (AUC = .84, p < .01, 95% CI between .77 and .90) indicated the IBACS is a good diagnostic instrument for assessing complicated grief. Positive correlations (p < .01, 2-tailed) with all four instruments at T1 demonstrated the IBACS' concurrent validity, strongest with complicated grief measures (r = .82). Predictive validity was shown to be fair in T2 ROC curve analysis results (n = 67, AUC = .78, 95% CI between .65 and .92; p < .01). Predictive validity was also supported by stable positive correlations between IBACS and other instruments at T2. Clinical indications were found not to differ across instruments. Conclusions The IBACS offers effective grief symptom and risk assessment for use by non-clinicians. Indications are sufficient to support intake assessment for a stepped model of bereavement intervention. PMID:27741246
Davidson, Thomas; Levin, Lars-Ake
2010-01-01
It is important for economic evaluations in healthcare to cover all relevant information. However, many existing evaluations fall short of this goal, as they fail to include all the costs and effects for the relatives of a disabled or sick individual. The objective of this study was to analyse how relatives' costs and effects could be measured, valued and incorporated into a cost-effectiveness analysis. In this article, we discuss the theories underlying cost-effectiveness analyses in the healthcare arena; the general conclusion is that it is hard to find theoretical arguments for excluding relatives' costs and effects if a societal perspective is used. We argue that the cost of informal care should be calculated according to the opportunity cost method. To capture relatives' effects, we construct a new term, the R-QALY weight, which is defined as the effect on relatives' QALY weight of being related to a disabled or sick individual. We examine methods for measuring, valuing and incorporating the R-QALY weights. One suggested method is to estimate R-QALYs and incorporate them together with the patient's QALY in the analysis. However, there is no well established method as yet that can create R-QALY weights. One difficulty with measuring R-QALY weights using existing instruments is that these instruments are rarely focused on relative-related aspects. Even if generic quality-of-life instruments do cover some aspects relevant to relatives and caregivers, they may miss important aspects and potential altruistic preferences. A further development and validation of the existing caregiving instruments used for eliciting utility weights would therefore be beneficial for this area, as would further studies on the use of time trade-off or Standard Gamble methods for valuing R-QALY weights. Another potential method is to use the contingent valuation method to find a monetary value for all the relatives' costs and effects. Because cost-effectiveness analyses are used for decision making, and this is often achieved by comparing different cost-effectiveness ratios, we argue that it is important to find ways of incorporating all relatives' costs and effects into the analysis. This may not be necessary for every analysis of every intervention, but for treatments where relatives' costs and effects are substantial there may be some associated influence on the cost-effectiveness ratio.
Development of a computer-assisted system for model-based condylar position analysis (E-CPM).
Ahlers, M O; Jakstat, H
2009-01-01
Condylar position analysis is a measuring method for the three-dimensional quantitative acquisition of the position of the mandible in different conditions or at different points in time. Originally, the measurement was done based on a model, using special mechanical condylar position measuring instruments, and on a research scale with mechanical-electronic measuring instruments. Today, as an alternative, it is possible to take measurements with electronic measuring instruments applied directly to the patient. The computerization of imaging has also facilitated condylar position measurement by means of three-dimensional data records obtained by imaging examination methods, which has been used in connection with the simulation and quantification of surgical operation results. However, the comparative measurement of the condylar position at different points in time has so far not been possible to the required degree. An electronic measuring instrument, allowing acquisition of the condylar position in clinical routine and facilitating later calibration with measurements from later examinations by data storage and use of precise equalizing systems, was therefore designed by the present authors. This measuring instrument was implemented on the basis of already existing components from the Reference CPM und Cadiax Compact articulator and registration systems (Gamma Dental, Klosterneuburg, Austria) as well as the matching CMD3D evaluation software (dentaConcept, Hamburg).
NASA Astrophysics Data System (ADS)
Landsberger, S.; Peshev, S.; Becker, D. A.
1994-12-01
Silicon determination in sixteen botanical and biological standard reference materials is described using the 29Si(n, p) 29Al reaction through instrumental epithermal neutron activation analysis and Compton suppression gamma-ray spectroscopy. By simultaneous utilization of both cadmium and boron epithermal filters along with anticoincidence gamma-counting, detection limits as low as 12 ppm were obtained for certain matrices, much lower than previously reported values for this type of analysis. The method is applicable to many botanical and biological matrices and is attractive with its interference free, purely instrumental nature, compared with methods using the 28Si(n, p) 28Al reaction or chemical separation techniques.
Augmented assessment as a means to augmented reality.
Bergeron, Bryan
2006-01-01
Rigorous scientific assessment of educational technologies typically lags behind the availability of the technologies by years because of the lack of validated instruments and benchmarks. Even when the appropriate assessment instruments are available, they may not be applied because of time and monetary constraints. Work in augmented reality, instrumented mannequins, serious gaming, and similar promising educational technologies that haven't undergone timely, rigorous evaluation, highlights the need for assessment methodologies that address the limitations of traditional approaches. The most promising augmented assessment solutions incorporate elements of rapid prototyping used in the software industry, simulation-based assessment techniques modeled after methods used in bioinformatics, and object-oriented analysis methods borrowed from object oriented programming.
Comparison of scoring approaches for the NEI VFQ-25 in low vision.
Dougherty, Bradley E; Bullimore, Mark A
2010-08-01
The aim of this study was to evaluate different approaches to scoring the National Eye Institute Visual Functioning Questionnaire-25 (NEI VFQ-25) in patients with low vision including scoring by the standard method, by Rasch analysis, and by use of an algorithm created by Massof to approximate Rasch person measure. Subscale validity and use of a 7-item short form instrument proposed by Ryan et al. were also investigated. NEI VFQ-25 data from 50 patients with low vision were analyzed using the standard method of summing Likert-type scores and calculating an overall average, Rasch analysis using Winsteps software, and the Massof algorithm in Excel. Correlations between scores were calculated. Rasch person separation reliability and other indicators were calculated to determine the validity of the subscales and of the 7-item instrument. Scores calculated using all three methods were highly correlated, but evidence of floor and ceiling effects was found with the standard scoring method. None of the subscales investigated proved valid. The 7-item instrument showed acceptable person separation reliability and good targeting and item performance. Although standard scores and Rasch scores are highly correlated, Rasch analysis has the advantages of eliminating floor and ceiling effects and producing interval-scaled data. The Massof algorithm for approximation of the Rasch person measure performed well in this group of low-vision patients. The validity of the subscales VFQ-25 should be reconsidered.
Simulating a Time-of-Flight Mass Spectrometer: A LabView Exercise
ERIC Educational Resources Information Center
Marty, Michael T.; Beussman, Douglas J.
2013-01-01
An in-depth understanding of all parameters that affect an instrumental analysis method, allowing students to explore how these instruments work so that they are not just a "black box," is key to being able to optimize the technique and obtain the best possible results. It is, however, impractical to provide such in depth coverage of…
Application of Handheld Laser-Induced Breakdown Spectroscopy (LIBS) to Geochemical Analysis.
Connors, Brendan; Somers, Andrew; Day, David
2016-05-01
While laser-induced breakdown spectroscopy (LIBS) has been in use for decades, only within the last two years has technology progressed to the point of enabling true handheld, self-contained instruments. Several instruments are now commercially available with a range of capabilities and features. In this paper, the SciAps Z-500 handheld LIBS instrument functionality and sub-systems are reviewed. Several assayed geochemical sample sets, including igneous rocks and soils, are investigated. Calibration data are presented for multiple elements of interest along with examples of elemental mapping in heterogeneous samples. Sample preparation and the data collection method from multiple locations and data analysis are discussed. © The Author(s) 2016.
On-ground tests of the NISP infrared spectrometer instrument for Euclid
NASA Astrophysics Data System (ADS)
Jomni, Cyril; Ealet, Anne; Gillard, William; Prieto, Éric; Grupp, Frank U.
2017-09-01
Euclid is an ESA mission dedicated to understand the acceleration of the expansion of the Universe. The mission will measure hundred of millions of galaxies in spectrophotometry and photometry in the near infrared thanks to a spectro-photometer called NISP. This instrument will be assembled and tested in Marseille. To prepare the on-ground test plan and develop the test procedure, we have used simulated PSF images, based on a Zemax optical design of the instrument. We have developed the analysis tools that will be further used to build the procedure verification. We present here the method and analysis results to adjust the focus of the instrument. We will in particular show that because of the sampling of the PSF, a dithering strategy should be adapted and will constraint the development of the test plan.
Chandramohan, Daniel; Clark, Samuel J.; Jakob, Robert; Leitao, Jordana; Rao, Chalapati; Riley, Ian; Setel, Philip W.
2018-01-01
Background Verbal autopsy (VA) is a practical method for determining probable causes of death at the population level in places where systems for medical certification of cause of death are weak. VA methods suitable for use in routine settings, such as civil registration and vital statistics (CRVS) systems, have developed rapidly in the last decade. These developments have been part of a growing global momentum to strengthen CRVS systems in low-income countries. With this momentum have come pressure for continued research and development of VA methods and the need for a single standard VA instrument on which multiple automated diagnostic methods can be developed. Methods and findings In 2016, partners harmonized a WHO VA standard instrument that fully incorporates the indicators necessary to run currently available automated diagnostic algorithms. The WHO 2016 VA instrument, together with validated approaches to analyzing VA data, offers countries solutions to improving information about patterns of cause-specific mortality. This VA instrument offers the opportunity to harmonize the automated diagnostic algorithms in the future. Conclusions Despite all improvements in design and technology, VA is only recommended where medical certification of cause of death is not possible. The method can nevertheless provide sufficient information to guide public health priorities in communities in which physician certification of deaths is largely unavailable. The WHO 2016 VA instrument, together with validated approaches to analyzing VA data, offers countries solutions to improving information about patterns of cause-specific mortality. PMID:29320495
Data analysis of the COMPTEL instrument on the NASA gamma ray observatory
NASA Technical Reports Server (NTRS)
Diehl, R.; Bennett, K.; Collmar, W.; Connors, A.; Denherder, J. W.; Hermsen, W.; Lichti, G. G.; Lockwood, J. A.; Macri, J.; Mcconnell, M.
1992-01-01
The Compton imaging telescope (COMPTEL) on the Gamma Ray Observatory (GRO) is a wide field of view instrument. The coincidence measurement technique in two scintillation detector layers requires specific analysis methods. Straightforward event projection into the sky is impossible. Therefore, detector events are analyzed in a multi-dimensional dataspace using a gamma ray sky hypothesis convolved with the point spread function of the instrument in this dataspace. Background suppression and analysis techniques have important implications on the gamma ray source results for this background limited telescope. The COMPTEL collaboration applies a software system of analysis utilities, organized around a database management system. The use of this system for the assistance of guest investigators at the various collaboration sites and external sites is foreseen and allows different detail levels of cooperation with the COMPTEL institutes, dependent on the type of data to be studied.
NASA Technical Reports Server (NTRS)
Levy, G.; Brown, R. A.
1986-01-01
A simple economical objective analysis scheme is devised and tested on real scatterometer data. It is designed to treat dense data such as those of the Seasat A Satellite Scatterometer (SASS) for individual or multiple passes, and preserves subsynoptic scale features. Errors are evaluated with the aid of sampling ('bootstrap') statistical methods. In addition, sensitivity tests have been performed which establish qualitative confidence in calculated fields of divergence and vorticity. The SASS wind algorithm could be improved; however, the data at this point are limited by instrument errors rather than analysis errors. The analysis error is typically negligible in comparison with the instrument error, but amounts to 30 percent of the instrument error in areas of strong wind shear. The scheme is very economical, and thus suitable for large volumes of dense data such as SASS data.
Instrumental Analysis in Environmental Chemistry - Liquid and Solid Phase Detection Systems
ERIC Educational Resources Information Center
Stedman, Donald H.; Meyers, Philip A.
1974-01-01
This is the second of two reviews dealing with analytical methods applicable to environmental chemistry. Methods are discussed under gas, liquid, or solid depending upon the state of the analyte during detection. (RH)
Hojat, Mohammadreza; Spandorfer, John; Isenberg, Gerald A; Vergare, Michael J; Fassihi, Reza; Gonnella, Joseph S
2012-01-01
Despite the emphasis placed on interdisciplinary education and interprofessional collaboration between physicians and pharmacologists, no psychometrically sound instrument is available to measure attitudes toward collaborative relationships. This study was designed to examine psychometrics of an instrument for measuring attitudes toward physician-pharmacist collaborative relationships for administration to students in medical and pharmacy schools and to physicians and pharmacists. The Scale of Attitudes Toward Physician-Pharmacist Collaboration was completed by 210 students at Jefferson Medical College. Factor analysis and correlational methods were used to examine psychometrics of the instrument. Consistent with the conceptual framework of interprofessional collaboration, three underlying constructs, namely "responsibility and accountability;" "shared authority;" and "interdisciplinary education" emerged from the factor analysis of the instrument providing support for its construct validity. The reliability coefficient alpha for the instrument was 0.90. The instrument's criterion-related validity coefficient with scores of a validated instrument (Jefferson Scale of Attitudes Toward Physician-Nurse Collaboration) was 0.70. Findings provide support for the validity and reliability of the instrument for medical students. The instrument has the potential to be used for the evaluation of interdisciplinary education in medical and pharmacy schools, and for the evaluation of patient outcomes resulting from collaborative physician-pharmacist relationships.
Development of a new diffuse near-infrared food measuring
NASA Astrophysics Data System (ADS)
Zhang, Jun; Piao, Renguan
2006-11-01
Industries from agriculture to petrochemistry have found near infrared (NIR) spectroscopic analysis useful for quality control and quantitative analysis of materials and products. The general chemical, polymer chemistry, petrochemistry, agriculture, food and textile industries are currently using NIR spectroscopic methods for analysis. In this study, we developed a new sort NIR instrument for food measuring. The instrument consists of a light source, 12 filters to the prismatic part. The special part is that we use a mirror to get two beams of light. And two PbS detectors were used. One detector collected the radiation of one light beam directly and the value was set as the standard instead the standard white surface. Another light beam irradiate the sample surface, and the diffuse light was collected by another detector. The value of the two detectors was compared and the absorbency was computed. We tested the performance of the NIR instrument in determining the protein and fat content of milk powder. The calibration showed the accuracy of the instrument in practice.
Frontiers in In-Situ Cosmic Dust Detection and Analysis
NASA Astrophysics Data System (ADS)
Sternovsky, Zoltán; Auer, Siegfried; Drake, Keith; Grün, Eberhard; Horányi, Mihály; Le, Huy; Srama, Ralf; Xie, Jianfeng
2011-11-01
In-situ cosmic dust instruments and measurements played a critical role in the emergence of the field of dusty plasmas. The major breakthroughs included the discovery of β-meteoroids, interstellar dust particles within the solar system, Jovian stream particles, and the detection and analysis of Enceladus's plumes. The science goals of cosmic dust research require the measurements of the charge, the spatial, size and velocity distributions, and the chemical and isotopic compositions of individual dust particles. In-situ dust instrument technology has improved significantly in the last decade. Modern dust instruments with high sensitivity can detect submicron-sized particles even at low impact velocities. Innovative ion optics methods deliver high mass resolution, m/dm>100, for chemical and isotopic analysis. The accurate trajectory measurement of cosmic dust is made possible even for submicron-sized grains using the Dust Trajectory Sensor (DTS). This article is a brief review of the current capabilities of modern dust instruments, future challenges and opportunities in cosmic dust research.
Silvestre, Dolores; Fraga, Miriam; Gormaz, María; Torres, Ester; Vento, Máximo
2014-07-01
The variability of human milk (HM) composition renders analysis of its components essential for optimal nutrition of preterm fed either with donor's or own mother's milk. To fulfil this requirement, various analytical instruments have been subjected to scientific and clinical evaluation. The objective of this study was to evaluate the suitability of a rapid method for the analysis of macronutrients in HM as compared with the analytical methods applied by cow's milk industry. Mature milk from 39 donors was analysed using an infrared human milk analyser (HMA) and compared with biochemical reference laboratory methods. The statistical analysis was based on the use of paired data tests. The use of an infrared HMA for the analysis of lipids, proteins and lactose in HM proved satisfactory as regards the rapidity, simplicity and the required sample volume. The instrument afforded good linearity and precision in application to all three nutrients. However, accuracy was not acceptable when compared with the reference methods, with overestimation of the lipid content and underestimation of the amount of proteins and lactose contents. The use of mid-infrared HMA might become the standard for rapid analysis of HM once standardisation and rigorous and systematic calibration is provided. © 2012 John Wiley & Sons Ltd.
A BLIND METHOD TO DETREND INSTRUMENTAL SYSTEMATICS IN EXOPLANETARY LIGHT CURVES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morello, G., E-mail: giuseppe.morello.11@ucl.ac.uk
2015-07-20
The study of the atmospheres of transiting exoplanets requires a photometric precision, and repeatability, of one part in ∼10{sup 4}. This is beyond the original calibration plans of current observatories, hence the necessity to disentangle the instrumental systematics from the astrophysical signals in raw data sets. Most methods used in the literature are based on an approximate instrument model. The choice of parameters of the model and their functional forms can sometimes be subjective, causing controversies in the literature. Recently, Morello et al. (2014, 2015) have developed a non-parametric detrending method that gave coherent and repeatable results when applied tomore » Spitzer/IRAC data sets that were debated in the literature. Said method is based on independent component analysis (ICA) of individual pixel time-series, hereafter “pixel-ICA”. The main purpose of this paper is to investigate the limits and advantages of pixel-ICA on a series of simulated data sets with different instrument properties, and a range of jitter timescales and shapes, non-stationarity, sudden change points, etc. The performances of pixel-ICA are compared against the ones of other methods, in particular polynomial centroid division, and pixel-level decorrelation method. We find that in simulated cases pixel-ICA performs as well or better than other methods, and it also guarantees a higher degree of objectivity, because of its purely statistical foundation with no prior information on the instrument systematics. The results of this paper, together with previous analyses of Spitzer/IRAC data sets, suggest that photometric precision and repeatability of one part in 10{sup 4} can be achieved with current infrared space instruments.« less
Application of type synthesis theory to the redesign of a complex surgical instrument.
Lim, Jonas J B; Erdman, Arthur G
2002-06-01
Surgical instruments consist of basic mechanical components such as gears, links, pivots, sliders, etc., which are common in mechanical design. This paper describes the application of a method in the analysis and design of complex surgical instruments such as those employed in laparoscopic surgery. This is believed to be the first application of type synthesis theory to a complex medical instrument. Type synthesis is a methodology that can be applied during the conceptual phase of mechanical design. A handle assembly from a patented laparoscopic surgical stapler is used to illustrate the application of the design method developed. Type synthesis is applied on specific subsystems of the mechanism within the handle assembly where alternative design concepts are generated. Chosen concepts are then combined to form a new conceptual design for the handle assembly. The new handle assembly is improved because it has fewer number of parts, is a simpler design and is easier to assemble. Surgical instrument designers may use the methodology presented here to analyze the mechanical subsystems within complex instruments and to create new options that may offer improvements to the original design.
Two Instruments for Measuring Distributions of Low-Energy Charged Particles in Space
NASA Technical Reports Server (NTRS)
Bader, Michel; Fryer, Thomas B.; Witteborn, Fred C.
1961-01-01
Current estimates indicate that the bulk of interplanetary gas consists of protons with energies between 0 and 20 kev and concentrations of 1 to 105 particles/cu cm. Methods and instrumentation for measuring the energy and density distribution of such a gas are considered from the standpoint of suitability for space vehicle payloads. It is concluded that electrostatic analysis of the energy distribution can provide sufficient information in initial experiments. Both magnetic and electrostatic analyzers should eventually be used. Several instruments designed and constructed at the Ames Research Center for space plasma measurements, and the methods of calibration and data reduction are described. In particular, the instrument designed for operation on solar cell power has the following characteristics: weight, 1.1 pounds; size, 2 by 3 by 4 inches; and power consumption, 145 mw. The instrument is designed to yield information on the concentration, energy distribution, and the anisotropy of ion trajectories in the 0.2 to 20 kev range.
Preetam, C. S.; Chandrashekhar, M.; Gunaranjan, T.; Kumar, S. Kishore; Miskeen Sahib, S. A.; Kumar, M. Senthil
2016-01-01
Aim: The purpose of this study is to achieve an effective method to remove root canal filling material from the root canal system. The study, thus, aims to evaluate the efficacy of the cleaning ability of two different rotary Ni-Ti systems; ProTaper Retreatment files and RaCe System compared to hand instrumentation with Hedstrom files for the removal of gutta-percha during retreatment. Materials and Methods: Thirty mandibular premolars with one single straight canal were decoronated and instrumented with ProTaper files and filled with thermoplastic gutta-percha. After 30 days, the samples were divided into three groups and gutta-percha was removed with the test instruments. The postoperative radiographs were evaluated with known criteria by dividing the root into cervical third, middle third, and apical third. The results were tabulated and Statistical Package for Social Sciences Software (IBM Corporation) was used for analysis. Results: The mean deviation of the results were first calculated and then t-test and analysis of variance test (two-tailed P value) were evaluated for establishing significant differences. The rotary instruments were effective in removing the gutta-percha from the canals. Therefore, significant difference was observed between the efficacies of the two rotary systems used. The rotary instruments showed effective gutta-percha removal in the cervical and middle one third. (P > 0.05). However, apical debridement was effective with Hedstrom files. Conclusion: The study concluded the use of both rotary and hand instrumentation for effective removal of gutta-percha for retreatment. PMID:27652245
Instrumental variable methods in comparative safety and effectiveness research†
Brookhart, M. Alan; Rassen, Jeremy A.; Schneeweiss, Sebastian
2010-01-01
Summary Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial. PMID:20354968
Della, Lindsay J.; DeJoy, David M.; Goetzel, Ron Z.; Ozminkowski, Ronald J.; Wilson, Mark G.
2009-01-01
Objective This paper describes the development of the Leading by Example (LBE) instrument. Methods Exploratory factor analysis was used to obtain an initial factor structure. Factor validity was evaluated using confirmatory factor analysis methods. Cronbach’s alpha and item-total correlations provided information on the reliability of the factor subscales. Results Four subscales were identified: business alignment with health promotion objectives; awareness of the health-productivity link; worksite support for health promotion; leadership support for health promotion. Factor by group comparisons revealed that the initial factor structure is effective in detecting differences in organizational support for health promotion across different employee groups Conclusions Management support for health promotion can be assessed using the LBE, a brief, self-report questionnaire. Researchers can use the LBE to diagnose, track, and evaluate worksite health promotion programs. PMID:18517097
NASA Technical Reports Server (NTRS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.;
2016-01-01
NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
NASA Astrophysics Data System (ADS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; Eichhorn, William L.; Glasse, Alistair C.; Gracey, Renee; Hartig, George F.; Howard, Joseph M.; Kelly, Douglas M.; Kimble, Randy A.; Kirk, Jeffrey R.; Kubalak, David A.; Landsman, Wayne B.; Lindler, Don J.; Malumuth, Eliot M.; Maszkiewicz, Michael; Rieke, Marcia J.; Rowlands, Neil; Sabatke, Derek S.; Smith, Corbett T.; Smith, J. Scott; Sullivan, Joseph F.; Telfer, Randal C.; Te Plate, Maurice; Vila, M. Begoña.; Warner, Gerry D.; Wright, David; Wright, Raymond H.; Zhou, Julia; Zielinski, Thomas P.
2016-09-01
NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy. The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM), that contains four science instruments (SI) and the Fine Guidance Sensor (FGS). The SIs are mounted to a composite metering structure. The SIs and FGS were integrated to the ISIM structure and optically tested at NASA's Goddard Space Flight Center using the Optical Telescope Element SIMulator (OSIM). OSIM is a full-field, cryogenic JWST telescope simulator. SI performance, including alignment and wavefront error, was evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, implementation of associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
Scales and Exercises with Aksak Meters in Flute Education: A Study with Turkish and Italian Students
ERIC Educational Resources Information Center
Sakin, Ajda Senol; Öztürk, Ferda Gürgan
2016-01-01
Musical scale and exercise studies in instrumental education are considered as a fundamental component of music education. During an analysis of methods prepared for instrumental education, it was detected that scale and exercise studies for Aksak meters generally did not exist. This study was conducted to identify the effects of scales and…
Wu, Zhiyuan; Yuan, Hong; Zhang, Xinju; Liu, Weiwei; Xu, Jinhua; Zhang, Wei; Guan, Ming
2011-01-01
JAK2 V617F, a somatic point mutation that leads to constitutive JAK2 phosphorylation and kinase activation, has been incorporated into the WHO classification and diagnostic criteria of myeloid neoplasms. Although various approaches such as restriction fragment length polymorphism, amplification refractory mutation system and real-time PCR have been developed for its detection, a generic rapid closed-tube method, which can be utilized on routine genetic testing instruments with stability and cost-efficiency, has not been described. Asymmetric PCR for detection of JAK2 V617F with a 3'-blocked unlabeled probe, saturate dye and subsequent melting curve analysis was performed on a Rotor-Gene® Q real-time cycler to establish the methodology. We compared this method to the existing amplification refractory mutation systems and direct sequencing. Hereafter, the broad applicability of this unlabeled probe melting method was also validated on three diverse real-time systems (Roche LightCycler® 480, Applied Biosystems ABI® 7500 and Eppendorf Mastercycler® ep realplex) in two different laboratories. The unlabeled probe melting analysis could genotype JAK2 V617F mutation explicitly with a 3% mutation load detecting sensitivity. At level of 5% mutation load, the intra- and inter-assay CVs of probe-DNA heteroduplex (mutation/wild type) covered 3.14%/3.55% and 1.72%/1.29% respectively. The method could equally discriminate mutant from wild type samples on the other three real-time instruments. With a high detecting sensitivity, unlabeled probe melting curve analysis is more applicable to disclose JAK2 V617F mutation than conventional methodologies. Verified with the favorable inter- and intra-assay reproducibility, unlabeled probe melting analysis provided a generic mutation detecting alternative for real-time instruments.
Lichte, F.E.; Meier, A.L.; Crock, J.G.
1987-01-01
A method of analysis of geological materials for the determination of the rare-earth elements using the Inductively coupled plasma mass spectrometric technique (ICP-MS) has been developed. Instrumental parameters and factors affecting analytical results have been first studied and then optimized. Samples are analyzed directly following an acid digestion, without the need for separation or preconcentration with limits of detection of 2-11 ng/g, precision of ?? 2.5% relative standard deviation, and accuracy comparable to inductively coupled plasma emission spectrometry and instrumental neutron activation analysis. A commercially available ICP-MS instrument is used with modifications to the sample introduction system, torch, and sampler orifice to reduce the effects of high salt content of sample solutions prepared from geologic materials. Corrections for isobaric interferences from oxide ions and other diatomic and triatomic ions are made mathematically. Special internal standard procedures are used to compensate for drift in metahmetal oxide ratios and sensitivity. Reference standard values are used to verify the accuracy and utility of the method.
Recent advances in CE-MS coupling: Instrumentation, methodology, and applications.
Týčová, Anna; Ledvina, Vojtěch; Klepárník, Karel
2017-01-01
This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices coupled with MS for detection and identification of important analytes. It is a continuation of the review article on the same topic by Kleparnik (Electrophoresis 2015, 36, 159-178). A wide selection of 161 relevant articles covers the literature published from June 2014 till May 2016. New improvements in the instrumentation and methodology of MS interfaced with capillary or microfluidic versions of zone electrophoresis, isotachophoresis, and isoelectric focusing are described in detail. The most frequently implemented MS ionization methods include electrospray ionization, matrix-assisted desorption/ionization and inductively coupled plasma ionization. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography, and micellar electrokinetic chromatography are not included. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Alternatives to current flow cytometry data analysis for clinical and research studies.
Gondhalekar, Carmen; Rajwa, Bartek; Patsekin, Valery; Ragheb, Kathy; Sturgis, Jennifer; Robinson, J Paul
2018-02-01
Flow cytometry has well-established methods for data analysis based on traditional data collection techniques. These techniques typically involved manual insertion of tube samples into an instrument that, historically, could only measure 1-3 colors. The field has since evolved to incorporate new technologies for faster and highly automated sample preparation and data collection. For example, the use of microwell plates on benchtop instruments is now a standard on virtually every new instrument, and so users can easily accumulate multiple data sets quickly. Further, because the user must carefully define the layout of the plate, this information is already defined when considering the analytical process, expanding the opportunities for automated analysis. Advances in multi-parametric data collection, as demonstrated by the development of hyperspectral flow-cytometry, 20-40 color polychromatic flow cytometry, and mass cytometry (CyTOF), are game-changing. As data and assay complexity increase, so too does the complexity of data analysis. Complex data analysis is already a challenge to traditional flow cytometry software. New methods for reviewing large and complex data sets can provide rapid insight into processes difficult to define without more advanced analytical tools. In settings such as clinical labs where rapid and accurate data analysis is a priority, rapid, efficient and intuitive software is needed. This paper outlines opportunities for analysis of complex data sets using examples of multiplexed bead-based assays, drug screens and cell cycle analysis - any of which could become integrated into the clinical environment. Copyright © 2017. Published by Elsevier Inc.
A New Principle of Sound Frequency Analysis
NASA Technical Reports Server (NTRS)
Theodorsen, Theodore
1932-01-01
In connection with the study of aircraft and propeller noises, the National Advisory Committee for Aeronautics has developed an instrument for sound-frequency analysis which differs fundamentally from previous types, and which, owing to its simplicity of principle, construction, and operation, has proved to be of value in this investigation. The method is based on the well-known fact that the Ohmic loss in an electrical resistance is equal to the sum of the losses of the harmonic components of a complex wave, except for the case in which any two components approach or attain vectorial identity, in which case the Ohmic loss is increased by a definite amount. The principle of frequency analysis has been presented mathematically and a number of distinct advantages relative to previous methods have been pointed out. An automatic recording instrument embodying this principle is described in detail. It employs a beat-frequency oscillator as a source of variable frequency. A large number of experiments have verified the predicted superiority of the method. A number of representative records are presented.
Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.
Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron
2018-02-01
New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.
2018-01-01
Objective To investigate the psychometric properties of the activities of daily living (ADL) instrument used in the analysis of Korean Longitudinal Study of Ageing (KLoSA) dataset. Methods A retrospective study was carried out involving 2006 KLoSA records of community-dwelling adults diagnosed with stroke. The ADL instrument used for the analysis of KLoSA included 17 items, which were analyzed using Rasch modeling to develop a robust outcome measure. The unidimensionality of the ADL instrument was examined based on confirmatory factor analysis with a one-factor model. Item-level psychometric analysis of the ADL instrument included fit statistics, internal consistency, precision, and the item difficulty hierarchy. Results The study sample included a total of 201 community-dwelling adults (1.5% of the Korean population with an age over 45 years; mean age=70.0 years, SD=9.7) having a history of stroke. The ADL instrument demonstrated unidimensional construct. Two misfit items, money management (mean square [MnSq]=1.56, standardized Z-statistics [ZSTD]=2.3) and phone use (MnSq=1.78, ZSTD=2.3) were removed from the analysis. The remaining 15 items demonstrated good item fit, high internal consistency (person reliability=0.91), and good precision (person strata=3.48). The instrument precisely estimated person measures within a wide range of theta (−4.75 logits < θ < 3.97 logits) and a reliability of 0.9, with a conceptual hierarchy of item difficulty. Conclusion The findings indicate that the 15 ADL items met Rasch expectations of unidimensionality and demonstrated good psychometric properties. It is proposed that the validated ADL instrument can be used as a primary outcome measure for assessing longitudinal disability trajectories in the Korean adult population and can be employed for comparative analysis of international disability across national aging studies. PMID:29765888
ERIC Educational Resources Information Center
Fedick, Patrick W.; Bain, Ryan M.; Bain, Kinsey; Cooks, R. Graham
2017-01-01
The goal of this laboratory exercise was for students to understand the concept of chirality and how enantiomeric excess (ee) is experimentally determined using the analysis of ibuprofen as an example. Students determined the enantiomeric excess of the analyte by three different instrumental methods: mass spectrometry, nuclear magnetic resonance…
An Isotopic Dilution Experiment Using Liquid Scintillation: A Simple Two-System, Two-Phase Analysis.
ERIC Educational Resources Information Center
Moehs, Peter J.; Levine, Samuel
1982-01-01
A simple isotonic, dilution analysis whose principles apply to methods of more complex radioanalyses is described. Suitable for clinical and instrumental analysis chemistry students, experimental manipulations are kept to a minimum involving only aqueous extraction before counting. Background information, procedures, and results are discussed.…
NASA Astrophysics Data System (ADS)
Anderson, R. B.; Finch, N.; Clegg, S.; Graff, T.; Morris, R. V.; Laura, J.
2017-06-01
We present a Python-based library and graphical interface for the analysis of point spectra. The tool is being developed with a focus on methods used for ChemCam data, but is flexible enough to handle spectra from other instruments.
Gordon, H R; Du, T; Zhang, T
1997-09-20
We provide an analysis of the influence of instrument polarization sensitivity on the radiance measured by spaceborne ocean color sensors. Simulated examples demonstrate the influence of polarization sensitivity on the retrieval of the water-leaving reflectance rho(w). A simple method for partially correcting for polarization sensitivity--replacing the linear polarization properties of the top-of-atmosphere reflectance with those from a Rayleigh-scattering atmosphere--is provided and its efficacy is evaluated. It is shown that this scheme improves rho(w) retrievals as long as the polarization sensitivity of the instrument does not vary strongly from band to band. Of course, a complete polarization-sensitivity characterization of the ocean color sensor is required to implement the correction.
2011-01-01
Background The cleaning stage of the instrument decontamination process has come under increased scrutiny due to the increasing complexity of surgical instruments and the adverse affects of residual protein contamination on surgical instruments. Instruments used in the podiatry field have a complex surface topography and are exposed to a wide range of biological contamination. Currently, podiatry instruments are reprocessed locally within surgeries while national strategies are favouring a move toward reprocessing in central facilities. The aim of this study was to determine the efficacy of local and central reprocessing on podiatry instruments by measuring residual protein contamination of instruments reprocessed by both methods. Methods The residual protein of 189 instruments reprocessed centrally and 189 instruments reprocessed locally was determined using a fluorescent assay based on the reaction of proteins with o-phthaldialdehyde/sodium 2-mercaptoethanesulfonate. Results Residual protein was detected on 72% (n = 136) of instruments reprocessed centrally and 90% (n = 170) of instruments reprocessed locally. Significantly less protein (p < 0.001) was recovered from instruments reprocessed centrally (median 20.62 μg, range 0 - 5705 μg) than local reprocessing (median 111.9 μg, range 0 - 6344 μg). Conclusions Overall, the results show the superiority of central reprocessing for complex podiatry instruments when protein contamination is considered, though no significant difference was found in residual protein between local decontamination unit and central decontamination unit processes for Blacks files. Further research is needed to undertake qualitative identification of protein contamination to identify any cross contamination risks and a standard for acceptable residual protein contamination applicable to different instruments and specialities should be considered as a matter of urgency. PMID:21219613
Edward's sword? - A non-destructive study of a medieval king's sword
NASA Astrophysics Data System (ADS)
Segebade, Chr.
2013-04-01
Non-destructive and instrumental methods including photon activation analysis were applied in an examination of an ancient sword. It was tried to find indication of forgery or, if authentic, any later processing and alteration. Metal components of the hilt and the blade were analysed by instrumental photon activation. Non-destructive metallurgical studies (hardness measurements, microscopic microstructure analysis) are briefly described, too. The results of these investigations did not yield indication of non-authenticity. This stood in agreement with the results of stylistic and scientific studies by weapon experts.
Analysis of XMM-Newton Data from Extended Sources and the Diffuse X-Ray Background
NASA Technical Reports Server (NTRS)
Snowden, Steven
2011-01-01
Reduction of X-ray data from extended objects and the diffuse background is a complicated process that requires attention to the details of the instrumental response as well as an understanding of the multiple background components. We present methods and software that we have developed to reduce data from XMM-Newton EPIC imaging observations for both the MOS and PN instruments. The software has now been included in the Science Analysis System (SAS) package available through the XMM-Newton Science Operations Center (SOC).
Edward's sword? - A non-destructive study of a medieval king's sword
DOE Office of Scientific and Technical Information (OSTI.GOV)
Segebade, Chr.
2013-04-19
Non-destructive and instrumental methods including photon activation analysis were applied in an examination of an ancient sword. It was tried to find indication of forgery or, if authentic, any later processing and alteration. Metal components of the hilt and the blade were analysed by instrumental photon activation. Non-destructive metallurgical studies (hardness measurements, microscopic microstructure analysis) are briefly described, too. The results of these investigations did not yield indication of non-authenticity. This stood in agreement with the results of stylistic and scientific studies by weapon experts.
Berens, Angelique M; Harbison, Richard Alex; Li, Yangming; Bly, Randall A; Aghdasi, Nava; Ferreira, Manuel; Hannaford, Blake; Moe, Kris S
2017-08-01
To develop a method to measure intraoperative surgical instrument motion. This model will be applicable to the study of surgical instrument kinematics including surgical training, skill verification, and the development of surgical warning systems that detect aberrant instrument motion that may result in patient injury. We developed an algorithm to automate derivation of surgical instrument kinematics in an endoscopic endonasal skull base surgery model. Surgical instrument motion was recorded during a cadaveric endoscopic transnasal approach to the pituitary using a navigation system modified to record intraoperative time-stamped Euclidian coordinates and Euler angles. Microdebrider tip coordinates and angles were referenced to the cadaver's preoperative computed tomography scan allowing us to assess surgical instrument kinematics over time. A representative cadaveric endoscopic endonasal approach to the pituitary was performed to demonstrate feasibility of our algorithm for deriving surgical instrument kinematics. Technical feasibility of automatically measuring intraoperative surgical instrument motion and deriving kinematics measurements was demonstrated using standard navigation equipment.
Spectroscopic Chemical Analysis Methods and Apparatus
NASA Technical Reports Server (NTRS)
Hug, William F.; Reid, Ray D.
2012-01-01
This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses ballistic electron beam injection directly into the active region of a wide bandgap semiconductor material.
Developing evaluation instrument based on CIPP models on the implementation of portfolio assessment
NASA Astrophysics Data System (ADS)
Kurnia, Feni; Rosana, Dadan; Supahar
2017-08-01
This study aimed to develop an evaluation instrument constructed by CIPP model on the implementation of portfolio assessment in science learning. This study used research and development (R & D) method; adapting 4-D by the development of non-test instrument, and the evaluation instrument constructed by CIPP model. CIPP is the abbreviation of Context, Input, Process, and Product. The techniques of data collection were interviews, questionnaires, and observations. Data collection instruments were: 1) the interview guidelines for the analysis of the problems and the needs, 2) questionnaire to see level of accomplishment of portfolio assessment instrument, and 3) observation sheets for teacher and student to dig up responses to the portfolio assessment instrument. The data obtained was quantitative data obtained from several validators. The validators consist of two lecturers as the evaluation experts, two practitioners (science teachers), and three colleagues. This paper shows the results of content validity obtained from the validators and the analysis result of the data obtained by using Aikens' V formula. The results of this study shows that the evaluation instrument based on CIPP models is proper to evaluate the implementation of portfolio assessment instruments. Based on the experts' judgments, practitioners, and colleagues, the Aikens' V coefficient was between 0.86-1,00 which means that it is valid and can be used in the limited trial and operational field trial.
Research of metal solidification in zero-g state. [test apparatus and instrumentation
NASA Technical Reports Server (NTRS)
Aubin, W. M.; Larson, D., Jr.; Geschwind, G. I.
1973-01-01
An experiment test apparatus that allows metal melting and resolidification in the three seconds available during free fall in a drop tower was built and tested in the tower. Droplets (approximately 0.05 cm) of pure nickel and 1090 steel were prepared in this fashion. The apparatus, including instrumentation, is described. As part of the instrumentation, a method for measuring temperature-time histories of the free floating metal droplets was developed. Finally, a metallurgical analysis of the specimens prepared in the apparatus is presented.
The use of biochemical methods in extraterrestrial life detection
NASA Astrophysics Data System (ADS)
McDonald, Gene
2006-08-01
Instrument development for in situ extraterrestrial life detection focuses primarily on the ability to distinguish between biological and non-biological material, mostly through chemical analysis for potential biosignatures (e.g., biogenic minerals, enantiomeric excesses). In constrast, biochemical analysis techniques commonly applied to Earth life focus primarily on the exploration of cellular and molecular processes, not on the classification of a given system as biological or non-biological. This focus has developed because of the relatively large functional gap between life and non-life on Earth today. Life on Earth is very diverse from an environmental and physiological point of view, but is highly conserved from a molecular point of view. Biochemical analysis techniques take advantage of this similarity of all terrestrial life at the molecular level, particularly through the use of biologically-derived reagents (e.g., DNA polymerases, antibodies), to enable analytical methods with enormous sensitivity and selectivity. These capabilities encourage consideration of such reagents and methods for use in extraterrestrial life detection instruments. The utility of this approach depends in large part on the (unknown at this time) degree of molecular compositional differences between extraterrestrial and terrestrial life. The greater these differences, the less useful laboratory biochemical techniques will be without significant modification. Biochemistry and molecular biology methods may need to be "de-focused" in order to produce instruments capable of unambiguously detecting a sufficiently wide range of extraterrestrial biochemical systems. Modern biotechnology tools may make that possible in some cases.
Cline, James P; Mendenhall, Marcus H; Black, David; Windover, Donald; Henins, Albert
2015-01-01
The laboratory X-ray powder diffractometer is one of the primary analytical tools in materials science. It is applicable to nearly any crystalline material, and with advanced data analysis methods, it can provide a wealth of information concerning sample character. Data from these machines, however, are beset by a complex aberration function that can be addressed through calibration with the use of NIST Standard Reference Materials (SRMs). Laboratory diffractometers can be set up in a range of optical geometries; considered herein are those of Bragg-Brentano divergent beam configuration using both incident and diffracted beam monochromators. We review the origin of the various aberrations affecting instruments of this geometry and the methods developed at NIST to align these machines in a first principles context. Data analysis methods are considered as being in two distinct categories: those that use empirical methods to parameterize the nature of the data for subsequent analysis, and those that use model functions to link the observation directly to a specific aspect of the experiment. We consider a multifaceted approach to instrument calibration using both the empirical and model based data analysis methods. The particular benefits of the fundamental parameters approach are reviewed.
Viewpoint on ISA TR84.0.02--simplified methods and fault tree analysis.
Summers, A E
2000-01-01
ANSI/ISA-S84.01-1996 and IEC 61508 require the establishment of a safety integrity level for any safety instrumented system or safety related system used to mitigate risk. Each stage of design, operation, maintenance, and testing is judged against this safety integrity level. Quantitative techniques can be used to verify whether the safety integrity level is met. ISA-dTR84.0.02 is a technical report under development by ISA, which discusses how to apply quantitative analysis techniques to safety instrumented systems. This paper discusses two of those techniques: (1) Simplified equations and (2) Fault tree analysis.
Comparison of the effect of three autogenous bone harvesting methods on cell viability in rabbits
Moradi Haghgoo, Janet; Arabi, Seyed Reza; Hosseinipanah, Seyyed Mohammad; Solgi, Ghasem; Rastegarfard, Neda; Farhadian, Maryam
2017-01-01
Background. This study was designed to compare the viability of autogenous bone grafts, harvested using different methods, in order to determine the best harvesting technique with respect to more viable cells. Methods. In this animal experimental study, three harvesting methods, including manual instrument (chisel), rotary device and piezosurgery, were used for harvesting bone grafts from the lateral body of the mandible on the left and right sides of 10 rabbits. In each group, 20 bone samples were collected and their viability was assessed using MTS kit. Statistical analyses, including ANOVA and post hoc Tukey tests, were used for evaluating significant differences between the groups. Results. One-way ANOVA showed significant differences between all the groups (P=0.000). Data analysis using post hoc Tukey tests indicated that manual instrument and piezosurgery had no significant differences with regard to cell viability (P=0.749) and the cell viability in both groups was higher than that with the use of a rotary instrument (P=0.000). Conclusion. Autogenous bone grafts harvested with a manual instrument and piezosurgery had more viable cells in comparison to the bone chips harvested with a rotary device. PMID:28748046
The Impact of Intervention Methods on Emotional Intelligence
ERIC Educational Resources Information Center
Davis, Christopher M.
2013-01-01
This experimental study continued the exploration surrounding emotional intelligence (EI). Emotional intelligence was examined through past and present literature, instrumentation, didactic teaching methods employing EI concepts, and data analysis. The experiment involved participants from two sections of an undergraduate economics class at a…
Flux-gate magnetometer spin axis offset calibration using the electron drift instrument
NASA Astrophysics Data System (ADS)
Plaschke, Ferdinand; Nakamura, Rumi; Leinweber, Hannes K.; Chutter, Mark; Vaith, Hans; Baumjohann, Wolfgang; Steller, Manfred; Magnes, Werner
2014-10-01
Spin-stabilization of spacecraft immensely supports the in-flight calibration of on-board flux-gate magnetometers (FGMs). From 12 calibration parameters in total, 8 can be easily obtained by spectral analysis. From the remaining 4, the spin axis offset is known to be particularly variable. It is usually determined by analysis of Alfvénic fluctuations that are embedded in the solar wind. In the absence of solar wind observations, the spin axis offset may be obtained by comparison of FGM and electron drift instrument (EDI) measurements. The aim of our study is to develop methods that are readily usable for routine FGM spin axis offset calibration with EDI. This paper represents a major step forward in this direction. We improve an existing method to determine FGM spin axis offsets from EDI time-of-flight measurements by providing it with a comprehensive error analysis. In addition, we introduce a new, complementary method that uses EDI beam direction data instead of time-of-flight data. Using Cluster data, we show that both methods yield similarly accurate results, which are comparable yet more stable than those from a commonly used solar wind-based method.
Ehlers, Jan P; Kaap-Fröhlich, Sylvia; Mahler, Cornelia; Scherer, Theresa; Huber, Marion
2017-01-01
Background: More and more institutions worldwide and in German-speaking countries are developing and establishing interprofessional seminars in undergraduate education of health professions. In order to evaluate the different didactic approaches and different outcomes regarding the anticipated interprofessional competencies, it is necessary to apply appropriate instruments. Cross-cultural instruments are particularly helpful for international comparability. The Interprofessional Education working group of the German Medical Association (GMA) aims at identifying existing instruments for the evaluation of interprofessional education in order to make recommendations for German-speaking countries. Methods: Systematic literature research was performed on the websites of international interprofessional organisations (CAIPE, EIPEN, AIPEN), as well as in the PubMed and Cinahl databases. Reviews focusing on quantitative instruments to evaluate competencies according to the modified Kirkpatrick competency levels were searched for. Psychometrics, language/country and setting, in which the instrument was applied, were recorded. Results: Six reviews out of 73 literature research hits were included. A large number of instruments were identified; however, their psychometrics and the applied setting were very heterogeneous. The instruments can mainly be assigned to Kirkpatrick levels 1, 2a & 2b. Most instruments have been developed in English but their psychometrics were not always reported rigorously. Only very few instruments are available in German. Conclusion: It is difficult to find appropriate instruments in German. Internationally, there are different approaches and objectives in the measurement and evaluation of interprofessional competencies. The question arises whether it makes sense to translate existing instruments or to go through the lengthy process of developing new ones. The evaluation of interprofessional seminars with quantitative instruments remains mainly on Kirkpatrick levels 1 and 2. Levels 3 and 4 can probably only be assessed with qualitative or mixed methods. German language instruments are necessary.
Ehlers, Jan P.; Kaap-Fröhlich, Sylvia; Mahler, Cornelia; Scherer, Theresa; Huber, Marion
2017-01-01
Background: More and more institutions worldwide and in German-speaking countries are developing and establishing interprofessional seminars in undergraduate education of health professions. In order to evaluate the different didactic approaches and different outcomes regarding the anticipated interprofessional competencies, it is necessary to apply appropriate instruments. Cross-cultural instruments are particularly helpful for international comparability. The Interprofessional Education working group of the German Medical Association (GMA) aims at identifying existing instruments for the evaluation of interprofessional education in order to make recommendations for German-speaking countries. Methods: Systematic literature research was performed on the websites of international interprofessional organisations (CAIPE, EIPEN, AIPEN), as well as in the PubMed and Cinahl databases. Reviews focusing on quantitative instruments to evaluate competencies according to the modified Kirkpatrick competency levels were searched for. Psychometrics, language/country and setting, in which the instrument was applied, were recorded. Results: Six reviews out of 73 literature research hits were included. A large number of instruments were identified; however, their psychometrics and the applied setting were very heterogeneous. The instruments can mainly be assigned to Kirkpatrick levels 1, 2a & 2b. Most instruments have been developed in English but their psychometrics were not always reported rigorously. Only very few instruments are available in German. Conclusion: It is difficult to find appropriate instruments in German. Internationally, there are different approaches and objectives in the measurement and evaluation of interprofessional competencies. The question arises whether it makes sense to translate existing instruments or to go through the lengthy process of developing new ones. The evaluation of interprofessional seminars with quantitative instruments remains mainly on Kirkpatrick levels 1 and 2. Levels 3 and 4 can probably only be assessed with qualitative or mixed methods. German language instruments are necessary. PMID:28890927
NASA Technical Reports Server (NTRS)
Cremers, D. A.; Wiens, R. C.; Arp, Z. A.; Harris, R. D.; Maurice, S.
2003-01-01
One of the most fundamental pieces of information about any planetary body is the elemental composition of its surface materials. The Viking Martian landers employed XRF (x-ray fluorescence) and the MER rovers are carrying APXS (alpha-proton x-ray spectrometer) instruments upgraded from that used on the Pathfinder rover to supply elemental composition information for soils and rocks to which direct contact is possible. These in- situ analyses require that the lander or rover be in contact with the sample. In addition to in-situ instrumentation, the present generation of rovers carry instruments that operate at stand-off distances. The Mini-TES is an example of a stand-off instrument on the MER rovers. Other examples for future missions include infrared point spectrometers and microscopic-imagers that can operate at a distance. The main advantage of such types of analyses is obvious: the sensing element does not need to be in contact or even adjacent to the target sample. This opens up new sensing capabilities. For example, targets that cannot be reached by a rover due to impassable terrain or targets positioned on a cliff face can now be accessed using stand-off analysis. In addition, the duty cycle of stand-off analysis can be much greater than that provided by in-situ measurements because the stand-off analysis probe can be aimed rapidly at different features of interest eliminating the need for the rover to actually move to the target. Over the past five years we have been developing a stand-off method of elemental analysis based on atomic emission spectroscopy called laser-induced breakdown spectroscopy (LIBS). A laser-produced spark vaporizes and excites the target material, the elements of which emit at characteristic wavelengths. Using this method, material can be analyzed from within a radius of several tens of meters from the instrument platform. A relatively large area can therefore be sampled from a simple lander without requiring a rover or sampling arms. The placement of such an instrument on a rover would allow the sampling of locations distant from the landing site. Here we give a description of the LIBS method and its advantages. We discuss recent work on determining its characteristics for Mars exploration, including accuracy, detection limits, and suitability for determining the presence of water ice and hydrated minerals. We also give a description of prototype instruments we have tested in field settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hallerman, G.; Gray, R.J.
An instrument for crushing-strength determinations of uncoated and pyrolytic-carbon-coated fuel particles (50 to 500 mu in diameter) was developed to relate the crushing strength of the particles to their fabricability. The instrument consists of a loading mechanism, load cell, and a power supply-readout unit. The information that can be obtained by statistical methods of the data analysis is illustrated by results on two batches of fuel particles. (auth)
Soil carbon analysis using gamma rays induced by neutrons
USDA-ARS?s Scientific Manuscript database
Agronomy is a research field where various physics concepts and experimental methods are widely used, particularly in agro-chemistry and soil elemental analysis. The evolution of methodology and instrumentation of nuclear physics combined with the availability of not highly expensive commercial prod...
NASA Instrument Cost/Schedule Model
NASA Technical Reports Server (NTRS)
Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George
2011-01-01
NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.
Multi-element analysis of emeralds and associated rocks by k(o) neutron activation analysis
Acharya; Mondal; Burte; Nair; Reddy; Reddy; Reddy; Manohar
2000-12-01
Multi-element analysis was carried out in natural emeralds, their associated rocks and one sample of beryl obtained from Rajasthan, India. The concentrations of 21 elements were assayed by Instrumental Neutron Activation Analysis using the k0 method (k0 INAA method) and high-resolution gamma ray spectrometry. The data reveal the segregation of some elements from associated (trapped and host) rocks to the mineral beryl forming the gemstones. A reference rock standard of the US Geological Survey (USGS BCR-1) was also analysed as a control of the method.
Richardson, Jeffrey R J; Peacock, Stuart J; Hawthorne, Graeme; Iezzi, Angelo; Elsworth, Gerald; Day, Neil A
2012-04-17
Multi attribute utility (MAU) instruments are used to include the health related quality of life (HRQoL) in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL)-6D, MAU instrument. The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM) to meet these dual requirements. The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs.
Too much ado about instrumental variable approach: is the cure worse than the disease?
Baser, Onur
2009-01-01
To review the efficacy of instrumental variable (IV) models in addressing a variety of assumption violations to ensure standard ordinary least squares (OLS) estimates are consistent. IV models gained popularity in outcomes research because of their ability to consistently estimate the average causal effects even in the presence of unmeasured confounding. However, in order for this consistent estimation to be achieved, several conditions must hold. In this article, we provide an overview of the IV approach, examine possible tests to check the prerequisite conditions, and illustrate how weak instruments may produce inconsistent and inefficient results. We use two IVs and apply Shea's partial R-square method, the Anderson canonical correlation, and Cragg-Donald tests to check for weak instruments. Hall-Peixe tests are applied to see if any of these instruments are redundant in the analysis. A total of 14,952 asthma patients from the MarketScan Commercial Claims and Encounters Database were examined in this study. Patient health care was provided under a variety of fee-for-service, fully capitated, and partially capitated health plans, including preferred provider organizations, point of service plans, indemnity plans, and health maintenance organizations. We used controller-reliever copay ratio and physician practice/prescribing patterns as an instrument. We demonstrated that the former was a weak and redundant instrument producing inconsistent and inefficient estimates of the effect of treatment. The results were worse than the results from standard regression analysis. Despite the obvious benefit of IV models, the method should not be used blindly. Several strong conditions are required for these models to work, and each of them should be tested. Otherwise, bias and precision of the results will be statistically worse than the results achieved by simply using standard OLS.
NASA Astrophysics Data System (ADS)
Hu, Taiyang; Lv, Rongchuan; Jin, Xu; Li, Hao; Chen, Wenxin
2018-01-01
The nonlinear bias analysis and correction of receiving channels in Chinese FY-3C meteorological satellite Microwave Temperature Sounder (MWTS) is a key technology of data assimilation for satellite radiance data. The thermal-vacuum chamber calibration data acquired from the MWTS can be analyzed to evaluate the instrument performance, including radiometric temperature sensitivity, channel nonlinearity and calibration accuracy. Especially, the nonlinearity parameters due to imperfect square-law detectors will be calculated from calibration data and further used to correct the nonlinear bias contributions of microwave receiving channels. Based upon the operational principles and thermalvacuum chamber calibration procedures of MWTS, this paper mainly focuses on the nonlinear bias analysis and correction methods for improving the calibration accuracy of the important instrument onboard FY-3C meteorological satellite, from the perspective of theoretical and experimental studies. Furthermore, a series of original results are presented to demonstrate the feasibility and significance of the methods.
NASA Technical Reports Server (NTRS)
Mueller, J. L. (Editor); Fargion, Giulietta S. (Editor); McClain, Charles R. (Editor)
2003-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 (Mueller and Fargion 2002, Volumes 1 and 2) is entirely superseded by the six volumes of Revision 4 listed above.
Advances in Molecular Rotational Spectroscopy for Applied Science
NASA Astrophysics Data System (ADS)
Harris, Brent; Fields, Shelby S.; Pulliam, Robin; Muckle, Matt; Neill, Justin L.
2017-06-01
Advances in chemical sensitivity and robust, solid-state designs for microwave/millimeter-wave instrumentation compel the expansion of molecular rotational spectroscopy as research tool into applied science. It is familiar to consider molecular rotational spectroscopy for air analysis. Those techniques for molecular rotational spectroscopy are included in our presentation of a more broad application space for materials analysis using Fourier Transform Molecular Rotational Resonance (FT-MRR) spectrometers. There are potentially transformative advantages for direct gas analysis of complex mixtures, determination of unknown evolved gases with parts per trillion detection limits in solid materials, and unambiguous chiral determination. The introduction of FT-MRR as an alternative detection principle for analytical chemistry has created a ripe research space for the development of new analytical methods and sampling equipment to fully enable FT-MRR. We present the current state of purpose-built FT-MRR instrumentation and the latest application measurements that make use of new sampling methods.
New trends in beer flavour compound analysis.
Andrés-Iglesias, Cristina; Montero, Olimpio; Sancho, Daniel; Blanco, Carlos A
2015-06-01
As the beer market is steadily expanding, it is important for the brewing industry to offer consumers a product with the best organoleptic characteristics, flavour being one of the key characteristics of beer. New trends in instrumental methods of beer flavour analysis are described. In addition to successfully applied methods in beer analysis such as chromatography, spectroscopy, nuclear magnetic resonance, mass spectrometry or electronic nose and tongue techniques, among others, sample extraction and preparation such as derivatization or microextraction methods are also reviewed. © 2014 Society of Chemical Industry.
Jindal, Rahul; Singh, Smita; Gupta, Siddharth; Jindal, Punita
2012-01-01
The purpose of this study was to evaluate and compare the apical extrusion of debris and irrigant using various rotary instruments with crown down technique in the instrumentation of root canals. Thirty freshly extracted human permanent straight rooted mandibular premolars with minimum root curvature of 0-10 ° were divided in three groups with 10 teeth in each group. Each group was instrumented using one of the three rotary instrumentation systems: Rotary Hero shapers, Rotary ProTaper and Rotary Mtwo. One ml of sterile water was used as an irrigant after using each instrument. Debris extruded was collected in pre weighed glass vials and the extruded irrigant was measured quantitatively by Myers and Montgomery method and was later evaporated. The weight of the dry extruded debris was calculated by comparing the pre and post instrumentation weight of glass vials for each group. Statistical analysis was done by using by a Kruskal-Wallis One-way ANOVA test. Statistical analysis showed that all the rotary instruments used in this study caused apical extrusion of debris and irrigant. A Statistically significant difference was observed with Rotary ProTaper and Rotary Mtwo groups when compared with Rotary Hero shapers. But no significant difference was observed between Rotary ProTaper and Rotary Mtwo groups. After instrumentation with different rotary instruments, Hero shapers showed a less apical extrusion of debris and irrigant.
Khanna, Rajesh; Handa, Aashish; Virk, Rupam Kaur; Ghai, Deepika; Handa, Rajni Sharma; Goel, Asim
2017-01-01
Background: The process of cleaning and shaping the canal is not an easy goal to obtain, as canal curvature played a significant role during the instrumentation of the curved canals. Aim: The present in vivo study was conducted to evaluate procedural errors during the preparation of curved root canals using hand Nitiflex and rotary K3XF instruments. Materials and Methods: Procedural errors such as ledge formation, instrument separation, and perforation (apical, furcal, strip) were determined in sixty patients, divided into two groups. In Group I, thirty teeth in thirty patients were prepared using hand Nitiflex system, and in Group II, thirty teeth in thirty patients were prepared using K3XF rotary system. The evaluation was done clinically as well as radiographically. The results recorded from both groups were compiled and put to statistical analysis. Statistical Analysis: Chi-square test was used to compare the procedural errors (instrument separation, ledge formation, and perforation). Results: In the present study, both hand Nitiflex and rotary K3XF showed ledge formation and instrument separation. Although ledge formation and instrument separation by rotary K3XF file system was less as compared to hand Nitiflex. No perforation was seen in both the instrument groups. Conclusion: Canal curvature played a significant role during the instrumentation of the curved canals. Procedural errors such as ledge formation and instrument separation by rotary K3XF file system were less as compared to hand Nitiflex. PMID:29042727
A study of the river velocity measurement techniques and analysis methods
NASA Astrophysics Data System (ADS)
Chung Yang, Han; Lun Chiang, Jie
2013-04-01
Velocity measurement technology can be traced back to the pitot tube velocity measurement method in the 18th century and today's velocity measurement technology use the acoustic and radar technology, with the Doppler principle developed technology advances, in order to develop the measurement method is more suitable for the measurement of velocity, the purpose is to get a more accurate measurement data and with the surface velocity theory, the maximum velocity theory and the indicator theory to obtain the mean velocity. As the main research direction of this article is to review the literature of the velocity measurement techniques and analysis methods, and to explore the applicability of the measurement method of the velocity measurement instruments, and then to describe the advantages and disadvantages of the different mean velocity profiles analysis method. Adequate review of the references of this study will be able to provide a reference for follow-up study of the velocity measurement. Review velocity measurement literature that different velocity measurement is required to follow the different flow conditions measured be upgraded its accuracy, because each flow rate measurement method has its advantages and disadvantages. Traditional velocity instrument can be used at low flow and RiverRAD microwave radar or imaging technology measurement method may be applied in high flow. In the tidal river can use the ADCP to quickly measure river vertical velocity distribution. In addition, urban rivers may be used the CW radar to set up on the bridge, and wide rivers can be used RiverRAD microwave radar to measure the velocities. Review the relevant literature also found that using Ultrasonic Doppler Current Profiler with the Chiu's theory to the velocity of observing automation work can save manpower and resources to improve measurement accuracy, reduce the risk of measurement, but the great variability of river characteristics in Taiwan and a lot of drifting floating objects in water in high flow, resulting in measurement automation work still needs further study. If the priority for the safety of personnel and instruments, we can use the non-contact velocity measurement method with the theoretical analysis method to achieve real-time monitoring.
NAA For Human Serum Analysis: Comparison With Conventional Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, Laura C.; Zamboni, Cibele B.; Medeiros, Jose A. G.
2010-08-04
Instrumental and Comparator methods of Neutron Activation Analysis (NAA) were applied to determine elements of clinical relevancy in serum samples of adult population (Sao Paulo city, Brazil). A comparison with the conventional analyses, Colorimetric for calcium, Titrymetric for chlorine and Ion Specific Electrode for sodium and potassium determination were also performed permitting a discussion about the performance of NAA methods for clinical chemistry research.
Examining Classification Criteria: A Comparison of Three Cut Score Methods
ERIC Educational Resources Information Center
DiStefano, Christine; Morgan, Grant
2011-01-01
This study compared 3 different methods of creating cut scores for a screening instrument, T scores, receiver operating characteristic curve (ROC) analysis, and the Rasch rating scale method (RSM), for use with the Behavioral and Emotional Screening System (BESS) Teacher Rating Scale for Children and Adolescents (Kamphaus & Reynolds, 2007).…
ANALYTICAL METHOD COMPARISONS BY ESTIMATES OF PRECISION AND LOWER DETECTION LIMIT
The paper describes the use of principal component analysis to estimate the operating precision of several different analytical instruments or methods simultaneously measuring a common sample of a material whose actual value is unknown. This approach is advantageous when none of ...
Automated processing for proton spectroscopic imaging using water reference deconvolution.
Maudsley, A A; Wu, Z; Meyerhoff, D J; Weiner, M W
1994-06-01
Automated formation of MR spectroscopic images (MRSI) is necessary before routine application of these methods is possible for in vivo studies; however, this task is complicated by the presence of spatially dependent instrumental distortions and the complex nature of the MR spectrum. A data processing method is presented for completely automated formation of in vivo proton spectroscopic images, and applied for analysis of human brain metabolites. This procedure uses the water reference deconvolution method (G. A. Morris, J. Magn. Reson. 80, 547(1988)) to correct for line shape distortions caused by instrumental and sample characteristics, followed by parametric spectral analysis. Results for automated image formation were found to compare favorably with operator dependent spectral integration methods. While the water reference deconvolution processing was found to provide good correction of spatially dependent resonance frequency shifts, it was found to be susceptible to errors for correction of line shape distortions. These occur due to differences between the water reference and the metabolite distributions.
Analysis of HbA1c on an automated multicapillary zone electrophoresis system.
Rollborn, Niclas; Åkerfeldt, Torbjörn; Nordin, Gunnar; Xu, Xiao Yan; Mandic-Havelka, Aleksandra; Hansson, Lars-Olof; Larsson, Anders
2017-02-01
Hemoglobin A1c (HbA1c) is a frequently requested laboratory test and there is thus a need for high throughput instruments for this assay. We evaluated a new automated multicapillary zone electrophoresis instrument (Capillarys 3 Tera, Sebia, Lisses, France) for analysis of HbA1c in venous samples. Routine requested HbA1c samples were analyzed immunologically on a Roche c6000 instrument (n = 142) and then with the Capillarys 3 Tera instrument. The Capillarys 3 Tera instrument performed approximately 70 HbA1c tests/hour. There was a strong linear correlation between Capillarys 3 Tera and Roche Tina-Quant HbA1c Gen 3 assay (y = 1.003x - 0.3246 R 2 = .996). The total CV for the 12 capillaries varied between 0.8 and 2.2% and there was a good agreement between duplicate samples (R 2 = .997). In conclusion, the Capillarys 3 Tera instrument has a high assay capacity for HbA1c. It has a good precision and agreement with the Roche Tina-Quant HbA1c method and is well suited for high volume testing of HbA1c.
Verma, Mudita; Meena, N.; Kumari, R. Anitha; Mallandur, Sudhanva; Vikram, R.; Gowda, Vishwas
2017-01-01
Aims: The aim of this study was to quantify the debris extruded apically from teeth using rotary and reciprocation instrumentation systems. Subjects and Methods: Eighty extracted human mandibular premolars with single canals and similar lengths were instrumented using ProTaper Universal (40, 06; Dentsply Maillefer, Ballaigues, Switzerland), ProTaper Next (40, 06; Dentsply Maillefer, Ballaigues, Switzerland), WaveOne (40, 06; Dentsply Maillefer, Ballaigues, Switzerland), and Reciproc (R40; VDW GmbH, Munich, Germany). Debris extruded during instrumentation was collected into preweighed Eppendorf tubes, which were then stored in an incubator at 70°C for 5 days. The final weight of the Eppendorf tubes with the extruded debris was calculated after obtaining the mean of three consecutive weights obtained for each tube. Statistical Analysis Used: Statistical analysis was performed using SPSS version 16.0 software. The groups were compared using the Kruskal–Wallis test for all variables. Results: There was no statistically significant difference between the groups (P = 0.1114). However, the ProTaper Universal group produced more extrusion and ProTaper Next produced least debris extrusion among the instrument groups (P > 0.05). Conclusions: All instrumentation techniques were associated with extruded debris. PMID:28855755
Trace-Level Automated Mercury Speciation Analysis
Taylor, Vivien F.; Carter, Annie; Davies, Colin; Jackson, Brian P.
2011-01-01
An automated system for methyl Hg analysis by purge and trap gas chromatography (GC) was evaluated, with comparison of several different instrument configurations including chromatography columns (packed column or capillary), detector (atomic fluorescence, AFS, or inductively coupled plasma mass spectrometry, ICP-MS, using quadrupole and sector field ICP- MS instruments). Method detection limits (MDL) of 0.042 pg and 0.030 pg for CH3Hg+ were achieved with the automated Hg analysis system configured with AFS and ICPMS detection, respectively. Capillary GC with temperature programming was effective in improving resolution and decreasing retention times of heavier Hg species (in this case C3H7Hg+) although carryover between samples was increased. With capillary GC, the MDL for CH3Hg+ was 0.25 pg for AFS detection and 0.060 pg for ICP-MS detection. The automated system was demonstrated to have high throughput (72 samples analyzed in 8 hours) requiring considerably less analyst time than the manual method for methyl mercury analysis described in EPA 1630. PMID:21572543
INVESTIGATION OF RESPONSE DIFFERENCES BETWEEN ...
Total organic carbon (TOC) and dissolved organic carbon (DOC) have long been used to estimate the amount of natural organic matter (NOM) found in raw and finished drinking water. In recent years, computer automation and improved instrumental analysis technologies have created a variety of TOC instrument systems. However, the amount of organic carbon (OC) measured in a sample has been found to depend upon the way a specific TOC instrument treats the sample and the way the OC is calculated and reported. Specifically, relative instrument response differences for TOC/DOC, ranging between 15 to 62%, were documented when five different source waters were each analyzed by five different TOC instrument systems operated according to the manufacturer's specifications. Problems and possible solutions for minimizing these differences are discussed. Establish optimum performance criteria for current TOC technologies for application to Stage 2 D/DBP Rule.Develop a TOC and SUVA (incorporating DOC and UV254) method to be published in the Stage 2 D/DBP Rule that will meet requirements as stated in the Stage 1 D/DBP Rule (Revise Method 415.3,
Breckons, Matthew; Jones, Ray; Morris, Jenny; Richardson, Janet
2008-01-22
Developers of health information websites aimed at consumers need methods to assess whether their website is of "high quality." Due to the nature of complementary medicine, website information is diverse and may be of poor quality. Various methods have been used to assess the quality of websites, the two main approaches being (1) to compare the content against some gold standard, and (2) to rate various aspects of the site using an assessment tool. We aimed to review available evaluation instruments to assess their performance when used by a researcher to evaluate websites containing information on complementary medicine and breast cancer. In particular, we wanted to see if instruments used the same criteria, agreed on the ranking of websites, were easy to use by a researcher, and if use of a single tool was sufficient to assess website quality. Bibliographic databases, search engines, and citation searches were used to identify evaluation instruments. Instruments were included that enabled users with no subject knowledge to make an objective assessment of a website containing health information. The elements of each instrument were compared to nine main criteria defined by a previous study. Google was used to search for complementary medicine and breast cancer sites. The first six results and a purposive six from different origins (charities, sponsored, commercial) were chosen. Each website was assessed using each tool, and the percentage of criteria successfully met was recorded. The ranking of the websites by each tool was compared. The use of the instruments by others was estimated by citation analysis and Google searching. A total of 39 instruments were identified, 12 of which met the inclusion criteria; the instruments contained between 4 and 43 questions. When applied to 12 websites, there was agreement of the rank order of the sites with 10 of the instruments. Instruments varied in the range of criteria they assessed and in their ease of use. Comparing the content of websites against a gold standard is time consuming and only feasible for very specific advice. Evaluation instruments offer gateway providers a method to assess websites. The checklist approach has face validity when results are compared to the actual content of "good" and "bad" websites. Although instruments differed in the range of items assessed, there was fair agreement between most available instruments. Some were easier to use than others, but these were not necessarily the instruments most widely used to date. Combining some of the better features of instruments to provide fewer, easy-to-use methods would be beneficial to gateway providers.
Yao, Yuan; Zhang, Huiyu; Liu, Huan; Zhang, Zhengfeng; Tang, Yu; Zhou, Yue
2017-08-01
Anterior debridement/bone grafting/posterior instrumentation is a common selection for the treatment of lumbar spinal tuberculosis (LST). To date, no study has focused on the prognostic factors for recovery after this surgery. We included 144 patients who experienced anterior debridement/bone grafting/posterior instrumentation for LST. The recovery rate based on the Japanese Orthopedic Association (JOA) score was used to assess recovery. The Kaplan-Meier method and Cox regression analysis were used to identify the prognostic factors for recovery postoperatively. For the prognostic factors worth further consideration, the changes in JOA scores within the 24-month follow-up period were identified by repeated-measures analysis of variance. Paralysis/nonparalysis, duration of symptoms (≥3/<3 months), number of involved vertebrae (>2/≤2), and posterior open/percutaneous instrumentation were identified as prognostic factors for recovery postoperatively. The prognostic factor of open/percutaneous instrumentation was then further compared for potential clinical application. Patients in the percutaneous instrumentation group achieved higher JOA scores than those in the open instrumentation group in the early stages postoperatively (1-3 months), but this effect equalized at 6 months postoperatively. Patients in the open instrumentation group experienced longer operation time and less cost than those in the percutaneous instrumentation group. Nonparalysis, shorter symptom duration, fewer involved vertebrae, and posterior percutaneous instrumentation (compared with open instrumentation) are considered favorable prognostic factors. Patients in the percutaneous instrumentation group achieved higher JOA scores than those in the open instrumentation group in the early stages postoperatively (1-3 months), but no significant difference was observed in long-term JOA scores (6-24 months). Copyright © 2017. Published by Elsevier Inc.
Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"
NASA Astrophysics Data System (ADS)
Pal, Sangita; Singha, Mousumi; Meena, Sher Singh
2018-04-01
Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.
Samide, Michael J; Smith, Gregory D
2015-12-24
Construction materials used in museums for the display, storage, and transportation of artwork must be assessed for their tendency to emit harmful pollution that could potentially damage cultural treasures. Traditionally, a subjective metals corrosion test known as the Oddy test has been widely utilized in museums for this purpose. To augment the Oddy test, an instrumental sampling approach based on evolved gas analysis (EGA) coupled to gas chromatography (GC) with mass spectral (MS) detection has been implemented for the first time to qualitatively identify off-gassed pollutants under specific conditions. This approach is compared to other instrumental methods reported in the literature. This novel application of the EGA sampling technique yields several benefits over traditional testing, including rapidity, high sensitivity, and broad detectability of volatile organic compounds (VOCs). Furthermore, unlike other reported instrumental approaches, the EGA method was used to determine quantitatively the amount of VOCs emitted by acetate resins and polyurethane foams under specific conditions using both an external calibration method as well as surrogate response factors. EGA was successfully employed to rapidly characterize emissions from 12 types of common plastics. This analysis is advocated as a rapid pre-screening method to rule out poorly performing materials prior to investing time and energy in Oddy testing. The approach is also useful for rapid, routine testing of construction materials previously vetted by traditional testing, but which may experience detrimental formulation changes over time. As an example, a case study on batch re-orders of rigid expanded poly(vinyl chloride) board stock is presented. Copyright © 2015 Elsevier B.V. All rights reserved.
Pincus MR, Lifshitz MS, Bock JL. Analysis: principles of instrumentation. In: McPherson RA, Pincus MR, eds. Henry's Clinical Diagnosis and Management by Laboratory Methods . 23rd ed. St Louis, MO: Elsevier; 2017:chap 4.
Seismic response analysis of an instrumented building structure
Li, H.-J.; Zhu, S.-Y.; Celebi, M.
2003-01-01
The Sheraton - Universal hotel, an instrumented building lying in North Hollywood, USA is selected for case study in this paper. The finite element method is used to produce a linear time - invariant structural model, and the SAP2000 program is employed for the time history analysis of the instrumented structure under the base excitation of strong motions recorded in the basement during the Northridge, California earthquake of 17 January 1994. The calculated structural responses are compared with the recorded data in both time domain and frequency domain, and the effects of structural parameters evaluation and indeterminate factors are discussed. Some features of structural response, such as the reason why the peak responses of acceleration in the ninth floor are larger than those in the sixteenth floor, are also explained.
NASA Astrophysics Data System (ADS)
Trahan, Alexis Chanel
New nondestructive assay techniques are sought to better characterize spent nuclear fuel. One of the NDA instruments selected for possible deployment is differential die-away self-interrogation (DDSI). The proposed DDSI approach for spent fuel assembly assay utilizes primarily the spontaneous fission and (alpha, n) neutrons in the assemblies as an internal interrogating radiation source. The neutrons released in spontaneous fission or (alpha,n) reactions are thermalized in the surrounding water and induce fission in fissile isotopes, thereby creating a measurable signal from isotopes of interest that would be otherwise difficult to measure. The DDSI instrument employs neutron coincidence counting with 3He tubes and list-mode-based data acquisition to allow for production of Rossi-alpha distributions (RADs) in post-processing. The list-mode approach to data collection and subsequent construction of RADs has expanded the analytical possibilities, as will be demonstrated throughout this thesis. One of the primary advantages is that the measured signal in the form of a RAD can be analyzed in its entirety including determination of die-away times in different time domains. This capability led to the development of the early die-away method, a novel leakage multiplication determination method which is tested throughout the thesis on different sources in simulation space and fresh fuel experiments. The early die-away method is a robust, accurate, improved method of determining multiplication without the need for knowledge of the (alpha,n) source term. The DDSI technique and instrument are presented along with the many novel capabilities enabled by and discovered through RAD analysis. Among the new capabilities presented are the early die-away method, total plutonium content determination, and highly sensitive missing pin detection. Simulation of hundreds of different spent and fresh fuel assemblies were used to develop the analysis algorithms and the techniques were tested on a variety of spontaneous fission-driven fresh fuel assemblies at Los Alamos National Laboratory and the BeRP ball at the Nevada National Security Site. The development of the new, improved analysis and characterization methods with the DDSI instrument makes it a viable technique for implementation in a facility to meet material control and safeguards needs.
Lin, Wei; Feng, Rui; Li, Hongzhe
2014-01-01
In genetical genomics studies, it is important to jointly analyze gene expression data and genetic variants in exploring their associations with complex traits, where the dimensionality of gene expressions and genetic variants can both be much larger than the sample size. Motivated by such modern applications, we consider the problem of variable selection and estimation in high-dimensional sparse instrumental variables models. To overcome the difficulty of high dimensionality and unknown optimal instruments, we propose a two-stage regularization framework for identifying and estimating important covariate effects while selecting and estimating optimal instruments. The methodology extends the classical two-stage least squares estimator to high dimensions by exploiting sparsity using sparsity-inducing penalty functions in both stages. The resulting procedure is efficiently implemented by coordinate descent optimization. For the representative L1 regularization and a class of concave regularization methods, we establish estimation, prediction, and model selection properties of the two-stage regularized estimators in the high-dimensional setting where the dimensionality of co-variates and instruments are both allowed to grow exponentially with the sample size. The practical performance of the proposed method is evaluated by simulation studies and its usefulness is illustrated by an analysis of mouse obesity data. Supplementary materials for this article are available online. PMID:26392642
This research plan has several objectives: 1) develop new or refine existing chemical, instrument and biological methods for the detection of cyanobacteria and their toxins; test such methods in field studies in both HAB and non HAB environments; 2) determine the method(s) that c...
Method and apparatus for ceramic analysis
Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr
2003-04-01
The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.
MEASURING THE SOLAR RADIUS FROM SPACE DURING THE 2012 VENUS TRANSIT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emilio, M.; Couvidat, S.; Bush, R. I.
We report in this work the determination of the solar radius from observations by the Helioseismic and Magnetic Imager (HMI) and the Atmospheric Imaging Assembly (AIA) instruments on board the Solar Dynamics Observatory during the 2012 June Venus transit of the Sun. Two different methods were utilized to determine the solar radius using images of Sun taken by the HMI instrument. The first technique fit the measured trajectory of Venus in front of the Sun for seven wavelengths across the Fe I absorption line at 6173 Å. The solar radius determined from this method varies with the measurement wavelength, reflectingmore » the variation in the height of line formation. The second method measured the area of the Sun obscured by Venus to determine the transit duration from which the solar radius was derived. This analysis focused on measurements taken in the continuum wing of the line, and applied a correction for the instrumental point spread function (PSF) of the HMI images. Measurements taken in the continuum wing of the 6173 Å line, resulted in a derived solar radius at 1 AU of 959.''57 ± 0.''02 (695, 946 ± 15 km). The AIA instrument observed the Venus transit at ultraviolet wavelengths. Using the solar disk obscuration technique, similar to that applied to the HMI images, analysis of the AIA data resulted in values of R {sub ☉} = 963.''04 ± 0.''03 at 1600 Å and R {sub ☉} = 961.''76 ± 0.''03 at 1700 Å.« less
Measuring the Solar Radius from Space during the 2012 Venus Transit
NASA Astrophysics Data System (ADS)
Emilio, M.; Couvidat, S.; Bush, R. I.; Kuhn, J. R.; Scholl, I. F.
2015-01-01
We report in this work the determination of the solar radius from observations by the Helioseismic and Magnetic Imager (HMI) and the Atmospheric Imaging Assembly (AIA) instruments on board the Solar Dynamics Observatory during the 2012 June Venus transit of the Sun. Two different methods were utilized to determine the solar radius using images of Sun taken by the HMI instrument. The first technique fit the measured trajectory of Venus in front of the Sun for seven wavelengths across the Fe I absorption line at 6173 Å. The solar radius determined from this method varies with the measurement wavelength, reflecting the variation in the height of line formation. The second method measured the area of the Sun obscured by Venus to determine the transit duration from which the solar radius was derived. This analysis focused on measurements taken in the continuum wing of the line, and applied a correction for the instrumental point spread function (PSF) of the HMI images. Measurements taken in the continuum wing of the 6173 Å line, resulted in a derived solar radius at 1 AU of 959.''57 ± 0.''02 (695, 946 ± 15 km). The AIA instrument observed the Venus transit at ultraviolet wavelengths. Using the solar disk obscuration technique, similar to that applied to the HMI images, analysis of the AIA data resulted in values of R ⊙ = 963.''04 ± 0.''03 at 1600 Å and R ⊙ = 961.''76 ± 0.''03 at 1700 Å.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoegg, Edward D.; Marcus, R. Kenneth; Hager, George J.
RATIONALE: The field of highly accurate and precise isotope ratio (IR) analysis has been dominated by inductively coupled plasma and thermal ionization mass spectrometers. While these instruments are considered the gold standard for IR analysis, the International Atomic Energy Agency desires a field deployable instrument capable of accurately and precisely measuring U isotope ratios. METHODS: The proposed system interfaces the liquid sampling – atmospheric pressure glow discharge (LS-APGD) ion source with a high resolution Exactive Orbitrap mass spectrometer. With this experimental setup certified U isotope standards and unknown samples were analyzed. The accuracy and precision of the system were thenmore » determined. RESULTS: The LS-APGD /Exactive instrument measures a certified reference material of natural U (235U/238U = 0.007258) as 0.007041 with a relative standard deviation of 0.158% meeting the International Target Values for Uncertainty for the destructive analysis of U. Additionally, when three unknowns measured and compared to the results from an ICP multi collector instrument, there is no statistical difference between the two instruments.CONCLUSIONS: The LS-APGD / Orbitrap system, while still in the preliminary stages of development, offers highly accurate and precise IR analysis that suggest a paradigm shift in the world of IR analysis. Furthermore, the portability of the LS-APGD as an elemental ion source combined with the low overhead and small size of the Orbitrap suggest that the instrumentation is capable of being field deployable.With liquid sampling glow discharge-Orbitrap MS, isotope ratio and precision performance improves with rejection of concomitant ion species.« less
ERIC Educational Resources Information Center
de Leng, Bas A.; Dolmans, Diana H. J. M.; Donkers, H. H. L. M.; Muijtjens, Arno M. M.; van der Vleuten, Cees P. M.
2010-01-01
In the complex practice of today's blended learning, educators need to be able to evaluate both online and face-to-face communication in order to get the full picture of what is going on in blended learning scenarios. The aim of this study was to investigate the reliability and feasibility of a practical instrument for analysing face-to-face…
Establishment of gold-quartz standard GQS-1
Millard, Hugh T.; Marinenko, John; McLane, John E.
1969-01-01
A homogeneous gold-quartz standard, GQS-1, was prepared from a heterogeneous gold-bearing quartz by chemical treatment. The concentration of gold in GQS-1 was determined by both instrumental neutron activation analysis and radioisotope dilution analysis to be 2.61?0.10 parts per million. Analysis of 10 samples of the standard by both instrumental neutron activation analysis and radioisotope dilution analysis failed to reveal heterogeneity within the standard. The precision of the analytical methods, expressed as standard error, was approximately 0.1 part per million. The analytical data were also used to estimate the average size of gold particles. The chemical treatment apparently reduced the average diameter of the gold particles by at least an order of magnitude and increased the concentration of gold grains by a factor of at least 4,000.
PASTERNAK-JÚNIOR, Braulio; de SOUSA NETO, Manoel Damião; DIONÍSIO, Valdeci Carlos; PÉCORA, Jesus Djalma; SILVA, Ricardo Gariba
2012-01-01
Objective This study assessed the muscular activity during root canal preparation through kinematics, kinetics, and electromyography (EMG). Material and Methods The operators prepared one canal with RaCe rotary instruments and another with Flexo-files. The kinematics of the major joints was reconstructed using an optoelectronic system and electromyographic responses of the flexor carpi radialis, extensor carpi radialis, brachioradialis, biceps brachii, triceps brachii, middle deltoid, and upper trapezius were recorded. The joint torques of the shoulder, elbow and wrist were calculated using inverse dynamics. In the kinematic analysis, angular movements of the wrist and elbow were classified as low risk factors for work-related musculoskeletal disorders. With respect to the shoulder, the classification was medium-risk. Results There was no significant difference revealed by the kinetic reports. The EMG results showed that for the middle deltoid and upper trapezius the rotary instrumentation elicited higher values. The flexor carpi radialis and extensor carpi radialis, as well as the brachioradialis showed a higher value with the manual method. Conclusion The muscular recruitment for accomplishment of articular movements for root canal preparation with either the rotary or manual techniques is distinct. Nevertheless, the rotary instrument presented less difficulty in the generation of the joint torque in each articulation, thus, presenting a greater uniformity of joint torques. PMID:22437679
Design, validation, and use of an evaluation instrument for monitoring systemic reform
NASA Astrophysics Data System (ADS)
Scantlebury, Kathryn; Boone, William; Butler Kahle, Jane; Fraser, Barry J.
2001-08-01
Over the past decade, state and national policymakers have promoted systemic reform as a way to achieve high-quality science education for all students. However, few instruments are available to measure changes in key dimensions relevant to systemic reform such as teaching practices, student attitudes, or home and peer support. Furthermore, Rasch methods of analysis are needed to permit valid comparison of different cohorts of students during different years of a reform effort. This article describes the design, development, validation, and use of an instrument that measures student attitudes and several environment dimensions (standards-based teaching, home support, and peer support) using a three-step process that incorporated expert opinion, factor analysis, and item response theory. The instrument was validated with over 8,000 science and mathematics students, taught by more than 1,000 teachers in over 200 schools as part of a comprehensive assessment of the effectiveness of Ohio's systemic reform initiative. When the new four-factor, 20-item questionnaire was used to explore the relative influence of the class, home, and peer environment on student achievement and attitudes, findings were remarkably consistent across 3 years and different units and methods of analysis. All three environments accounted for unique variance in student attitudes, but only the environment of the class accounted for unique variance in student achievement. However, the class environment (standards-based teaching practices) was the strongest independent predictor of both achievement and attitude, and appreciable amounts of the total variance in attitudes were common to the three environments.
The clinical learning environment and supervision by staff nurses: developing the instrument.
Saarikoski, Mikko; Leino-Kilpi, Helena
2002-03-01
The aims of this study were (1) to describe students' perceptions of the clinical learning environment and clinical supervision and (2) to develop an evaluation scale by using the empirical results of this study. The data were collected using the Clinical Learning Environment and Supervision instrument (CLES). The instrument was based on the literature review of earlier studies. The derived instrument was tested empirically in a study involving nurse students (N=416) from four nursing colleges in Finland. The results demonstrated that the method of supervision, the number of separate supervision sessions and the psychological content of supervisory contact within a positive ward atmosphere are the most important variables in the students' clinical learning. The results also suggest that ward managers can create the conditions of a positive ward culture and a positive attitude towards students and their learning needs. The construct validity of the instrument was analysed by using exploratory factor analysis. The analysis indicated that the most important factor in the students' clinical learning is the supervisory relationship. The two most important factors constituting a 'good' clinical learning environment are the management style of the ward manager and the premises of nursing on the ward. The results of the factor analysis support the theoretical construction of the clinical learning environment modelled by earlier empirical studies.
Sims, Mario; Wyatt, Sharon B.; Gutierrez, Mary Lou; Taylor, Herman A.; Williams, David R.
2009-01-01
Objective Assessing the discrimination-health disparities hypothesis requires psychometrically sound, multidimensional measures of discrimination. Among the available discrimination measures, few are multidimensional and none have adequate psychometric testing in a large, African American sample. We report the development and psychometric testing of the multidimensional Jackson Heart Study Discrimination (JHSDIS) Instrument. Methods A multidimensional measure assessing the occurrence, frequency, attribution, and coping responses to perceived everyday and lifetime discrimination; lifetime burden of discrimination; and effect of skin color was developed and tested in the 5302-member cohort of the Jackson Heart Study. Internal consistency was calculated by using Cronbach α. coefficient. Confirmatory factor analysis established the dimensions, and intercorrelation coefficients assessed the discriminant validity of the instrument. Setting Tri-county area of the Jackson, MS metropolitan statistical area. Results The JHSDIS was psychometrically sound (overall α=.78, .84 and .77, respectively, for the everyday and lifetime subscales). Confirmatory factor analysis yielded 11 factors, which confirmed the a priori dimensions represented. Conclusions The JHSDIS combined three scales into a single multidimensional instrument with good psychometric properties in a large sample of African Americans. This analysis lays the foundation for using this instrument in research that will examine the association between perceived discrimination and CVD among African Americans. PMID:19341164
Topics in Chemical Instrumentation.
ERIC Educational Resources Information Center
Settle, Frank A. Jr., Ed.
1989-01-01
Using Fourier transformation methods in nuclear resonance has made possible increased sensitivity in chemical analysis. This article describes data acquisition, data processing, and the frequency spectrum as they relate to this technique. (CW)
Teaching Analytical Method Development in an Undergraduate Instrumental Analysis Course
ERIC Educational Resources Information Center
Lanigan, Katherine C.
2008-01-01
Method development and assessment, central components of carrying out chemical research, require problem-solving skills. This article describes a pedagogical approach for teaching these skills through the adaptation of published experiments and application of group-meeting style discussions to the curriculum of an undergraduate instrumental…
NASA Astrophysics Data System (ADS)
Kanjanapen, Manorth; Kunsombat, Cherdsak; Chiangga, Surasak
2017-09-01
The functional transformation method (FTM) is a powerful tool for detailed investigation of digital sound synthesis by the physical modeling method, the resulting sound or measured vibrational characteristics at discretized points on real instruments directly solves the underlying physical effect of partial differential equation (PDE). In this paper, we present the Higuchi’s method to examine the difference between the timbre of tone and estimate fractal dimension of musical signals which contains information about their geometrical structure that synthesizes by FTM. With the Higuchi’s method we obtain the whole process is not complicated, fast processing, with the ease of analysis without expertise in the physics or virtuoso musicians and the easiest way for the common people can judge that sounds similarly presented.
Psychometric Properties of a Screening Instrument for Domestic Violence in a Sample of Iranian Women
Azadarmaki, Taghi; Kassani, Aziz; Menati, Rostam; Hassanzadeh, Jafar; Menati, Walieh
2016-01-01
Background Domestic violence against women is regarded as an important health problem among women and a serious concern in issues related to human rights. To date, a few screening tools for domestic violence exist for Iranian married women, but they assess only some of the domestic violence components. Objectives The present study aimed to design and determine the validity and reliability of a screening instrument for domestic violence in a sample of Iranian women. Materials and Methods The present study was a cross-sectional psychometric evaluation conducted on 350 married women in Ilam, Iran, in 2014. The samples were selected through multistage sampling and the main method was cluster sampling. A 20-item, self-administered questionnaire was validated by exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). An Eigen value > 1 and a loading factor > 0.3 for each component were considered as indices for extracting domestic violence components. Reliability was calculated by test-retest and Cronbach’s alpha. Also, the content validity index (CVI) and content validity ratio (CVR) were used to measure content validity. The data were analyzed using SPSS-13 and LISREL 8.8 software programs. Results The self-administered instrument was completed by 334 women. The CFA and EFA methods confirmed embedding items and the three-factor structure of the instrument including psychological, physical, and sexual violence, which explained 66% of the total variance of the domestic violence. The ICC and Cronbach’s alpha coefficients were > 0.7 for the components of the questionnaire. The test-retest also revealed strong correlations for each of the domestic violence components (r > 0.6). Conclusions The used instrument for measuring domestic violence had desirable validity and reliability and can be used as a suitable instrument in health and social researches in the local population. PMID:27331052
NASA Astrophysics Data System (ADS)
Takatsuka, Toshiko; Hirata, Kouichi; Kobayashi, Yoshinori; Kuroiwa, Takayoshi; Miura, Tsutomu; Matsue, Hideaki
2008-11-01
Certified reference materials (CRMs) of shallow arsenic implants in silicon are now under development at the National Metrology Institute of Japan (NMIJ). The amount of ion-implanted arsenic atoms is quantified by Instrumental Neutron Activation Analysis (INAA) using research reactor JRR-3 in Japan Atomic Energy Agency (JAEA). It is found that this method can evaluate arsenic amounts of 1015 atoms/cm2 with small uncertainties, and is adaptable to shallower dopants. The estimated uncertainties can satisfy the industrial demands for reference materials to calibrate the implanted dose of arsenic at shallow junctions.
Isothermal Titration Calorimetry Can Provide Critical Thinking Opportunities
ERIC Educational Resources Information Center
Moore, Dale E.; Goode, David R.; Seney, Caryn S.; Boatwright, Jennifer M.
2016-01-01
College chemistry faculties might not have considered including isothermal titration calorimetry (ITC) in their majors' curriculum because experimental data from this instrumental method are often analyzed via automation (software). However, the software-based data analysis can be replaced with a spreadsheet-based analysis that is readily…
Turbine Engine Hot Section Technology, 1987
NASA Technical Reports Server (NTRS)
1987-01-01
Presentations were made concerning the development of design analysis tools for combustor liners, turbine vanes, and turbine blades. Presentations were divided into six sections: instrumentation, combustion, turbine heat transfer, structural analysis, fatigue and fracture, surface protective coatings, constitutive behavior of materials, stress-strain response and life prediction methods.
Data analysis strategies for reducing the influence of the bias in cross-cultural research.
Sindik, Josko
2012-03-01
In cross-cultural research, researchers have to adjust the constructs and associated measurement instruments that have been developed in one culture and then imported for use in another culture. Importing concepts from other cultures is often simply reduced to language adjustment of the content in the items of the measurement instruments that define a certain (psychological) construct. In the context of cross-cultural research, test bias can be defined as a generic term for all nuisance factors that threaten the validity of cross-cultural comparisons. Bias can be an indicator that instrument scores based on the same items measure different traits and characteristics across different cultural groups. To reduce construct, method and item bias,the researcher can consider these strategies: (1) simply comparing average results in certain measuring instruments; (2) comparing only the reliability of certain dimensions of the measurement instruments, applied to the "target" and "source" samples of participants, i.e. from different cultures; (3) comparing the "framed" factor structure (fixed number of factors) of the measurement instruments, applied to the samples from the "target" and "source" cultures, using explorative factor analysis strategy on separate samples; (4) comparing the complete constructs ("unframed" factor analysis, i.e. unlimited number of factors) in relation to their best psychometric properties and the possibility of interpreting (best suited to certain cultures, applying explorative strategy of factor analysis); or (5) checking the similarity of the constructs in the samples from different cultures (using structural equation modeling approach). Each approach has its advantages and disadvantages. The advantages and lacks of each approach are discussed.
NASA Astrophysics Data System (ADS)
Panczyk, E.; Ligeza, M.; Walis, L.
1999-01-01
In the Institute of Nuclear Chemistry and Technology in Warsaw in collaboration with the Department of Preservation and Restoration of Works of Art of the Academy of Fine Arts in Cracow and National Museum in Warsaw systematic studies using nuclear methods, particulary instrumental neutron activation analysis and X-ray fluorescence analysis, have been carried out on the panel paintings from the Krakowska- Nowosadecka School and Silesian School of the period from the XIV-XVII century, Chinese and Thai porcelains and mummies fillings of Egyptian sarcophagi. These studies will provide new data to the existing data base, will permit to compare materials used by various schools and individual artists.
EASY-An Instrument for Surveillance of Physical Activity in Youth.
Pate, Russell R; McIver, Kerry; Dowda, Marsha; Schenkelberg, Michaela A; Beets, Michael; DiStefano, Christine
2018-01-23
Physical activity (PA) promotion among youth is a public health priority and there is a need for robust surveillance systems to help support such initiatives. Existing youth PA self-report instruments that are used for surveillance lack information regarding the types and contexts of activity. Further, these instruments have limited validity with accelerometry. The purpose of the present study was to develop a self-report instrument, with sound psychometric properties, for monitoring compliance with PA guidelines in youth. In focus groups, 162 middle school students identified 30 forms of PA that are highly prevalent in that age group. We incorporated these activities into three preliminary forms of a self-report instrument. An independent sample of middle school students (n = 537) was randomly assigned to complete one of the three preliminary versions of the instrument. Rasch analysis was applied to the responses to the three formats, and a yes/no plus frequency format emerged as the preferred method. A third sample of 342 middle school students then completed the yes/no plus frequency instrument twice following a seven-day period during which they wore an accelerometer. Using both Rasch analysis and traditional correlational methods, validity and reliability of a 14-item instrument were established. Data were collected during 2012 - 2015. Spearman correlation coefficient for the association between the cumulative score for the 14 items and minutes per day of accelerometry-derived moderate-to-vigorous physical activity (MVPA) was 0.33 (95% CI 0.22, 0.43; p<.001). Sensitivity and specificity of the 14-item instrument was 0.90 and 0.44, respectively. The study produced a PA self-report instrument for youth that was found to be reliable (r=0.91), valid versus accelerometry (r=0.33), and acceptably specific and sensitive in detecting compliance with PA guidelines.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.
SVD analysis of Aura TES spectral residuals
NASA Technical Reports Server (NTRS)
Beer, Reinhard; Kulawik, Susan S.; Rodgers, Clive D.; Bowman, Kevin W.
2005-01-01
Singular Value Decomposition (SVD) analysis is both a powerful diagnostic tool and an effective method of noise filtering. We present the results of an SVD analysis of an ensemble of spectral residuals acquired in September 2004 from a 16-orbit Aura Tropospheric Emission Spectrometer (TES) Global Survey and compare them to alternative methods such as zonal averages. In particular, the technique highlights issues such as the orbital variation of instrument response and incompletely modeled effects of surface emissivity and atmospheric composition.
Estimation of ovular fiber production in cotton
Van't Hof, Jack
1998-09-01
The present invention is a method for rendering cotton fiber cells that are post-anthesis and pre-harvest available for analysis of their physical properties. The method includes the steps of hydrolyzing cotton fiber cells and separating cotton fiber cells from cotton ovules thereby rendering the cells available for analysis. The analysis of the fiber cells is through any suitable means, e.g., visual inspection. Visual inspection of the cells can be accomplished by placing the cells under an instrument for detection, such as microscope or other means.
Transportation forecasting : analysis and quantitative methods
DOT National Transportation Integrated Search
1983-01-01
This Record contains the following papers: Development of Survey Instruments Suitable for Determining Non-Home Activity Patterns; Sequential, History-Dependent Approach to Trip-Chaining Behavior; Identifying Time and History Dependencies of Activity ...
QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.
Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O
2018-04-17
Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected. In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration. To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality. QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis. We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.
Edwards, D. L.; Saleh, A. A.; Greenspan, S. L.
2015-01-01
Summary We performed a systematic review and meta-analysis of the performance of clinical risk assessment instruments for screening for DXA-determined osteoporosis or low bone density. Commonly evaluated risk instruments showed high sensitivity approaching or exceeding 90 % at particular thresholds within various populations but low specificity at thresholds required for high sensitivity. Simpler instruments, such as OST, generally performed as well as or better than more complex instruments. Introduction The purpose of the study is to systematically review the performance of clinical risk assessment instruments for screening for dual-energy X-ray absorptiometry (DXA)-determined osteoporosis or low bone density. Methods Systematic review and meta-analysis were performed. Multiple literature sources were searched, and data extracted and analyzed from included references. Results One hundred eight references met inclusion criteria. Studies assessed many instruments in 34 countries, most commonly the Osteoporosis Self-Assessment Tool (OST), the Simple Calculated Osteoporosis Risk Estimation (SCORE) instrument, the Osteoporosis Self-Assessment Tool for Asians (OSTA), the Osteoporosis Risk Assessment Instrument (ORAI), and body weight criteria. Meta-analyses of studies evaluating OST using a cutoff threshold of <1 to identify US postmenopausal women with osteoporosis at the femoral neck provided summary sensitivity and specificity estimates of 89 % (95%CI 82–96 %) and 41 % (95%CI 23–59 %), respectively. Meta-analyses of studies evaluating OST using a cutoff threshold of 3 to identify US men with osteoporosis at the femoral neck, total hip, or lumbar spine provided summary sensitivity and specificity estimates of 88 % (95%CI 79–97 %) and 55 % (95%CI 42–68 %), respectively. Frequently evaluated instruments each had thresholds and populations for which sensitivity for osteoporosis or low bone mass detection approached or exceeded 90 % but always with a trade-off of relatively low specificity. Conclusions Commonly evaluated clinical risk assessment instruments each showed high sensitivity approaching or exceeding 90 % for identifying individuals with DXA-determined osteoporosis or low BMD at certain thresholds in different populations but low specificity at thresholds required for high sensitivity. Simpler instruments, such as OST, generally performed as well as or better than more complex instruments. PMID:25644147
Dynamics of land - use change in urban area in West Jakarta
NASA Astrophysics Data System (ADS)
Pangaribowo, R. L.
2018-01-01
This aim to research is to know how land use change in West Jakarta period 2000 - 2010. The research method used is descriptive method with a quantitative approach. Data analysis was done by using the result of research instrument to find out the driving of land change and to know the change of was analyzed using GIS (Geographic Information System) in Arc View GIS 3.3 program and Quantitative Analysis Model Location Quotient (LQ) and Shift-Share Analysis (SSA) In this study. The research instrument used in the analysis was observation and documentation. Based on the analysis conducted, the results of research on land use change in West Jakarta in the period of 10 years from 2000 until 2010 is caused by several aspects that are related to each other, namely political, economic, demographic, and cultural aspects. The land use change occurred in the area which decreased by minus 367,79 hectares (2.87%), the open space area decreased by minus 103.36 hectares (0.8%), the built up area increased by 201.13 hectares (1.57%), and the settlement area was 27.14 hectares (0.21%).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waldmann, I. P., E-mail: ingo@star.ucl.ac.uk
2014-01-01
Independent component analysis (ICA) has recently been shown to be a promising new path in data analysis and de-trending of exoplanetary time series signals. Such approaches do not require or assume any prior or auxiliary knowledge about the data or instrument in order to de-convolve the astrophysical light curve signal from instrument or stellar systematic noise. These methods are often known as 'blind-source separation' (BSS) algorithms. Unfortunately, all BSS methods suffer from an amplitude and sign ambiguity of their de-convolved components, which severely limits these methods in low signal-to-noise (S/N) observations where their scalings cannot be determined otherwise. Here wemore » present a novel approach to calibrate ICA using sparse wavelet calibrators. The Amplitude Calibrated Independent Component Analysis (ACICA) allows for the direct retrieval of the independent components' scalings and the robust de-trending of low S/N data. Such an approach gives us an unique and unprecedented insight in the underlying morphology of a data set, which makes this method a powerful tool for exoplanetary data de-trending and signal diagnostics.« less
ERIC Educational Resources Information Center
Tsai, Tien-Lung; Shau, Wen-Yi; Hu, Fu-Chang
2006-01-01
This article generalizes linear path analysis (PA) and simultaneous equations models (SiEM) to deal with mixed responses of different types in a recursive or triangular system. An efficient instrumental variable (IV) method for estimating the structural coefficients of a 2-equation partially recursive generalized path analysis (GPA) model and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Movahed, M. Sadegh; Khosravi, Shahram, E-mail: m.s.movahed@ipm.ir, E-mail: khosravi@ipm.ir
2011-03-01
In this paper we study the footprint of cosmic string as the topological defects in the very early universe on the cosmic microwave background radiation. We develop the method of level crossing analysis in the context of the well-known Kaiser-Stebbins phenomenon for exploring the signature of cosmic strings. We simulate a Gaussian map by using the best fit parameter given by WMAP-7 and then superimpose cosmic strings effects on it as an incoherent and active fluctuations. In order to investigate the capability of our method to detect the cosmic strings for the various values of tension, Gμ, a simulated puremore » Gaussian map is compared with that of including cosmic strings. Based on the level crossing analysis, the superimposed cosmic string with Gμ∼>4 × 10{sup −9} in the simulated map without instrumental noise and the resolution R = 1' could be detected. In the presence of anticipated instrumental noise the lower bound increases just up to Gμ∼>5.8 × 10{sup −9}.« less
Instrumentation for studying binder burnout in an immobilized plutonium ceramic wasteform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, M; Pugh, D; Herman, C
The Plutonium Immobilization Program produces a ceramic wasteform that utilizes organic binders. Several techniques and instruments were developed to study binder burnout on full size ceramic samples in a production environment. This approach provides a method for developing process parameters on production scale to optimize throughput, product quality, offgas behavior, and plant emissions. These instruments allow for offgas analysis, large-scale TGA, product quality observation, and thermal modeling. Using these tools, results from lab-scale techniques such as laser dilametry studies and traditional TGA/DTA analysis can be integrated. Often, the sintering step of a ceramification process is the limiting process step thatmore » controls the production throughput. Therefore, optimization of sintering behavior is important for overall process success. Furthermore, the capabilities of this instrumentation allows better understanding of plant emissions of key gases: volatile organic compounds (VOCs), volatile inorganics including some halide compounds, NO{sub x}, SO{sub x}, carbon dioxide, and carbon monoxide.« less
A fully Bayesian method for jointly fitting instrumental calibration and X-ray spectral models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Jin; Yu, Yaming; Van Dyk, David A.
2014-10-20
Owing to a lack of robust principled methods, systematic instrumental uncertainties have generally been ignored in astrophysical data analysis despite wide recognition of the importance of including them. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. We previously introduced a pragmatic Bayesian method to address this problem. The method is 'pragmatic' in that it introduced an ad hoc technique that simplified computation by neglecting the potential information in the data for narrowing the uncertainty for the calibration product. Following that work, we use amore » principal component analysis to efficiently represent the uncertainty of the effective area of an X-ray (or γ-ray) telescope. Here, however, we leverage this representation to enable a principled, fully Bayesian method that coherently accounts for the calibration uncertainty in high-energy spectral analysis. In this setting, the method is compared with standard analysis techniques and the pragmatic Bayesian method. The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product—here the effective area, conditional on the adopted spectral model. In this way, it can yield more accurate and efficient estimates of the source parameters along with valid estimates of their uncertainty. Provided that the source spectrum can be accurately described by a parameterized model, this method allows rigorous inference about the effective area by quantifying which possible curves are most consistent with the data.« less
John-Baptiste, A.; Sowerby, L.J.; Chin, C.J.; Martin, J.; Rotenberg, B.W.
2016-01-01
Background: When prearranged standard surgical trays contain instruments that are repeatedly unused, the redundancy can result in unnecessary health care costs. Our objective was to estimate potential savings by performing an economic evaluation comparing the cost of surgical trays with redundant instruments with surgical trays with reduced instruments ("reduced trays"). Methods: We performed a cost-analysis from the hospital perspective over a 1-year period. Using a mathematical model, we compared the direct costs of trays containing redundant instruments to reduced trays for 5 otolaryngology procedures. We incorporated data from several sources including local hospital data on surgical volume, the number of instruments on redundant and reduced trays, wages of personnel and time required to pack instruments. From the literature, we incorporated instrument depreciation costs and the time required to decontaminate an instrument. We performed 1-way sensitivity analyses on all variables, including surgical volume. Costs were estimated in 2013 Canadian dollars. Results: The cost of redundant trays was $21 806 and the cost of reduced trays was $8803, for a 1-year cost saving of $13 003. In sensitivity analyses, cost savings ranged from $3262 to $21 395, based on the surgical volume at the institution. Variation in surgical volume resulted in a wider range of estimates, with a minimum of $3253 for low-volume to a maximum of $52 012 for high-volume institutions. Interpretation: Our study suggests moderate savings may be achieved by reducing surgical tray redundancy and, if applied to other surgical specialties, may result in savings to Canadian health care systems. PMID:27975045
Customisation of an instrument to assess anaesthesiologists' non-technical skills.
Jepsen, Rikke M H G; Spanager, Lene; Lyk-Jensen, Helle T; Dieckmann, Peter; Østergaard, Doris
2015-02-22
The objectives of the study were to identify Danish anaesthesiologists' non-technical skills and to customise the Scottish-developed Anaesthetists' Non-Technical Skills instrument for Danish anaesthesiologists. Six semi-structured group interviews were conducted with 31 operating room team members: anaes-thesiologists, nurse anaesthetists, surgeons, and scrub nurses. Interviews were transcribed verbatim and analysed using directed content analysis. Anaesthesiologists' non-technical skills were identified, coded, and sorted using the original instrument as a basis. The resulting prototype instrument was discussed with anaesthesiologists from 17 centres to ensure face validity. Interviews lasted 46-67 minutes. Identified examples of anaesthesiologists' good or poor non-technical skills fit the four categories in the original instrument: situation awareness; decision making; team working; and task management. Anaesthesiologists' leadership role in the operating room was emphasised: the original 'Task Management' category was named 'Leadership'. One new element, 'Demonstrating self-awareness' was added under the category 'Situation Awareness'. Compared with the original instrument, half of the behavioural markers were new, which reflected that being aware of and communicating one's own abilities to the team; working systematically; and speaking up to avoid adverse events were important skills. The Anaesthetists' Non-Technical Skills instrument was customised to a Danish setting using the identified non-technical skills for anaesthesiologists and the original instrument as basis. The customised instrument comprises four categories and 16 underpinning elements supported by multiple behavioural markers. Identifying non-technical skills through semi-structured group interviews and analysing them using direct content analysis proved a useful method for customising an assessment instrument to another setting.
Instruments for Gathering Data
ERIC Educational Resources Information Center
Canals, Laia
2017-01-01
This chapter sets out various methods for gathering important data on the language uses of participants in a research project. These methods imply interaction between students, teachers and researchers. They are used in the design of research projects based on action research, ethnography or conversational analysis, this being the case with the…
de Arruda Santos, Leandro; López, Javier Bayod; de Las Casas, Estevam Barbosa; de Azevedo Bahia, Maria Guiomar; Buono, Vicente Tadeu Lopes
2014-04-01
To assess the flexibility and torsional stiffness of three nickel-titanium rotary instruments by finite element analysis and compare the numerical results with the experiment. Mtwo (VDW, Munich, Germany) and RaCe (FKG Dentaire, La-Chaux-de-Fonds, Switzerland) size 25, .06 taper (0.25-mm tip diameter, 0.06% conicity) and PTU F1 (Dentsply Maillefer, Ballaigues, Switzerland) instruments were selected for this study. Experimental tests to assess the flexibility and torsional stiffness of the files were performed according to specification ISO 3630-1. Geometric models for finite element analysis were obtained by micro-CT scanning. Boundary conditions for the numerical analysis were based on the specification ISO 3630-1. A good agreement between the simulation and the experiment moment-displacement curves was found for the three types of instruments studied. RaCe exhibited the highest flexibility and PTU presented the highest torsional stiffness. Maximum values of von Mises stress were found for the PTU F1 file (1185MPa) under bending, whereas the values of von Mises stress for the three instruments were quite similar under torsion. The stress patterns proved to be different in Mtwo under bending, according to the displacement orientation. The favorable agreement found between simulation and experiment for the three types of instruments studied confirmed the potential of the numerical method to assess the mechanical behavior of endodontic instruments. Thus, a methodology is established to predict the failure of the instruments under bending and torsion. Copyright © 2014 Elsevier B.V. All rights reserved.
2013-01-01
Background Antibiotics overuse is a global public health issue influenced by several factors, of which some are parent-related psychosocial factors that can only be measured using valid and reliable psychosocial measurement instruments. The PAPA scale was developed to measure these factors and the content validity of this instrument was assessed. Aim This study further validated the recently developed instrument in terms of (1) face validity and (2) construct validity including: deciding the number and nature of factors, and item selection. Methods Questionnaires were self-administered to parents of children between the ages of 0 and 12 years old. Parents were conveniently recruited from schools’ parental meetings in the Eastern Province, Saudi Arabia. Face validity was assessed with regards to questionnaire clarity and unambiguity. Construct validity and item selection processes were conducted using Exploratory factor analysis. Results Parallel analysis and Exploratory factor analysis using principal axis factoring produced six factors in the developed instrument: knowledge and beliefs, behaviours, sources of information, adherence, awareness about antibiotics resistance, and parents’ perception regarding doctors’ prescribing behaviours. Reliability was assessed (Cronbach’s alpha = 0.78) which demonstrates the instrument as being reliable. Conclusion The ‘factors’ produced in this study coincide with the constructs contextually identified in the development phase of other instruments used to study antibiotic use. However, no other study considering perceptions of antibiotic use had gone beyond content validation of such instruments. This study is the first to constructively validate the factors underlying perceptions regarding antibiotic use in any population and in parents in particular. PMID:23497151
Prabhakar, Attiguppe R; Yavagal, Chandrashekar; Naik, Saraswathi V
2016-01-01
ABSTRACT Background: Primary root canals are considered to be most challenging due to their complex anatomy. "Wave one" and "one shape" are single-file systems with reciprocating and rotary motion respectively. The aim of this study was to evaluate and compare dentin thickness, centering ability, canal transportation, and instrumentation time of wave one and one shape files in primary root canals using a cone beam computed tomographic (CBCT) analysis. Study design: This is an experimental, in vitro study comparing the two groups. Materials and methods: A total of 24 extracted human primary teeth with minimum 7 mm root length were included in the study. Cone beam computed tomographic images were taken before and after the instrumentation for each group. Dentin thickness, centering ability, canal transportation, and instrumentation times were evaluated for each group. Results: A significant difference was found in instrumentation time and canal transportation measures between the two groups. Wave one showed less canal transportation as compared with one shape, and the mean instrumentation time of wave one was significantly less than one shape. Conclusion: Reciprocating single-file systems was found to be faster with much less procedural errors and can hence be recommended for shaping the root canals of primary teeth. How to cite this article: Prabhakar AR, Yavagal C, Dixit K, Naik SV. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-File Systems. Int J Clin Pediatr Dent 2016;9(1):45-49. PMID:27274155
Black carbon emissions in gasoline vehicle exhaust: a measurement and instrument comparison.
Kamboures, Michael A; Hu, Shishan; Yu, Yong; Sandoval, Julia; Rieger, Paul; Huang, Shiou-Mei; Zhang, Sherry; Dzhema, Inna; Huo, Darey; Ayala, Alberto; Chang, M C Oliver
2013-08-01
A pilot study was conducted to evaluate the performance and agreement of several commercially available black carbon (BC) measurement instruments, when applied to the quantification of BC in light-duty vehicle (LDV) exhaust. Samples from six vehicles, three fuels, and three driving cycles were used. The pilot study included determinations of the method detection limit (MDL) and repeatability. With respect to the MDL, the real-time instruments outperformed the time-integrated instruments, with MDL = 0.12 mg/mi for the AE51 Aethalometer, and 0.15 mg/mi for the Micro Soot Sensor (MSS), versus 0.38 mg/mi for the IMPROVE_A thermal/ optical method, and 0.35 mg/mi for the OT21_T Optical Transmissometer. The real-time instruments had repeatability values ranging from 30% to 35%, which are somewhat better than those of the time-integrated instruments (40-41%). These results suggest that, despite being less resource intensive, real-time methods can be equivalent or superior to time-integrated methods in terms of sensitivity and repeatability. BC mass data, from the photoacoustic and light attenuation instruments, were compared against same-test EC data, determined using the IMPROVE_A method. The MSS BC data was well correlated with EC, with R2 = 0.85 for the composite results and R2 = 0.86 for the phase-by-phase (PBP) results. The correlation of BC, by the AE51, AE22, and OT21_T with EC was moderate to weak. The weaker correlation was driven by the inclusion of US06 test data in the linear regression analysis. We hypothesize that test-cycle-dependent BC:EC ratios are due to the different physicochemical properties of particulate matter (PM) in US06 and Federal Test Procedure (FTP) tests. Correlation amongst the real-time MSS, PASS-1, AE51, and AE22 instruments was excellent (R2 = 0.83-0.95), below 1 mg/mi levels. In the process of investigating these BC instruments, we learned that BC emissions at sub-1 mg/mi levels can be measured and are achievable by current-generation gasoline engines. Most comparison studies of black carbon (BC) measurement methods were carried out in the ambient air. This study assesses the agreement among various BC measurement instrument in emissions from light-duty gasoline vehicles (LDGVs) on standard test cycles, and evaluates applicability of these methods under various fuel types, driving cycles, and engine combustion technologies. This research helps to fill in the knowledge gap of BC method standardization as stated in the U.S. Environmental Protection Agency (EPA) 2011 Report to Congress on Black Carbon, and these results demonstrate the feasibility of quantification of BC at the 1 mg/mi PM standard in California Low Emission Vehicle III regulations.
Adolescent Domain Screening Inventory-Short Form: Development and Initial Validation
ERIC Educational Resources Information Center
Corrigan, Matthew J.
2017-01-01
This study sought to develop a short version of the ADSI, and investigate its psychometric properties. Methods: This is a secondary analysis. Analysis to determine the Cronbach's Alpha, correlations to determine concurrent criterion validity and known instrument validity and a logistic regression to determine predictive validity were conducted.…
ERIC Educational Resources Information Center
Campbell, Dean J.; Xia, Younan
2007-01-01
The physical phenomenon of plasmons and the techniques that build upon them are discussed. Plasmon-enhanced applications are well-suited for introduction in physical chemistry and instrumental analysis classes and some methods of fabrication and analysis of plasmon-producing structures are simple for use in labs in general, physical and inorganic…
A Document Analysis of Teacher Evaluation Systems Specific to Physical Education
ERIC Educational Resources Information Center
Norris, Jason M.; van der Mars, Hans; Kulinna, Pamela; Kwon, Jayoun; Amrein-Beardsley, Audrey
2017-01-01
Purpose: The purpose of this document analysis study was to examine current teacher evaluation systems, understand current practices, and determine whether the instrumentation is a valid measure of teaching quality as reflected in teacher behavior and effectiveness specific to physical education (PE). Method: An interpretive document analysis…
Analysis of a Suspected Drug Sample
ERIC Educational Resources Information Center
Schurter, Eric J.; Zook-Gerdau, Lois Anne; Szalay, Paul
2011-01-01
This general chemistry laboratory uses differences in solubility to separate a mixture of caffeine and aspirin while introducing the instrumental analysis methods of GCMS and FTIR. The drug mixture is separated by partitioning aspirin and caffeine between dichloromethane and aqueous base. TLC and reference standards are used to identify aspirin…
NASA Astrophysics Data System (ADS)
Bhartia, R.; Wanger, G.; Orphan, V. J.; Fries, M.; Rowe, A. R.; Nealson, K. H.; Abbey, W. J.; DeFlores, L. P.; Beegle, L. W.
2014-12-01
Detection of in situ biosignatures on terrestrial and planetary missions is becoming increasingly more important. Missions that target the Earth's deep biosphere, Mars, moons of Jupiter (including Europa), moons of Saturn (Titan and Enceladus), and small bodies such as asteroids or comets require methods that enable detection of materials for both in-situ analysis that preserve context and as a means to select high priority sample for return to Earth. In situ instrumentation for biosignature detection spans a wide range of analytical and spectroscopic methods that capitalize on amino acid distribution, chirality, lipid composition, isotopic fractionation, or textures that persist in the environment. Many of the existing analytical instruments are bulk analysis methods and while highly sensitive, these require sample acquisition and sample processing. However, by combining with triaging spectroscopic methods, biosignatures can be targeted on a surface and preserve spatial context (including mineralogy, textures, and organic distribution). To provide spatially correlated chemical analysis at multiple spatial scales (meters to microns) we have employed a dual spectroscopic approach that capitalizes on high sensitivity deep UV native fluorescence detection and high specificity deep UV Raman analysis.. Recently selected as a payload on the Mars 2020 mission, SHERLOC incorporates these optical methods for potential biosignatures detection on Mars. We present data from both Earth analogs that operate as our only examples known biosignatures and meteorite samples that provide an example of abiotic organic formation, and demonstrate how provenance effects the spatial distribution and composition of organics.
Developing a pressure ulcer risk assessment scale for patients in long-term care.
Lepisto, Mervi; Eriksson, Elina; Hietanen, Helvi; Lepisto, Jyri; Lauri, Sirkka
2006-02-01
Previous pressure ulcer risk assessment scales appear to have relied on opinions about risk factors and are based on care setting rather than research evidence. Utilizing 21 existing risk assessment scales and relevant risk factor literature, an instrument was developed by Finnish researchers that takes into account individual patient risk factors, devices and methods applied in nursing care, and organizational characteristics. The instrument underwent two pilot tests to assess the relevance and clarity of the instrument: the first involved 43 nurses and six patients; the second involved 50 nurses with expertise in wound care. Changes to questionnaire items deemed necessary as a result of descriptive analysis and agreement percentages were completed. After pilot testing, the final instrument addressed the following issues: 1) patient risks: activity, mobility in bed, mental status, nutrition, urinary incontinence, fecal incontinence, sensory perception, and skin condition; 2) devices and methods used in patient care: technical devices, bed type, mattress, overlay, seat cushions, and care methods; and 3) staff number and structure, maximum number of beds, and beds in use (the last group of questions were included to ensure participants understood the items; results were not analyzed). The phases of the study provided an expeditious means of data collection and a suitable opportunity to assess how the instrument would function in practice. Instrument reliability and validity were improved as a result of the pilot testing and can be enhanced further with continued use and assessment.
Boncyk, Wayne C.; Markham, Brian L.; Barker, John L.; Helder, Dennis
1996-01-01
The Landsat-7 Image Assessment System (IAS), part of the Landsat-7 Ground System, will calibrate and evaluate the radiometric and geometric performance of the Enhanced Thematic Mapper Plus (ETM +) instrument. The IAS incorporates new instrument radiometric artifact correction and absolute radiometric calibration techniques which overcome some limitations to calibration accuracy inherent in historical calibration methods. Knowledge of ETM + instrument characteristics gleaned from analysis of archival Thematic Mapper in-flight data and from ETM + prelaunch tests allow the determination and quantification of the sources of instrument artifacts. This a priori knowledge will be utilized in IAS algorithms designed to minimize the effects of the noise sources before calibration, in both ETM + image and calibration data.
Milner, Allison; Aitken, Zoe; Kavanagh, Anne; LaMontagne, Anthony D; Pega, Frank; Petrie, Dennis
2017-06-23
Previous studies suggest that poor psychosocial job quality is a risk factor for mental health problems, but they use conventional regression analytic methods that cannot rule out reverse causation, unmeasured time-invariant confounding and reporting bias. This study combines two quasi-experimental approaches to improve causal inference by better accounting for these biases: (i) linear fixed effects regression analysis and (ii) linear instrumental variable analysis. We extract 13 annual waves of national cohort data including 13 260 working-age (18-64 years) employees. The exposure variable is self-reported level of psychosocial job quality. The instruments used are two common workplace entitlements. The outcome variable is the Mental Health Inventory (MHI-5). We adjust for measured time-varying confounders. In the fixed effects regression analysis adjusted for time-varying confounders, a 1-point increase in psychosocial job quality is associated with a 1.28-point improvement in mental health on the MHI-5 scale (95% CI: 1.17, 1.40; P < 0.001). When the fixed effects was combined with the instrumental variable analysis, a 1-point increase psychosocial job quality is related to 1.62-point improvement on the MHI-5 scale (95% CI: -0.24, 3.48; P = 0.088). Our quasi-experimental results provide evidence to confirm job stressors as risk factors for mental ill health using methods that improve causal inference. © The Author 2017. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Moubarac, Jean-Claude; Cargo, Margaret; Receveur, Olivier; Daniel, Mark
2012-01-01
Little is known about the situational contexts in which individuals consume processed sources of dietary sugars. This study aimed to describe the situational contexts associated with the consumption of sweetened food and drink products in a Catholic Middle Eastern Canadian community. A two-stage exploratory sequential mixed-method design was employed with a rationale of triangulation. In stage 1 (n = 62), items and themes describing the situational contexts of sweetened food and drink product consumption were identified from semi-structured interviews and were used to develop the content for the Situational Context Instrument for Sweetened Product Consumption (SCISPC). Face validity, readability and cultural relevance of the instrument were assessed. In stage 2 (n = 192), a cross-sectional study was conducted and exploratory factor analysis was used to examine the structure of themes that emerged from the qualitative analysis as a means of furthering construct validation. The SCISPC reliability and predictive validity on the daily consumption of sweetened products were also assessed. In stage 1, six themes and 40-items describing the situational contexts of sweetened product consumption emerged from the qualitative analysis and were used to construct the first draft of the SCISPC. In stage 2, factor analysis enabled the clarification and/or expansion of the instrument's initial thematic structure. The revised SCISPC has seven factors and 31 items describing the situational contexts of sweetened product consumption. Initial validation of the instrument indicated it has excellent internal consistency and adequate test-retest reliability. Two factors of the SCISPC had predictive validity for the daily consumption of total sugar from sweetened products (Snacking and Energy demands) while the other factors (Socialization, Indulgence, Constraints, Visual Stimuli and Emotional needs) were rather associated to occasional consumption of these products. PMID:23028597
Adaptation of the Practice Environment Scale for military nurses: a psychometric analysis.
Swiger, Pauline A; Raju, Dheeraj; Breckenridge-Sproat, Sara; Patrician, Patricia A
2017-09-01
The aim of this study was to confirm the psychometric properties of Practice Environment Scale of the Nursing Work Index in a military population. This study also demonstrates association rule analysis, a contemporary exploratory technique. One of the instruments most commonly used to evaluate the nursing practice environment is the Practice Environment Scale of the Nursing Work Index. Although the instrument has been widely used, the reliability, validity and individual item function are not commonly evaluated. Gaps exist with regard to confirmatory evaluation of the subscale factors, individual item analysis and evaluation in the outpatient setting and with non-registered nursing staff. This was a secondary data analysis of existing survey data. Multiple psychometric methods were used for this analysis using survey data collected in 2014. First, descriptive analyses were conducted, including exploration using association rules. Next, internal consistency was tested and confirmatory factor analysis was performed to test the factor structure. The specified factor structure did not hold; therefore, exploratory factor analysis was performed. Finally, item analysis was executed using item response theory. The differential item functioning technique allowed the comparison of responses by care setting and nurse type. The results of this study indicate that responses differ between groups and that several individual items could be removed without altering the psychometric properties of the instrument. The instrument functions moderately well in a military population; however, researchers may want to consider nurse type and care setting during analysis to identify any meaningful variation in responses. © 2017 John Wiley & Sons Ltd.
Fluorescence fingerprint as an instrumental assessment of the sensory quality of tomato juices.
Trivittayasil, Vipavee; Tsuta, Mizuki; Imamura, Yoshinori; Sato, Tsuneo; Otagiri, Yuji; Obata, Akio; Otomo, Hiroe; Kokawa, Mito; Sugiyama, Junichi; Fujita, Kaori; Yoshimura, Masatoshi
2016-03-15
Sensory analysis is an important standard for evaluating food products. However, as trained panelists and time are required for the process, the potential of using fluorescence fingerprint as a rapid instrumental method to approximate sensory characteristics was explored in this study. Thirty-five out of 44 descriptive sensory attributes were found to show a significant difference between samples (analysis of variance test). Principal component analysis revealed that principal component 1 could capture 73.84 and 75.28% variance for aroma category and combined flavor and taste category respectively. Fluorescence fingerprints of tomato juices consisted of two visible peaks at excitation/emission wavelengths of 290/350 and 315/425 nm and a long narrow emission peak at 680 nm. The 680 nm peak was only clearly observed in juices obtained from tomatoes cultivated to be eaten raw. The ability to predict overall sensory profiles was investigated by using principal component 1 as a regression target. Fluorescence fingerprint could predict principal component 1 of both aroma and combined flavor and taste with a coefficient of determination above 0.8. The results obtained in this study indicate the potential of using fluorescence fingerprint as an instrumental method for assessing sensory characteristics of tomato juices. © 2015 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Tiwari, Harinarayan; Sharma, Nayan
2017-05-01
This research paper focuses on the need of turbulence, instruments reliable to capture turbulence, different turbulence parameters and some advance methodology which can decompose various turbulence structures at different levels near hydraulic structures. Small-scale turbulence research has valid prospects in open channel flow. The relevance of the study is amplified as we introduce any hydraulic structure in the channel which disturbs the natural flow and creates discontinuity. To recover this discontinuity, the piano key weir (PKW) might be used with sloped keys. Constraints of empirical results in the vicinity of PKW necessitate extensive laboratory experiments with fair and reliable instrumentation techniques. Acoustic Doppler velocimeter was established to be best suited within range of some limitations using principal component analysis. Wavelet analysis is proposed to decompose the underlying turbulence structure in a better way.
Hu, Jianwei; Gauld, Ian C.
2014-12-01
The U.S. Department of Energy’s Next Generation Safeguards Initiative Spent Fuel (NGSI-SF) project is nearing the final phase of developing several advanced nondestructive assay (NDA) instruments designed to measure spent nuclear fuel assemblies for the purpose of improving nuclear safeguards. Current efforts are focusing on calibrating several of these instruments with spent fuel assemblies at two international spent fuel facilities. Modelling and simulation is expected to play an important role in predicting nuclide compositions, neutron and gamma source terms, and instrument responses in order to inform the instrument calibration procedures. As part of NGSI-SF project, this work was carried outmore » to assess the impacts of uncertainties in the nuclear data used in the calculations of spent fuel content, radiation emissions and instrument responses. Nuclear data is an essential part of nuclear fuel burnup and decay codes and nuclear transport codes. Such codes are routinely used for analysis of spent fuel and NDA safeguards instruments. Hence, the uncertainties existing in the nuclear data used in these codes affect the accuracies of such analysis. In addition, nuclear data uncertainties represent the limiting (smallest) uncertainties that can be expected from nuclear code predictions, and therefore define the highest attainable accuracy of the NDA instrument. This work studies the impacts of nuclear data uncertainties on calculated spent fuel nuclide inventories and the associated NDA instrument response. Recently developed methods within the SCALE code system are applied in this study. The Californium Interrogation with Prompt Neutron instrument was selected to illustrate the impact of these uncertainties on NDA instrument response.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Jianwei; Gauld, Ian C.
The U.S. Department of Energy’s Next Generation Safeguards Initiative Spent Fuel (NGSI-SF) project is nearing the final phase of developing several advanced nondestructive assay (NDA) instruments designed to measure spent nuclear fuel assemblies for the purpose of improving nuclear safeguards. Current efforts are focusing on calibrating several of these instruments with spent fuel assemblies at two international spent fuel facilities. Modelling and simulation is expected to play an important role in predicting nuclide compositions, neutron and gamma source terms, and instrument responses in order to inform the instrument calibration procedures. As part of NGSI-SF project, this work was carried outmore » to assess the impacts of uncertainties in the nuclear data used in the calculations of spent fuel content, radiation emissions and instrument responses. Nuclear data is an essential part of nuclear fuel burnup and decay codes and nuclear transport codes. Such codes are routinely used for analysis of spent fuel and NDA safeguards instruments. Hence, the uncertainties existing in the nuclear data used in these codes affect the accuracies of such analysis. In addition, nuclear data uncertainties represent the limiting (smallest) uncertainties that can be expected from nuclear code predictions, and therefore define the highest attainable accuracy of the NDA instrument. This work studies the impacts of nuclear data uncertainties on calculated spent fuel nuclide inventories and the associated NDA instrument response. Recently developed methods within the SCALE code system are applied in this study. The Californium Interrogation with Prompt Neutron instrument was selected to illustrate the impact of these uncertainties on NDA instrument response.« less
USE OF THE SDO POINTING CONTROLLERS FOR INSTRUMENT CALIBRATION MANEUVERS
NASA Technical Reports Server (NTRS)
Vess, Melissa F.; Starin, Scott R.; Morgenstern, Wendy M.
2005-01-01
During the science phase of the Solar Dynamics Observatory mission, the three science instruments require periodic instrument calibration maneuvers with a frequency of up to once per month. The command sequences for these maneuvers vary in length from a handful of steps to over 200 steps, and individual steps vary in size from 5 arcsec per step to 22.5 degrees per step. Early in the calibration maneuver development, it was determined that the original attitude sensor complement could not meet the knowledge requirements for the instrument calibration maneuvers in the event of a sensor failure. Because the mission must be single fault tolerant, an attitude determination trade study was undertaken to determine the impact of adding an additional attitude sensor versus developing alternative, potentially complex, methods of performing the maneuvers in the event of a sensor failure. To limit the impact to the science data capture budget, these instrument calibration maneuvers must be performed as quickly as possible while maintaining the tight pointing and knowledge required to obtain valid data during the calibration. To this end, the decision was made to adapt a linear pointing controller by adjusting gains and adding an attitude limiter so that it would be able to slew quickly and still achieve steady pointing once on target. During the analysis of this controller, questions arose about the stability of the controller during slewing maneuvers due to the combination of the integral gain, attitude limit, and actuator saturation. Analysis was performed and a method for disabling the integral action while slewing was incorporated to ensure stability. A high fidelity simulation is used to simulate the various instrument calibration maneuvers.
NASA Astrophysics Data System (ADS)
Wahl, Michael; Rahn, Hans-Jürgen; Gregor, Ingo; Erdmann, Rainer; Enderlein, Jörg
2007-03-01
Time-correlated single photon counting is a powerful method for sensitive time-resolved fluorescence measurements down to the single molecule level. The method is based on the precisely timed registration of single photons of a fluorescence signal. Historically, its primary goal was the determination of fluorescence lifetimes upon optical excitation by a short light pulse. This goal is still important today and therefore has a strong influence on instrument design. However, modifications and extensions of the early designs allow for the recovery of much more information from the detected photons and enable entirely new applications. Here, we present a new instrument that captures single photon events on multiple synchronized channels with picosecond resolution and over virtually unlimited time spans. This is achieved by means of crystal-locked time digitizers with high resolution and very short dead time. Subsequent event processing in programmable logic permits classical histogramming as well as time tagging of individual photons and their streaming to the host computer. Through the latter, any algorithms and methods for the analysis of fluorescence dynamics can be implemented either in real time or offline. Instrument test results from single molecule applications will be presented.
NASA Astrophysics Data System (ADS)
Rimantho, Dino; Rahman, Tomy Abdul; Cahyadi, Bambang; Tina Hernawati, S.
2017-02-01
Calibration of instrumentation equipment in the pharmaceutical industry is an important activity to determine the true value of a measurement. Preliminary studies indicated that occur lead-time calibration resulted in disruption of production and laboratory activities. This study aimed to analyze the causes of lead-time calibration. Several methods used in this study such as, Six Sigma in order to determine the capability process of the calibration instrumentation of equipment. Furthermore, the method of brainstorming, Pareto diagrams, and Fishbone diagrams were used to identify and analyze the problems. Then, the method of Hierarchy Analytical Process (AHP) was used to create a hierarchical structure and prioritize problems. The results showed that the value of DPMO around 40769.23 which was equivalent to the level of sigma in calibration equipment approximately 3,24σ. This indicated the need for improvements in the calibration process. Furthermore, the determination of problem-solving strategies Lead Time Calibration such as, shortens the schedule preventive maintenance, increase the number of instrument Calibrators, and train personnel. Test results on the consistency of the whole matrix of pairwise comparisons and consistency test showed the value of hierarchy the CR below 0.1.
Survey of Voyager plasma science ions at Jupiter: 1. Analysis method
NASA Astrophysics Data System (ADS)
Bagenal, F.; Dougherty, L. P.; Bodisch, K. M.; Richardson, J. D.; Belcher, J. M.
2017-08-01
The Voyagers 1 and 2 spacecraft flew by Jupiter in March and July of 1979, respectively. The Plasma Science instrument (PLS) acquired detailed measurements of the plasma environment in the equatorial region of the magnetosphere between 4.9 and 4 RJ. While bulk plasma properties such as charge density, ion temperature, and bulk flow were reasonably well determined, the ion composition was only well constrained in occasional regions of cold plasma. The ion data obtained by the PLS instrument have been reanalyzed using physical chemistry models to constrain the composition and reduce the number of free parameters, particularly in regions of hotter plasma. This paper describes the method used for fitting the plasma data and presents the results versus time. Two companion papers describe the composition of heavy ions and present analysis of protons plus other minor ions.
Method and apparatus for calibrating a particle emissions monitor
Flower, W.L.; Renzi, R.F.
1998-07-07
The invention discloses a method and apparatus for calibrating particulate emissions monitors, in particular, sampling probes, and in general, without removing the instrument from the system being monitored. A source of one or more specific metals in aerosol (either solid or liquid) or vapor form is housed in the instrument. The calibration operation is initiated by moving a focusing lens, used to focus a light beam onto an analysis location and collect the output light response, from an operating position to a calibration position such that the focal point of the focusing lens is now within a calibration stream issuing from a calibration source. The output light response from the calibration stream can be compared to that derived from an analysis location in the operating position to more accurately monitor emissions within the emissions flow stream. 6 figs.
Method and apparatus for calibrating a particle emissions monitor
Flower, William L.; Renzi, Ronald F.
1998-07-07
The instant invention discloses method and apparatus for calibrating particulate emissions monitors, in particular, and sampling probes, in general, without removing the instrument from the system being monitored. A source of one or more specific metals in aerosol (either solid or liquid) or vapor form is housed in the instrument. The calibration operation is initiated by moving a focusing lens, used to focus a light beam onto an analysis location and collect the output light response, from an operating position to a calibration position such that the focal point of the focusing lens is now within a calibration stream issuing from a calibration source. The output light response from the calibration stream can be compared to that derived from an analysis location in the operating position to more accurately monitor emissions within the emissions flow stream.
Quality of life in end stage renal disease patients
Joshi, Veena D
2014-01-01
AIM: To understand factors associated with quality of life (QOL), examine types of QOL instruments, and determine need for further improvements in QOL assessment. METHODS: The method used databases (Pubmed, Google scholar) and a bibliographic search using key words QOL, end stage renal disease, Hemodialysis, Peritoneal dialysis, instruments to measure QOL, patients and qualitative/quantitative analysis published during 1990 to June 2014. Each article was assessed for sample size, demographics of participants, study design and type of QOL instruments used. We used WHO definition of QOL. RESULTS: For this review, 109 articles were screened, out of which 65 articles were selected. Out of 65 articles, there were 19 reports/reviews and 12 questionnaire manuals. Of the 34 studies, 82% were quantitative while only 18% were qualitative. QOL instruments measured several phenomenon such as physical/psychological health, effects and burdens of kidney disease, social support etc. those are associated with QOL. Few studies looked at spiritual beliefs, cultural beliefs, personal concerns, as per the WHO definition. Telemedicine and Palliative care have now been successfully used however QOL instruments seldom addressed those in the articles reviewed. Also noticed was that longitudinal studies were rarely conducted. Existing QOL instruments only partially measure QOL. This may limit validity of predictive power of QOL. CONCLUSION: Culture and disease specific QOL instruments that assess patients’ objective and subjective experiences covering most aspects of QOL are urgently needed. PMID:25374827
NASA Astrophysics Data System (ADS)
de León, Jesús Ponce; Beltrán, José Ramón
2012-12-01
In this study, a new method of blind audio source separation (BASS) of monaural musical harmonic notes is presented. The input (mixed notes) signal is processed using a flexible analysis and synthesis algorithm (complex wavelet additive synthesis, CWAS), which is based on the complex continuous wavelet transform. When the harmonics from two or more sources overlap in a certain frequency band (or group of bands), a new technique based on amplitude similarity criteria is used to obtain an approximation to the original partial information. The aim is to show that the CWAS algorithm can be a powerful tool in BASS. Compared with other existing techniques, the main advantages of the proposed algorithm are its accuracy in the instantaneous phase estimation, its synthesis capability and that the only input information needed is the mixed signal itself. A set of synthetically mixed monaural isolated notes have been analyzed using this method, in eight different experiments: the same instrument playing two notes within the same octave and two harmonically related notes (5th and 12th intervals), two different musical instruments playing 5th and 12th intervals, two different instruments playing non-harmonic notes, major and minor chords played by the same musical instrument, three different instruments playing non-harmonically related notes and finally the mixture of a inharmonic instrument (piano) and one harmonic instrument. The results obtained show the strength of the technique.
Valente, Ana Rita S; Hall, Andreia; Alvelos, Helena; Leahy, Margaret; Jesus, Luis M T
2018-04-12
The appropriate use of language in context depends on the speaker's pragmatic language competencies. A coding system was used to develop a specific and adult-focused self-administered questionnaire to adults who stutter and adults who do not stutter, The Assessment of Language Use in Social Contexts for Adults, with three categories: precursors, basic exchanges, and extended literal/non-literal discourse. This paper presents the content validity, item analysis, reliability coefficients and evidences of construct validity of the instrument. Content validity analysis was based on a two-stage process: first, 11 pragmatic questionnaires were assessed to identify items that probe each pragmatic competency and to create the first version of the instrument; second, items were assessed qualitatively by an expert panel composed by adults who stutter and controls, and quantitatively and qualitatively by an expert panel composed by clinicians. A pilot study was conducted with five adults who stutter and five controls to analyse items and calculate reliability. Construct validity evidences were obtained using the hypothesized relationships method and factor analysis with 28 adults who stutter and 28 controls. Concerning content validity, the questionnaires assessed up to 13 pragmatic competencies. Qualitative and quantitative analysis revealed ambiguities in items construction. Disagreement between experts was solved through item modification. The pilot study showed that the instrument presented internal consistency and temporal stability. Significant differences between adults who stutter and controls and different response profiles revealed the instrument's underlying construct. The instrument is reliable and presented evidences of construct validity.
Sánchez-Margallo, Juan A; Sánchez-Margallo, Francisco M; Oropesa, Ignacio; Enciso, Silvia; Gómez, Enrique J
2017-02-01
The aim of this study is to present the construct and concurrent validity of a motion-tracking method of laparoscopic instruments based on an optical pose tracker and determine its feasibility as an objective assessment tool of psychomotor skills during laparoscopic suturing. A group of novice ([Formula: see text] laparoscopic procedures), intermediate (11-100 laparoscopic procedures) and experienced ([Formula: see text] laparoscopic procedures) surgeons performed three intracorporeal sutures on an ex vivo porcine stomach. Motion analysis metrics were recorded using the proposed tracking method, which employs an optical pose tracker to determine the laparoscopic instruments' position. Construct validation was measured for all 10 metrics across the three groups and between pairs of groups. Concurrent validation was measured against a previously validated suturing checklist. Checklists were completed by two independent surgeons over blinded video recordings of the task. Eighteen novices, 15 intermediates and 11 experienced surgeons took part in this study. Execution time and path length travelled by the laparoscopic dissector presented construct validity. Experienced surgeons required significantly less time ([Formula: see text]), travelled less distance using both laparoscopic instruments ([Formula: see text]) and made more efficient use of the work space ([Formula: see text]) compared with novice and intermediate surgeons. Concurrent validation showed strong correlation between both the execution time and path length and the checklist score ([Formula: see text] and [Formula: see text], [Formula: see text]). The suturing performance was successfully assessed by the motion analysis method. Construct and concurrent validity of the motion-based assessment method has been demonstrated for the execution time and path length metrics. This study demonstrates the efficacy of the presented method for objective evaluation of psychomotor skills in laparoscopic suturing. However, this method does not take into account the quality of the suture. Thus, future works will focus on developing new methods combining motion analysis and qualitative outcome evaluation to provide a complete performance assessment to trainees.
Strand, Pia; Sjöborg, Karolina; Stalmeijer, Renée; Wichmann-Hansen, Gitte; Jakobsson, Ulf; Edgren, Gudrun
2013-12-01
There is a paucity of instruments designed to evaluate the multiple dimensions of the workplace as an educational environment for undergraduate medical students. The aim was to develop and psychometrically evaluate an instrument to measure how undergraduate medical students perceive the clinical workplace environment, based on workplace learning theories and empirical findings. Development of the instrument relied on established standards including theoretical and empirical grounding, systematic item development and expert review at various stages to ensure content validity. Qualitative and quantitative methods were employed using a series of steps from conceptualization through psychometric analysis of scores in a Swedish medical student population. The final result was a 25-item instrument with two overarching dimensions, experiential learning and social participation, and four subscales that coincided well with theory and empirical findings: Opportunities to learn in and through work & quality of supervision; Preparedness for student entry; Workplace interaction patterns & student inclusion; and Equal treatment. Evidence from various sources supported content validity, construct validity and reliability of the instrument. The Undergraduate Clinical Education Environment Measure represents a valid, reliable and feasible multidimensional instrument for evaluation of the clinical workplace as a learning environment for undergraduate medical students. Further validation in different populations using various psychometric methods is needed.
Methods for collection and analysis of water samples
Rainwater, Frank Hays; Thatcher, Leland Lincoln
1960-01-01
This manual contains methods used by the U.S. Geological Survey to collect, preserve, and analyze water samples. Throughout, the emphasis is on obtaining analytical results that accurately describe the chemical composition of the water in situ. Among the topics discussed are selection of sampling sites, frequency of sampling, field equipment, preservatives and fixatives, analytical techniques of water analysis, and instruments. Seventy-seven laboratory and field procedures are given for determining fifty-three water properties.
X-ray microprobe analysis of platelets. Principles, methods and review of the literature.
Yarom, R
1983-01-01
Platelets are well suited to X-ray microanalysis as there is no need for chemical fixation or sectioning, and the concentrations of calcium and phosphorus are above 10(-3). The principles of the technique, the methods of specimen preparation, instrumental conditions during analysis and ways of quantitation are described. This is followed by a review of published reports and a brief summary of the author's own work in the field.
ERIC Educational Resources Information Center
Huang, Shu Rong; Palmer, Peter T.
2017-01-01
This paper describes a method for determination of trihalomethanes (THMs) in drinking water via solid-phase microextraction (SPME) GC/MS as a means to develop and improve student understanding of the use of GC/MS for qualitative and quantitative analysis. In the classroom, students are introduced to SPME, GC/MS instrumentation, and the use of MS…
Bioanalytical methods for food contaminant analysis.
Van Emon, Jeanette M
2010-01-01
Foods are complex mixtures of lipids, carbohydrates, proteins, vitamins, organic compounds, and other naturally occurring substances. Sometimes added to this mixture are residues of pesticides, veterinary and human drugs, microbial toxins, preservatives, contaminants from food processing and packaging, and other residues. This milieu of compounds can pose difficulties in the analysis of food contaminants. There is an expanding need for rapid and cost-effective residue methods for difficult food matrixes to safeguard our food supply. Bioanalytical methods are established for many food contaminants such as mycotoxins and are the method of choice for many food allergens. Bioanalytical methods are often more cost-effective and sensitive than instrumental procedures. Recent developments in bioanalytical methods may provide more applications for their use in food analysis.
Van Dijk-de Vries, Anneke N.; Duimel-Peeters, Inge G. P.; Muris, Jean W.; Wesseling, Geertjan J.; Beusmans, George H. M. I.
2016-01-01
Introduction: Teamwork between healthcare providers is conditional for the delivery of integrated care. This study aimed to assess the usefulness of the conceptual framework Integrated Team Effectiveness Model for developing and testing of the Integrated Team Effectiveness Instrument. Theory and methods: Focus groups with healthcare providers in an integrated care setting for people with chronic obstructive pulmonary disease (COPD) were conducted to examine the recognisability of the conceptual framework and to explore critical success factors for collaborative COPD practice out of this framework. The resulting items were transposed into a pilot instrument. This was reviewed by expert opinion and completed 153 times by healthcare providers. The underlying structure and internal consistency of the instrument were verified by factor analysis and Cronbach’s alpha. Results: The conceptual framework turned out to be comprehensible for discussing teamwork effectiveness. The pilot instrument measures 25 relevant aspects of teamwork in integrated COPD care. Factor analysis suggested three reliable components: teamwork effectiveness, team processes and team psychosocial traits (Cronbach’s alpha between 0.76 and 0.81). Conclusions and discussion: The conceptual framework Integrated Team Effectiveness Model is relevant in developing a practical full-spectrum instrument to facilitate discussing teamwork effectiveness. The Integrated Team Effectiveness Instrument provides a well-founded basis to self-evaluate teamwork effectiveness in integrated COPD care by healthcare providers. Recommendations are provided for the improvement of the instrument. PMID:27616953
Electronic Implementation of Integrated End-of-life Care: A Local Approach
Schlieper, Daniel; Altreuther, Christiane; Schallenburger, Manuela; Neukirchen, Martin; Schmitz, Andrea
2017-01-01
Introduction: The Liverpool Care Pathway for the Dying Patient is an instrument to deliver integrated care for patients in their last hours of life. Originally a paper-based system, this study investigates the feasibility of an electronic version. Methods: An electronic Liverpool Care Pathway was implemented in a specialized palliative care unit of a German university hospital. Its use is exemplified by means of auditing and analysis of the proportion of recorded items. Results: In the years 2013 and 2014 the electronic Liverpool Care Pathway was used for the care of 159 patients. The uptake of the instrument was high (67%). Most items were recorded. Apart from a high usability, the fast data retrieval allows fast analysis for auditing and research. Conclusions and discussion: The electronic instrument is feasible in a computerized ward and has strong advantages for retrospective analysis. Trial registration: Internal Clinical Trial Register of the Medical Faculty, Heinrich Heine University Düsseldorf, No. 2015124683 (7 December 2015). PMID:28970746
The impact of clinical use on the torsional behavior of Reciproc and WaveOne instruments
de MAGALHÃES, Rafael Rodrigues Soares; BRAGA, Lígia Carolina Moreira; PEREIRA, Érika Sales Joviano; PEIXOTO, Isabella Faria da Cunha; BUONO, Vicente Tadeu Lopes; BAHIA, Maria Guiomar de Azevedo
2016-01-01
ABSTRACT Torsional overload is a fracture representative parameter for instruments in single-file techniques. Objective The aim of this study was to assess the influence of clinical use, in vivo, on the torsional behavior of Reciproc and WaveOne instruments considering the possibility that they degraded with use. Material and Methods Diameter at each millimeter, pitch length, and area at 3 mm from the tip were determined for both types of instruments. Twenty-four instruments, size 25, 0.08 taper, of each system were divided into two groups (n=12 each): Control Group (CG), in which new Reciproc (RC) and WaveOne Primary (WO) instruments were tested in torsion until rupture based on ISO 3630-1; and Experimental Group (EG), in which each new instrument was clinically used to clean and shape the root canals of one molar. After clinical use, the instruments were analyzed using optical and scanning electron microscopy and subsequently tested in torsion until fracture. Data were analyzed using one-way analysis of variance at a=.05. Results WO instruments showed significantly higher mean values of cross-sectional area A3 (P=0.000) and smaller pitch lengths than RC instruments with no statistically significant differences in the diameter at D3 (P=0.521). No significant differences in torsional resistance between the RC and WO new instruments (P=0.134) were found. The clinical use resulted in a tendency of reduction in the maximum torque of the analyzed instruments but no statistically significant difference was observed between them (P=0.327). During the preparation of the root canals, two fractured RC instruments and longitudinal and transversal cracks in RC and WO instruments were observed through SEM analysis. Conclusion After clinical use, no statistically significant reduction in the torsional resistance was observed. PMID:27556200
A Study of Morrison's Iterative Noise Removal Method. Final Report M. S. Thesis
NASA Technical Reports Server (NTRS)
Ioup, G. E.; Wright, K. A. R.
1985-01-01
Morrison's iterative noise removal method is studied by characterizing its effect upon systems of differing noise level and response function. The nature of data acquired from a linear shift invariant instrument is discussed so as to define the relationship between the input signal, the instrument response function, and the output signal. Fourier analysis is introduced, along with several pertinent theorems, as a tool to more thorough understanding of the nature of and difficulties with deconvolution. In relation to such difficulties the necessity of a noise removal process is discussed. Morrison's iterative noise removal method and the restrictions upon its application are developed. The nature of permissible response functions is discussed, as is the choice of the response functions used.
NASA Astrophysics Data System (ADS)
Tonutare, Tonu; Krebstein, Kadri; Rodima, Ako; Kõlli, Raimo; Künnapas, Allan; Rebane, Jaanus; Penu, Priit; Vennik, Kersti; Soobik, Liina
2015-04-01
Soils provide vital ecosystem functions, playing an important role in our economy and in healthy living environment. However, soils are increasingly degrading in Europe and at the global level. Knowledge about the content of major plant available nutrients, i.e. calcium, magnesium, potassium and phosphorus, plays an important role in the sustainable soil management. Mobility of nutrients depends directly on the environmental conditions, two of the most important factors are the pH and organic matter content. Therefore it is essential to have correct information about the content and behaviour of the above named elements in soil, both from the environmental and agronomical viewpoint. During the last decades several extracting solutions which are suitable for the evaluation of nutrient status of soils have been developed for this purpose. One of them is called Mehlich 3 which is widely used in USA, Canada and some European countries (e.g. Estonia, Czech Republic) because of its suitability to extract several major plant nutrients from the soil simultaneously. There are several different instrumental methods used for the analysis of nutrient elements in the soil extract. Potassium, magnesium and calcium are widely analysed by the AAS (atomic absorption spectroscopic) method or by the ICP (inductively coupled plasma) spectroscopic methods. Molecular spectroscopy and ICP spectroscopy were used for the phosphorus determination. In 2011 a new multielemental instrumental method MP-AES (microwave plasma atomic emission spectroscopy) was added to them. Due to its lower detection limits and multielemental character, compared with AAS, and lower exploitation costs, compared with ICP, the MP-AES has a good potential to achieve a leading position in soil nutrient analysis in the future. The objective of this study was to investigate: (i) the impact of soil pH and humus content and (ii) applicability of MP-AES instrumental method for the determination of soil nutrients extracted according to Mehlich 3. For the experiment 100 soil samples with different content of organic matter and pH were used. The determination of Ca, Mg, K and P was analysed by MP and ICP methods and additionally P was analysed molecular spectroscopically. Within the framework of the study the regressions between MP and ICP methods were created for all the analysed elements, i.e. Ca, Mg, K and P. According to MP and ICP, the relationships between the analysed soil major nutrient contents at different soil humus levels and at different pH ranges were determined for the evaluation of their impact. The optimal instrumental settings for calcium, magnesium and potassium analysis, according to Mehlich 3 using MP-AES method, are reported.
Factor analysis of an instrument to measure the impact of disease on daily life.
Pedrosa, Rafaela Batista Dos Santos; Rodrigues, Roberta Cunha Matheus; Padilha, Kátia Melissa; Gallani, Maria Cecília Bueno Jayme; Alexandre, Neusa Maria Costa
2016-01-01
to verify the structure of factors of an instrument to measure the Heart Valve Disease Impact on Daily Life (IDCV) when applied to coronary artery disease patients. the study included 153 coronary artery disease patients undergoing outpatient follow-up care. The IDCV structure of factors was initially assessed by means of confirmatory factor analysis and, subsequently, by exploratory factor analysis. The Varimax rotation method was used to estimate the main components of analysis, eigenvalues greater than one for extraction of factors, and factor loading greater than 0.40 for selection of items. Internal consistency was estimated using Cronbach's alpha coefficient. confirmatory factor analysis did not confirm the original structure of factors of the IDCV. Exploratory factor analysis showed three dimensions, which together explained 78% of the measurement variance. future studies with expansion of case selection are necessary to confirm the IDCV new structure of factors.
2013-01-01
Background High resolution melting analysis (HRM) is a rapid and cost-effective technique for the characterisation of PCR amplicons. Because the reverse genetics of segmented influenza A viruses allows the generation of numerous influenza A virus reassortants within a short time, methods for the rapid selection of the correct recombinants are very useful. Methods PCR primer pairs covering the single nucleotide polymorphism (SNP) positions of two different influenza A H5N1 strains were designed. Reassortants of the two different H5N1 isolates were used as a model to prove the suitability of HRM for the selection of the correct recombinants. Furthermore, two different cycler instruments were compared. Results Both cycler instruments generated comparable average melting peaks, which allowed the easy identification and selection of the correct cloned segments or reassorted viruses. Conclusions HRM is a highly suitable method for the rapid and precise characterisation of cloned influenza A genomes. PMID:24028349
MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications
Medina, Isabel; Cappiello, Achille; Careri, Maria
2018-01-01
Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370
Evaluation of methods for rapid determination of freezing point of aviation fuels
NASA Technical Reports Server (NTRS)
Mathiprakasam, B.
1982-01-01
Methods for identification of the more promising concepts for the development of a portable instrument to rapidly determine the freezing point of aviation fuels are described. The evaluation process consisted of: (1) collection of information on techniques previously used for the determination of the freezing point, (2) screening and selection of these techniques for further evaluation of their suitability in a portable unit for rapid measurement, and (3) an extensive experimental evaluation of the selected techniques and a final selection of the most promising technique. Test apparatuses employing differential thermal analysis and the change in optical transparency during phase change were evaluated and tested. A technique similar to differential thermal analysis using no reference fuel was investigated. In this method, the freezing point was obtained by digitizing the data and locating the point of inflection. Results obtained using this technique compare well with those obtained elsewhere using different techniques. A conceptual design of a portable instrument incorporating this technique is presented.
A Comparative Study of Shaping Ability of four Rotary Systems
Zarzosa, José Ignacio; Pallarés, Antonio
2015-01-01
Purpose This study compared the cutting area, instrumentation time, root canal anatomy preservation and non-instrumented areas obtained by F360®, Mtwo®, RaCe® and Hyflex® files with ISO size 35. Material and Methods 120 teeth with a single straight root and root canal were divided into 4 groups. Working length was calculated by using X-rays. The teeth were sectioned with a handpiece and a diamond disc, and the sections were observed with Nikon SMZ-2T stereoscopic microscope and an Intralux 4000-1 light source. The groups were adjusted with a preoperative analysis with AutoCAD. The teeth were reconstructed by a #10 K-File and epoxy glue. Each group was instrumented with one of the four file systems. The instrumentation time was calculated with a 1/100 second chronometer. The area of the thirds and root canal anatomy preservation were analyzed with AutoCAD 2013 and the non-instrumented areas with AutoCAD 2013 and SMZ-2T stereoscopic microscope. The statistical analysis was made with Levene’s Test, ANOVA, Bonferroni Test and Pearson´s Chi-square. Results Equal variances were shown by Levene’s Test (P > 0.05). ANOVA (P > 0.05) showed the absence of significant differences. There were significant differences in the instrumentation time (P < 0.05). For root canal anatomy preservation and non-instrumented areas, there were no significant differences between all systems (P > 0.05). Conclusions The 4 different rotary systems produced similar cutting area, root canal anatomy preservation and non-instrumented areas. Regarding instrumentation time, F360® was the fastest system statistically. PMID:27688412
Instrument-independent analysis of music by means of the continuous wavelet transform
NASA Astrophysics Data System (ADS)
Olmo, Gabriella; Dovis, Fabio; Benotto, Paolo; Calosso, Claudio; Passaro, Pierluigi
1999-10-01
This paper deals with the problem of automatic recognition of music. Segments of digitized music are processed by means of a Continuous Wavelet Transform, properly chosen so as to match the spectral characteristics of the signal. In order to achieve a good time-scale representation of the signal components a novel wavelet has been designed suited to the musical signal features. particular care has been devoted towards an efficient implementation, which operates in the frequency domain, and includes proper segmentation and aliasing reduction techniques to make the analysis of long signals feasible. The method achieves very good performance in terms of both time and frequency selectivity, and can yield the estimate and the localization in time of both the fundamental frequency and the main harmonics of each tone. The analysis is used as a preprocessing step for a recognition algorithm, which we show to be almost independent on the instrument reproducing the sounds. Simulations are provided to demonstrate the effectiveness of the proposed method.
New pediatric vision screener, part II: electronics, software, signal processing and validation.
Gramatikov, Boris I; Irsch, Kristina; Wu, Yi-Kai; Guyton, David L
2016-02-04
We have developed an improved pediatric vision screener (PVS) that can reliably detect central fixation, eye alignment and focus. The instrument identifies risk factors for amblyopia, namely eye misalignment and defocus. The device uses the birefringence of the human fovea (the most sensitive part of the retina). The optics have been reported in more detail previously. The present article focuses on the electronics and the analysis algorithms used. The objective of this study was to optimize the analog design, data acquisition, noise suppression techniques, the classification algorithms and the decision making thresholds, as well as to validate the performance of the research instrument on an initial group of young test subjects-18 patients with known vision abnormalities (eight male and 10 female), ages 4-25 (only one above 18) and 19 controls with proven lack of vision issues. Four statistical methods were used to derive decision making thresholds that would best separate patients with abnormalities from controls. Sensitivity and specificity were calculated for each method, and the most suitable one was selected. Both the central fixation and the focus detection criteria worked robustly and allowed reliable separation between normal test subjects and symptomatic subjects. The sensitivity of the instrument was 100 % for both central fixation and focus detection. The specificity was 100 % for central fixation and 89.5 % for focus detection. The overall sensitivity was 100 % and the overall specificity was 94.7 %. Despite the relatively small initial sample size, we believe that the PVS instrument design, the analysis methods employed, and the device as a whole, will prove valuable for mass screening of children.
Najimi, Arash; Mostafavi, Firoozeh; Sharifirad, Gholamreza; Golshiri, Parastoo
2017-01-01
BACKGROUND: This study was aimed at developing and studying the scale of self-efficacy in adherence to treatment in Iranian patients with hypertension. METHODS: A mix-method study was conducted on the two stages: in the first phase, a qualitative study was done using content analysis through deep and semi-structured interviews. After data analysis, the draft of tool was prepared. Items in the draft were selected based on the extracted concepts. In the second phase, validity and reliability of the instrument were implemented using a quantitative study. The prepared instrument in the first phase was studied among 612 participants. To test the construct validity and internal consistency, exploratory factor analysis and Cronbach's alpha were used, respectively. To study the validity of the final scale, the average score of self-efficacy in patients with controlled hypertension were compared with patients with uncontrolled hypertension. RESULTS: In overall, 16 patients were interviewed. Twenty-six items were developed to assess different concepts of self-efficacy. Concept-related items were extracted from interviews to study the face validity of the tool from patient's point of view. Four items were deleted because scored 0.79 in content validity. The mean of questionnaire content validity was 0.85. Items were collected in two factors with an eigenvalue >1. Four items were deleted with load factor <0.4. Reliability was 0.84 for the entire instrument. CONCLUSION: Self-efficacy scale in patients with hypertension is a valid and reliable instrument that can effectively evaluate the self-efficacy in medication adherence in the management of hypertension. PMID:29114551
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashby, H.A.; Carlson, T.R.; Husson, L.
1986-01-01
Significant reductions in motor vehicle emissions are possible through the implementation of inspection and maintenance (I/M) programs. However, the potential benefits of I/M are obviously not achieved when specific inspection requirements are ignored or improperly performed. In addition, I/M benefits may be substantially reduced when improper repair procedures are used on vehicles which fail the test. In order for the ''theoretical'' benefits of I/M to be achieved, certain program design and enforcement procedures are necessary. The use of instrumentation and data analysis methods capable of identifying individuals who are improperly performing inspections and repairs is critical.
A strategy for compression and analysis of massive geophysical data sets
NASA Technical Reports Server (NTRS)
Braverman, A.
2001-01-01
This paper describes a method for summaraizing data in a way that approximately preserves high-resolution data structure while reducing data volume and maintaining global integrity of very large, remote sensing data sets. The method is under development for one of Terra's instruments, the Multi-angle Imaging SpectroRadiometer (MISR).
ERIC Educational Resources Information Center
McChlery, Stuart; Visser, Susan
2009-01-01
Learners preferentially take in and process information in diverse ways whilst teaching methods also vary presenting the possibility for mismatching teaching methods with learners' preferences leading to disengagement, ineffective learning and potential underperformance. Different research instruments have been used in the past to assess the…
Acoustic inspection of concrete bridge decks
NASA Astrophysics Data System (ADS)
Henderson, Mark E.; Dion, Gary N.; Costley, R. Daniel
1999-02-01
The determination of concrete integrity, especially in concrete bridge decks, is of extreme importance. Current systems for testing concrete structures are expensive, slow, or tedious. State of the art systems use ground penetrating radar, but they have inherent problems especially with ghosting and signal signature overlap. The older method of locating delaminations in bridge decks involves either tapping on the surface with a hammer or metal rod, or dragging a chain-bar across the bridge deck. Both methods require a `calibrated' ear to determine the difference between good sections and bad sections of concrete. As a consequence, the method is highly subjective, different from person to person and even day to day for a given person. In addition, archival of such data is impractical, or at least improbable, in most situations. The Diagnostic Instrumentation and Analysis Laboratory has constructed an instrument that implements the chain-drag method of concrete inspection. The system is capable of real-time analysis of recorded signals, archival of processed data, and high-speed data acquisition so that post-processing of the data is possible for either research purposes or for listening to the recorded signals.
NASA Astrophysics Data System (ADS)
Borovski, A.; Postylyakov, O.; Elokhov, A.; Bruchkovski, I.
2017-11-01
An instrument for measuring atmospheric trace gases by DOAS method using scattered solar radiation was developed in A.M.Obukhov IAP RAS. The instrument layout is based on the lab Shamrock 303i spectrograph supplemented by 2-port radiation input system employing optical fiber. Optical ports may be used with a telescope with fixed field of view or with a scanning MAX-DOAS unit. MAX-DOAS unit port will be used for investigation of gas contents and profiles in the low troposphere. In September 2016 the IAP instrument participated in the CINDI-2 campaign, held in the Netherlands. CINDI 2 (2nd Cabauw Intercomparison of Nitrogen Dioxide Measuring Instruments) involves about 40 instruments quasi-synchronously performing DOAS measurements of NO2 and other trace gases. During the campaign the instrument ports had telescopes A and B with similar field of view of about 0.3°. Telescope A was always directed to the zenith. Telescope B was directed at 5° elevation angle. Two gratings were installed in the spectrometer. They provide different spectral resolution (FWHM 0.4 and 0.8 nm respectively) and spectral window width ( 70 and 140 nm respectively). During CINDI-2 campaign we performed test measurements in UV and visible wavelength ranges to investigate instrument stability and retrieval errors of NO2 and HCHO contents. We perform the preliminary error analysis of retrieval of the NO2 and HCHO differential slant column densities using spectra measured in four modes of the instrument basing on residual noise analysis in this paper. It was found that rotation of grating turret does not significantly affected on quality of NO2 DSCD retrieval from spectra which measured in visible spectral region. Influence of grating turret rotation is much more significant for gas DSCD retrieval from spectra which measured in UV spectral region. Standard deviation of retrieval error points to presence of some systematic error.
Methodological considerations when translating “burnout”☆
Squires, Allison; Finlayson, Catherine; Gerchow, Lauren; Cimiotti, Jeannie P.; Matthews, Anne; Schwendimann, Rene; Griffiths, Peter; Busse, Reinhard; Heinen, Maude; Brzostek, Tomasz; Moreno-Casbas, Maria Teresa; Aiken, Linda H.; Sermeus, Walter
2014-01-01
No study has systematically examined how researchers address cross-cultural adaptation of burnout. We conducted an integrative review to examine how researchers had adapted the instruments to the different contexts. We reviewed the Content Validity Indexing scores for the Maslach Burnout Inventory-Human Services Survey from the 12-country comparative nursing workforce study, RN4CAST. In the integrative review, multiple issues related to translation were found in existing studies. In the cross-cultural instrument analysis, 7 out of 22 items on the instrument received an extremely low kappa score. Investigators may need to employ more rigorous cross-cultural adaptation methods when attempting to measure burnout. PMID:25343131
Error Ratio Analysis: Alternate Mathematics Assessment for General and Special Educators.
ERIC Educational Resources Information Center
Miller, James H.; Carr, Sonya C.
1997-01-01
Eighty-seven elementary students in grades four, five, and six, were administered a 30-item multiplication instrument to assess performance in computation across grade levels. An interpretation of student performance using error ratio analysis is provided and the use of this method with groups of students for instructional decision making is…
E-Books in Academic Libraries: Results of a Survey Carried out in Sweden and Lithuania
ERIC Educational Resources Information Center
Maceviciute, Elena; Wilson, T. D.; Gudinavicius, Arunas; Šuminas, Andrius
2017-01-01
Introduction: This paper reports on a study of e-books issues in academic libraries in two European countries representative of small language markets--Sweden and Lithuania. Method: Questionnaire surveys, using the same instrument, were carried out in Swedish and Lithuanian academic libraries. Analysis: Quantitative analysis was performed using…
A Standards-Based Content Analysis of Selected Biological Science Websites
ERIC Educational Resources Information Center
Stewart, Joy E.
2010-01-01
The purpose of this study was to analyze the biology content, instructional strategies, and assessment methods of 100 biological science websites that were appropriate for Grade 12 educational purposes. For the analysis of each website, an instrument, developed from the National Science Education Standards (NSES) for Grade 12 Life Science coupled…
An instrument to assess the statistical intensity of medical research papers.
Nieminen, Pentti; Virtanen, Jorma I; Vähänikkilä, Hannu
2017-01-01
There is widespread evidence that statistical methods play an important role in original research articles, especially in medical research. The evaluation of statistical methods and reporting in journals suffers from a lack of standardized methods for assessing the use of statistics. The objective of this study was to develop and evaluate an instrument to assess the statistical intensity in research articles in a standardized way. A checklist-type measure scale was developed by selecting and refining items from previous reports about the statistical contents of medical journal articles and from published guidelines for statistical reporting. A total of 840 original medical research articles that were published between 2007-2015 in 16 journals were evaluated to test the scoring instrument. The total sum of all items was used to assess the intensity between sub-fields and journals. Inter-rater agreement was examined using a random sample of 40 articles. Four raters read and evaluated the selected articles using the developed instrument. The scale consisted of 66 items. The total summary score adequately discriminated between research articles according to their study design characteristics. The new instrument could also discriminate between journals according to their statistical intensity. The inter-observer agreement measured by the ICC was 0.88 between all four raters. Individual item analysis showed very high agreement between the rater pairs, the percentage agreement ranged from 91.7% to 95.2%. A reliable and applicable instrument for evaluating the statistical intensity in research papers was developed. It is a helpful tool for comparing the statistical intensity between sub-fields and journals. The novel instrument may be applied in manuscript peer review to identify papers in need of additional statistical review.
Stanifer, John W.; Karia, Francis; Voils, Corrine I.; Turner, Elizabeth L.; Maro, Venance; Shimbi, Dionis; Kilawe, Humphrey; Lazaro, Matayo; Patel, Uptal D.
2015-01-01
Introduction Non-communicable diseases are a growing global burden, and structured surveys can identify critical gaps to address this epidemic. In sub-Saharan Africa, there are very few well-tested survey instruments measuring population attributes related to non-communicable diseases. To meet this need, we have developed and validated the first instrument evaluating knowledge, attitudes and practices pertaining to chronic kidney disease in a Swahili-speaking population. Methods and Results Between December 2013 and June 2014, we conducted a four-stage, mixed-methods study among adults from the general population of northern Tanzania. In stage 1, the survey instrument was constructed in English by a group of cross-cultural experts from multiple disciplines and through content analysis of focus group discussions to ensure local significance. Following translation, in stage 2, we piloted the survey through cognitive and structured interviews, and in stage 3, in order to obtain initial evidence of reliability and construct validity, we recruited and then administered the instrument to a random sample of 606 adults. In stage 4, we conducted analyses to establish test-retest reliability and known-groups validity which was informed by thematic analysis of the qualitative data in stages 1 and 2. The final version consisted of 25 items divided into three conceptual domains: knowledge, attitudes and practices. Each item demonstrated excellent test-retest reliability with established content and construct validity. Conclusions We have developed a reliable and valid cross-cultural survey instrument designed to measure knowledge, attitudes and practices of chronic kidney disease in a Swahili-speaking population of Northern Tanzania. This instrument may be valuable for addressing gaps in non-communicable diseases care by understanding preferences regarding healthcare, formulating educational initiatives, and directing development of chronic disease management programs that incorporate chronic kidney disease across sub-Saharan Africa. PMID:25811781
Correcting for the effects of pupil discontinuities with the ACAD method
NASA Astrophysics Data System (ADS)
Mazoyer, Johan; Pueyo, Laurent; N'Diaye, Mamadou; Mawet, Dimitri; Soummer, Rémi; Norman, Colin
2016-07-01
The current generation of ground-based coronagraphic instruments uses deformable mirrors to correct for phase errors and to improve contrast levels at small angular separations. Improving these techniques, several space and ground based instruments are currently developed using two deformable mirrors to correct for both phase and amplitude errors. However, as wavefront control techniques improve, more complex telescope pupil geometries (support structures, segmentation) will soon be a limiting factor for these next generation coronagraphic instruments. The technique presented in this proceeding, the Active Correction of Aperture Discontinuities method, is taking advantage of the fact that most future coronagraphic instruments will include two deformable mirrors, and is proposing to find the shapes and actuator movements to correct for the effect introduced by these complex pupil geometries. For any coronagraph previously designed for continuous apertures, this technique allow to obtain similar performance in contrast with a complex aperture (with segmented and secondary mirror support structures), with high throughput and flexibility to adapt to changing pupil geometry (e.g. in case of segment failure or maintenance of the segments). We here present the results of the parametric analysis realized on the WFIRST pupil for which we obtained high contrast levels with several deformable mirror setups (size, separation between them), coronagraphs (Vortex charge 2, vortex charge 4, APLC) and spectral bandwidths. However, because contrast levels and separation are not the only metrics to maximize the scientific return of an instrument, we also included in this study the influence of these deformable mirror shapes on the throughput of the instrument and sensitivity to pointing jitters. Finally, we present results obtained on another potential space based telescope segmented aperture. The main result of this proceeding is that we now obtain comparable performance than the coronagraphs previously designed for WFIRST. First result from the parametric analysis strongly suggest that the 2 deformable mirror set up (size and distance between them) have a important impact on the performance in contrast and throughput of the final instrument.
The holistic analysis of gamma-ray spectra in instrumental neutron activation analysis
NASA Astrophysics Data System (ADS)
Blaauw, Menno
1994-12-01
A method for the interpretation of γ-ray spectra as obtained in INAA using linear least squares techniques is described. Results obtained using this technique and the traditional method previously in use at IRI are compared. It is concluded that the method presented performs better with respect to the number of detected elements, the resolution of interferences and the estimation of the accuracies of the reported element concentrations. It is also concluded that the technique is robust enough to obviate the deconvolution of multiplets.
Hu, Yannan; van Lenthe, Frank J; Hoffmann, Rasmus; van Hedel, Karen; Mackenbach, Johan P
2017-04-20
The scientific evidence-base for policies to tackle health inequalities is limited. Natural policy experiments (NPE) have drawn increasing attention as a means to evaluating the effects of policies on health. Several analytical methods can be used to evaluate the outcomes of NPEs in terms of average population health, but it is unclear whether they can also be used to assess the outcomes of NPEs in terms of health inequalities. The aim of this study therefore was to assess whether, and to demonstrate how, a number of commonly used analytical methods for the evaluation of NPEs can be applied to quantify the effect of policies on health inequalities. We identified seven quantitative analytical methods for the evaluation of NPEs: regression adjustment, propensity score matching, difference-in-differences analysis, fixed effects analysis, instrumental variable analysis, regression discontinuity and interrupted time-series. We assessed whether these methods can be used to quantify the effect of policies on the magnitude of health inequalities either by conducting a stratified analysis or by including an interaction term, and illustrated both approaches in a fictitious numerical example. All seven methods can be used to quantify the equity impact of policies on absolute and relative inequalities in health by conducting an analysis stratified by socioeconomic position, and all but one (propensity score matching) can be used to quantify equity impacts by inclusion of an interaction term between socioeconomic position and policy exposure. Methods commonly used in economics and econometrics for the evaluation of NPEs can also be applied to assess the equity impact of policies, and our illustrations provide guidance on how to do this appropriately. The low external validity of results from instrumental variable analysis and regression discontinuity makes these methods less desirable for assessing policy effects on population-level health inequalities. Increased use of the methods in social epidemiology will help to build an evidence base to support policy making in the area of health inequalities.
Radioactive sample effects on EDXRF spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worley, Christopher G
2008-01-01
Energy dispersive X-ray fluorescence (EDXRF) is a rapid, straightforward method to determine sample elemental composition. A spectrum can be collected in a few minutes or less, and elemental content can be determined easily if there is adequate energy resolution. Radioactive alpha emitters, however, emit X-rays during the alpha decay process that complicate spectral interpretation. This is particularly noticeable when using a portable instrument where the detector is located in close proximity to the instrument analysis window held against the sample. A portable EDXRF instrument was used to collect spectra from specimens containing plutonium-239 (a moderate alpha emitter) and americium-241 (amore » heavy alpha emitter). These specimens were then analyzed with a wavelength dispersive XRF (WDXRF) instrument to demonstrate the differences to which sample radiation-induced X-ray emission affects the detectors on these two types of XRF instruments.« less
Estimation of ovular fiber production in cotton
Van`t Hof, J.
1998-09-01
The present invention is a method for rendering cotton fiber cells that are post-anthesis and pre-harvest available for analysis of their physical properties. The method includes the steps of hydrolyzing cotton fiber cells and separating cotton fiber cells from cotton ovules thereby rendering the cells available for analysis. The analysis of the fiber cells is through any suitable means, e.g., visual inspection. Visual inspection of the cells can be accomplished by placing the cells under an instrument for detection, such as microscope or other means. 4 figs.
Estimation of ovular fiber production in cotton
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van`t Hof, J.
The present invention is a method for rendering cotton fiber cells that are post-anthesis and pre-harvest available for analysis of their physical properties. The method includes the steps of hydrolyzing cotton fiber cells and separating cotton fiber cells from cotton ovules thereby rendering the cells available for analysis. The analysis of the fiber cells is through any suitable means, e.g., visual inspection. Visual inspection of the cells can be accomplished by placing the cells under an instrument for detection, such as microscope or other means. 4 figs.
Calibration Issues and Operating System Requirements for Electron-Probe Microanalysis
NASA Technical Reports Server (NTRS)
Carpenter, P.
2006-01-01
Instrument purchase requirements and dialogue with manufacturers have established hardware parameters for alignment, stability, and reproducibility, which have helped improve the precision and accuracy of electron microprobe analysis (EPMA). The development of correction algorithms and the accurate solution to quantitative analysis problems requires the minimization of systematic errors and relies on internally consistent data sets. Improved hardware and computer systems have resulted in better automation of vacuum systems, stage and wavelength-dispersive spectrometer (WDS) mechanisms, and x-ray detector systems which have improved instrument stability and precision. Improved software now allows extended automated runs involving diverse setups and better integrates digital imaging and quantitative analysis. However, instrumental performance is not regularly maintained, as WDS are aligned and calibrated during installation but few laboratories appear to check and maintain this calibration. In particular, detector deadtime (DT) data is typically assumed rather than measured, due primarily to the difficulty and inconvenience of the measurement process. This is a source of fundamental systematic error in many microprobe laboratories and is unknown to the analyst, as the magnitude of DT correction is not listed in output by microprobe operating systems. The analyst must remain vigilant to deviations in instrumental alignment and calibration, and microprobe system software must conveniently verify the necessary parameters. Microanalysis of mission critical materials requires an ongoing demonstration of instrumental calibration. Possible approaches to improvements in instrument calibration, quality control, and accuracy will be discussed. Development of a set of core requirements based on discussions with users, researchers, and manufacturers can yield documents that improve and unify the methods by which instruments can be calibrated. These results can be used to continue improvements of EPMA.
Analytical techniques and instrumentation: A compilation
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information on developments in instrumentation is arranged into four sections: (1) instrumentation for analysis; (2) analysis of matter; (3) analysis of electrical and mechanical phenomena; and (4) structural analysis. Patent information for two of the instruments described is presented.
Clinical protein mass spectrometry.
Scherl, Alexander
2015-06-15
Quantitative protein analysis is routinely performed in clinical chemistry laboratories for diagnosis, therapeutic monitoring, and prognosis. Today, protein assays are mostly performed either with non-specific detection methods or immunoassays. Mass spectrometry (MS) is a very specific analytical method potentially very well suited for clinical laboratories. Its unique advantage relies in the high specificity of the detection. Any protein sequence variant, the presence of a post-translational modification or degradation will differ in mass and structure, and these differences will appear in the mass spectrum of the protein. On the other hand, protein MS is a relatively young technique, demanding specialized personnel and expensive instrumentation. Many scientists and opinion leaders predict MS to replace immunoassays for routine protein analysis, but there are only few protein MS applications routinely used in clinical chemistry laboratories today. The present review consists of a didactical introduction summarizing the pros and cons of MS assays compared to immunoassays, the different instrumentations, and various MS protein assays that have been proposed and/or are used in clinical laboratories. An important distinction is made between full length protein analysis (top-down method) and peptide analysis after enzymatic digestion of the proteins (bottom-up method) and its implication for the protein assay. The document ends with an outlook on what type of analyses could be used in the future, and for what type of applications MS has a clear advantage compared to immunoassays. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Okada, Asahi A.
2005-01-01
Polycyclic aromatic hydrocarbons are a class of molecules composed of multiple, bonded benzene rings. As PAHS are believed to be present on Mars, positive confirmation of their presence on Mars is highly desirable. To extract PAHS, which have low volatility, a fluid extraction method is ideal, and one that does not utilize organic solvents is especially ideal for in situ instrumental analysis. The use of water as a solvent, which at subcritical pressures and temperatures is relatively non-Polar, has significant potential. As SCWE instruments have not yet been commercialized, all instruments are individually-built research prototypes: thus, initial efforts were intended to determine if extraction efficiencies on the JPL-built laboratory-scale SCWE instrument are comparable to differing designs built elsewhere. Samples of soil with certified reference concentrations of PAHs were extracted using SCWE as well as conventional Soxhlet extraction. Continuation of the work would involve extractions on JPL'S newer, portable SCWE instrument prototype to determine its efficiency in extracting PAHs.
Hicks, Kathryn
2014-09-01
This article examines the influence of emotional and instrumental support on women's immune function, a biomarker of stress, in the city of El Alto, Bolivia. It tests the prediction that instrumental support is protective of immune function for women living in this marginal environment. Qualitative and quantitative ethnographic methods were employed to assess perceived emotional and instrumental support and common sources of support; multiple linear regression analysis was used to model the relationship between social support and antibodies to the Epstein-Barr virus. These analyses provided no evidence that instrumental social support is related to women's health, but there is some evidence that emotional support from compadres helps protect immune function. © 2014 by the American Anthropological Association.
An instrument to measure job satisfaction of nursing home administrators
Castle, Nicholas G
2006-01-01
Background The psychometric properties of the nursing home administrator job satisfaction questionnaire (NHA-JSQ) are presented, and the steps used to develop this instrument. Methods The NHA-JSQ subscales were developed from pilot survey activities with 93 administrators, content analysis, and a research panel. The resulting survey was sent to 1,000 nursing home administrators. Factor analyses were used to determine the psychometric properties of the instrument. Results Of the 1,000 surveys mailed, 721 usable surveys were returned (72 percent response rate). The factor analyses show that the items were representative of six underlying factors (i.e., coworkers, work demands, work content, work load, work skills, and rewards). Conclusion The NHA-JSQ represents a short, psychometrically sound job satisfaction instrument for use in nursing homes. PMID:17029644
High-throughput real-time quantitative reverse transcription PCR.
Bookout, Angie L; Cummins, Carolyn L; Mangelsdorf, David J; Pesola, Jean M; Kramer, Martha F
2006-02-01
Extensive detail on the application of the real-time quantitative polymerase chain reaction (QPCR) for the analysis of gene expression is provided in this unit. The protocols are designed for high-throughput, 384-well-format instruments, such as the Applied Biosystems 7900HT, but may be modified to suit any real-time PCR instrument. QPCR primer and probe design and validation are discussed, and three relative quantitation methods are described: the standard curve method, the efficiency-corrected DeltaCt method, and the comparative cycle time, or DeltaDeltaCt method. In addition, a method is provided for absolute quantification of RNA in unknown samples. RNA standards are subjected to RT-PCR in the same manner as the experimental samples, thus accounting for the reaction efficiencies of both procedures. This protocol describes the production and quantitation of synthetic RNA molecules for real-time and non-real-time RT-PCR applications.
NASA Astrophysics Data System (ADS)
Rai, A. K.; Kumar, A.; Hies, T.; Nguyen, H. H.
2016-11-01
High sediment load passing through hydropower components erodes the hydraulic components resulting in loss of efficiency, interruptions in power production and downtime for repair/maintenance, especially in Himalayan regions. The size and concentration of sediment play a major role in silt erosion. The traditional process of collecting samples manually to analyse in laboratory cannot suffice the need of monitoring temporal variation in sediment properties. In this study, a multi-frequency acoustic instrument was applied at desilting chamber to monitor sediment size and concentration entering the turbine. The sediment size and concentration entering the turbine were also measured with manual samples collected twice daily. The samples collected manually were analysed in laboratory with a laser diffraction instrument for size and concentration apart from analysis by drying and filtering methods for concentration. A conductivity probe was used to calculate total dissolved solids, which was further used in results from drying method to calculate suspended solid content of the samples. The acoustic instrument was found to provide sediment concentration values similar to drying and filtering methods. However, no good match was found between mean grain size from the acoustic method with the current status of development and laser diffraction method in the first field application presented here. The future versions of the software and significant sensitivity improvements of the ultrasonic transducers are expected to increase the accuracy in the obtained results. As the instrument is able to capture the concentration and in the future most likely more accurate mean grain size of the suspended sediments, its application for monitoring silt erosion in hydropower plant shall be highly useful.
NASA Technical Reports Server (NTRS)
Myers, R. H.
1976-01-01
The depletion of ozone in the stratosphere is examined, and causes for the depletion are cited. Ground station and satellite measurements of ozone, which are taken on a worldwide basis, are discussed. Instruments used in ozone measurement are discussed, such as the Dobson spectrophotometer, which is credited with providing the longest and most extensive series of observations for ground based observation of stratospheric ozone. Other ground based instruments used to measure ozone are also discussed. The statistical differences of ground based measurements of ozone from these different instruments are compared to each other, and to satellite measurements. Mathematical methods (i.e., trend analysis or linear regression analysis) of analyzing the variability of ozone concentration with respect to time and lattitude are described. Various time series models which can be employed in accounting for ozone concentration variability are examined.
Ammonia Analysis by Gas Chromatograph/Infrared Detector (GC/IRD)
NASA Technical Reports Server (NTRS)
Scott, Joseph P.; Whitfield, Steve W.
2003-01-01
Methods are being developed at Marshall Space Flight Center's Toxicity Lab on a CG/IRD System that will be used to detect ammonia in low part per million (ppm) levels. These methods will allow analysis of gas samples by syringe injections. The GC is equipped with a unique cryogenic-cooled inlet system that will enable our lab to make large injections of a gas sample. Although the initial focus of the work will be analysis of ammonia, this instrument could identify other compounds on a molecular level. If proper methods can be developed, the IRD could work as a powerful addition to our offgassing capabilities.
A Multirater Instrument for the Assessment of Simulated Pediatric Crises
Calhoun, Aaron W; Boone, Megan; Miller, Karen H; Taulbee, Rebecca L; Montgomery, Vicki L; Boland, Kimberly
2011-01-01
Background Few validated instruments exist to measure pediatric code team skills. The goal of this study was to develop an instrument for the assessment of resuscitation competency and self-appraisal using multirater and gap analysis methodologies. Methods Multirater assessment with gap analysis is a robust methodology that enables the measurement of self-appraisal as well as competency, offering faculty the ability to provide enhanced feedback. The Team Performance during Simulated Crises Instrument (TPDSCI) was grounded in the Accreditation Council for Graduate Medical Education competencies. The instrument contains 5 competencies, each assessed by a series of descriptive rubrics. It was piloted during a series of simulation-based interdisciplinary pediatric crisis resource management education sessions. Course faculty assessed participants, who also did self-assessments. Internal consistency and interrater reliability were analyzed using Cronbach α and intraclass correlation (ICC) statistics. Gap analysis results were examined descriptively. Results Cronbach α for the instrument was between 0.72 and 0.69. The overall ICC was 0.82. ICC values for the medical knowledge, clinical skills, communication skills, and systems-based practice were between 0.87 and 0.72. The ICC for the professionalism domain was 0.22. Further examination of the professionalism competency revealed a positive skew, 43 simulated sessions (98%) had significant gaps for at least one of the competencies, 38 sessions (86%) had gaps indicating self-overappraisal, and 15 sessions (34%) had gaps indicating self-underappraisal. Conclusions The TPDSCI possesses good measures of internal consistency and interrater reliability with respect to medical knowledge, clinical skills, communication skills, systems-based practice, and overall competence in the context of simulated interdisciplinary pediatric medical crises. Professionalism remains difficult to assess. These results provide an encouraging first step toward instrument validation. Gap analysis reveals disparities between faculty and self-assessments that indicate inadequate participant self-reflection. Identifying self-overappraisal can facilitate focused interventions. PMID:22379528
MARTINS, Renata de Castro; BAHIA, Maria Guiomar de Azevedo; BUONO, Vicente Tadeu Lopes
2010-01-01
Objective This study identified which regions of ProTaper instruments work during curved root canal instrumentation. Material and methods Twelve ProTaper instruments of each type, S1, S2, F1, and F2, were assessed morphometrically by measuring tip angle, tip length, tip diameter, length of each pitch along the cutting blades, and instrument diameter at each millimeter from the tip. Curved canals in resin blocks were explored with manual stainless steel files and prepared with ProTaper instruments until the apical end following four distinct sequences of instrumentation: S1; S1 and S2; S1, S2, and F1; S1, S2, F1, and F2. Image analysis was employed for measuring canal diameters. The diameters of the canals and diameters of the instruments were compared. Data were analyzed by one-way ANOVA and Tukey’s test. Results No statistically significant difference was found between the canals and instrument diameters (p>0.05). The largest diameters in the end-point of the instrumented canals were obtained with F1 and F2 instruments and in the initial and middle thirds with S1 and S2 instruments. Conclusions All instruments worked at the tip and along their cutting blades, being susceptible to fail by torsion, fatigue, or the combination of these two mechanisms. PMID:20379681
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trahan, Alexis Chanel
New nondestructive assay techniques are sought to better characterize spent nuclear fuel. One of the NDA instruments selected for possible deployment is differential die-away self-interrogation (DDSI). The proposed DDSI approach for spent fuel assembly assay utilizes primarily the spontaneous fission and (α, n) neutrons in the assemblies as an internal interrogating radiation source. The neutrons released in spontaneous fission or (α,n) reactions are thermalized in the surrounding water and induce fission in fissile isotopes, thereby creating a measurable signal from isotopes of interest that would be otherwise difficult to measure. The DDSI instrument employs neutron coincidence counting with 3He tubesmore » and list-mode-based data acquisition to allow for production of Rossi-alpha distributions (RADs) in post-processing. The list-mode approach to data collection and subsequent construction of RADs has expanded the analytical possibilities, as will be demonstrated throughout this thesis. One of the primary advantages is that the measured signal in the form of a RAD can be analyzed in its entirety including determination of die-away times in different time domains. This capability led to the development of the early die-away method, a novel leakage multiplication determination method which is tested throughout the thesis on different sources in simulation space and fresh fuel experiments. The early die-away method is a robust, accurate, improved method of determining multiplication without the need for knowledge of the (α,n) source term. The DDSI technique and instrument are presented along with the many novel capabilities enabled by and discovered through RAD analysis. Among the new capabilities presented are the early die-away method, total plutonium content determination, and highly sensitive missing pin detection. Simulation of hundreds of different spent and fresh fuel assemblies were used to develop the analysis algorithms and the techniques were tested on a variety of spontaneous fission-driven fresh fuel assemblies at Los Alamos National Laboratory and the BeRP ball at the Nevada National Security Site. The development of the new, improved analysis and characterization methods with the DDSI instrument makes it a viable technique for implementation in a facility to meet material control and safeguards needs.« less
NASA Astrophysics Data System (ADS)
Furton, Kenneth G.; Harper, Ross J.; Perr, Jeannette M.; Almirall, Jose R.
2003-09-01
A comprehensive study and comparison is underway using biological detectors and instrumental methods for the rapid detection of ignitable liquid residues (ILR) and high explosives. Headspace solid phase microextraction (SPME) has been demonstrated to be an effective sampling method helping to identify active odor signature chemicals used by detector dogs to locate forensic specimens as well as a rapid pre-concentration technique prior to instrumental detection. Common ignitable liquids and common military and industrial explosives have been studied including trinitrotoluene, tetryl, RDX, HMX, EGDN, PETN and nitroglycerine. This study focuses on identifying volatile odor signature chemicals present, which can be used to enhance the level and reliability of detection of ILR and explosives by canines and instrumental methods. While most instrumental methods currently in use focus on particles and on parent organic compounds, which are often involatile, characteristic volatile organics are generally also present and can be exploited to enhance detection particularly for well-concealed devices. Specific examples include the volatile odor chemicals 2-ethyl-1-hexanol and cyclohexanone, which are readily available in the headspace of the high explosive composition C-4; whereas, the active chemical cyclo-1,3,5-trimethylene-2,4,6-trinitramine (RDX) is not. The analysis and identification of these headspace 'fingerprint' organics is followed by double-blind dog trials of the individual components using certified teams in an attempt to isolate and understand the target compounds to which dogs are sensitive. Studies to compare commonly used training aids with the actual target explosive have also been undertaken to determine their suitability and effectiveness. The optimization of solid phase microextraction (SPME) combined with ion trap mobility spectrometry (ITMS) and gas chromatography/mass spectrometry/mass spectrometry (GC/MSn) is detailed including interface development and comparisons of limits of detection. These instrumental methods are being optimized in order to detect the same target odor chemicals used by detector dogs to reliably locate explosives and ignitable liquids.
Comparison of soil pollution concentrations determined using AAS and portable XRF techniques.
Radu, Tanja; Diamond, Dermot
2009-11-15
Past mining activities in the area of Silvermines, Ireland, have resulted in heavily polluted soils. The possibility of spreading pollution to the surrounding areas through dust blow-offs poses a potential threat for the local communities. Conventional environmental soil and dust analysis techniques are very slow and laborious and consequently there is a need for fast and accurate analytical methods, which can provide real-time in situ pollution mapping. Laboratory-based aqua regia acid digestion of the soil samples collected in the area followed by the atomic absorption spectrophotometry (AAS) analysis confirmed very high pollution, especially by Pb, As, Cu, and Zn. In parallel, samples were analyzed using portable X-ray fluorescence radioisotope and miniature tube powered (XRF) NITON instruments and their performance was compared. Overall, the portable XRF instrument gave excellent correlation with the laboratory-based reference AAS method.
Expression microdissection adapted to commercial laser dissection instruments
Hanson, Jeffrey C; Tangrea, Michael A; Kim, Skye; Armani, Michael D; Pohida, Thomas J; Bonner, Robert F; Rodriguez-Canales, Jaime; Emmert-Buck, Michael R
2016-01-01
Laser-based microdissection facilitates the isolation of specific cell populations from clinical or animal model tissue specimens for molecular analysis. Expression microdissection (xMD) is a second-generation technology that offers considerable advantages in dissection capabilities; however, until recently the method has not been accessible to investigators. This protocol describes the adaptation of xMD to commonly used laser microdissection instruments and to a commercially available handheld laser device in order to make the technique widely available to the biomedical research community. The method improves dissection speed for many applications by using a targeting probe for cell procurement in place of an operator-based, cell-by-cell selection process. Moreover, xMD can provide improved dissection precision because of the unique characteristics of film activation. The time to complete the protocol is highly dependent on the target cell population and the number of cells needed for subsequent molecular analysis. PMID:21412274
Mass Spectrometry in Studies of Protein Thiol Chemistry and Signaling: Opportunities and Caveats
Devarie Baez, Nelmi O.; Reisz, Julie A.; Furdui, Cristina M.
2014-01-01
Mass spectrometry (MS) has become a powerful and widely utilized tool in the investigation of protein thiol chemistry, biochemistry, and biology. Very early biochemical studies of metabolic enzymes have brought to light the broad spectrum of reactivity profiles that distinguish cysteine thiols with functions in catalysis and protein stability from other cysteine residues in proteins. The development of MS methods for the analysis of proteins using electrospray ionization (ESI) or matrix-assisted laser desorption/ionization (MALDI) coupled with the emergence of high-resolution mass analyzers have been instrumental in advancing studies of thiol modifications, both in single proteins and within the cellular context. This article reviews MS instrumentation and methods of analysis employed in investigations of thiols and their reactivity toward a range of small biomolecules. A selected number of studies are detailed to highlight the advantages brought about by the MS technologies along with the caveats associated with these analyses. PMID:25261734
Klepárník, Karel
2015-01-01
This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices with MS detection and identification. A wide selection of 183 relevant articles covers the literature published from June 2012 till May 2014 as a continuation of the review article on the same topic by Kleparnik [Electrophoresis 2013, 34, 70-86]. Special attention is paid to the new improvements in the theory of instrumentation and methodology of MS interfacing with capillary versions of zone electrophoresis, ITP, and IEF. Ionization methods in MS include ESI, MALDI, and ICP. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography and micellar electrokinetic chromatography are not included. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
[Isolation and identification methods of enterobacteria group and its technological advancement].
Furuta, Itaru
2007-08-01
In the last half-century, isolation and identification methods of enterobacteria groups have markedly improved by technological advancement. Clinical microbiology tests have changed overtime from tube methods to commercial identification kits and automated identification. Tube methods are the original method for the identification of enterobacteria groups, that is, a basically essential method to recognize bacterial fermentation and biochemical principles. In this paper, traditional tube tests are discussed, such as the utilization of carbohydrates, indole, methyl red, and citrate and urease tests. Commercial identification kits and automated instruments by computer based analysis as current methods are also discussed, and those methods provide rapidity and accuracy. Nonculture techniques of nucleic acid typing methods using PCR analysis, and immunochemical methods using monoclonal antibodies can be further developed.
Major advances in testing of dairy products: milk component and dairy product attribute testing.
Barbano, D M; Lynch, J M
2006-04-01
Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.
Rutter, A.P.; Hanford, K.L.; Zwers, J.T.; Perillo-Nicholas, A. L.; Schauer, J.J.; Olson, M.L.
2008-01-01
Reactive gaseous mercury (RGM) and particulate mercury (PHg) were collected in Milwaukee, WI, between April 2004 and May 2005, and in Riverside, CA, between July 25 and August 7, 2005 using sorbent and filter substrates. The substrates were analyzed for mercury by thermal desorption analysis (TDA) using a purpose-built instrument. Results from this offline-TDA method were compared with measurements using a real-time atmospheric mercury analyzer. RGM measurements made with the offline-TDA agreed well with a commercial real-time method. However, the offline TDA reported PHg concentrations 2.7 times higher than the real-time method, indicating evaporative losses might be occurring from the real-time instrument during sample collection. TDA combined with reactive mercury collection on filter and absorbent substrates was cheap, relatively easy to use, did not introduce biases due to a semicontinuous sample collection strategy, and had a dynamic range appropriate for use in rural and urban locations. The results of this study demonstrate that offline-TDA is a feasible method for collecting reactive mercury concentrations in a large network of filter-based samplers. Copyright 2008 Air & Waste Management Association.
Wang, Yang; Liu, Fang; Li, Peng; He, Chengwei; Wang, Ruibing; Su, Huanxing; Wan, Jian-Bo
2016-07-13
Pseudotargeted metabolomics is a novel strategy integrating the advantages of both untargeted and targeted methods. The conventional pseudotargeted metabolomics required two MS instruments, i.e., ultra-high performance liquid chromatography/quadrupole-time- of-flight mass spectrometry (UHPLC/Q-TOF MS) and UHPLC/triple quadrupole mass spectrometry (UHPLC/QQQ-MS), which makes method transformation inevitable. Furthermore, the picking of ion pairs from thousands of candidates and the swapping of the data between two instruments are the most labor-intensive steps, which greatly limit its application in metabolomic analysis. In the present study, we proposed an improved pseudotargeted metabolomics method that could be achieved on an UHPLC/Q-TOF/MS instrument operated in the multiple ion monitoring (MIM) mode with time-staggered ion lists (tsMIM). Full scan-based untargeted analysis was applied to extract the target ions. After peak alignment and ion fusion, a stepwise ion picking procedure was used to generate the ion lists for subsequent single MIM and tsMIM. The UHPLC/Q-TOF tsMIM MS-based pseudotargeted approach exhibited better repeatability and a wider linear range than the UHPLC/Q-TOF MS-based untargeted metabolomics method. Compared to the single MIM mode, the tsMIM significantly increased the coverage of the metabolites detected. The newly developed method was successfully applied to discover plasma biomarkers for alcohol-induced liver injury in mice, which indicated its practicability and great potential in future metabolomics studies. Copyright © 2016 Elsevier B.V. All rights reserved.
Manard, Benjamin T.; Wylie, E. Miller; Willson, Stephen P.
2018-05-22
In this paper, a portable handheld laser-induced breakdown spectroscopy (HH LIBS) instrument was evaluated as a rapid method to qualitatively analyze rare earth elements in a uranium oxide matrix. This research is motivated by the need for development of a method to perform rapid, at-line chemical analysis in a nuclear facility, particularly to provide a rapid first pass analysis to determine if additional actions or measurements are warranted. This will result in the minimization of handling and transport of radiological and nuclear material and subsequent exposure to their associated hazards. In this work, rare earth elements (Eu, Nd, and Yb)more » were quantitatively spiked into a uranium oxide powder and analyzed by the HH LIBS instrumentation. This method demonstrates the ability to rapidly identify elemental constituents in sub-percent levels in a uranium matrix. Preliminary limits of detection (LODs) were determined with values on the order of hundredths of a percent. Validity of this methodology was explored by employing a National Institute of Standards and Technology (NIST) standard reference materials (SRM) 610 and 612 (Trace Elements in Glass). Finally, it was determined that the HH LIBS method was able to clearly discern the rare earths elements of interest in the glass or uranium matrices.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manard, Benjamin T.; Wylie, E. Miller; Willson, Stephen P.
In this paper, a portable handheld laser-induced breakdown spectroscopy (HH LIBS) instrument was evaluated as a rapid method to qualitatively analyze rare earth elements in a uranium oxide matrix. This research is motivated by the need for development of a method to perform rapid, at-line chemical analysis in a nuclear facility, particularly to provide a rapid first pass analysis to determine if additional actions or measurements are warranted. This will result in the minimization of handling and transport of radiological and nuclear material and subsequent exposure to their associated hazards. In this work, rare earth elements (Eu, Nd, and Yb)more » were quantitatively spiked into a uranium oxide powder and analyzed by the HH LIBS instrumentation. This method demonstrates the ability to rapidly identify elemental constituents in sub-percent levels in a uranium matrix. Preliminary limits of detection (LODs) were determined with values on the order of hundredths of a percent. Validity of this methodology was explored by employing a National Institute of Standards and Technology (NIST) standard reference materials (SRM) 610 and 612 (Trace Elements in Glass). Finally, it was determined that the HH LIBS method was able to clearly discern the rare earths elements of interest in the glass or uranium matrices.« less
Manard, Benjamin T; Wylie, E Miller; Willson, Stephen P
2018-01-01
A portable handheld laser-induced breakdown spectroscopy (HH LIBS) instrument was evaluated as a rapid method to qualitatively analyze rare earth elements in a uranium oxide matrix. This research is motivated by the need for development of a method to perform rapid, at-line chemical analysis in a nuclear facility, particularly to provide a rapid first pass analysis to determine if additional actions or measurements are warranted. This will result in the minimization of handling and transport of radiological and nuclear material and subsequent exposure to their associated hazards. In this work, rare earth elements (Eu, Nd, and Yb) were quantitatively spiked into a uranium oxide powder and analyzed by the HH LIBS instrumentation. This method demonstrates the ability to rapidly identify elemental constituents in sub-percent levels in a uranium matrix. Preliminary limits of detection (LODs) were determined with values on the order of hundredths of a percent. Validity of this methodology was explored by employing a National Institute of Standards and Technology (NIST) standard reference materials (SRM) 610 and 612 (Trace Elements in Glass). It was determined that the HH LIBS method was able to clearly discern the rare earths elements of interest in the glass or uranium matrices.
Ab initio simulation of diffractometer instrumental function for high-resolution X-ray diffraction1
Mikhalychev, Alexander; Benediktovitch, Andrei; Ulyanenkova, Tatjana; Ulyanenkov, Alex
2015-01-01
Modeling of the X-ray diffractometer instrumental function for a given optics configuration is important both for planning experiments and for the analysis of measured data. A fast and universal method for instrumental function simulation, suitable for fully automated computer realization and describing both coplanar and noncoplanar measurement geometries for any combination of X-ray optical elements, is proposed. The method can be identified as semi-analytical backward ray tracing and is based on the calculation of a detected signal as an integral of X-ray intensities for all the rays reaching the detector. The high speed of calculation is provided by the expressions for analytical integration over the spatial coordinates that describe the detection point. Consideration of the three-dimensional propagation of rays without restriction to the diffraction plane provides the applicability of the method for noncoplanar geometry and the accuracy for characterization of the signal from a two-dimensional detector. The correctness of the simulation algorithm is checked in the following two ways: by verifying the consistency of the calculated data with the patterns expected for certain simple limiting cases and by comparing measured reciprocal-space maps with the corresponding maps simulated by the proposed method for the same diffractometer configurations. Both kinds of tests demonstrate the agreement of the simulated instrumental function shape with the measured data. PMID:26089760
Impact of storage on dark chocolate: texture and polymorphic changes.
Nightingale, Lia M; Lee, Soo-Yeun; Engeseth, Nicki J
2011-01-01
Chocolate storage is critical to final product quality. Inadequate storage, especially with temperature fluctuations, may lead to rearrangement of triglycerides that make up the bulk of the chocolate matrix; this rearrangement may lead to fat bloom. Bloom is the main cause of quality loss in the chocolate industry. The effect of storage conditions leading to bloom formation on texture and flavor attributes by human and instrumental measures has yet to be reported. Therefore, the impact of storage conditions on the quality of dark chocolate by sensory and instrumental measurements was determined. Dark chocolate was kept under various conditions and analyzed at 0, 4, and 8 wk of storage. Ten members of a descriptive panel analyzed texture and flavor. Instrumental methods included texture analysis, color measurement, lipid polymorphism by X-ray diffraction and differential scanning calorimetry, triglyceride concentration by gas chromatography, and surface properties by atomic force microscopy. Results were treated by analysis of variance, cluster analysis, principal component analysis, and linear partial least squares regression analysis. Chocolate stored 8 wk at high temperature without fluctuations and 4 wk with fluctuations transitioned from form V to VI. Chocolates stored at high temperature with and without fluctuations were harder, more fracturable, more toothpacking, had longer melt time, were less sweet, and had less cream flavor. These samples had rougher surfaces, fewer but larger grains, and a heterogeneous surface. Overall, all stored dark chocolate experienced instrumental or perceptual changes attributed to storage condition. Chocolates stored at high temperature with and without fluctuations were most visually and texturally compromised. Practical Application: Many large chocolate companies do their own "in-house" unpublished research and smaller confectionery facilities do not have the means to conduct their own research. Therefore, this study relating sensory and instrumental data provides published evidence available for application throughout the confectionery industry.
Paar, Christian; Hammerl, Verena; Blessberger, Hermann; Stekel, Herbert; Steinwender, Clemens; Berg, Jörg
2016-12-01
High resolution melting (HRM) of amplicons is a simple method for genotyping of single nucleotide polymorphisms (SNPs). Albeit many applications reported, HRM seems to be rarely used in clinical laboratories. The suitability of HRM-PCR for the clinical laboratory was investigated for genotyping of SNPs of the vitamin K epoxide reductase complex unit 1 gene. About 100 DNA samples were analyzed by two different HRM-PCRs on the Cobas z480 instrument and compared with a PCR with fluorescently labeled probes (HybProbe-PCR) on the LightCycler 2.0 instrument as reference. Reliable genotyping with 100% matching results was obtained, when the amplicon size was small (63 bp) and DNA input was limited by e.g., sample dilution with salt-free water. DNA extracted by differing methods may be used for genotyping by HRM-PCR. Compared with HybProbe-PCR, HRM-PCR on the Cobas z480 instrument allows for higher through-put, however, at the cost of a higher degree of laboratory standardization and a slower turnaround.
Johnson, Sindhu R.; Naden, Raymond P.; Fransen, Jaap; van den Hoogen, Frank; Pope, Janet E.; Baron, Murray; Tyndall, Alan; Matucci-Cerinic, Marco; Denton, Christopher P.; Distler, Oliver; Gabrielli, Armando; van Laar, Jacob M.; Mayes, Maureen; Steen, Virginia; Seibold, James R.; Clements, Phillip; Medsger, Thomas A.; Carreira, Patricia E.; Riemekasten, Gabriela; Chung, Lorinda; Fessler, Barri J.; Merkel, Peter A.; Silver, Richard; Varga, John; Allanore, Yannick; Mueller-Ladner, Ulf; Vonk, Madelon C.; Walker, Ulrich A.; Cappelli, Susanna; Khanna, Dinesh
2014-01-01
Objective Classification criteria for systemic sclerosis (SSc) are being developed. The objectives were to: develop an instrument for collating case-data and evaluate its sensibility; use forced-choice methods to reduce and weight criteria; and explore agreement between experts on the probability that cases were classified as SSc. Study Design and Setting A standardized instrument was tested for sensibility. The instrument was applied to 20 cases covering a range of probabilities that each had SSc. Experts rank-ordered cases from highest to lowest probability; reduced and weighted the criteria using forced-choice methods; and re-ranked the cases. Consistency in rankings was evaluated using intraclass correlation coefficients (ICC). Results Experts endorsed clarity (83%), comprehensibility (100%), face and content validity (100%). Criteria were weighted (points): finger skin thickening (14–22), finger-tip lesions (9–21), friction rubs (21), finger flexion contractures (16), pulmonary fibrosis (14), SSc-related antibodies (15), Raynaud’s phenomenon (13), calcinosis (12), pulmonary hypertension (11), renal crisis (11), telangiectasia (10), abnormal nailfold capillaries (10), esophageal dilation (7) and puffy fingers (5). The ICC across experts was 0.73 (95%CI 0.58,0.86) and improved to 0.80 (95%CI 0.68,0.90). Conclusions Using a sensible instrument and forced-choice methods, the number of criteria were reduced by 39% (23 to 14) and weighted. Our methods reflect the rigors of measurement science, and serves as a template for developing classification criteria. PMID:24721558
Sterilization of endoscopic instruments.
Sabnis, Ravindra B; Bhattu, Amit; Vijaykumar, Mohankumar
2014-03-01
Sterilization of endoscopic instruments is an important but often ignored topic. The purpose of this article is to review the current literature on the sterilization of endoscopic instruments and elaborate on the appropriate sterilization practices. Autoclaving is an economic and excellent method of sterilizing the instruments that are not heat sensitive. Heat sensitive instruments may get damaged with hot sterilization methods. Several new endoscopic instruments such as flexible ureteroscopes, chip on tip endoscopes, are added in urologists armamentarium. Many of these instruments are heat sensitive and hence alternative efficacious methods of sterilization are necessary. Although ethylene oxide and hydrogen peroxide are excellent methods of sterilization, they have some drawbacks. Gamma irradiation is mainly for disposable items. Various chemical agents are widely used even though they achieve high-level disinfection rather than sterilization. This article reviews various methods of endoscopic instrument sterilization with their advantages and drawbacks. If appropriate sterilization methods are adopted, then it not only will protect patients from procedure-related infections but prevent hypersensitive allergic reactions. It will also protect instruments from damage and increase its longevity.
Cutting efficiency of four different rotary nickel: Titanium instruments
Cecchin, Doglas; de Sousa-Neto, Manoel Damião; Pécora, Jesus Djalma; Gariba-Silva, Ricardo
2011-01-01
Aim: The aim of this study was to evaluate the cutting efficiency of rotary nickel-titanium (NiTi) instruments K3, NiTi Tee, Profile, and Quantec with taper size 04/25. Materials and Methods: The number of samples was 10 for each group (n = 10). The cutting efficiency was measured by the mass loss from each acrylic resin block after instrumentation of a simulated canal using the Crown-down technique. Results: The analysis of variance (ANOVA) showed that there was a statistically significant difference among the studied groups. The Tukey's test showed that the acrylic resin blocks prepared with instruments K3 (0.00369 ± 0.00022), NiTi Tee (0.00368 ± 0.00023), and Profile (0.00351 ± 0.00026) presented the greatest mass loss, showing no statistically significant difference among them (P < 0.05). The lowest mass loss was found in the blocks prepared with Quantec instruments (0.00311 ± 0.0003) (P < 0.05). Conclusions: It could be concluded that the K3, NiTi Tee, and Profile instruments presented a greater cutting efficiency than the Quantec instruments. PMID:21814349
Escobar, Rogelio
2014-11-01
Enrique O. Aragón established the first psychological laboratory in Mexico in 1916. This laboratory was inspired by Wundt's laboratory and by those created afterward in Germany and the United States. It was equipped with state-of-the art instruments imported from Germany in 1902 from Ernst Zimmermann who supplied instruments for Wundt's laboratory. Although previous authors have described the social events leading to the creation of the laboratory, there are limited descriptions of the instruments, their use, and their influence. With the aid of archival resources, the initial location of the laboratory was determined. The analysis of instruments revealed a previously overlooked relation with a previous laboratory of experimental physiology. The influence of the laboratory was traced by describing the careers of 4 students, 3 of them women, who worked with the instruments during the first 2 decades of the 20th century, each becoming accomplished scholars. In addition, this article, by identifying and analyzing the instruments shown in photographs of the psychological laboratory and in 1 motion film, provides information of the class demonstrations and the experiments conducted in this laboratory.
NASA Astrophysics Data System (ADS)
Preusse, Peter; Dörnbrack, Andreas; Eckermann, Stephen D.; Riese, Martin; Schaeler, Bernd; Bacmeister, Julio T.; Broutman, Dave; Grossmann, Klaus U.
2002-09-01
The Cryogenic Infrared Spectrometers and Telescopes for the Atmosphere (CRISTA) instrument measured stratospheric temperatures and trace species concentrations with high precision and spatial resolution during two missions. The measuring technique is infrared limb-sounding of optically thin emissions. In a general approach, we investigate the applicability of the technique to measure gravity waves (GWs) in the retrieved temperature data. It is shown that GWs with wavelengths of the order of 100-200 km horizontally can be detected. The results are applicable to any instrument using the same technique. We discuss additional constraints inherent to the CRISTA instrument. The vertical field of view and the influence of the sampling and retrieval imply that waves with vertical wavelengths ~3-5 km or larger can be retrieved. Global distributions of GW fluctuations were extracted from temperature data measured by CRISTA using Maximum Entropy Method (MEM) and Harmonic Analysis (HA), yielding height profiles of vertical wavelength and peak amplitude for fluctuations in each scanned profile. The method is discussed and compared to Fourier transform analyses and standard deviations. Analysis of data from the first mission reveals large GW amplitudes in the stratosphere over southernmost South America. These waves obey the dispersion relation for linear two-dimensional mountain waves (MWs). The horizontal structure on 6 November 1994 is compared to temperature fields calculated by the Pennsylvania State University (PSU)/National Center for Atmospheric Research (NCAR) mesoscale model (MM5). It is demonstrated that precise knowledge of the instrument's sensitivity is essential. Particularly good agreement is found at the southern tip of South America where the MM5 accurately reproduces the amplitudes and phases of a large-scale wave with 400 km horizontal wavelength. Targeted ray-tracing simulations allow us to interpret some of the observed wave features. A companion paper will discuss MWs on a global scale and estimates the fraction that MWs contribute to the total GW energy (Preusse et al., in preparation, 2002).
Triqui, Réda; Bouchriti, Nourredine
2003-12-17
Freshness of ice-stored sardine was assessed by two sensory methods, the quality index method (QIM) and the European Union freshness grading system, and by instrumental means using the method of aroma extract dilution analysis. Screening of sardine potent volatiles was carried out at three freshness stages. In the very fresh state, the plant-like fresh volatiles dominated the odor pattern, with the exception of methional. Overall odor changes in sardine throughout storage correlated with changes in the concentration of some potent volatiles: after 2 days of ice storage, (Z)-4-heptenal, (Z)-1,5-octadien-3-one, and methional imparted an overall "fishy" odor character to sardine, whereas at a lower sensory grade (B), the compounds (E)-2-nonenal and (E,Z)-2,6-nonadienal could be, in part, associated with the slightly rancid aroma top notes. Trimethylamine was detected as a highly volatile odorant using solid-phase microextraction (SPME) headspace analysis of refrigerator-stored sardine. Intensity and sensory characteristics of some SPME determined volatiles, for example, 3-methylnonane-2,4-dione, were closely related to overall odor changes. SPME headspace analysis may be useful in the characterization of off-flavors in fish.
Validation of the organizational culture assessment instrument.
Heritage, Brody; Pollock, Clare; Roberts, Lynne
2014-01-01
Organizational culture is a commonly studied area in industrial/organizational psychology due to its important role in workplace behaviour, cognitions, and outcomes. Jung et al.'s [1] review of the psychometric properties of organizational culture measurement instruments noted many instruments have limited validation data despite frequent use in both theoretical and applied situations. The Organizational Culture Assessment Instrument (OCAI) has had conflicting data regarding its psychometric properties, particularly regarding its factor structure. Our study examined the factor structure and criterion validity of the OCAI using robust analysis methods on data gathered from 328 (females = 226, males = 102) Australian employees. Confirmatory factor analysis supported a four factor structure of the OCAI for both ideal and current organizational culture perspectives. Current organizational culture data demonstrated expected reciprocally-opposed relationships between three of the four OCAI factors and the outcome variable of job satisfaction but ideal culture data did not, thus indicating possible weak criterion validity when the OCAI is used to assess ideal culture. Based on the mixed evidence regarding the measure's properties, further examination of the factor structure and broad validity of the measure is encouraged.
Sensitive ion detection device and method for analysis of compounds as vapors in gases
Denton, M. Bonner; Sperline, Roger P.
2015-09-15
An ion mobility spectrometer (IMS) for the detection of trace gaseous molecular compounds dissolved or suspended in a carrier gas, particularly in ambient air, without preconcentration or the trapping of analyte particles. The IMS of the invention comprises an ionization volume of greater than 5 cm.sup.3 and preferably greater than 100 cm.sup.3. The larger size ionizers of this invention enable analysis of trace (<1 ppb) of sample compounds in the gas phase. To facilitate efficient ion motion through the large volume ionization and reaction regions of the IMS, an electric field gradient can be provided in the ionization region or in both the ionization and reaction regions. The systems can be implemented with radioactive ionization sources, corona discharge ion sources or ions can be formed by photoionization. In specific embodiments, particularly when the sample gas is ambient air, the sample gas is heater prior to entry into the instrument, the instrument is run at temperatures above ambient, and the instrument can be heated by contact with heated sample gas exiting the instrument.
Sensitive ion detection device and method for analysis of compounds as vapors in gases
Denton, M. Bonner; Sperline, Roger P
2014-02-18
An ion mobility spectrometer (IMS) for the detection of trace gaseous molecular compounds dissolved or suspended in a carrier gas, particularly in ambient air, without preconcentration or the trapping of analyte particles. The IMS of the invention comprises an ionization volume of greater than 5 cm.sup.3 and preferably greater than 100 cm.sup.3. The larger size ionizers of this invention enable analysis of trace (<1 ppb) of sample compounds in the gas phase. To facilitate efficient ion motion through the large volume ionization and reaction regions of the IMS, an electric field gradient can be provided in the ionization region or in both the ionization and reaction regions. The systems can be implemented with radioactive ionization sources, corona discharge ion sources or ions can be formed by photoionization. In specific embodiments, particularly when the sample gas is ambient air, the sample gas is heater prior to entry into the instrument, the instrument is run at temperatures above ambient, and the instrument can be heated by contact with heated sample gas exiting the instrument.
Uncertainty Analysis of Instrument Calibration and Application
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.
Validation of the Organizational Culture Assessment Instrument
Heritage, Brody; Pollock, Clare; Roberts, Lynne
2014-01-01
Organizational culture is a commonly studied area in industrial/organizational psychology due to its important role in workplace behaviour, cognitions, and outcomes. Jung et al.'s [1] review of the psychometric properties of organizational culture measurement instruments noted many instruments have limited validation data despite frequent use in both theoretical and applied situations. The Organizational Culture Assessment Instrument (OCAI) has had conflicting data regarding its psychometric properties, particularly regarding its factor structure. Our study examined the factor structure and criterion validity of the OCAI using robust analysis methods on data gathered from 328 (females = 226, males = 102) Australian employees. Confirmatory factor analysis supported a four factor structure of the OCAI for both ideal and current organizational culture perspectives. Current organizational culture data demonstrated expected reciprocally-opposed relationships between three of the four OCAI factors and the outcome variable of job satisfaction but ideal culture data did not, thus indicating possible weak criterion validity when the OCAI is used to assess ideal culture. Based on the mixed evidence regarding the measure's properties, further examination of the factor structure and broad validity of the measure is encouraged. PMID:24667839
Bayesian Analysis of the Cosmic Microwave Background
NASA Technical Reports Server (NTRS)
Jewell, Jeffrey
2007-01-01
There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.
Creating a Computer Adaptive Test Version of the Late-Life Function & Disability Instrument
Jette, Alan M.; Haley, Stephen M.; Ni, Pengsheng; Olarsch, Sippy; Moed, Richard
2009-01-01
Background This study applied Item Response Theory (IRT) and Computer Adaptive Test (CAT) methodologies to develop a prototype function and disability assessment instrument for use in aging research. Herein, we report on the development of the CAT version of the Late-Life Function & Disability instrument (Late-Life FDI) and evaluate its psychometric properties. Methods We employed confirmatory factor analysis, IRT methods, validation, and computer simulation analyses of data collected from 671 older adults residing in residential care facilities. We compared accuracy, precision, and sensitivity to change of scores from CAT versions of two Late-Life FDI scales with scores from the fixed-form instrument. Score estimates from the prototype CAT versus the original instrument were compared in a sample of 40 older adults. Results Distinct function and disability domains were identified within the Late-Life FDI item bank and used to construct two prototype CAT scales. Using retrospective data, scores from computer simulations of the prototype CAT scales were highly correlated with scores from the original instrument. The results of computer simulation, accuracy, precision, and sensitivity to change of the CATs closely approximated those of the fixed-form scales, especially for the 10- or 15-item CAT versions. In the prospective study each CAT was administered in less than 3 minutes and CAT scores were highly correlated with scores generated from the original instrument. Conclusions CAT scores of the Late-Life FDI were highly comparable to those obtained from the full-length instrument with a small loss in accuracy, precision, and sensitivity to change. PMID:19038841
Improvements in Virtual Sensors: Using Spatial Information to Estimate Remote Sensing Spectra
NASA Technical Reports Server (NTRS)
Oza, Nikunj C.; Srivastava, Ashok N.; Stroeve, Julienne
2005-01-01
Various instruments are used to create images of the Earth and other objects in the universe in a diverse set of wavelength bands with the aim of understanding natural phenomena. Sometimes these instruments are built in a phased approach, with additional measurement capabilities added in later phases. In other cases, technology may mature to the point that the instrument offers new measurement capabilities that were not planned in the original design of the instrument. In still other cases, high resolution spectral measurements may be too costly to perform on a large sample and therefore lower resolution spectral instruments are used to take the majority of measurements. Many applied science questions that are relevant to the earth science remote sensing community require analysis of enormous amounts of data that were generated by instruments with disparate measurement capabilities. In past work [1], we addressed this problem using Virtual Sensors: a method that uses models trained on spectrally rich (high spectral resolution) data to "fill in" unmeasured spectral channels in spectrally poor (low spectral resolution) data. We demonstrated this method by using models trained on the high spectral resolution Terra MODIS instrument to estimate what the equivalent of the MODIS 1.6 micron channel would be for the NOAA AVHRR2 instrument. The scientific motivation for the simulation of the 1.6 micron channel is to improve the ability of the AVHRR2 sensor to detect clouds over snow and ice. This work contains preliminary experiments demonstrating that the use of spatial information can improve our ability to estimate these spectra.
Enke, Christie
2013-02-19
Methods and instruments for high dynamic range analysis of sample components are described. A sample is subjected to time-dependent separation, ionized, and the ions dispersed with a constant integration time across an array of detectors according to the ions m/z values. Each of the detectors in the array has a dynamically adjustable gain or a logarithmic response function, producing an instrument capable of detecting a ratio of responses or 4 or more orders of magnitude.
Shuttle Tethered Aerothermodynamics Research Facility (STARFAC) Instrumentation Requirements
NASA Technical Reports Server (NTRS)
Wood, George M.; Siemers, Paul M.; Carlomagno, Giovanni M.; Hoffman, John
1986-01-01
The instrumentation requirements for the Shuttle Tethered Aerothermodynamic Research Facility (STARFAC) are presented. The typical physical properties of the terrestrial atmosphere are given along with representative atmospheric daytime ion concentrations and the equilibrium and nonequilibrium gas property comparison from a point away from a wall. STARFAC science and engineering measurements are given as are the TSS free stream gas analysis. The potential nonintrusive measurement techniques for hypersonic boundary layer research are outlined along with the quantitative physical measurement methods for aerothermodynamic studies.
Evaluation of Acoustic Doppler Current Profiler measurements of river discharge
Morlock, S.E.
1996-01-01
The standard deviations of the ADCP measurements ranged from approximately 1 to 6 percent and were generally higher than the measurement errors predicted by error-propagation analysis of ADCP instrument performance. These error-prediction methods assume that the largest component of ADCP discharge measurement error is instrument related. The larger standard deviations indicate that substantial portions of measurement error may be attributable to sources unrelated to ADCP electronics or signal processing and are functions of the field environment.
Preetam, C S; Chandrashekhar, M; Gunaranjan, T; Kumar, S Kishore; Miskeen Sahib, S A; Kumar, M Senthil
2016-08-01
The purpose of this study is to achieve an effective method to remove root canal filling material from the root canal system. The study, thus, aims to evaluate the efficacy of the cleaning ability of two different rotary Ni-Ti systems; ProTaper Retreatment files and RaCe System compared to hand instrumentation with Hedstrom files for the removal of gutta-percha during retreatment. Thirty mandibular premolars with one single straight canal were decoronated and instrumented with ProTaper files and filled with thermoplastic gutta-percha. After 30 days, the samples were divided into three groups and gutta-percha was removed with the test instruments. The postoperative radiographs were evaluated with known criteria by dividing the root into cervical third, middle third, and apical third. The results were tabulated and Statistical Package for Social Sciences Software (IBM Corporation) was used for analysis. The mean deviation of the results were first calculated and then t-test and analysis of variance test (two-tailed P value) were evaluated for establishing significant differences. The rotary instruments were effective in removing the gutta-percha from the canals. Therefore, significant difference was observed between the efficacies of the two rotary systems used. The rotary instruments showed effective gutta-percha removal in the cervical and middle one third. (P > 0.05). However, apical debridement was effective with Hedstrom files. The study concluded the use of both rotary and hand instrumentation for effective removal of gutta-percha for retreatment.
Newham, Rosemary; Bennie, Marion; Maxwell, David; Watson, Anne; de Wet, Carl; Bowie, Paul
2014-12-01
A positive and strong safety culture underpins effective learning from patient safety incidents in health care, including the community pharmacy (CP) setting. To build this culture, perceptions of safety climate must be measured with context-specific and reliable instruments. No pre-existing instruments were specifically designed or suitable for CP within Scotland. We therefore aimed to develop a psychometrically sound instrument to measure perceptions of safety climate within Scottish CPs. The first stage, development of a preliminary instrument, comprised three steps: (i) a literature review; (ii) focus group feedback; and (iii) content validation. The second stage, psychometric testing, consisted of three further steps: (iv) a pilot survey; (v) a survey of all CP staff within a single health board in NHS Scotland; and (vi) application of statistical methods, including principal components analysis and calculation of Cronbach's reliability coefficients, to derive the final instrument. The preliminary questionnaire was developed through a process of literature review and feedback. This questionnaire was completed by staff in 50 CPs from the 131 (38%) sampled. 250 completed questionnaires were suitable for analysis. Psychometric evaluation resulted in a 30-item instrument with five positively correlated safety climate factors: leadership, teamwork, safety systems, communication and working conditions. Reliability coefficients were satisfactory for the safety climate factors (α > 0.7) and overall (α = 0.93). The robust nature of the technical design and testing process has resulted in the development of an instrument with sufficient psychometric properties, which can be implemented in the community pharmacy setting in NHS Scotland. © 2014 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutmacher, R.; Crawford, R.
This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.
Blom, E M; Verdaasdonk, E G G; Stassen, L P S; Stassen, H G; Wieringa, P A; Dankelman, J
2007-09-01
Verbal communication in the operating room during surgical procedures affects team performance, reflects individual skills, and is related to the complexity of the operation process. During the procedural training of surgeons (residents), feedback and guidance is given through verbal communication. A classification method based on structural analysis of the contents was developed to analyze verbal communication. This study aimed to evaluate whether a classification method for the contents of verbal communication in the operating room could provide insight into the teaching processes. Eight laparoscopic cholecystectomies were videotaped. Two entire cholecystectomies and the dissection phase of six additional procedures were analyzed by categorization of the communication in terms of type (4 categories: commanding, explaining, questioning, and miscellaneous) and content (9 categories: operation method, location, direction, instrument handling, visualization, anatomy and pathology, general, private, undefinable). The operation was divided into six phases: start, dissection, clipping, separating, control, closing. Classification of the communication during two entire procedures showed that each phase of the operation was dominated by different kinds of communication. A high percentage of explaining anatomy and pathology was found throughout the whole procedure except for the control and closing phases. In the dissection phases, 60% of verbal communication concerned explaining. These explaining communication events were divided as follows: 27% operation method, 19% anatomy and pathology, 25% location (positioning of the instrument-tissue interaction), 15% direction (direction of tissue manipulation), 11% instrument handling, and 3% other nonclassified instructions. The proposed classification method is feasible for analyzing verbal communication during surgical procedures. Communication content objectively reflects the interaction between surgeon and resident. This information can potentially be used to specify training needs, and may contribute to the evaluation of different training methods.
Jakupovic, Vedran; Solakovic, Suajb; Celebic, Nedim; Kulovic, Dzenan
2018-01-01
Introduction: Diabetes is progressive condition which requires various ways of treatment. Adequate therapy prescribed in the right time helps patient to postpone development of complications. Adherence to complicated therapy is challenge for both patients and HCPs and is subject of research in many disciplines. Improvement in communication between HCP and patients is very important in patient’s adherence to therapy. Aim: Aim of this research was to explore validity and reliability of modified SERVQUAL instrument in attempt to explore ways of motivating diabetic patient to accept prescribed insulin therapy. Material and Methods: We used modified SERVQUAL questionnaire as instrument in the research. It was necessary to check validity and reliability of the new modified instrument. Results: Results show that modified Servqual instrument has excellent reliability (α=0.908), so we could say that it measures precisely Expectations, Perceptions and Motivation at patients. Factor analysis (EFA method) with Varimax rotation extracted 4 factors which together explain 52.902% variance of the results on this subscale. Bifactorial solution could be seen on Scree-plot diagram (break at second factor). Conclusion: Results in this research show that modified Servqual instrument which is created in order to measure expectations and perceptions of the patients is valid and reliable. Reliability and validity are proven indeed in additional dimension which was created originally for this research - motivation to accept insulin therapy. PMID:29670478
Analysis of standard reference materials by absolute INAA
NASA Astrophysics Data System (ADS)
Heft, R. E.; Koszykowski, R. F.
1981-07-01
Three standard reference materials: flyash, soil, and ASI 4340 steel, are analyzed by a method of absolute instrumental neutron activation analysis. Two different light water pool-type reactors were used to produce equivalent analytical results even though the epithermal to thermal flux ratio in one reactor was higher than that in the other by a factor of two.
NASA Astrophysics Data System (ADS)
Chan, Y. C.; Vowles, P. D.; McTainsh, G. H.; Simpson, R. W.; Cohen, D. D.; Bailey, G. M.; McOrist, G. D.
This paper describes a method for the simultaneous collection of size-fractionated aerosol samples on several collection substrates, including glass-fibre filter, carbon tape and silver tape, with a commercially available high-volume cascade impactor. This permitted various chemical analysis procedures, including ion beam analysis (IBA), instrumental neutron activation analysis (INAA), carbon analysis and scanning electron microscopy (SEM), to be carried out on the samples.
Surakanti, Jayaprada Reddy; Venkata, Ravi Chandra Polavarapu; Vemisetty, Hari Kumar; Dandolu, Ram Kiran; Jaya, Nagendra Krishna Muppalla; Thota, Shirisha
2014-01-01
Background and Aims: Extrusion of any debris during endodontic treatment may potentially cause post-operative complications such as flare-ups. The purpose of this in vitro study was to assess the amount of apically extruded debris during the root canal preparation using rotary and reciprocating nickel-titanium instrumentation systems. Materials and Methods: In this study, 60 human mandibular first premolars were randomly assigned to 3 groups (n = 20 teeth/group). The root canals were instrumented according to the manufacturers’ instructions using the Reciprocating single-file system WaveOne™ (Dentsply Maillefer, Ballaigues, Switzerland) and full-sequence rotary Hyflex CM™ (Coltene Whaledent, Allstetten, Switzerland) and ProTaper™ (Dentsply Maillefer, Ballaigues, Switzerland) instruments. The canals were then irrigated using bidistilled water. The debris that was extruded apically was collected in preweighed eppendorf tubes and assessed with an electronic balance and compared. Statistical Analysis Used: The debris extrusion was compared and statistically analyzed using analysis of variance and the post hoc Student-Newman-Keuls test. Results: The WaveOne™ and ProTaper™ rotary instruments produced significantly more debris compared with Hyflex CM™ rotary instruments (P < 0.05). Conclusions: Under the conditions of this study, all systems that were used resulted in extrusion of apical debris. Full-sequence rotary instrumentation was associated with less debris extrusion compared with the use of reciprocating single-file systems. PMID:24778507
Review of Diesel Odor and Toxic Vapor Emissions
DOT National Transportation Integrated Search
1981-05-01
The purpose of the study was to attempt to assess the adequacy of the diesel engine exhaust chemical composition data base and instrumental analysis methods for the measurement of chemicals giving rise to sensory response, especially odor and irritat...
Effectively Transforming IMC Flight into VMC Flight: An SVS Case Study
NASA Technical Reports Server (NTRS)
Glaab, Louis J.; Hughes, Monic F.; Parrish, Russell V.; Takallu, Mohammad A.
2006-01-01
A flight-test experiment was conducted using the NASA LaRC Cessna 206 aircraft. Four primary flight and navigation display concepts, including baseline and Synthetic Vision System (SVS) concepts, were evaluated in the local area of Roanoke Virginia Airport, flying visual and instrument approach procedures. A total of 19 pilots, from 3 pilot groups reflecting the diverse piloting skills of the GA population, served as evaluation pilots. Multi-variable Discriminant Analysis was applied to three carefully selected and markedly different operating conditions with conventional instrumentation to provide an extension of traditional analysis methods as well as provide an assessment of the effectiveness of SVS displays to effectively transform IMC flight into VMC flight.
Spreadsheet for designing valid least-squares calibrations: A tutorial.
Bettencourt da Silva, Ricardo J N
2016-02-01
Instrumental methods of analysis are used to define the price of goods, the compliance of products with a regulation, or the outcome of fundamental or applied research. These methods can only play their role properly if reported information is objective and their quality is fit for the intended use. If measurement results are reported with an adequately small measurement uncertainty both of these goals are achieved. The evaluation of the measurement uncertainty can be performed by the bottom-up approach, that involves a detailed description of the measurement process, or using a pragmatic top-down approach that quantify major uncertainty components from global performance data. The bottom-up approach is not so frequently used due to the need to master the quantification of individual components responsible for random and systematic effects that affect measurement results. This work presents a tutorial that can be easily used by non-experts in the accurate evaluation of the measurement uncertainty of instrumental methods of analysis calibrated using least-squares regressions. The tutorial includes the definition of the calibration interval, the assessments of instrumental response homoscedasticity, the definition of calibrators preparation procedure required for least-squares regression model application, the assessment of instrumental response linearity and the evaluation of measurement uncertainty. The developed measurement model is only applicable in calibration ranges where signal precision is constant. A MS-Excel file is made available to allow the easy application of the tutorial. This tool can be useful for cases where top-down approaches cannot produce results with adequately low measurement uncertainty. An example of the application of this tool to the determination of nitrate in water by ion chromatography is presented. Copyright © 2015 Elsevier B.V. All rights reserved.
Berridge, Georgina; Chalk, Rod; D’Avanzo, Nazzareno; Dong, Liang; Doyle, Declan; Kim, Jung-In; Xia, Xiaobing; Burgess-Brown, Nicola; deRiso, Antonio; Carpenter, Elisabeth Paula; Gileadi, Opher
2011-01-01
We have developed a method for intact mass analysis of detergent-solubilized and purified integral membrane proteins using liquid chromatography–mass spectrometry (LC–MS) with methanol as the organic mobile phase. Membrane proteins and detergents are separated chromatographically during the isocratic stage of the gradient profile from a 150-mm C3 reversed-phase column. The mass accuracy is comparable to standard methods employed for soluble proteins; the sensitivity is 10-fold lower, requiring 0.2–5 μg of protein. The method is also compatible with our standard LC–MS method used for intact mass analysis of soluble proteins and may therefore be applied on a multiuser instrument or in a high-throughput environment. PMID:21093405
Development and validity of a method for the evaluation of printed education material
Castro, Mauro Silveira; Pilger, Diogo; Fuchs, Flávio Danni; Ferreira, Maria Beatriz Cardoso
Objectives To develop and study the validity of an instrument for evaluation of Printed Education Materials (PEM); to evaluate the use of acceptability indices; to identify possible influences of professional aspects. Methods An instrument for PEM evaluation was developed which included tree steps: domain identification, item generation and instrument design. A reading to easy PEM was developed for education of patient with systemic hypertension and its treatment with hydrochlorothiazide. Construct validity was measured based on previously established errors purposively introduced into the PEM, which served as extreme groups. An acceptability index was applied taking into account the rate of professionals who should approve each item. Participants were 10 physicians (9 men) and 5 nurses (all women). Results Many professionals identified intentional errors of crude character. Few participants identified errors that needed more careful evaluation, and no one detected the intentional error that required literature analysis. Physicians considered as acceptable 95.8% of the items of the PEM, and nurses 29.2%. The differences between the scoring were statistically significant in 27% of the items. In the overall evaluation, 66.6% were considered as acceptable. The analysis of each item revealed a behavioral pattern for each professional group. Conclusions The use of instruments for evaluation of printed education materials is required and may improve the quality of the PEM available for the patients. Not always are the acceptability indices totally correct or represent high quality of information. The professional experience, the practice pattern, and perhaps the gendre of the reviewers may influence their evaluation. An analysis of the PEM by professionals in communication, in drug information, and patients should be carried out to improve the quality of the proposed material. PMID:25214924
A mathematical model for describing the mechanical behaviour of root canal instruments.
Zhang, E W; Cheung, G S P; Zheng, Y F
2011-01-01
The purpose of this study was to establish a general mathematical model for describing the mechanical behaviour of root canal instruments by combining a theoretical analytical approach with a numerical finite-element method. Mathematical formulas representing the longitudinal (taper, helical angle and pitch) and cross-sectional configurations and area, the bending and torsional inertia, the curvature of the boundary point and the (geometry of) loading condition were derived. Torsional and bending stresses and the resultant deformation were expressed mathematically as a function of these geometric parameters, modulus of elasticity of the material and the applied load. As illustrations, three brands of NiTi endodontic files of different cross-sectional configurations (ProTaper, Hero 642, and Mani NRT) were analysed under pure torsion and pure bending situation by entering the model into a finite-element analysis package (ANSYS). Numerical results confirmed that mathematical models were a feasible method to analyse the mechanical properties and predict the stress and deformation for root canal instruments during root canal preparation. Mathematical and numerical model can be a suitable way to examine mechanical behaviours as a criterion of the instrument design and to predict the stress and strain experienced by the endodontic instruments during root canal preparation. © 2010 International Endodontic Journal.
Duong, Hong Anh; Le, Minh Duc; Nguyen, Kim Diem Mai; Hauser, Peter C; Pham, Hung Viet; Mai, Thanh Duc
2015-11-01
A simple and inexpensive method for the determination of various ionic species in different water matrices is discussed in this study. The approach is based on the employment of in-house-made capillary electrophoresis (CE) instruments with capacitively coupled contactless conductivity detection (C(4)D), which can be realized even when only a modest financial budget and limited expertise are available. Advantageous features and considerations of these instruments are detailed following their pilot deployment in Vietnam. Different categories of ionic species, namely major inorganic cations (K(+), Na(+), Ca(2+), Mg(2+), and NH4(+)) and major inorganic anions (Cl(-), NO3(-), NO2(-), SO4(2-), and phosphate), in different water matrices in Vietnam were determined using these in-house fabricated instruments. Inorganic trivalent arsenic (As(iii)), which is the most abundant form of arsenic in reducing groundwater, was determined by CE-C(4)D. The effect of some interfering ions in groundwater on the analytical performance was investigated and is highlighted. The results from in-house-made CE-C(4)D-instruments were cross-checked with those obtained using the standard methods (AAS, AES, UV and IC), with correlation coefficients r(2) ≥ 0.9 and deviations from the referenced results less than 15%.
Chow, Clara K.; Corsi, Daniel J.; Lock, Karen; Madhavan, Manisha; Mackie, Pam; Li, Wei; Yi, Sun; Wang, Yang; Swaminathan, Sumathi; Lopez-Jaramillo, Patricio; Gomez-Arbelaez, Diego; Avezum, Álvaro; Lear, Scott A.; Dagenais, Gilles; Teo, Koon; McKee, Martin; Yusuf, Salim
2014-01-01
Background Previous research has shown that environments with features that encourage walking are associated with increased physical activity. Existing methods to assess the built environment using geographical information systems (GIS) data, direct audit or large surveys of the residents face constraints, such as data availability and comparability, when used to study communities in countries in diverse parts of the world. The aim of this study was to develop a method to evaluate features of the built environment of communities using a standard set of photos. In this report we describe the method of photo collection, photo analysis instrument development and inter-rater reliability of the instrument. Methods/Principal Findings A minimum of 5 photos were taken per community in 86 communities in 5 countries according to a standard set of instructions from a designated central point of each community by researchers at each site. A standard pro forma derived from reviewing existing instruments to assess the built environment was developed and used to score the characteristics of each community. Photo sets from each community were assessed independently by three observers in the central research office according to the pro forma and the inter-rater reliability was compared by intra-class correlation (ICC). Overall 87% (53 of 60) items had an ICC of ≥0.70, 7% (4 of 60) had an ICC between 0.60 and 0.70 and 5% (3 of 60) items had an ICC ≤0.50. Conclusions/Significance Analysis of photos using a standardized protocol as described in this study offers a means to obtain reliable and reproducible information on the built environment in communities in very diverse locations around the world. The collection of the photographic data required minimal training and the analysis demonstrated high reliability for the majority of items of interest. PMID:25369366
Who's in and why? A typology of stakeholder analysis methods for natural resource management.
Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C
2009-04-01
Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships.
Dose Calibration of the ISS-RAD Fast Neutron Detector
NASA Technical Reports Server (NTRS)
Zeitlin, C.
2015-01-01
The ISS-RAD instrument has been fabricated by Southwest Research Institute and delivered to NASA for flight to the ISS in late 2015 or early 2016. ISS-RAD is essentially two instruments that share a common interface to ISS. The two instruments are the Charged Particle Detector (CPD), which is very similar to the MSL-RAD detector on Mars, and the Fast Neutron Detector (FND), which is a boron-loaded plastic scintillator with readout optimized for the 0.5 to 10 MeV energy range. As the FND is completely new, it has been necessary to develop methodology to allow it to be used to measure the neutron dose and dose equivalent. This talk will focus on the methods developed and their implementation using calibration data obtained in quasi-monoenergetic (QMN) neutron fields at the PTB facility in Braunschweig, Germany. The QMN data allow us to determine an approximate response function, from which we estimate dose and dose equivalent contributions per detected neutron as a function of the pulse height. We refer to these as the "pSv per count" curves for dose equivalent and the "pGy per count" curves for dose. The FND is required to provide a dose equivalent measurement with an accuracy of ?10% of the known value in a calibrated AmBe field. Four variants of the analysis method were developed, corresponding to two different approximations of the pSv per count curve, and two different implementations, one for real-time analysis onboard ISS and one for ground analysis. We will show that the preferred method, when applied in either real-time or ground analysis, yields good accuracy for the AmBe field. We find that the real-time algorithm is more susceptible to chance-coincidence background than is the algorithm used in ground analysis, so that the best estimates will come from the latter.
A stereotaxic method of recording from single neurons in the intact in vivo eye of the cat.
Molenaar, J; Van de Grind, W A
1980-04-01
A method is described for recording stereotaxically from single retinal neurons in the optically intact in vivo eye of the cat. The method is implemented with the help of a new type of stereotaxic instrument and a specially developed stereotaxic atlas of the cat's eye and retina. The instrument is extremely stable and facilitates intracellular recording from retinal neurons. The microelectrode can be rotated about two mutually perpendicular axes, which intersect in the freely positionable pivot point of the electrode manipulation system. When the pivot point is made to coincide with a small electrode-entrance hole in the sclera of the eye, a large retinal region can be reached through this fixed hole in the immobilized eye. The stereotaxic method makes it possible to choose a target point on the presented eye atlas and predict the settings of the instrument necessary to reach this target. This method also includes the prediction of the corresponding light stimulus position on a tangent screen and the calculation of the projection of the recording electrode on this screen. The sources of error in the method were studied experimentally and a numerical perturbation analysis was carried out to study the influence of each of the sources of error on the final result. The overall accuracy of the method is of the order of 5 degrees of visual angle, which will be sufficient for most purposes.
Wavelength Scanning with a Tilting Interference Filter for Glow-Discharge Elemental Imaging.
Storey, Andrew P; Ray, Steven J; Hoffmann, Volker; Voronov, Maxim; Engelhard, Carsten; Buscher, Wolfgang; Hieftje, Gary M
2017-06-01
Glow discharges have long been used for depth profiling and bulk analysis of solid samples. In addition, over the past decade, several methods of obtaining lateral surface elemental distributions have been introduced, each with its own strengths and weaknesses. Challenges for each of these techniques are acceptable optical throughput and added instrumental complexity. Here, these problems are addressed with a tilting-filter instrument. A pulsed glow discharge is coupled to an optical system comprising an adjustable-angle tilting filter, collimating and imaging lenses, and a gated, intensified charge-coupled device (CCD) camera, which together provide surface elemental mapping of solid samples. The tilting-filter spectrometer is instrumentally simpler, produces less image distortion, and achieves higher optical throughput than a monochromator-based instrument, but has a much more limited tunable spectral range and poorer spectral resolution. As a result, the tilting-filter spectrometer is limited to single-element or two-element determinations, and only when the target spectral lines fall within an appropriate spectral range and can be spectrally discerned. Spectral interferences that result from heterogeneous impurities can be flagged and overcome by observing the spatially resolved signal response across the available tunable spectral range. The instrument has been characterized and evaluated for the spatially resolved analysis of glow-discharge emission from selected but representative samples.
Validation of ocean color sensors using a profiling hyperspectral radiometer
NASA Astrophysics Data System (ADS)
Ondrusek, M. E.; Stengel, E.; Rella, M. A.; Goode, W.; Ladner, S.; Feinholz, M.
2014-05-01
Validation measurements of satellite ocean color sensors require in situ measurements that are accurate, repeatable and traceable enough to distinguish variability between in situ measurements and variability in the signal being observed on orbit. The utility of using a Satlantic Profiler II equipped with HyperOCR radiometers (Hyperpro) for validating ocean color sensors is tested by assessing the stability of the calibration coefficients and by comparing Hyperpro in situ measurements to other instruments and between different Hyperpros in a variety of water types. Calibration and characterization of the NOAA Satlantic Hyperpro instrument is described and concurrent measurements of water-leaving radiances conducted during cruises are presented between this profiling instrument and other profiling, above-water and moored instruments. The moored optical instruments are the US operated Marine Optical BuoY (MOBY) and the French operated Boussole Buoy. In addition, Satlantic processing versions are described in terms of accuracy and consistency. A new multi-cast approach is compared to the most commonly used single cast method. Analysis comparisons are conducted in turbid and blue water conditions. Examples of validation matchups with VIIRS ocean color data are presented. With careful data collection and analysis, the Satlantic Hyperpro profiling radiometer has proven to be a reliable and consistent tool for satellite ocean color validation.
Henley, W Hampton; He, Yan; Mellors, J Scott; Batz, Nicholas G; Ramsey, J Michael; Jorgenson, James W
2017-11-10
Ultra-high voltage capillary electrophoresis with high electric field strength has been applied to the separation of the charge variants, drug conjugates, and disulfide isomers of monoclonal antibodies. Samples composed of many closely related species are difficult to resolve and quantify using traditional analytical instrumentation. High performance instrumentation can often save considerable time and effort otherwise spent on extensive method development. Ideally, the resolution obtained for a given CE buffer system scales with the square root of the applied voltage. Currently available commercial CE instrumentation is limited to an applied voltage of approximately 30kV and a maximum electric field strength of 1kV/cm due to design limitations. The instrumentation described here is capable of safely applying potentials of at least 120kV with electric field strengths over 2000V/cm, potentially doubling the resolution of the best conventional CE buffer/capillary systems while decreasing analysis time in some applications. Separations of these complex mixtures using this new instrumentation demonstrate the potential of ultra-high voltage CE to identify the presence of previously unresolved components and to reduce analysis time for complex mixtures of antibody variants and drug conjugates. Copyright © 2017 Elsevier B.V. All rights reserved.
Extended cyclic fatigue life of F2 ProTaper instruments used in reciprocating movement.
De-Deus, G; Moreira, E J L; Lopes, H P; Elias, C N
2010-12-01
To evaluate the cyclic fatigue fracture resistance of engine-driven F2 ProTaper instruments under reciprocating movement. A sample of 30 NiTi ProTaper F2 instruments was used. An artificial canal was made from a stainless steel tube, allowing the instruments to rotate freely. During mechanical testing, different movement kinematics and speeds were used, which resulted in three experimental groups (n = 10). The instruments from the first group (G1) were rotated at a nominal speed of 250 rpm until fracture, whilst the instruments from the second group (G2) were rotated at 400 rpm. In the third instrument group (G3), the files were driven under reciprocating movement. The time of fracture for each instrument was measured, and statistical analysis was performed using parametric methods. Reciprocating movement resulted in a significantly longer cyclic fatigue life (P < 0.05). Moreover, operating rpm was a significant factor affecting cyclic fatigue life (P < 0.05); instruments used at a rotational speed of 400 rpm (approximately 95 s) failed more rapidly than those used at 250 rpm (approximately 25 s). Movement kinematics is amongst the factors determining the resistance of rotary NiTi instruments to cyclic fracture. Moreover, the reciprocating movement promoted an extended cyclic fatigue life of the F2 ProTaper instrument in comparison with conventional rotation.
Erdmann, Włodzimierz S; Giovanis, Vassilis; Aschenbrenner, Piotr; Kiriakis, Vaios; Suchanowski, Andrzej
2017-01-01
This paper aims at the description and comparison of methods of topographic analysis of racing courses at all disciplines of alpine skiing sports for the purposes of obtaining: terrain geomorphology (snowless and with snow), course geometry, and competitors' runs. The review presents specific methods and instruments according to the order of their historical appearance as follows: (1) azimuth method with the use of a compass, tape and goniometer instruments; (2) optical method with geodetic theodolite, laser and photocells; (3) triangulation method with the aid of a tape and goniometer; (4) image method with the use of video cameras; (5) differential global positioning system and carrier phase global positioning system methods. Described methods were used at homologation procedure, at training sessions, during competitions of local level and during International Ski Federation World Championships or World Cups. Some methods were used together. In order to provide detailed data on course setting and skiers' running it is recommended to analyse course geometry and kinematics data of competitors' running for all important competitions.
Rota, Cristina; Biondi, Marco; Trenti, Tommaso
2011-09-26
Aution Max AX-4030, a test strip analyzer recently introduced to the market, represents an upgrade of the Aution Max AX-4280 widely employed for urinalysis. This new instrument model can allocate two different test strips at the same time. In the present study the two instruments have been compared together with the usage of Uriflet 9UB and the recently produced Aution Sticks 10PA urine strips, the latter presenting an additional test area for the measurement of urinary creatinine. Imprecision and correlation between instruments and strips have been evaluated for chemical-physical parameters. Accuracy was evaluated for protein, glucose and creatinine by comparing the semi-quantitative results to those obtained by quantitative methods. The well-known interference effect of high ascorbic acid levels on urine glucose test strip determination was evaluated, ascorbic acid influence was also evaluated on protein and creatinine determination. The two instruments have demonstrated comparable performances: precision and correlation between instruments and strips, evaluated for chemical-physical parameters, were always good. Furthermore, accuracy was always very good: results of protein and glucose semi-quantitative measurements resulted to be highly correlated with those obtained by quantitative methods. Moreover, the semi-quantitative measurements of creatinine, employing Aution Sticks 10PA urine strips, were highly comparable with quantitative results. 10PA urine strips are eligible for urine creatinine determination with the possibility of correcting urinalysis results for urinary creatinine concentration, whenever necessary and calculating the protein creatinine ratio. Further studies should be carried out to evaluate effectiveness and appropriateness of the usage of creatinine semi-quantitative analysis.
Sample preparation of metal alloys by electric discharge machining
NASA Technical Reports Server (NTRS)
Chapman, G. B., II; Gordon, W. A.
1976-01-01
Electric discharge machining was investigated as a noncontaminating method of comminuting alloys for subsequent chemical analysis. Particulate dispersions in water were produced from bulk alloys at a rate of about 5 mg/min by using a commercially available machining instrument. The utility of this approach was demonstrated by results obtained when acidified dispersions were substituted for true acid solutions in an established spectrochemical method. The analysis results were not significantly different for the two sample forms. Particle size measurements and preliminary results from other spectrochemical methods which require direct aspiration of liquid into flame or plasma sources are reported.
Effects of thermal deformation on optical instruments for space application
NASA Astrophysics Data System (ADS)
Segato, E.; Da Deppo, V.; Debei, S.; Cremonese, G.
2017-11-01
Optical instruments for space missions work in hostile environment, it's thus necessary to accurately study the effects of ambient parameters variations on the equipment. In particular optical instruments are very sensitive to ambient conditions, especially temperature. This variable can cause dilatations and misalignments of the optical elements, and can also lead to rise of dangerous stresses in the optics. Their displacements and the deformations degrade the quality of the sampled images. In this work a method for studying the effects of the temperature variations on the performance of imaging instrument is presented. The optics and their mountings are modeled and processed by a thermo-mechanical Finite Element Model (FEM) analysis, then the output data, which describe the deformations of the optical element surfaces, are elaborated using an ad hoc MATLAB routine: a non-linear least square optimization algorithm is adopted to determine the surface equations (plane, spherical, nth polynomial) which best fit the data. The obtained mathematical surface representations are then directly imported into ZEMAX for sequential raytracing analysis. The results are the variations of the Spot Diagrams, of the MTF curves and of the Diffraction Ensquared Energy due to simulated thermal loads. This method has been successfully applied to the Stereo Camera for the BepiColombo mission reproducing expected operative conditions. The results help to design and compare different optical housing systems for a feasible solution and show that it is preferable to use kinematic constraints on prisms and lenses to minimize the variation of the optical performance of the Stereo Camera.
Analysis of the Triglycerides of Some Vegetable Oils.
ERIC Educational Resources Information Center
Farines, Marie; And Others
1988-01-01
Explains that triglycerides consist of a mixture of different compounds, depending on the total number of fatty acid constituents. Details the method and instrumentation necessary for students to analyze a vegetable oil for its triglyceride content. Describes sample results. (CW)
Roughness analysis of grade breaks at intersections : final report.
DOT National Transportation Integrated Search
1993-03-01
A method to analyze the roughness of grade breaks at highway intersections is proposed. Although there are a variety of instruments to physically measure the road roughness, there are no known methodologies to analyze profile roughness during the des...
Remote In-Situ Quantitative Mineralogical Analysis Using XRD/XRF
NASA Technical Reports Server (NTRS)
Blake, D. F.; Bish, D.; Vaniman, D.; Chipera, S.; Sarrazin, P.; Collins, S. A.; Elliott, S. T.
2001-01-01
X-Ray Diffraction (XRD) is the most direct and accurate method for determining mineralogy. The CHEMIN XRD/XRF instrument has shown promising results on a variety of mineral and rock samples. Additional information is contained in the original extended abstract.
Method and Apparatus for Concentrating Vapors for Analysis
Grate, Jay W.; Baldwin, David L.; Anheier, Jr., Norman C.
2008-10-07
An apparatus and method are disclosed for pre-concentrating gaseous vapors for analysis. The invention finds application in conjunction with, e.g., analytical instruments where low detection limits for gaseous vapors are desirable. Vapors sorbed and concentrated within the bed of the apparatus can be thermally desorbed achieving at least partial separation of vapor mixtures. The apparatus is suitable, e.g., for preconcentration and sample injection, and provides greater resolution of peaks for vapors within vapor mixtures, yielding detection levels that are 10-10,000 times better than for direct sampling and analysis systems. Features are particularly useful for continuous unattended monitoring applications.
Comparison of UPLC and HPLC methods for determination of vitamin C.
Klimczak, Inga; Gliszczyńska-Świgło, Anna
2015-05-15
Ultra performance liquid chromatography (UPLC) and high-performance liquid chromatography (HPLC) methods for determination of ascorbic acid (AA) and total AA (TAA) contents (as the sum of AA and dehydroascorbic acid (DHAA) after its reduction to AA) in fruit beverages and in pharmaceutical preparations were compared. Both methods are rapid: total time of analysis was 15 and 6 min for HPLC and UPLC methods, respectively. The methods were validated in terms of linearity, instrument precision, limits of detection (LOD) and quantification (LOQ), accuracy and recovery. Intra- and inter-day instrument precisions for fruit juices, expressed as RSD, were 2.2% and 2.4% for HPLC, respectively, and 1.7% and 1.9% for UPLC, respectively. For vitamin C tablets, inter- and intra-day precisions were 0.4% and 0.5%, respectively (HPLC), and 0.5% and 0.3%, respectively (UPLC). Both methods were sensitive: LOD was 0.049 μg/mL for HPLC and 0.024 μg/mL for UPLC while LOQs were 0.149 and 0.073 μg/mL for HPLC and UPLC, respectively. These methods could be useful in the routine qualitative and quantitative analysis of AA or TAA in pharmaceutical preparations or fruit beverages. However, UPLC method is more sensitive, faster and consumes less eluent. Copyright © 2014 Elsevier Ltd. All rights reserved.
Vieira, Maria Aparecida; Ohara, Conceição Vieira da Silva; de Domenico, Edvane Birelo Lopes
2016-01-01
Abstract Objective: to construct an instrument for the assessment of graduates of undergraduate nursing courses and to validate this instrument through the consensus of specialists. Method: methodological study. In order to elaborate the instrument, documental analysis and a literature review were undertaken. Validation took place through use of the Delphi Conference, between September 2012 and September 2013, in which 36 specialists from Brazilian Nursing participated. In order to analyze reliability, the Cronbach alpha coefficient, the item/total correlation, and the Pearson correlation coefficient were calculated. Results: the instrument was constructed with the participation of specialist nurses representing all regions of Brazil, with experience in lecturing and research. The first Delphi round led to changes in the first instrument, which was restructured and submitted to another round, with a response rate of 94.44%. In the second round, the instrument was validated with a Cronbach alpha of 0.75. Conclusion: the final instrument possessed three dimensions related to the characterization of the graduate, insertion in the job market, and evaluation of the professional training process. This instrument may be used across the territory of Brazil as it is based on the curricular guidelines and contributes to the process of regulation of the quality of the undergraduate courses in Nursing. PMID:27305184
NASA Astrophysics Data System (ADS)
Phedorin, M. A.; Bobrov, V. A.; Goldberg, E. L.; Navez, J.; Zolotaryov, K. V.; Grachev, M. A.
2000-06-01
Sediments of Lake Baikal obtained on top of the underwater Akademichesky Ridge for reconstruction of the palaeoclimates of Holocene and Upper Pleistocene were subjected to elemental analysis with three methods: (i) synchrotron radiation X-ray fluorescent analysis (SR-XFA); (ii) instrumental neutron activation analysis (INAA); (iii) induction-coupled plasma mass-spectrometry (ICP-MS). Comparison of the results obtained is accompanied by statistical tests and shows that, due to its high sensitivity, simplicity, and non-destructive nature, SR-XFA can be recommended as a method of choice in the search of geochemical signals of changing palaeoclimates.
Polarization analysis for magnetic field imaging at RADEN in J-PARC/MLF
NASA Astrophysics Data System (ADS)
Shinohara, Takenao; Hiroi, Kosuke; Su, Yuhua; Kai, Tetsuya; Nakatani, Takeshi; Oikawa, Kenichi; Segawa, Mariko; Hayashida, Hirotoshi; Parker, Joseph D.; Matsumoto, Yoshihiro; Zhang, Shuoyuan; Kiyanagi, Yoshiaki
2017-06-01
Polarized neutron imaging is an attractive method for visualizing magnetic fields in a bulk object or in free space. In this technique polarization of neutrons transmitted through a sample is analyzed position by position to produce an image of the polarization distribution. In particular, the combination of three-dimensional spin analysis and the use of a pulsed neutron beam is very effective for the quantitative evaluation of both field strength and direction by means of the analysis of the wavelength dependent polarization vector. Recently a new imaging instrument “RADEN” has been constructed at the beam line of BL22 of the Materials and Life Science Experimental Facility (MLF) at J-PARC, which is dedicated to energy-resolved neutron imaging experiments. We have designed a polarization analysis apparatus for magnetic field imaging at the RADEN instrument and have evaluated its performance.
Designing instrumented walker to measure upper-extremity's efforts: A case study.
Khodadadi, Mohammad; Baniasad, Mina Arab; Arazpour, Mokhtar; Farahmand, Farzam; Zohoor, Hassan
2018-02-26
The high prevalence of shoulder pain in using walkers in patients who have spinal cord injury (SCI). Also, the limited options available to economically measure grip forces in walkers, which drove the need to create one. This article describes a method to obtain upper-extremities' forces and moments in a person with SCI by designing an appropriate instrumented walker. First, since the commercial multidirectional loadcells are too expensive, custom loadcells are fabricated. Ultimately, a complete gait analysis by means of VICON motion analysis and using inverse dynamic method has been held to measure upper-extremities' efforts. The results for a person with SCI using a two-wheel walker in low and high heights and a basic walker show that there are higher shoulder and elbow flexion-extension moments and also higher shoulder forces in superior-inferior direction and higher elbow and wrist forces in anterior-posterior directions. The results are not much different in using two different types of walker. By using the proposed method, upper-extremities' forces and moments were obtained and the results were compared to each other in using two different walkers.
Determination of N epsilon-(carboxymethyl)lysine in foods and related systems.
Ames, Jennifer M
2008-04-01
The sensitive and specific determination of advanced glycation end products (AGEs) is of considerable interest because these compounds have been associated with pro-oxidative and proinflammatory effects in vivo. AGEs form when carbonyl compounds, such as glucose and its oxidation products, glyoxal and methylglyoxal, react with the epsilon-amino group of lysine and the guanidino group of arginine to give structures including N epsilon-(carboxymethyl)lysine (CML), N epsilon-(carboxyethyl)lysine, and hydroimidazolones. CML is frequently used as a marker for AGEs in general. It exists in both the free or peptide-bound forms. Analysis of CML involves its extraction from the food (including protein hydrolysis to release any peptide-bound adduct) and determination by immunochemical or instrumental means. Various factors must be considered at each step of the analysis. Extraction, hydrolysis, and sample clean-up are all less straight forward for food samples, compared to plasma and tissue. The immunochemical and instrumental methods all have their advantages and disadvantages, and no perfect method exists. Currently, different procedures are being used in different laboratories, and there is an urgent need to compare, improve, and validate methods.
Targeted Proteomic Quantification on Quadrupole-Orbitrap Mass Spectrometer*
Gallien, Sebastien; Duriez, Elodie; Crone, Catharina; Kellmann, Markus; Moehring, Thomas; Domon, Bruno
2012-01-01
There is an immediate need for improved methods to systematically and precisely quantify large sets of peptides in complex biological samples. To date protein quantification in biological samples has been routinely performed on triple quadrupole instruments operated in selected reaction monitoring mode (SRM), and two major challenges remain. Firstly, the number of peptides to be included in one survey experiment needs to be increased to routinely reach several hundreds, and secondly, the degree of selectivity should be improved so as to reliably discriminate the targeted analytes from background interferences. High resolution and accurate mass (HR/AM) analysis on the recently developed Q-Exactive mass spectrometer can potentially address these issues. This instrument presents a unique configuration: it is constituted of an orbitrap mass analyzer equipped with a quadrupole mass filter as the front-end for precursor ion mass selection. This configuration enables new quantitative methods based on HR/AM measurements, including targeted analysis in MS mode (single ion monitoring) and in MS/MS mode (parallel reaction monitoring). The ability of the quadrupole to select a restricted m/z range allows one to overcome the dynamic range limitations associated with trapping devices, and the MS/MS mode provides an additional stage of selectivity. When applied to targeted protein quantification in urine samples and benchmarked with the reference SRM technique, the quadrupole-orbitrap instrument exhibits similar or better performance in terms of selectivity, dynamic range, and sensitivity. This high performance is further enhanced by leveraging the multiplexing capability of the instrument to design novel acquisition methods and apply them to large targeted proteomic studies for the first time, as demonstrated on 770 tryptic yeast peptides analyzed in one 60-min experiment. The increased quality of quadrupole-orbitrap data has the potential to improve existing protein quantification methods in complex samples and address the pressing demand of systems biology or biomarker evaluation studies. PMID:22962056
NASA Astrophysics Data System (ADS)
Haaland, Stein; Vaivads, Andris; Eriksson, Elin
2016-04-01
In the old Norse mythology Thor was a hammer-wielding god associated with thunder and lightning. THOR is also an acronym for Turbulence Heating ObserveR - a planned space mission dedicated to stydy space plasma turbulence. Whereas the mythological Thor did most of his work with a single tool, Mjølnir - his hammer, the modern version of THOR is far more versatile. The proposed THOR spacecraft comes with a comprehensive package of instruments to explore the energy dissipation and particle energization taking place in turbulent plasma environments. This paper presents a more detailed investigation of some of the analysis methods listed in the submitted THOR proposal. To demonstrate the methods, we have used data from existing spacecraft missions like Cluster and MMS to examine and compare single-spacecraft and multi-spacecraft methods to establish proper frames. The presented analysis methods are based on fundamental plasma laws, such as conservation of mass, momentum and energy and do not require any triangulation or gradients based on multiple spacecraft. Our experience based on Cluster and MMS results show that a well equipped single spacecraft platform, like the proposed THOR mission, very often provides better and less ambiguous results that a constellation of many spacecraft with less capable instrumentation. Limitations in underlying assumptions, such as planarity and linearity, as well as non-optimal spacecraft separation and configurations often limit the possibility to utilize multi-spacecraft methods. We also investigate the role of time resolution and dynamical range of the measurements used in the methods. Since the particle instruments onboard THOR will have a much better time resolution than existing magnetospheric satellite missions, we infer that THOR will be far better suited to resolve time evolution in plasma structures. This is of particular importance in the solar wind and magnetosheat, where frame velocities can be very high. With a larger dynamical range in many of the measurements, and thus the ability to utilize a larger part of the distribution function to calculate moments, the accuracy of key plasma parameters will also be better in THOR measurements.
A novel tensile test method to assess texture and gaping in salmon fillets.
Ashton, Thomas J; Michie, Ian; Johnston, Ian A
2010-05-01
A new tensile strength method was developed to quantify the force required to tear a standardized block of Atlantic salmon muscle with the aim of identifying those samples more prone to factory downgrading as a result of softness and fillet gaping. The new method effectively overcomes problems of sample attachment encountered with previous tensile strength tests. The repeatability and sensitivity and predictability of the new technique were evaluated against other common instrumental texture measurement methods. The relationship between sensory assessments of firmness and parameters from the instrumental texture methods was also determined. Data from the new method were shown to have the strongest correlations with gaping severity (r =-0.514, P < 0.001) and the highest level of repeatability of data when analyzing cold-smoked samples. The Warner Bratzler shear method gave the most repeatable data from fresh samples and had the highest correlations between fresh and smoked product from the same fish (r = 0.811, P < 0.001). A hierarchical cluster analysis placed the tensile test in the top cluster, alongside the Warner Bratzler method, demonstrating that it also yields adequate data with respect to these tests. None of the tested sensory analysis attributes showed significant relationships to mechanical tests except fillet firmness, with correlations (r) of 0.42 for cylinder probe maximum force (P = 0.005) and 0.31 for tensile work (P = 0.04). It was concluded that the tensile test method developed provides an important addition to the available tools for mechanical analysis of salmon quality, particularly with respect to the prediction of gaping during factory processing, which is a serious commercial problem. A novel, reliable method of measuring flesh tensile strength in salmon, provides data of relevance to gaping.
Tian, Feng; Ni, Pengsheng; Mulcahey, M J; Hambleton, Ronald K; Tulsky, David; Haley, Stephen M; Jette, Alan M
2014-11-01
To use item response theory (IRT) methods to link scores from 2 recently developed contemporary functional outcome measures, the adult Spinal Cord Injury-Functional Index (SCI-FI) and the Pedi SCI (both the parent version and the child version). Secondary data analysis of the physical functioning items of the adult SCI-FI and the Pedi SCI instruments. We used a nonequivalent group design with items common to both instruments and the Stocking-Lord method for the linking. Linking was conducted so that the adult SCI-FI and Pedi SCI scaled scores could be compared. Community. This study included a total sample of 1558 participants. Pedi SCI items were administered to a sample of children (n=381) with SCI aged 8 to 21 years, and of parents/caregivers (n=322) of children with SCI aged 4 to 21 years. Adult SCI-FI items were administered to a sample of adults (n=855) with SCI aged 18 to 92 years. Not applicable. Five scales common to both instruments were included in the analysis: Wheelchair, Daily Routine/Self-care, Daily Routine/Fine Motor, Ambulation, and General Mobility functioning. Confirmatory factor analysis and exploratory factor analysis results indicated that the 5 scales are unidimensional. A graded response model was used to calibrate the items. Misfitting items were identified and removed from the item banks. Items that function differently between the adult and child samples (ie, exhibit differential item functioning) were identified and removed from the common items used for linking. Domain scores from the Pedi SCI instruments were transformed onto the adult SCI-FI metric. This IRT linking allowed estimation of adult SCI-FI scale scores based on Pedi SCI scale scores and vice versa; therefore, it provides clinicians with a means of tracking long-term functional data for children with an SCI across their entire lifespan. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Lehotay, Steven J; Lightfield, Alan R
2018-01-01
The way to maximize scope of analysis, sample throughput, and laboratory efficiency in the monitoring of veterinary drug residues in food animals is to determine as many analytes as possible as fast as possible in as few methods as possible. Capital and overhead expenses are also reduced by using fewer instruments in the overall monitoring scheme. Traditionally, the highly polar aminoglycoside antibiotics require different chromatographic conditions from other classes of drugs, but in this work, we demonstrate that an ion-pairing reagent (sodium 1-heptanesulfonate) added to the combined final extracts from two sample preparation methods attains good separation of 174 targeted drugs, including 9 aminoglycosides, in the same 10.5-min ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis. The full method was validated in bovine kidney, liver, and muscle tissues according to US regulatory protocols, and 137-146 (79-84%) of the drugs gave between 70 and 120% average recoveries with ≤ 25% RSDs in the different types of tissues spiked at 0.5, 1, and 2 times the regulatory levels of interest (10-1000 ng/g depending on the drug). This method increases sample throughput and the possible number of drugs monitored in the US National Residue Program, and requires only one UHPLC-MS/MS method and instrument for analysis rather than two by the previous scheme. Graphical abstract Outline of the streamlined approach to monitor 174 veterinary drugs, including aminoglycosides, in bovine tissues by combining two extracts of the same sample with an ion-pairing reagent for analysis by UHPLC-MS/MS.
ERIC Educational Resources Information Center
Crocker, Linda M.; Mehrens, William A.
Four new methods of item analysis were used to select subsets of items which would yield measures of attitude change. The sample consisted of 263 students at Michigan State University who were tested on the Inventory of Beliefs as freshmen and retested on the same instrument as juniors. Item change scores and total change scores were computed for…
USDA-ARS?s Scientific Manuscript database
RATIONALE: Analysis for identification and quantification of regulated veterinary drug residues in foods are usually achieved by liquid chromatography coupled to tandem mass spectrometry. The instrument method requires the selection of characteristic ions, but structure elucidation is seldom perform...
Paiva, Carlos Eduardo; Siquelli, Felipe Augusto Ferreira; Zaia, Gabriela Rossi; de Andrade, Diocésio Alves Pinto; Borges, Marcos Aristoteles; Jácome, Alexandre A; Giroldo, Gisele Augusta Sousa Nascimento; Santos, Henrique Amorim; Hahn, Elizabeth A; Uemura, Gilberto; Paiva, Bianca Sakamoto Ribeiro
2016-01-01
To develop and validate a new multimedia instrument to measure health-related quality of life (HRQOL) in Portuguese-speaking patients with cancer. A mixed-methods study conducted in a large Brazilian Cancer Hospital. The instrument was developed along the following sequential phases: identification of HRQOL issues through qualitative content analysis of individual interviews, evaluation of the most important items according to the patients, review of the literature, evaluation by an expert committee, and pretesting. In sequence, an exploratory factor analysis was conducted (pilot testing, n = 149) to reduce the number of items and to define domains and scores. The psychometric properties of the IQualiV-OG-21 were measured in a large multicentre Brazilian study (n = 323). A software containing multimedia resources were developed to facilitate self-administration of IQualiV-OG-21; its feasibility and patients' preferences ("paper and pencil" vs. software) were further tested (n = 54). An exploratory factor analysis reduced the 30-item instrument to 21 items. The IQualiV-OG-21 was divided into 6 domains: emotional, physical, existential, interpersonal relationships, functional and financial. The multicentre study confirmed that it was valid and reliable. The electronic multimedia instrument was easy to complete and acceptable to patients. Regarding preferences, 61.1 % of them preferred the electronic format in comparison with the paper and pencil format. The IQualiV-OG-21 is a new valid and reliable multimedia HRQOL instrument that is well-understood, even by patients with low literacy skills, and can be answered quickly. It is a useful new tool that can be translated and tested in other cultures and languages.
Parish, Chad M.; Miller, Michael K.
2014-12-09
Nanostructured ferritic alloys (NFAs) exhibit complex microstructures consisting of 100-500 nm ferrite grains, grain boundary solute enrichment, and multiple populations of precipitates and nanoclusters (NCs). Understanding these materials' excellent creep and radiation-tolerance properties requires a combination of multiple atomic-scale experimental techniques. Recent advances in scanning transmission electron microscopy (STEM) hardware and data analysis methods have the potential to revolutionize nanometer to micrometer scale materials analysis. The application of these methods is applied to NFAs as a test case and is compared to both conventional STEM methods as well as complementary methods such as scanning electron microscopy and atom probe tomography.more » In this paper, we review past results and present new results illustrating the effectiveness of latest-generation STEM instrumentation and data analysis.« less
Sangeux, Morgan; Mahy, Jessica; Graham, H Kerr
2014-01-01
Informed clinical decision making for femoral and/or tibial de-rotation osteotomies requires accurate measurement of patient function through gait analysis and anatomy through physical examination of bony torsions. Validity of gait analysis has been extensively studied; however, controversy remains regarding the accuracy of physical examination measurements of femoral and tibial torsion. Comparison between CT-scans and physical examination measurements of femoral neck anteversion (FNA) and external tibial torsion (ETT) were retrospectively obtained for 98 (FNA) and 64 (ETT) patients who attended a tertiary hospital for instrumented gait analysis between 2007 and 2010. The physical examination methods studied for femoral neck anteversion were the trochanteric prominence angle test (TPAT) and the maximum hip rotation arc midpoint (Arc midpoint) and for external tibial torsion the transmalleolar axis (TMA). Results showed that all physical examination measurements statistically differed to the CT-scans (bias(standard deviation): -2(14) for TPAT, -10(12) for Arc midpoint and -16(9) for TMA). Bland and Altman plots showed that method disagreements increased with increasing bony torsions in all cases but notably for TPAT. Regression analysis showed that only TMA and CT-scan measurement of external tibial torsion demonstrated good (R(2)=57%) correlation. Correlations for both TPAT (R(2)=14%) and Arc midpoint (R(2)=39%) with CT-scan measurements of FNA were limited. We conclude that physical examination should be considered as screening techniques rather than definitive measurement methods for FNA and ETT. Further research is required to develop more accurate measurement methods to accompany instrumented gait analysis. Copyright © 2013. Published by Elsevier B.V.
Validation of a moral distress instrument in nurses of primary health care 1
Barth, Priscila Orlandi; Ramos, Flávia Regina Souza; Barlem, Edison Luiz Devos; Dalmolin, Graziele de Lima; Schneider, Dulcinéia Ghizoni
2018-01-01
ABSTRACT Objective: to validate an instrument to identify situations that trigger moral distress in relation to intensity and frequency in primary health care nurses. Method: this is a methodological study carried out with 391 nurses of primary health care, applied to the Brazilian Scale of Moral Distress in Nurses with 57 questions. Validation for primary health care was performed through expert committee evaluation, pre-test, factorial analysis, and Cronbach’s alpha. Results: there were 46 questions validated divided into six constructs: Health Policies, Working Conditions, Nurse Autonomy, Professional ethics, Disrespect to patient autonomy and Work Overload. The instrument had satisfactory internal consistency, with Cronbach’s alpha 0.98 for the instrument, and between 0.96 and 0.88 for the constructs. Conclusion: the instrument is valid and reliable to be used in the identification of the factors that trigger moral distress in primary care nurses, providing subsidies for new research in this field of professional practice. PMID:29791671
2011-01-01
Background Psychometric properties include validity, reliability and sensitivity to change. Establishing the psychometric properties of an instrument which measures three-dimensional human posture are essential prior to applying it in clinical practice or research. Methods This paper reports the findings of a systematic literature review which aimed to 1) identify non-invasive three-dimensional (3D) human posture-measuring instruments; and 2) assess the quality of reporting of the methodological procedures undertaken to establish their psychometric properties, using a purpose-build critical appraisal tool. Results Seventeen instruments were identified, of which nine were supported by research into psychometric properties. Eleven and six papers respectively, reported on validity and reliability testing. Rater qualification and reference standards were generally poorly addressed, and there was variable quality reporting of rater blinding and statistical analysis. Conclusions There is a lack of current research to establish the psychometric properties of non-invasive 3D human posture-measuring instruments. PMID:21569486
Entropy, instrument scan and pilot workload
NASA Technical Reports Server (NTRS)
Tole, J. R.; Stephens, A. T.; Vivaudou, M.; Harris, R. L., Jr.; Ephrath, A. R.
1982-01-01
Correlation and information theory which analyze the relationships between mental loading and visual scanpath of aircraft pilots are described. The relationship between skill, performance, mental workload, and visual scanning behavior are investigated. The experimental method required pilots to maintain a general aviation flight simulator on a straight and level, constant sensitivity, Instrument Landing System (ILS) course with a low level of turbulence. An additional periodic verbal task whose difficulty increased with frequency was used to increment the subject's mental workload. The subject's looppoint on the instrument panel during each ten minute run was computed via a TV oculometer and stored. Several pilots ranging in skill from novices to test pilots took part in the experiment. Analysis of the periodicity of the subject's instrument scan was accomplished by means of correlation techniques. For skilled pilots, the autocorrelation of instrument/dwell times sequences showed the same periodicity as the verbal task. The ability to multiplex simultaneous tasks increases with skill. Thus autocorrelation provides a way of evaluating the operator's skill level.
Distributed behavior model orchestration in cognitive internet of things solution
NASA Astrophysics Data System (ADS)
Li, Chung-Sheng; Darema, Frederica; Chang, Victor
2018-04-01
The introduction of pervasive and ubiquitous instrumentation within Internet of Things (IoT) leads to unprecedented real-time visibility (instrumentation), optimization and fault-tolerance of the power grid, traffic, transportation, water, oil & gas, to give some examples. Interconnecting those distinct physical, people, and business worlds through ubiquitous instrumentation, even though still in its embryonic stage, has the potential to create intelligent IoT solutions that are much greener, more efficient, comfortable, and safer. An essential new direction to materialize this potential is to develop comprehensive models of such systems dynamically interacting with the instrumentation in a feed-back control loop. We describe here opportunities in applying cognitive computing on interconnected and instrumented worlds (Cognitive Internet of Things-CIoT) and call out the system-of-systems trend among distinct but interdependent worlds, and Dynamic Data-Driven Application System (DDDAS)-based methods for advanced understanding, analysis, and real-time decision support capabilities with the accuracy of full-scale models.
Resident participation in neighbourhood audit tools — a scoping review
Hofland, Aafke C L; Devilee, Jeroen; van Kempen, Elise; den Broeder, Lea
2018-01-01
Abstract Background Healthy urban environments require careful planning and a testing of environmental quality that goes beyond statutory requirements. Moreover, it requires the inclusion of resident views, perceptions and experiences that help deepen the understanding of local (public health) problems. To facilitate this, neighbourhoods should be mapped in a way that is relevant to them. One way to do this is participative neighbourhood auditing. This paper provides an insight into availability and characteristics of participatory neighbourhood audit instruments. Methods A scoping review in scientific and grey literature, consisting of the following steps: literature search, identification and selection of relevant audit instruments, data extraction and data charting (including a work meeting to discuss outputs), reporting. Results In total, 13 participatory instruments were identified. The role of residents in most instruments was as ‘data collectors’; only few instruments included residents in other audit activities like problem definition or analysis of data. The instruments identified focus mainly on physical, not social, neighbourhood characteristics. Paper forms containing closed-ended questions or scales were the most often applied registration method. Conclusions The results show that neighbourhood auditing could be improved by including social aspects in the audit tools. They also show that the role of residents in neighbourhood auditing is limited; however, little is known about how their engagement takes place in practice. Developers of new instruments need to balance not only social and physical aspects, but also resident engagement and scientific robustness. Technologies like mobile applications pose new opportunities for participative approaches in neighbourhood auditing. PMID:29346663
Information Management Systems in the Undergraduate Instrumental Analysis Laboratory.
ERIC Educational Resources Information Center
Merrer, Robert J.
1985-01-01
Discusses two applications of Laboratory Information Management Systems (LIMS) in the undergraduate laboratory. They are the coulometric titration of thiosulfate with electrogenerated triiodide ion and the atomic absorption determination of calcium using both analytical calibration curve and standard addition methods. (JN)
Laboratory analytical methods for the determination of the hydrocarbon status of soils (a review)
NASA Astrophysics Data System (ADS)
Pikovskii, Yu. I.; Korotkov, L. A.; Smirnova, M. A.; Kovach, R. G.
2017-10-01
Laboratory analytical methods suitable for the determination of the hydrocarbon status of soils (a specific soil characteristic involving information on the total content and qualitative features of soluble (bitumoid) carbonaceous substances and individual hydrocarbons (polycyclic aromatic hydrocarbons, alkanes, etc.) in bitumoid, as well as the composition and content of hydrocarbon gases) have been considered. Among different physicochemical methods of study, attention is focused on the methods suitable for the wide use. Luminescence-bituminological analysis, low-temperature spectrofluorimetry (Shpolskii spectroscopy), infrared (IR) spectroscopy, gas chromatography, chromatography-mass spectrometry, and some other methods have been characterized, as well as sample preparation features. Advantages and limitations of each of these methods are described; their efficiency, instrumental complexity, analysis duration, and accuracy are assessed.
Zachariah, Marianne; Seidling, Hanna M; Neri, Pamela M; Cresswell, Kathrin M; Duke, Jon; Bloomrosen, Meryl; Volk, Lynn A; Bates, David W
2011-01-01
Background Medication-related decision support can reduce the frequency of preventable adverse drug events. However, the design of current medication alerts often results in alert fatigue and high over-ride rates, thus reducing any potential benefits. Methods The authors previously reviewed human-factors principles for relevance to medication-related decision support alerts. In this study, instrument items were developed for assessing the appropriate implementation of these human-factors principles in drug–drug interaction (DDI) alerts. User feedback regarding nine electronic medical records was considered during the development process. Content validity, construct validity through correlation analysis, and inter-rater reliability were assessed. Results The final version of the instrument included 26 items associated with nine human-factors principles. Content validation on three systems resulted in the addition of one principle (Corrective Actions) to the instrument and the elimination of eight items. Additionally, the wording of eight items was altered. Correlation analysis suggests a direct relationship between system age and performance of DDI alerts (p=0.0016). Inter-rater reliability indicated substantial agreement between raters (κ=0.764). Conclusion The authors developed and gathered preliminary evidence for the validity of an instrument that measures the appropriate use of human-factors principles in the design and display of DDI alerts. Designers of DDI alerts may use the instrument to improve usability and increase user acceptance of medication alerts, and organizations selecting an electronic medical record may find the instrument helpful in meeting their clinicians' usability needs. PMID:21946241
Shahi, Shahriar; Yavari, Hamid R; Rahimi, Saeed; Reyhani, Mohammad F; Kamarroosta, Zahra; Abdolrahimi, Majid
2009-03-01
The aim of this study was to evaluate the effect of RaCe, FlexMaster and ProFile rotary instruments on smear layer formation by scanning electron microscopy. Eighty-four caries-free freshly extracted human single-rooted teeth were selected and divided into three groups, each containing 28 teeth. The teeth were instrumented with rotary instruments sequentially: Group A: ProFile Rotary Instruments; Group B: FlexMaster Rotary Instruments; and Group C: RaCe Rotary Instruments. Instrumentation was performed by the crown-down method and according to the manufacturer's instructions. The specimens were then examined with SEM according to Hülsmann's classification. One-way ANOVA and a post hoc Tukey test were used for statistical analysis. The results showed that there were no statistically significant differences among the three groups in the coronal third (P = 0.39), but at the apical and middle thirds there were statistically significant differences between the RaCe group and the other groups (P < 0.05). Smear layer in the RaCe group was less than that in the ProFile and FlexMaster groups, but the difference between the ProFile group and FlexMaster group was not statistically significant (P > 0.05). It was concluded that RaCe Rotary Instruments produce less smear layer than FlexMaster and ProFile Rotary Instruments.
NASA Astrophysics Data System (ADS)
B. Franz, Heather; G. Trainer, Melissa; H. Wong, Michael; L. K. Manning, Heidi; C. Stern, Jennifer; R. Mahaffy, Paul; K. Atreya, Sushil; Benna, Mehdi; G. Conrad, Pamela; N. Harpold, Dan; A. Leshin, Laurie; A. Malespin, Charles; P. McKay, Christopher; Thomas Nolan, J.; Raaen, Eric
2014-06-01
The Sample Analysis at Mars (SAM) instrument suite is the largest scientific payload on the Mars Science Laboratory (MSL) Curiosity rover, which landed in Mars' Gale Crater in August 2012. As a miniature geochemical laboratory, SAM is well-equipped to address multiple aspects of MSL's primary science goal, characterizing the potential past or present habitability of Gale Crater. Atmospheric measurements support this goal through compositional investigations relevant to martian climate evolution. SAM instruments include a quadrupole mass spectrometer, a tunable laser spectrometer, and a gas chromatograph that are used to analyze martian atmospheric gases as well as volatiles released by pyrolysis of solid surface materials (Mahaffy et al., 2012). This report presents analytical methods for retrieving the chemical and isotopic composition of Mars' atmosphere from measurements obtained with SAM's quadrupole mass spectrometer. It provides empirical calibration constants for computing volume mixing ratios of the most abundant atmospheric species and analytical functions to correct for instrument artifacts and to characterize measurement uncertainties. Finally, we discuss differences in volume mixing ratios of the martian atmosphere as determined by SAM (Mahaffy et al., 2013) and Viking (Owen et al., 1977; Oyama and Berdahl, 1977) from an analytical perspective. Although the focus of this paper is atmospheric observations, much of the material concerning corrections for instrumental effects also applies to reduction of data acquired with SAM from analysis of solid samples. The Sample Analysis at Mars (SAM) instrument measures the composition of the martian atmosphere. Rigorous calibration of SAM's mass spectrometer was performed with relevant gas mixtures. Calibration included derivation of a new model to correct for electron multiplier effects. Volume mixing ratios for Ar and N2 obtained with SAM differ from those obtained with Viking. Differences between SAM and Viking volume mixing ratios are under investigation.
NASA Astrophysics Data System (ADS)
Haciyakupoglu, Sevilay; Nur Esen, Ayse; Erenturk, Sema
2014-08-01
The purpose of this study is optimization of the experimental parameters for analysis of soil matrix by instrumental neutron activation analysis and quantitative determination of barium, cerium, lanthanum, rubidium, scandium and thorium in soil samples collected from industrialized urban areas near Istanbul. Samples were irradiated in TRIGA MARK II Research Reactor of Istanbul Technical University. Two types of reference materials were used to check the accuracy of the applied method. The achieved results were found to be in compliance with certified values of the reference materials. The calculated En numbers for mentioned elements were found to be less than 1. The presented data of element concentrations in soil samples will help to trace the pollution as an impact of urbanization and industrialization, as well as providing database for future studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Heinrich, R.R.; Jensen, K.J.
Technical and administrative activities of the Analytical Chemistry Laboratory (ACL) are reported for fiscal year 1984. The ACL is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL is administratively within the Chemical Technology Division, the principal user, but provides technical support for all of the technical divisions and programs at ANL. The ACL has threemore » technical groups - Chemical Analysis, Instrumental Analysis, and Organic Analysis. Under technical activities 26 projects are briefly described. Under professional activities, a list is presented for publications and reports, oral presentations, awards and meetings attended. 6 figs., 2 tabs.« less
Albarracín, Dolores; Wilson, Kristina; Durantini, Marta R.; Sunderrajan, Aashna; Livingood, William
2016-01-01
Objective A randomized control trial with 722 eligible clients from a health department in the State of Florida was conducted to identify a simple, effective meta-intervention to increase completion of an HIV-prevention counseling program. Method The overall design involved two factors representing an empowering and instrumental message, as well as an additional factor indicating presence or absence of expectations about the counseling. Completion of the three-session counseling was determined by recording attendance. Results A logistic regression analysis with the three factors of empowering message, instrumental message, and presence of mediator measures, as well as all interactions, revealed significant interactions between instrumental and empowering messages and between instrumental messages and presence of mediator measures. Results indicated that (a) the instrumental message alone produced most completion than any other message, and (b) when mediators were not measured, including the instrumental message led to greater completion. Conclusions The overall gains in completion as a result of the instrumental message were 16%, implying success in the intended facilitation of counseling completion. The measures of mediators did not detect any experimental effects, probably because the effects were happening without much conscious awareness. PMID:27786499
Double difference method in deep inelastic neutron scattering on the VESUVIO spectrometer
NASA Astrophysics Data System (ADS)
Andreani, C.; Colognesi, D.; Degiorgi, E.; Filabozzi, A.; Nardone, M.; Pace, E.; Pietropaolo, A.; Senesi, R.
2003-02-01
The principles of the Double Difference (DD) method, applied to the neutron spectrometer VESUVIO, are discussed. VESUVIO, an inverse geometry spectrometer operating at the ISIS pulsed neutron source in the eV energy region, has been specifically designed to measure the single particle dynamical properties in condensed matter. The width of the nuclear resonance of the absorbing filter, used for the neutron energy analysis, provides the most important contribution to the energy resolution of the inverse geometry instruments. In this paper, the DD method, which is based on a linear combination of two measurements recorded with filter foils of the same resonance material but of different thickness, is shown to improve significantly the instrumental energy resolution, as compared with the Single Difference (SD) method. The asymptotic response functions, derived through Monte-Carlo simulations for polycrystalline Pb and ZrH 2 samples, are analysed in both DD and SD methods, and compared with the experimental ones for Pb sample. The response functions have been modelled for two distinct experimental configurations of the VESUVIO spectrometer, employing 6Li-glass neutron detectors and NaI γ detectors revealing the γ-ray cascade from the ( n,γ) reaction, respectively. The DD method appears to be an effective experimental procedure for Deep Inelastic Neutron Scattering measurements on VESUVIO spectrometer, since it reduces the experimental resolution of the instrument in both 6Li-glass neutron detector and γ detector configurations.
Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.
2015-10-30
The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statisticallymore » significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.« less
Hassoun, Abdo; Karoui, Romdhane
2017-06-13
Although being one of the most vulnerable and perishable products, fish and other seafoods provide a wide range of health-promoting compounds. Recently, the growing interest of consumers in food quality and safety issues has contributed to the increasing demand for sensitive and rapid analytical technologies. Several traditional physicochemical, textural, sensory, and electrical methods have been used to evaluate freshness and authentication of fish and other seafood products. Despite the importance of these standard methods, they are expensive and time-consuming, and often susceptible to large sources of variation. Recently, spectroscopic methods and other emerging techniques have shown great potential due to speed of analysis, minimal sample preparation, high repeatability, low cost, and, most of all, the fact that these techniques are noninvasive and nondestructive and, therefore, could be applied to any online monitoring system. This review describes firstly and briefly the basic principles of multivariate data analysis, followed by the most commonly traditional methods used for the determination of the freshness and authenticity of fish and other seafood products. A special focus is put on the use of rapid and nondestructive techniques (spectroscopic techniques and instrumental sensors) to address several issues related to the quality of these products. Moreover, the advantages and limitations of each technique are reviewed and some perspectives are also given.
Isolation of Circulating Tumor Cells by Dielectrophoresis
Gascoyne, Peter R. C.; Shim, Sangjo
2014-01-01
Dielectrophoresis (DEP) is an electrokinetic method that allows intrinsic dielectric properties of suspended cells to be exploited for discrimination and separation. It has emerged as a promising method for isolating circulation tumor cells (CTCs) from blood. DEP-isolation of CTCs is independent of cell surface markers. Furthermore, isolated CTCs are viable and can be maintained in culture, suggesting that DEP methods should be more generally applicable than antibody-based approaches. The aim of this article is to review and synthesize for both oncologists and biomedical engineers interested in CTC isolation the pertinent characteristics of DEP and CTCs. The aim is to promote an understanding of the factors involved in realizing DEP-based instruments having both sufficient discrimination and throughput to allow routine analysis of CTCs in clinical practice. The article brings together: (a) the principles of DEP; (b) the biological basis for the dielectric differences between CTCs and blood cells; (c) why such differences are expected to be present for all types of tumors; and (d) instrumentation requirements to process 10 mL blood specimens in less than 1 h to enable routine clinical analysis. The force equilibrium method of dielectrophoretic field-flow fractionation (DEP-FFF) is shown to offer higher discrimination and throughput than earlier DEP trapping methods and to be applicable to clinical studies. PMID:24662940
A new cation-exchange method for accurate field speciation of hexavalent chromium
Ball, J.W.; McCleskey, R. Blaine
2003-01-01
A new method for field speciation of Cr(VI) has been developed to meet present stringent regulatory standards and to overcome the limitations of existing methods. The method consists of passing a water sample through strong acid cation-exchange resin at the field site, where Cr(III) is retained while Cr(VI) passes into the effluent and is preserved for later determination. The method is simple, rapid, portable, and accurate, and makes use of readily available, inexpensive materials. Cr(VI) concentrations are determined later in the laboratory using any elemental analysis instrument sufficiently sensitive to measure the Cr(VI) concentrations of interest. The new method allows measurement of Cr(VI) concentrations as low as 0.05 ??g 1-1, storage of samples for at least several weeks prior to analysis, and use of readily available analytical instrumentation. Cr(VI) can be separated from Cr(III) between pH 2 and 11 at Cr(III)/Cr(VI) concentration ratios as high as 1000. The new method has demonstrated excellent comparability with two commonly used methods, the Hach Company direct colorimetric method and USEPA method 218.6. The new method is superior to the Hach direct colorimetric method owing to its relative sensitivity and simplicity. The new method is superior to USEPA method 218.6 in the presence of Fe(II) concentrations up to 1 mg 1-1 and Fe(III) concentrations up to 10 mg 1-1. Time stability of preserved samples is a significant advantage over the 24-h time constraint specified for USEPA method 218.6.
Material Analysis and Identification
NASA Technical Reports Server (NTRS)
2004-01-01
KeyMaster Technologies, Inc., develops and markets specialized, hand-held X-ray fluorescence (XRF) instruments and unique tagging technology used to identify and authenticate materials or processes. NASA first met with this Kennewick, Washington-based company as the Agency began seeking companies to develop a hand-held instrument that would detect data matrix symbols on parts covered by paint and other coatings. Since the Federal Aviation Administration was also searching for methods to detect and eliminate the use of unapproved parts, it recommended that NASA and KeyMaster work together to develop a technology that would benefit both agencies.
NASA Astrophysics Data System (ADS)
Rubinson, Judith F.; Neyer-Hilvert, Jennifer
1997-09-01
A laboratory experiment using a gas chromatography/mass selective detection method has been developed for the isolation, identification, and quantitation of fatty acid content of commercial fats and oils. Results for corn, nutmeg, peanut, and safflower oils are compared with literature values, and the results for corn oil are compared for two different trials of the experiment. In addition, a number of variations on the experiment are suggested including possible extension of the experiment for use in an instrumental analysis course.
Instrumental and atmospheric background lines observed by the SMM gamma-ray spectrometer
NASA Technical Reports Server (NTRS)
Share, G. H.; Kinzer, R. L.; Strickman, M. S.; Letaw, J. R.; Chupp, E. L.
1989-01-01
Preliminary identifications of instrumental and atmospheric background lines detected by the gamma-ray spectrometer on NASA's Solar Maximum Mission satellite (SMM) are presented. The long-term and stable operation of this experiment has provided data of high quality for use in this analysis. Methods are described for identifying radioactive isotopes which use their different decay times. Temporal evolution of the features are revealed by spectral comparisons, subtractions, and fits. An understanding of these temporal variations has enabled the data to be used for detecting celestial gamma-ray sources.
Cyclic fatigue resistance of four nickel-titanium rotary instruments: a comparative study
Pedullà, Eugenio; Plotino, Gianluca; Grande, Nicola Maria; Pappalardo, Alfio; Rapisarda, Ernesto
2012-01-01
Summary Aims The aim of this study is to investigate cyclic fatigue resistance of four nickel – titanium rotary (NTR) instruments produced by a new method or traditional grinding processes. Methods Four NTR instruments from different brands were selected: group 1. Twisted File produced by a new thermal treatment of nickel – titanium alloy; group 2. Revo S SU; group 3. Mtwo and group 4. BioRaCe BR3 produced by traditional grinding processes. A total of 80 instruments (20 for each group) were tested for cyclic fatigue resistance inside a curved artificial canal with a 60 degree angle of curvature and 5 mm radius of curvature. Time to fracture (TtF) from the start of the test until the moment of file breakage and the length of the fractured tip was recorded for each instrument. Means and standard deviations (SD) of TtF and fragment length were calculated. Data were subjected to one-way analysis of variance (ANOVA). Results Group 1 (Twisted File) showed the highest value of TtF means. Cyclic fatigue resistance of Twisted File and Mtwo was significantly higher than group 2 (Revo S SU) and 4 (BioRace BR3), while no significant differences were found between group 1 (Twisted File) and 3 (Mtwo) or group 2 (Revo S SU) and 4 (BioRaCe BR3). Conclusions The cyclic fatigue resistance of Twisted File was significantly frigher than instruments produced with traditional grinding process except of Mtwo files. PMID:23087787
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strunk, W.D.
1987-01-01
Personnel at the Oak Ridge National Laboratory were tasked by the US Navy to assist in establishing a maintenance monitoring program for machinery aboard surface ships. Given the number of surface ships, the variety of locations in which they operate, the different types of equipment (rotating and reciprocating, as well as instrumentation), and the different procedures which control the operation and maintenance of a ship, it can be seen, apart from the logistics of organizing such a monitoring program, that the technical issues are as varied and numerous as the ships themselves. Unique methods and procedures have been developed tomore » perform the tasks required on a large scale. Among the specific tasks and technical issues addressed were the development and installation of a data collection and communication instrumentation system for each port, the qualification of measurement methodologies and techniques, the establishment of computer data bases, the evaluation of the instrumentation used, training of civilian and military personnel, development of machinery condition assessment aids using machine design and modal analysis information, and development of computer displays. After these tasks were completed and the appropriate resolution integrated into the program, the final task was the development of a method to continually evaluate the effectiveness of the program, using actual maintenance records.« less
Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.
Sugino, T; Kawahira, H; Nakamura, R
2014-09-01
Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information. Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits. Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently. Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.
Shiroff, Jennifer J; Gregoski, Mathew J
2017-06-01
Measurement of recessive carrier screening attitudes related to conception and pregnancy is necessary to determine current acceptance, and whether behavioral intervention strategies are needed in clinical practice. To evaluate quantitative survey instruments to measure patient attitudes regarding genetic carrier testing prior to conception and pregnancy databases examining patient attitudes regarding genetic screening prior to conception and pregnancy from 2003-2013 were searched yielding 344 articles; eight studies with eight instruments met criteria for inclusion. Data abstraction on theoretical framework, subjects, instrument description, scoring, method of measurement, reliability, validity, feasibility, level of evidence, and outcomes was completed. Reliability information was provided in five studies with an internal consistency of Cronbach's α >0.70. Information pertaining to validity was presented in three studies and included construct validity via factor analysis. Despite limited psychometric information, these questionnaires are self-administered and can be briefly completed, making them a feasible method of evaluation.
Kumari, Manju Raj; Krishnaswamy, Manjunath Mysore
2016-07-01
Success of any endodontic treatment depends on strict adherence to 'endodontic triad'. Preparation of root canal system is recognized as being one of the most important stages in root canal treatment. At times, we inevitably end up damaging root dentin which becomes a Gateway for infections like perforation, zipping, dentinal cracks and minute intricate fractures or even vertical root fractures, thereby resulting in failure of treatment. Several factors may be responsible for the formation of dentinal cracks like high concentration of sodium hypochlorite, compaction methods and various canal shaping methods. To compare and evaluate the effects of root canal preparation techniques and instrumentation length on the development of apical root cracks. Seventy extracted premolars with straight roots were mounted on resin blocks with simulated periodontal ligaments, exposing 1-2 mm of the apex followed by sectioning of 1mm of root tip for better visualization under stereomicroscope. The teeth were divided into seven groups of 10 teeth each - a control group and six experimental groups. Subgroup A & B were instrumented with: Stainless Steel hand files (SS) up to Root Canal Length (RCL) & (RCL -1 mm) respectively; sub group C & D were instrumented using ProTaper Universal (PTU) up to RCL and (RCL -1mm) respectively; subgroup E & F were instrumented using ProTaper Next (PTN) up to RCL & (RCL -1 mm) respectively. Stereomicroscopic images of the instrumentation sequence were compared for each tooth. The data was analyzed statistically using descriptive analysis by 'Phi' and 'Cramers' test to find out statistical significance between the groups. The level of significance was set at p< 0.05 using SPSS software. Stainless steel hand file group showed most cracks followed by ProTaper Universal & ProTaper Next though statistically not significant. Samples instrumented up to 1mm short of working length (RCL-1mm) showed lesser number of cracks. All groups showed cracks formation, the stainless steel group being the highest. Working 1mm short of apex reduces the incidence of crack formation.
Chemical Fingerprinting of Materials Developed Due to Environmental Issues
NASA Technical Reports Server (NTRS)
Smith, Doris A.; McCool, A. (Technical Monitor)
2000-01-01
Instrumental chemical analysis methods are developed and used to chemically fingerprint new and modified External Tank materials made necessary by changing environmental requirements. Chemical fingerprinting can detect and diagnose variations in material composition. To chemically characterize each material, fingerprint methods are selected from an extensive toolbox based on the material's chemistry and the ability of the specific methods to detect the material's critical ingredients. Fingerprint methods have been developed for a variety of materials including Thermal Protection System foams, adhesives, primers, and composites.
Hou, Zheng-Kun; Liu, Feng-Bin; Fang, Ji-Qian; Li, Xiao-Ying; Li, Li-Juan; Lin, Chu-Hua
2013-03-01
The reporting of patient-reported outcomes (PRO) instrument development is vital for both researchers and clinicians to determine its validity, thus, we propose the Preferred Reporting Items for PRO Instrument Development (PRIPROID) to improve the quality of reports. Abiding by the guidance published by the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network, we had performed 6 steps for items development: identified the need for a guideline, performed a literature review, obtained funding for the guideline initiative, identified participants, conducted a Delphi exercise and generated a list of PRIPROID items for consideration at the face-to-face meeting. Twenty three items subheadings under 7 topics were included: title and structured abstract, rationale, objectives, intention, eligibility criteria, conceptual framework, items generation, response options, scoring, times, administrative modes, burden assessment, properties assessment, statistical methods, participants, main results, and additional analysis, summary of evidence, limitations, clinical attentions, and conclusions, item pools or final form, and funding. The PRIPROID contains many elements of the PRO research, and this assists researchers to report their results more accurately and to a certain degree use this instrument to evaluate the quality of the research methods.
Apical extrusion of debris and irrigant using hand and rotary systems: A comparative study
Ghivari, Sheetal B; Kubasad, Girish C; Chandak, Manoj G; Akarte, NR
2011-01-01
Aim: To evaluate and compare the amount of debris and irrigant extruded quantitatively by using two hand and rotary nickel–titanium (Ni–Ti) instrumentation techniques. Materials and Methods: Eighty freshly extracted mandibular premolars having similar canal length and curvature were selected and mounted in a debris collection apparatus. After each instrument change, 1 ml of distilled water was used as an irrigant and the amount of irrigant extruded was measured using the Meyers and Montgomery method. After drying, the debris was weighed using an electronic microbalance to determine its weight. Statistical analysis used: The data was analyzed statistically to determine the mean difference between the groups. The mean weight of the dry debris and irrigant within the group and between the groups was calculated by the one-way ANOVA and multiple comparison (Dunnet D) test. Results: The step-back technique extruded a greater quantity of debris and irrigant in comparison to other hand and rotary Ni–Ti systems. Conclusions: All instrumentation techniques extrude debris and irrigant, it is prudent on the part of the clinician to select the instrumentation technique that extrudes the least amount of debris and irrigant, to prevent a flare-up phenomena. PMID:21814364
Ertefaie, Ashkan; Flory, James H; Hennessy, Sean; Small, Dylan S
2017-06-15
Instrumental variable (IV) methods provide unbiased treatment effect estimation in the presence of unmeasured confounders under certain assumptions. To provide valid estimates of treatment effect, treatment effect confounders that are associated with the IV (IV-confounders) must be included in the analysis, and not including observations with missing values may lead to bias. Missing covariate data are particularly problematic when the probability that a value is missing is related to the value itself, which is known as nonignorable missingness. In such cases, imputation-based methods are biased. Using health-care provider preference as an IV method, we propose a 2-step procedure with which to estimate a valid treatment effect in the presence of baseline variables with nonignorable missing values. First, the provider preference IV value is estimated by performing a complete-case analysis using a random-effects model that includes IV-confounders. Second, the treatment effect is estimated using a 2-stage least squares IV approach that excludes IV-confounders with missing values. Simulation results are presented, and the method is applied to an analysis comparing the effects of sulfonylureas versus metformin on body mass index, where the variables baseline body mass index and glycosylated hemoglobin have missing values. Our result supports the association of sulfonylureas with weight gain. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Fotiadis, Dimitris A; Astaras, Alexandros; Bamidis, Panagiotis D; Papathanasiou, Kostas; Kalfas, Anestis
2015-09-01
This paper presents a novel method for tracking the position of a medical instrument's tip. The system is based on phase locking a high frequency signal transmitted from the medical instrument's tip to a reference signal. Displacement measurement is established having the loop open, in order to get a low frequency voltage representing the medical instrument's movement; therefore, positioning is established by means of conventional measuring techniques. The voltage-controlled oscillator stage of the phase-locked loop (PLL), combined to an appropriate antenna, comprises the associated transmitter located inside the medical instrument tip. All the other low frequency PLL components, low noise amplifier and mixer, are located outside the human body, forming the receiver part of the system. The operating details of the proposed system were coded in Verilog-AMS. Simulation results indicate robust medical instrument tracking in 1-D. Experimental evaluation of the proposed position tracking system is also presented. The experiments described in this paper are based on a transmitter moving opposite a stationary receiver performing either constant velocity or uniformly accelerated movement, and also together with two stationary receivers performing constant velocity movement again. This latter setup is implemented in order to demonstrate the prototype's accuracy for planar (2-D) motion measurements. Error analysis and time-domain analysis are presented for system performance characterization. Furthermore, preliminary experimental assessment using a saline solution container to more closely approximate the human body as a radio frequency wave transmission medium has proved the system's capability of operating underneath the skin.
Dimension Reduction of Hyperspectral Data on Beowulf Clusters
NASA Technical Reports Server (NTRS)
El-Ghazawi, Tarek
2000-01-01
Traditional remote sensing instruments are multispectral, where observations are collected at a few different spectral bands. Recently, many hyperspectral instruments, that can collect observations at hundreds of bands, have been operation. Furthermore, there have been ongoing research efforts on ultraspectral instruments that can produce observations at thousands of spectral bands. While these remote sensing technology developments hold a great promise for new findings in the area of Earth and space science, they present many challenges. These include the need for faster processing of such increased data volumes, and methods for data reduction. Dimension Reduction is a spectral transformation, which is used widely in remote sensing, is the Principal Components Analysis (PCA). In light of the growing number of spectral channels of modern instruments, the paper reports on the development of a parallel PCA and its implementation on two Beowulf cluster configurations, on with fast Ethernet switch and the other is with a Myrinet interconnection.
Follow That Satellite: EO-1 Maneuvers Into Close Formation With Landsat-7
NASA Technical Reports Server (NTRS)
DeFazio, Robert L.; Owens, Skip; Good, Susan; Bauer, Frank H. (Technical Monitor)
2001-01-01
As the Landsat-7 (LS-7) spacecraft continued NASA's historic program of earth imaging begun over three decades ago, NASA launched the Earth Observing-1 (EO-1) spacecraft carrying examples of the next generation of LS instruments. The validation method for these instruments was to have EO-1 fly in a close formation behind LS-7 on the same World Reference System (WRS) path. From that formation hundreds of near-coincident images would be taken by each spacecraft and compared to evaluate improvements in the EO-1 instruments. This paper will address the mission analysis required to launch and maneuver EO-1 into the formation with LS-7 where instrument validation was to occur plus a summary of completing the formation acquisition. Each EO-1 launch opportunity that occurred on a different day of a LS-7 16-day repeat cycle required a separate and distinct maneuver profile.
NASA Technical Reports Server (NTRS)
1988-01-01
Viking landers touched down on Mars equipped with a variety of systems to conduct automated research, each carrying a compact but highly sophisticated instrument for analyzing Martian soil and atmosphere. Instrument called a Gas Chromatography/Mass Spectrometer (GC/MS) had to be small, lightweight, shock resistant, highly automated and extremely sensitive, yet require minimal electrical power. Viking Instruments Corporation commercialized this technology and targeted their primary market as environmental monitoring, especially toxic and hazardous waste site monitoring. Waste sites often contain chemicals in complex mixtures, and the conventional method of site characterization, taking samples on-site and sending them to a laboratory for analysis is time consuming and expensive. Other terrestrial applications are explosive detection in airports, drug detection, industrial air monitoring, medical metabolic monitoring and for military, chemical warfare agents.
NASA Astrophysics Data System (ADS)
Carlson, Scott M.
1993-06-01
The design of a high resolution plane grating all-reflection Michelson interferometer for ionospheric spectroscopy was analyzed using ray tracing techniques. This interferometer produces an interference pattern whose spatial frequency is wavelength dependent. The instrument is intended for remote observations of the atomic oxygen triplet emission line profile at 1304 A in the thermosphere from sounding rocket or satellite platforms. The device was modeled using the PC-based ray tracing application, DART, and results analyzed through fourier techniques using the PC with Windows version of the Interactive Data Language (IDL). Through these methods, instrument resolution, resolving power, and bandpass were determined. An analysis of the effects of aperture size and shape on instrument performance was also conducted.
Experimental analysis of IMEP in a rotary combustion engine
NASA Technical Reports Server (NTRS)
Schock, H. J.; Rice, W. J.; Meng, P. R.
1981-01-01
A real time indicated mean effective pressure measurement system is described which is used to judge proposed improvements in cycle efficiency of a rotary combustion engine. This is the first self-contained instrument that is capable of making real time measurements of IMEP in a rotary engine. Previous methods used require data recording and later processing using a digital computer. The unique features of this instrumentation include its ability to measure IMEP on a cycle by cycle, real time basis and the elimination of the need to differentiate volume function in real time. Measurements at two engine speeds (2000 and 3000 rpm) and a full range of loads are presented, although the instrument was designed to operate to speeds of 9000 rpm.
Santos, Rafaella Zulianello Dos; Bonin, Christiani Decker Batista; Martins, Eliara Ten Caten; Pereira Junior, Moacir; Ghisi, Gabriela Lima de Melo; Macedo, Kassia Rosangela Paz de; Benetti, Magnus
2018-01-01
The absence of instruments capable of measuring the level of knowledge of hypertensive patients in cardiac rehabilitation programs about their disease reflects the lack of specific recommendations for these patients. To develop and validate a questionnaire to evaluate the knowledge of hypertensive patients in cardiac rehabilitation programs about their disease. A total of 184 hypertensive patients (mean age 60.5 ± 10 years, 66.8% men) were evaluated. Reproducibility was assessed by calculation of the intraclass correlation coefficient using the test-retest method. Internal consistency was assessed by the Cronbach's alpha and the construct validity by the exploratory factorial analysis. The final version of the instrument had 17 questions organized in areas considered important for patient education. The instrument proposed showed a clarity index of 8.7 (0.25). The intraclass correlation coefficient was 0.804 and the Cronbach's correlation coefficient was 0.648. Factor analysis revealed five factors associated with knowledge areas. Regarding the criterion validity, patients with higher education level and higher family income showed greater knowledge about hypertension. The instrument has a satisfactory clarity index and adequate validity, and can be used to evaluate the knowledge of hypertensive participants in cardiac rehabilitation programs.
[A basic research to share Fourier transform near-infrared spectrum information resource].
Zhang, Lu-Da; Li, Jun-Hui; Zhao, Long-Lian; Zhao, Li-Li; Qin, Fang-Li; Yan, Yan-Lu
2004-08-01
A method to share the information resource in the database of Fourier transform near-infrared(FTNIR) spectrum information of agricultural products and utilize the spectrum information sufficiently is explored in this paper. Mapping spectrum information from one instrument to another is studied to express the spectrum information accurately between the instruments. Then mapping spectrum information is used to establish a mathematical model of quantitative analysis without including standard samples. The analysis result is that the relative coefficient r is 0.941 and the relative error is 3.28% between the model estimate values and the Kjeldahl's value for the protein content of twenty-two wheat samples, while the relative coefficient r is 0.963 and the relative error is 2.4% for the other model, which is established by using standard samples. It is shown that the spectrum information can be shared by using the mapping spectrum information. So it can be concluded that the spectrum information in one FTNIR spectrum information database can be transformed to another instrument's mapping spectrum information, which makes full use of the information resource in the database of FTNIR spectrum information to realize the resource sharing between different instruments.
Apical extrusion of debris in four different endodontic instrumentation systems: A meta-analysis.
Western, J Sylvia; Dicksit, Daniel Devaprakash
2017-01-01
All endodontic instrumentation systems tested so far, promote apical extrusion of debris, which is one of the main causes of postoperative pain, flare ups, and delayed healing. Of this meta-analysis was to collect and analyze in vitro studies quantifying apically extruded debris while using Hand ProTaper (manual), ProTaper Universal (rotary), Wave One (reciprocating), and self-adjusting file (SAF; vibratory) endodontic instrumentation systems and to determine methods which produced lesser extrusion of debris apically. An extensive electronic database search was done in PubMed, Scopus, Cochrane, LILACS, and Google Scholar from inception until February 2016 using the key terms "Apical Debris Extrusion, extruded material, and manual/rotary/reciprocating/SAF systems." A systematic search strategy was followed to extract 12 potential articles from a total of 1352 articles. The overall effect size was calculated from the raw mean difference of weight of apically extruded debris. Statistically significant difference was seen in the following comparisons: SAF < Wave One, SAF < Rotary ProTaper. Apical extrusion of debris was invariably present in all the instrumentation systems analyzed. SAF system seemed to be periapical tissue friendly as it caused reduced apical extrusion compared to Rotary ProTaper and Wave One.
Yepes-Nuñez, Juan Jose; Zhang, Yuan; Xie, Feng; Alonso-Coello, Pablo; Selva, Anna; Schünemann, Holger; Guyatt, Gordon
2017-05-01
In systematic reviews of studies of patients' values and preferences, the objective of the study was to summarize items and domains authors have identified when considering the risk of bias (RoB) associated with primary studies. We conducted a systematic survey of systematic reviews of patients' values and preference studies. Our search included three databases (MEDLINE, EMBASE, and PsycINFO) from their inception to August 2015. We conducted duplicate data extraction, focusing on items that authors used to address RoB in the primary studies included in their reviews and the associated underlying domains, and summarized criteria in descriptive tables. We identified 42 eligible systematic reviews that addressed 23 items relevant to RoB and grouped the items into 7 domains: appropriate administration of instrument; instrument choice; instrument-described health state presentation; choice of participants group; description, analysis, and presentation of methods and results; patient understanding; and subgroup analysis. The items and domains identified provide insight into issues of RoB in patients' values and preference studies and establish the basis for an instrument to assess RoB in such studies. Copyright © 2017 Elsevier Inc. All rights reserved.
Standard-less analysis of Zircaloy clad samples by an instrumental neutron activation method
NASA Astrophysics Data System (ADS)
Acharya, R.; Nair, A. G. C.; Reddy, A. V. R.; Goswami, A.
2004-03-01
A non-destructive method for analysis of irregular shape and size samples of Zircaloy has been developed using the recently standardized k0-based internal mono standard instrumental neutron activation analysis (INAA). The samples of Zircaloy-2 and -4 tubes, used as fuel cladding in Indian boiling water reactors (BWR) and pressurized heavy water reactors (PHWR), respectively, have been analyzed. Samples weighing in the range of a few tens of grams were irradiated in the thermal column of Apsara reactor to minimize neutron flux perturbations and high radiation dose. The method utilizes in situ relative detection efficiency using the γ-rays of selected activation products in the sample for overcoming γ-ray self-attenuation. Since the major and minor constituents (Zr, Sn, Fe, Cr and/or Ni) in these samples were amenable to NAA, the absolute concentrations of all the elements were determined using mass balance instead of using the concentration of the internal mono standard. Concentrations were also determined in a smaller size Zircaloy-4 sample by irradiating in the core position of the reactor to validate the present methodology. The results were compared with literature specifications and were found to be satisfactory. Values of sensitivities and detection limits have been evaluated for the elements analyzed.
Auditing of chromatographic data.
Mabie, J T
1998-01-01
During a data audit, it is important to ensure that there is clear documentation and an audit trail. The Quality Assurance Unit should review all areas, including the laboratory, during the conduct of the sample analyses. The analytical methodology that is developed should be documented prior to sample analyses. This is an important document for the auditor, as it is the instrumental piece used by the laboratory personnel to maintain integrity throughout the process. It is expected that this document will give insight into the sample analysis, run controls, run sequencing, instrument parameters, and acceptance criteria for the samples. The sample analysis and all supporting documentation should be audited in conjunction with this written analytical method and any supporting Standard Operating Procedures to ensure the quality and integrity of the data.
Alignment of sensor arrays in optical instruments using a geometric approach.
Sawyer, Travis W
2018-02-01
Alignment of sensor arrays in optical instruments is critical to maximize the instrument's performance. While many commercial systems use standardized mounting threads for alignment, custom systems require specialized equipment and alignment procedures. These alignment procedures can be time-consuming, dependent on operator experience, and have low repeatability. Furthermore, each alignment solution must be considered on a case-by-case basis, leading to additional time and resource cost. Here I present a method to align a sensor array using geometric analysis. By imaging a grid pattern of dots, I show that it is possible to calculate the misalignment for a sensor in five degrees of freedom simultaneously. I first test the approach by simulating different cases of misalignment using Zemax before applying the method to experimentally acquired data of sensor misalignment for an echelle spectrograph. The results show that the algorithm effectively quantifies misalignment in five degrees of freedom for an F/5 imaging system, accurate to within ±0.87 deg in rotation and ±0.86 μm in translation. Furthermore, the results suggest that the method can also be applied to non-imaging systems with a small penalty to precision. This general approach can potentially improve the alignment of sensor arrays in custom instruments by offering an accurate, quantitative approach to calculating misalignment in five degrees of freedom simultaneously.
NASA Astrophysics Data System (ADS)
Nguyen, Huong Giang T.; Horn, Jarod C.; Thommes, Matthias; van Zee, Roger D.; Espinal, Laura
2017-12-01
Addressing reproducibility issues in adsorption measurements is critical to accelerating the path to discovery of new industrial adsorbents and to understanding adsorption processes. A National Institute of Standards and Technology Reference Material, RM 8852 (ammonium ZSM-5 zeolite), and two gravimetric instruments with asymmetric two-beam balances were used to measure high-pressure adsorption isotherms. This work demonstrates how common approaches to buoyancy correction, a key factor in obtaining the mass change due to surface excess gas uptake from the apparent mass change, can impact the adsorption isotherm data. Three different approaches to buoyancy correction were investigated and applied to the subcritical CO2 and supercritical N2 adsorption isotherms at 293 K. It was observed that measuring a collective volume for all balance components for the buoyancy correction (helium method) introduces an inherent bias in temperature partition when there is a temperature gradient (i.e. analysis temperature is not equal to instrument air bath temperature). We demonstrate that a blank subtraction is effective in mitigating the biases associated with temperature partitioning, instrument calibration, and the determined volumes of the balance components. In general, the manual and subtraction methods allow for better treatment of the temperature gradient during buoyancy correction. From the study, best practices specific to asymmetric two-beam balances and more general recommendations for measuring isotherms far from critical temperatures using gravimetric instruments are offered.
Nguyen, Huong Giang T; Horn, Jarod C; Thommes, Matthias; van Zee, Roger D; Espinal, Laura
2017-12-01
Addressing reproducibility issues in adsorption measurements is critical to accelerating the path to discovery of new industrial adsorbents and to understanding adsorption processes. A National Institute of Standards and Technology Reference Material, RM 8852 (ammonium ZSM-5 zeolite), and two gravimetric instruments with asymmetric two-beam balances were used to measure high-pressure adsorption isotherms. This work demonstrates how common approaches to buoyancy correction, a key factor in obtaining the mass change due to surface excess gas uptake from the apparent mass change, can impact the adsorption isotherm data. Three different approaches to buoyancy correction were investigated and applied to the subcritical CO 2 and supercritical N 2 adsorption isotherms at 293 K. It was observed that measuring a collective volume for all balance components for the buoyancy correction (helium method) introduces an inherent bias in temperature partition when there is a temperature gradient (i.e. analysis temperature is not equal to instrument air bath temperature). We demonstrate that a blank subtraction is effective in mitigating the biases associated with temperature partitioning, instrument calibration, and the determined volumes of the balance components. In general, the manual and subtraction methods allow for better treatment of the temperature gradient during buoyancy correction. From the study, best practices specific to asymmetric two-beam balances and more general recommendations for measuring isotherms far from critical temperatures using gravimetric instruments are offered.
Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games.
Alber, Julia M; Watson, Anna M; Barnett, Tracey E; Mercado, Rebeccah; Bernhardt, Jay M
2015-07-01
Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development.
Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games
Alber, Julia M.; Watson, Anna M.; Barnett, Tracey E.; Mercado, Rebeccah
2015-01-01
Abstract Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development. PMID:26167842
An Analysis Method for Superconducting Resonator Parameter Extraction with Complex Baseline Removal
NASA Technical Reports Server (NTRS)
Cataldo, Giuseppe
2014-01-01
A new semi-empirical model is proposed for extracting the quality (Q) factors of arrays of superconducting microwave kinetic inductance detectors (MKIDs). The determination of the total internal and coupling Q factors enables the computation of the loss in the superconducting transmission lines. The method used allows the simultaneous analysis of multiple interacting discrete resonators with the presence of a complex spectral baseline arising from reflections in the system. The baseline removal allows an unbiased estimate of the device response as measured in a cryogenic instrumentation setting.
X-ray fluorescence analysis of alloy and stainless steels using a mercuric iodide detector
NASA Technical Reports Server (NTRS)
Kelliher, Warren C.; Maddox, W. Gene
1988-01-01
A mercuric iodide detector was used for the XRF analysis of a number of NBS standard steels, applying a specially developed correction method for interelemental effects. It is shown that, using this method and a good peak-deconvolution technique, the HgI2 detector is capable of achieving resolutions and count rates needed in the XRF anlysis of multielement samples. The freedom from cryogenic cooling and from power supplies necessary for an electrically cooled device makes this detector a very good candidate for a portable instrument.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Kuangcai
The goal of this study is to help with future data analysis and experiment designs in rotational dynamics research using DIC-based SPORT technique. Most of the current studies using DIC-based SPORT techniques are technical demonstrations. Understanding the mechanisms behind the observed rotational behaviors of the imaging probes should be the focus of the future SPORT studies. More efforts are still needed in the development of new imaging probes, particle tracking methods, instrumentations, and advanced data analysis methods to further extend the potential of DIC-based SPORT technique.
Hicks, Michael B; Regalado, Erik L; Tan, Feng; Gong, Xiaoyi; Welch, Christopher J
2016-01-05
Supercritical fluid chromatography (SFC) has long been a preferred method for enantiopurity analysis in support of pharmaceutical discovery and development, but implementation of the technique in regulated GMP laboratories has been somewhat slow, owing to limitations in instrument sensitivity, reproducibility, accuracy and robustness. In recent years, commercialization of next generation analytical SFC instrumentation has addressed previous shortcomings, making the technique better suited for GMP analysis. In this study we investigate the use of modern SFC for enantiopurity analysis of several pharmaceutical intermediates and compare the results with the conventional HPLC approaches historically used for analysis in a GMP setting. The findings clearly illustrate that modern SFC now exhibits improved precision, reproducibility, accuracy and robustness; also providing superior resolution and peak capacity compared to HPLC. Based on these findings, the use of modern chiral SFC is recommended for GMP studies of stereochemistry in pharmaceutical development and manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Ray, Ronald J.
1994-01-01
New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.
Rowley, Mark I.; Coolen, Anthonius C. C.; Vojnovic, Borivoj; Barber, Paul R.
2016-01-01
We present novel Bayesian methods for the analysis of exponential decay data that exploit the evidence carried by every detected decay event and enables robust extension to advanced processing. Our algorithms are presented in the context of fluorescence lifetime imaging microscopy (FLIM) and particular attention has been paid to model the time-domain system (based on time-correlated single photon counting) with unprecedented accuracy. We present estimates of decay parameters for mono- and bi-exponential systems, offering up to a factor of two improvement in accuracy compared to previous popular techniques. Results of the analysis of synthetic and experimental data are presented, and areas where the superior precision of our techniques can be exploited in Förster Resonance Energy Transfer (FRET) experiments are described. Furthermore, we demonstrate two advanced processing methods: decay model selection to choose between differing models such as mono- and bi-exponential, and the simultaneous estimation of instrument and decay parameters. PMID:27355322
Photometric method for determination of acidity constants through integral spectra analysis
NASA Astrophysics Data System (ADS)
Zevatskiy, Yuriy Eduardovich; Ruzanov, Daniil Olegovich; Samoylov, Denis Vladimirovich
2015-04-01
An express method for determination of acidity constants of organic acids, based on the analysis of the integral transmittance vs. pH dependence is developed. The integral value is registered as a photocurrent of photometric device simultaneously with potentiometric titration. The proposed method allows to obtain pKa using only simple and low-cost instrumentation. The optical part of the experimental setup has been optimized through the exclusion of the monochromator device. Thus it only takes 10-15 min to obtain one pKa value with the absolute error of less than 0.15 pH units. Application limitations and reliability of the method have been tested for a series of organic acids of various nature.
An intercomparison of five ammonia measurement techniques
NASA Technical Reports Server (NTRS)
Williams, E. J.; Sandholm, S. T.; Bradshaw, J. D.; Schendel, J. S.; Langford, A. O.; Quinn, P. K.; Lebel, P. J.; Vay, S. A.; Roberts, P. D.; Norton, R. B.
1992-01-01
Results obtained from five techniques for measuring gas-phase ammonia at low concentration in the atmosphere are compared. These methods are: (1) a photofragmentation/laser-induced fluorescence (PF/LIF) instrument; (2) a molybdenum oxide annular denuder sampling/chemiluminescence detection technique; (3) a tungsten oxide denuder sampling/chemiluminescence detection system; (4) a citric-acid-coated denuder sampling/ion chromatographic analysis (CAD/IC) method; and (5) an oxalic-acid-coated filter pack sampling/colorimetric analysis method. It was found that two of the techniques, the PF/LIF and the CAD/IC methods, measured approximately 90 percent of the calculated ammonia added in the spiking tests and agreed very well with each other in the ambient measurements.
Methods for identification and verification using vacuum XRF system
NASA Technical Reports Server (NTRS)
Kaiser, Bruce (Inventor); Schramm, Fred (Inventor)
2005-01-01
Apparatus and methods in which one or more elemental taggants that are intrinsically located in an object are detected by x-ray fluorescence analysis under vacuum conditions to identify or verify the object's elemental content for elements with lower atomic numbers. By using x-ray fluorescence analysis, the apparatus and methods of the invention are simple and easy to use, as well as provide detection by a non line-of-sight method to establish the origin of objects, as well as their point of manufacture, authenticity, verification, security, and the presence of impurities. The invention is extremely advantageous because it provides the capability to measure lower atomic number elements in the field with a portable instrument.
In-injection port thermal desorption for explosives trace evidence analysis.
Sigman, M E; Ma, C Y
1999-10-01
A gas chromatographic method utilizing thermal desorption of a dry surface wipe for the analysis of explosives trace chemical evidence has been developed and validated using electron capture and negative ion chemical ionization mass spectrometric detection. Thermal desorption was performed within a split/splitless injection port with minimal instrument modification. Surface-abraded Teflon tubing provided the solid support for sample collection and desorption. Performance was characterized by desorption efficiency, reproducibility, linearity of the calibration, and method detection and quantitation limits. Method validation was performed with a series of dinitrotoluenes, trinitrotoluene, two nitroester explosives, and one nitramine explosive. The method was applied to the sampling of a single piece of debris from an explosion containing trinitrotoluene.
Pawar, Ajinkya M.; Pawar, Mansing G.; Metzger, Zvi; Kokate, Sharad R.
2015-01-01
Aim: The present ex vivo study aimed to evaluate the debris extrusion after instrumenting the root canals by three different files systems. Materials and Methods: Sixty extracted human mandibular premolars with single canals were selected and randomly divided into three groups (n = 20) for instrumentation with three different files. Group 1: WaveOne (primary) single reciprocating file (WO; Dentsply Maillefer, Ballaigues, Switzerland) (25/08), Group 2: Self-adjusting file (SAF; ReDent-Nova, Ra’anana, Israel) (1.5 mm), and Group 3: ProTaper NEXT X1 and X2 (PTN; Dentsply Tulsa Dental, Tulsa, OK) (25/06). Debris extruding by instrumentation were collected into pre-weighed Eppendorf tubes. These tubes were then stored in an incubator at 70°C for 5 days. The tubes were then weighed to obtain the final weight, with the extruded debris. Statistical analysis for the debris extruded apically was performed using one-way analysis of variance and post hoc Tukey's test. Results: The statistical analysis showed a significant difference between all the three groups tested (P < 0.01). The following post hoc Tukey's test confirmed that Group 2 (SAF) exhibited significantly least (P < 0.01) debris extrusion between the three groups tested. Conclusions: The SAF resulted in significantly less extrusion of debris when compared to reciprocating WO and rotary PTN. PMID:25829683
Laser spectrometer for CO2 clumped isotope analysis
NASA Astrophysics Data System (ADS)
Prokhorov, Ivan; Kluge, Tobias; Janssen, Christof
2017-04-01
Carbon dioxide clumped isotope thermometry has proven to be a reliable method for biogeochemical and atmospheric research. We present a new laser spectroscopic instrument for doubly-substituted isotopologues analysis. In contrast to a conventional isotope ratio mass spectrometry (IRMS), tunable laser direct absorption spectroscopy (TLDAS) has the advantage of isotopologue-specific determination free of isobaric interferences. Tunable infrared laser based spectrometer for clumped isotope analysis is being developed in collaboration between Heidelberg University, Germany, and LERMA-IPSL, CNRS, France. The instrument employs two continuous intraband cascade lasers (ICL) tuned at 4439 and 4329 nm. The spectral windows covered by the lasers contain absorption lines of the six most abundant CO2 isotopologues, including the two doubly substituted species 16O13C18O and 16O13C17O, and all singly substituted isotopologues with 13C, 18O and 17O. A Herriott-type multi-pass cell provides two different absorption pathlengths to compensate the abundance difference between singly- and doubly-substituted isotopologues. We have reached the sub-permill precision required for clumped isotope measurements within the integration time of several seconds. The test version of the instrument demonstrates a performance comparable to state of the art IRMS. We highlight the following features of the instrument that are strong advantages compared to conventional mass spectrometry: measurement cycle in the minute range, simplified sample preparation routine, table-top layout with a potential for in-situ applications.
Parpa, Efi; Kostopoulou, Sotiria; Tsilika, Eleni; Galanos, Antonis; Katsaragakis, Stylianos; Mystakidou, Kyriaki
2017-09-01
The patient dignity inventory (PDI) is an instrument to measure dignity distressing aspects at the end of life. The aims of the present study were the translation of the PDI in Greek language as well as to measure its psychometric aspects in a palliative care unit. A back-translation method was obtained at the Greek version. One hundred twenty advanced cancer patients completed the Greek version of the PDI, the Greek hospital anxiety and depression scale, the Greek schedule of attitudes toward hastened death (SAHD-Gr), and the Greek 12-item short form health survey. Confirmatory factor analysis failed to fit to the original instrument's structure and exploratory factor analysis was conducted revealing five factors ("Psychological Distress," "Body Image and Role Identity," "Self-Esteem," "Physical Distress and Dependency," and "Social Support"). The psychometric analysis of the PDI-Gr demonstrated a good concurrent validity, and the instrument discriminated well between subgroups of patients regarding age differences. Cronbach α were between 0.71 and 0.9 showing a good internal consistency. The Greek version of the PDI showed good psychometric properties in advanced cancer patients, supported the usefulness of the instrument assessing the sense of dignity distressing aspects of the terminally ill cancer patients. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rockaway, J.D.; Stephenson, R.W.
The study described was accomplished through an investigation of both the intact and insitu geotechnical properties of the subcoal strata and through the installation of floor movement monitoring instrumentation. The method of bearing capacity analysis developed by A.S. Vesic was used to define pillar stability. This method includes provisions for taking into consideration a two-layer system such as assumed for mine conditions where a weak underclay overlies firmer strata. 4 refs.
Measuring Workplace Climate in Community Clinics and Health Centers
Friedberg, Mark W.; Rodriguez, Hector P.; Martsolf, Grant; Edelen, Maria Orlando; Vargas-Bustamante, Arturo
2018-01-01
Background The effectiveness of community clinics and health centers’ efforts to improve the quality of care might be modified by clinics’ workplace climates. Several surveys to measure workplace climate exist, but their relationships to each other and to distinguishable dimensions of workplace climate are unknown. Objective To assess the psychometric properties of a survey instrument combining items from several existing surveys of workplace climate and to generate a shorter instrument for future use. Methods We fielded a 106-item survey, which included items from 9 existing instruments, to all clinicians and staff members (n=781) working in 30 California community clinics and health centers, receiving 628 responses (80% response rate). We performed exploratory factor analysis of survey responses, followed by confirmatory factor analysis of 200 reserved survey responses. We generated a new, shorter survey instrument of items with strong factor loadings. Results Six factors, including 44 survey items, emerged from the exploratory analysis. Two factors (Clinic Workload and Teamwork) were independent from the others. The remaining 4 factors (Staff Relationships, Quality Improvement Orientation, Managerial Readiness for Change, and Staff Readiness for Change) were highly correlated, indicating that these represented dimensions of a higher-order factor we called “Clinic Functionality.” This two-level, six-factor model fit the data well in the exploratory and confirmatory samples. For all but one factor, fewer than 20 survey responses were needed to achieve clinic-level reliability >0.7. Conclusion Survey instruments designed to measure workplace climate have substantial overlap. The relatively parsimonious item set we identified might help target and tailor clinics’ quality improvement efforts. PMID:27326549
Cloke, Jonathan; Arizanova, Julia; Crabtree, David; Simpson, Helen; Evans, Katharine; Vaahtoranta, Laura; Palomäki, Jukka-Pekka; Artimo, Paulus; Huang, Feng; Liikanen, Maria; Koskela, Suvi; Chen, Yi
2016-01-01
The Thermo Scientific™ SureTect™ Listeria species Real-Time PCR Assay was certified during 2013 by the AOAC Research Institute (RI) Performance Tested Methods(SM) program as a rapid method for the detection of Listeria species from a wide range of food matrixes and surface samples. A method modification study was conducted in 2015 to extend the matrix claims of the product to a wider range of food matrixes. This report details the method modification study undertaken to extend the use of this PCR kit to the Applied Biosystems™ 7500 Fast PCR Instrument and Applied Biosystems RapidFinder™ Express 2.0 software allowing use of the assay on a 96-well format PCR cycler in addition to the current workflow, using the 24-well Thermo Scientific PikoReal™ PCR Instrument and Thermo Scientific SureTect software. The method modification study presented in this report was assessed by the AOAC-RI as being a level 2 method modification study, necessitating a method developer study on a representative range of food matrixes covering raw ground turkey, 2% fat pasteurized milk, and bagged lettuce as well as stainless steel surface samples. All testing was conducted in comparison to the reference method detailed in International Organization for Standardization (ISO) 6579:2002. No significant difference by probability of detection statistical analysis was found between the SureTect Listeria species PCR Assay or the ISO reference method methods for any of the three food matrixes and the surface samples analyzed during the study.
Bing-You, Robert; Ramesh, Saradha; Hayes, Victoria; Varaklis, Kalli; Ward, Denham; Blanco, Maria
2018-01-01
Construct: Medical educators consider feedback a core component of the educational process. Effective feedback allows learners to acquire new skills, knowledge, and attitudes. Learners' perceptions of feedback are an important aspect to assess with valid methods in order to improve the feedback skills of educators and the feedback culture. Although guidelines for delivering effective feedback have existed for several decades, medical students and residents often indicate that they receive little feedback. A recent scoping review on feedback in medical education did not reveal any validity evidence on instruments to assess learner's perceptions of feedback. The purpose of our study was to gather validity evidence on two novel FEEDME (Feedback in Medical Education) instruments to assess medical students' and residents' perceptions of the feedback that they receive. After the authors developed an initial instrument with 54 items, cognitive interviews with medical students and residents suggested that 2 separate instruments were needed, one focused on the feedback culture (FEEDME-Culture) and the other on the provider of feedback (FEEDME-Provider). A Delphi study with 17 medical education experts and faculty members assessed content validity. The response process was explored involving 31 medical students and residents at 2 academic institutions. Exploratory factor analysis and reliability analyses were performed on completed instruments. Two Delphi consultation rounds refined the wording of items and eliminated several items. Learners found both instruments easy and quick to answer; it took them less than 5 minutes to complete. Learners preferred an electronic format of the instruments over paper. Factor analysis revealed a two- and three-factor solution for the FEEDME-Culture and FEEDME-Provider instruments, respectively. Cronbach's alpha was greater than 0.80 for all factors. Items on both instruments were moderately to highly correlated (range, r = .3-.7). Our results provide preliminary validity evidence of 2 novel feedback instruments. After further validation of both FEEDME instruments, sharing the results of the FEEDME-Culture instrument with educational leaders and faculty may improve the culture of feedback on specific educational rotations and at the institutional level. The FEEDME-Provider instrument could be useful for faculty development targeting feedback skills. Additional research studies could assess whether both instruments may be used to help learners receive feedback and prompt reflective learning.
ERIC Educational Resources Information Center
Cowan, Logan T.; Van Wagenen, Sarah A.; Brown, Brittany A.; Hedin, Riley J.; Seino-Stephan, Yukiko; Hall, P. Cougar; West, Joshua H.
2013-01-01
Objective. To quantify the presence of health behavior theory constructs in iPhone apps targeting physical activity. Methods. This study used a content analysis of 127 apps from Apple's (App Store) "Health & Fitness" category. Coders downloaded the apps and then used an established theory-based instrument to rate each app's inclusion of…
Transportable, Low-Dose Active Fast-Neutron Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mihalczo, John T.; Wright, Michael C.; McConchie, Seth M.
2017-08-01
This document contains a description of the method of transportable, low-dose active fast-neutron imaging as developed by ORNL. The discussion begins with the technique and instrumentation and continues with the image reconstruction and analysis. The analysis discussion includes an example of how a gap smaller than the neutron production spot size and detector size can be detected and characterized depending upon the measurement time.
Cost analysis in the toxicology laboratory.
Travers, E M
1990-09-01
The process of determining laboratory sectional and departmental costs and test costs for instrument-generated and manually generated reportable results for toxicology laboratories has been outlined in this article. It is hoped that the basic principles outlined in the preceding text will clarify and elucidate one of the most important areas needed for laboratory fiscal integrity and its survival in these difficult times for health care providers. The following general principles derived from this article are helpful aids for managers of toxicology laboratories. 1. To manage a cost-effective, efficient toxicology laboratory, several factors must be considered: the laboratory's instrument configuration, test turnaround time needs, the test menu offered, the analytic methods used, the cost of labor based on time expended and the experience and educational level of the staff, and logistics that determine specimen delivery time and costs. 2. There is a wide variation in costs for toxicologic methods, which requires that an analysis of capital (equipment) purchase and operational (test performance) costs be performed to avoid waste, purchase wisely, and determine which tests consume the majority of the laboratory's resources. 3. Toxicologic analysis is composed of many complex steps. Each step must be individually cost-accounted. Screening test results must be confirmed, and the cost for both steps must be included in the cost per reportable result. 4. Total costs will vary in the same laboratory and between laboratories based on differences in salaries paid to technical staff, differences in reagent/supply costs, the number of technical staff needed to operate the analyzer or perform the method, and the inefficient use of highly paid staff to operate the analyzer or perform the method. 5. Since direct test costs vary directly with the type and number of analyzers or methods and are dependent on the operational mode designed by the manufacturer, laboratory managers should construct an actual test-cost data base for instrument or method in use to accurately compare costs using the "bottom-up" approach. 6. Laboratory expenses can be examined from three perspectives: total laboratory, laboratory section, and subsection workstation. The objective is to track all laboratory expenses through each of these levels. 7. In the final analysis, a portion of total laboratory expenses must be allocated to each unit of laboratory output--the billable procedure or, in laboratories where tests are not billed, the tests produced.(ABSTRACT TRUNCATED AT 400 WORDS)
Smith, Gordon Wg; Goldie, Frank; Long, Steven; Lappin, David F; Ramage, Gordon; Smith, Andrew J
2011-01-10
The cleaning stage of the instrument decontamination process has come under increased scrutiny due to the increasing complexity of surgical instruments and the adverse affects of residual protein contamination on surgical instruments. Instruments used in the podiatry field have a complex surface topography and are exposed to a wide range of biological contamination. Currently, podiatry instruments are reprocessed locally within surgeries while national strategies are favouring a move toward reprocessing in central facilities. The aim of this study was to determine the efficacy of local and central reprocessing on podiatry instruments by measuring residual protein contamination of instruments reprocessed by both methods. The residual protein of 189 instruments reprocessed centrally and 189 instruments reprocessed locally was determined using a fluorescent assay based on the reaction of proteins with o-phthaldialdehyde/sodium 2-mercaptoethanesulfonate. Residual protein was detected on 72% (n = 136) of instruments reprocessed centrally and 90% (n = 170) of instruments reprocessed locally. Significantly less protein (p < 0.001) was recovered from instruments reprocessed centrally (median 20.62 μg, range 0 - 5705 μg) than local reprocessing (median 111.9 μg, range 0 - 6344 μg). Overall, the results show the superiority of central reprocessing for complex podiatry instruments when protein contamination is considered, though no significant difference was found in residual protein between local decontamination unit and central decontamination unit processes for Blacks files. Further research is needed to undertake qualitative identification of protein contamination to identify any cross contamination risks and a standard for acceptable residual protein contamination applicable to different instruments and specialities should be considered as a matter of urgency.
Church, Russell M.
2002-04-28
This article provides an overview of the published research of John Gibbon. It describes his experimental research on scalar timing and his development of scalar timing theory. It also describes his methods of research which included mathematical analysis, conditioning methods, psychophysical methods and secondary data analysis. Finally, it describes his application of scalar timing theory to avoidance and punishment, autoshaping, temporal perception and timed behavior, foraging, circadian rhythms, human timing, and the effect of drugs on timed perception and timed performance of Parkinson's patients. The research of Gibbon has shown the essential role of timing in perception, classical conditioning, instrumental learning, behavior in natural environments and in neuropsychology.
A method for the measurement and analysis of ride vibrations of transportation systems
NASA Technical Reports Server (NTRS)
Catherines, J. J.; Clevenson, S. A.; Scholl, H. F.
1972-01-01
The measurement and recording of ride vibrations which affect passenger comfort in transportation systems and the subsequent data-reduction methods necessary for interpreting the data present exceptional instrumentation requirements and necessitate the use of computers for specialized analysis techniques. A method is presented for both measuring and analyzing ride vibrations of the type encountered in ground and air transportation systems. A portable system for measuring and recording low-frequency, low-amplitude accelerations and specialized data-reduction procedures are described. Sample vibration measurements in the form of statistical parameters representative of typical transportation systems are also presented to demonstrate the utility of the techniques.
Method and apparatus for concentrating vapors for analysis
Grate, Jay W [West Richland, WA; Baldwin, David L [Kennewick, WA; Anheier, Jr., Norman C.
2012-06-05
A pre-concentration device and a method are disclosed for concentrating gaseous vapors for analysis. Vapors sorbed and concentrated within the bed of the pre-concentration device are thermally desorbed, achieving at least partial separation of the vapor mixtures. The pre-concentration device is suitable, e.g., for pre-concentration and sample injection, and provides greater resolution of peaks for vapors within vapor mixtures, yielding detection levels that are 10-10,000 times better than direct sampling and analysis systems. Features are particularly useful for continuous unattended monitoring applications. The invention finds application in conjunction with, e.g., analytical instruments where low detection limits for gaseous vapors are desirable.
Zhang, Mengliang; Kruse, Natalie A; Bowman, Jennifer R; Jackson, Glen P
2016-05-01
An expedited field analysis method was developed for the determination of polychlorinated biphenyls (PCBs) in soil matrices using a portable gas chromatography-mass spectrometry (GC-MS) instrument. Soil samples of approximately 0.5 g were measured with a portable scale and PCBs were extracted by headspace solid-phase microextraction (SPME) with a 100 µm polydimethylsiloxane (PDMS) fiber. Two milliliters of 0.2 M potassium permanganate and 0.5 mL of 6 M sulfuric acid solution were added to the soil matrices to facilitate the extraction of PCBs. The extraction was performed for 30 min at 100 ℃ in a portable heating block that was powered by a portable generator. The portable GC-MS instrument took less than 6 min per analysis and ran off an internal battery and helium cylinder. Six commercial PCB mixtures, Aroclor 1016, 1221, 1232, 1242, 1248, 1254, and 1260, could be classified based on the GC chromatograms and mass spectra. The detection limit of this method for Aroclor 1260 in soil matrices is approximately 10 ppm, which is sufficient for guiding remediation efforts in contaminated sites. This method was applicable to the on-site analysis of PCBs with a total analysis time of 37 min per sample. However, the total analysis time could be improved to less than 7 min per sample by conducting the rate-limiting extraction step for different samples in parallel. © The Author(s) 2016.
Reliably detectable flaw size for NDE methods that use calibration
NASA Astrophysics Data System (ADS)
Koshti, Ajay M.
2017-04-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh18232 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.
Reliably Detectable Flaw Size for NDE Methods that Use Calibration
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh1823 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.
Rodrigues, Renata C V; Lopes, Hélio P; Elias, Carlos N; Amaral, Georgiana; Vieira, Victor T L; De Martin, Alexandre S
2011-11-01
The aim of this study was to evaluate, by static and dynamic cyclic fatigue tests, the number of cycles to fracture (NCF) 2 types of rotary NiTi instruments: Twisted File (SybronEndo, Orange, CA), which is manufactured by a proprietary twisting process, and RaCe files (FKG Dentaire, La Chaux-de-Fonds, Switzerland), which are manufactured by grinding. Twenty Twisted Files (TFs) and 20 RaCe files #25/.006 taper instruments were allowed to rotate freely in an artificial curved canal at 310 rpm in a static or a dynamic model until fracture occurred. Measurements of the fractured fragments showed that fracture occurred at the point of maximum flexure in the midpoint of the curved segment. The NCF was significantly lower for RaCe instruments compared with TFs. The NCF was also lower for instruments subjected to the static test compared with the dynamic model in both groups. Scanning electron microscopic analysis revealed ductile morphologic characteristics on the fractured surfaces of all instruments and no plastic deformation in their helical shafts. Rotary NiTi endodontic instruments manufactured by twisting present greater resistance to cyclic fatigue compared with instruments manufactured by grinding. The fracture mode observed in all instruments was of the ductile type. Copyright © 2011 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Coaching leadership: leaders' and followers' perception assessment questionnaires in nursing
Cardoso, Maria Lúcia Alves Pereira; Ramos, Laís Helena; D'Innocenzo, Maria
2014-01-01
ABSTRACT Objective: To describe the development, content analysis, and reliability of two questionnaires to assess the perception of nurse leaders, nurse technicians, and licensed practical nurses – coached in the practice of leadership and the relation with the dimensions of the coaching process. Methods: This was a methodological study with a quantitative and qualitative approach, which had the goal of instrumentation in reference to the construction and validation of measuring instruments. The instrument proposition design was based on the literature on leadership, coaching, and assessment of psychometric properties, subjected to content validation as to clarity, relevance, and applicability in order to validate the propositions through the consensus of judges, using the Delphi technique, in 2010. The final version of the questionnaires was administered to 279 nurses and 608 nurse technicians and licensed practical nurses, at two university hospitals and two private hospitals. Results: The Cronbach's alpha value with all items of the self-perception instrument was very high (0.911). The team members' instrument of perception showed that for all determinants and for each dimension of the coaching process, Cronbach's overall alpha value (0.952) was considered quite high, pointing to a very strong consistency of the scale. Confirmatory analysis showed that the models were well adjusted. Conclusion: From the statistical validation we compared the possibility of reusing the questionnaires for other study samples, because there was evidence of reliability and applicability. PMID:24728249
Nakagaki, Susumu; Yasuda, Yoshitaka; Handa, Keisuke; Koike, Toshiyuki; Saito, Takashi; Mizoguchi, Itaru
2016-01-01
Abstract Orthodontic implants may fracture at the cortical bone level upon rotational torque. The impacted fragment can be detached by a range of methods, which are all more or less time‐consuming and injurious to the cortical bone. The aim of this study was to compare three different methods for detaching an orthodontic implant impacted in cortical bone. Health Sciences University of Hokkaido animal ethics committee approved the study protocol. Orthodontic titanium‐alloy (Ti‐6Al‐4 V) implants were placed bilaterally on the buccal side of the mandible of beagle dogs. Subsequently, the implants were detached using either a low‐speed handpiece with a round bur, alternatively by use of a low‐power or a high‐power ultrasonic instrument. In the first experiment, 56 orthodontic implants were placed into the dissected mandible from 7 animals. The methods for detachment were compared with respect to time interval, as well as associated undesirable bone loss as appraised by use of cone‐beam computed tomography. In experiment two, 2x2 implants were placed bilaterally in the mandible of 8 animals and subsequently detached by manual rotational torque, and the described three methods for detachment. The implant socket was investigated histologically as a function of removal method immediately after removal, and after 1, 3 and 8 weeks and contrasted with the healing of the socket of the implant that was detached by manual rotational torque. Statistical significance was appraised by the use of non‐parametric Kruskal‐Wallis one‐way analysis of variance. The method using the low‐power ultrasonic required significantly longer removal time versus the two other methods, i.e. high‐power ultrasonic and low‐speed handpiece with a round bur (p < 0.02). The amount of undesirable bone loss was substantially larger with low‐speed handpiece with a round bur compared to the two ultrasonic methods (p < 0.05). Bone formation after 3 weeks of healing was more complete following the use of low or high‐power ultrasonic instrument in comparison with a low‐speed handpiece rotary instrument method. Orthodontic implants likely to fracture upon rotational torque or impacted fractured fragments should be detached preferably with an ultrasonic instrument, because of less associated bone loss and more rapid bone healing compared to the use of a low‐speed handpiece rotary instrument. PMID:29744149
Larson, S.J.; Capel, P.D.; VanderLoop, A.G.
1996-01-01
Laboratory and quality assurance procedures for the analysis of ground-water samples for herbicides at the Management Systems Evaluation Area near Princeton, Minnesota are described. The target herbicides include atrazine, de-ethylatrazine, de-isopropylatrazine, metribuzin, alachlor, 2,6-diethylaniline, and metolachlor. The analytical techniques used are solid-phase extraction, and analysis by gas chromatography with mass-selective detection. Descriptions of cleaning procedures, preparation of standard solutions, isolation of analytes from water, sample transfer methods, instrumental analysis, and data analysis are included.
Fast 5DOF needle tracking in iOCT.
Weiss, Jakob; Rieke, Nicola; Nasseri, Mohammad Ali; Maier, Mathias; Eslami, Abouzar; Navab, Nassir
2018-06-01
Intraoperative optical coherence tomography (iOCT) is an increasingly available imaging technique for ophthalmic microsurgery that provides high-resolution cross-sectional information of the surgical scene. We propose to build on its desirable qualities and present a method for tracking the orientation and location of a surgical needle. Thereby, we enable the direct analysis of instrument-tissue interaction directly in OCT space without complex multimodal calibration that would be required with traditional instrument tracking methods. The intersection of the needle with the iOCT scan is detected by a peculiar multistep ellipse fitting that takes advantage of the directionality of the modality. The geometric modeling allows us to use the ellipse parameters and provide them into a latency-aware estimator to infer the 5DOF pose during needle movement. Experiments on phantom data and ex vivo porcine eyes indicate that the algorithm retains angular precision especially during lateral needle movement and provides a more robust and consistent estimation than baseline methods. Using solely cross-sectional iOCT information, we are able to successfully and robustly estimate a 5DOF pose of the instrument in less than 5.4 ms on a CPU.
Evaluation of Three Field-Based Methods for Quantifying Soil Carbon
Izaurralde, Roberto C.; Rice, Charles W.; Wielopolski, Lucian; Ebinger, Michael H.; Reeves, James B.; Thomson, Allison M.; Francis, Barry; Mitra, Sudeep; Rappaport, Aaron G.; Etchevers, Jorge D.; Sayre, Kenneth D.; Govaerts, Bram; McCarty, Gregory W.
2013-01-01
Three advanced technologies to measure soil carbon (C) density (g C m−2) are deployed in the field and the results compared against those obtained by the dry combustion (DC) method. The advanced methods are: a) Laser Induced Breakdown Spectroscopy (LIBS), b) Diffuse Reflectance Fourier Transform Infrared Spectroscopy (DRIFTS), and c) Inelastic Neutron Scattering (INS). The measurements and soil samples were acquired at Beltsville, MD, USA and at Centro International para el Mejoramiento del Maíz y el Trigo (CIMMYT) at El Batán, Mexico. At Beltsville, soil samples were extracted at three depth intervals (0–5, 5–15, and 15–30 cm) and processed for analysis in the field with the LIBS and DRIFTS instruments. The INS instrument determined soil C density to a depth of 30 cm via scanning and stationary measurements. Subsequently, soil core samples were analyzed in the laboratory for soil bulk density (kg m−3), C concentration (g kg−1) by DC, and results reported as soil C density (kg m−2). Results from each technique were derived independently and contributed to a blind test against results from the reference (DC) method. A similar procedure was employed at CIMMYT in Mexico employing but only with the LIBS and DRIFTS instruments. Following conversion to common units, we found that the LIBS, DRIFTS, and INS results can be compared directly with those obtained by the DC method. The first two methods and the standard DC require soil sampling and need soil bulk density information to convert soil C concentrations to soil C densities while the INS method does not require soil sampling. We conclude that, in comparison with the DC method, the three instruments (a) showed acceptable performances although further work is needed to improve calibration techniques and (b) demonstrated their portability and their capacity to perform under field conditions. PMID:23383225
of adaptive optics systems for the next generation of high resolution astronomy instrumentation. The largest telescopes in support of UC Astronomy, including those at the Keck, Gemini, and Lick Observatories optics for astronomy: MEMS and fiber lasers lead the way. In Adaptive Optics: Analysis, Methods and
Workshop on "New Engineering Technology Transfer in Orthopaedic Surgery".
1999-04-01
anatomy is required to remain oriented to the tunnel -view of the arthroscope as is the ability to triangulate the instruments through proprioception...required. Methods: Eight limbs from fresh, whole cadavers were used. Motion data were collected with the OPTOTRAK infrared motion analysis system
Teachers' Self-Efficacy: Progressing Qualitative Analysis
ERIC Educational Resources Information Center
Glackin, Melissa; Hohenstein, Jill
2018-01-01
Teacher self-efficacy has predominantly been explored using quantitative instruments such as Likert scales-based questionnaires. Several researchers have questioned these methods, suggesting they offer only a limited view of the concept. This paper considers their claim by exploring the self-efficacy of UK secondary science teachers participating…
Experiential Education: Enhancing the Liberal Arts Curriculum
ERIC Educational Resources Information Center
Graff, Elissa R.
2013-01-01
This mixed-methods study combined a survey instrument, the Learning Style Inventory (LSI), with a selected group of follow-up interviews for the purpose of determining how experiential practices affected student engagement and learning. Quantitative data analysis established students' preferences for more active involvement in learning practices…
Measurement of influence function using swing arm profilometer and laser tracker.
Jing, Hongwei; King, Christopher; Walker, David
2010-03-01
We present a novel method to accurately measure 3D polishing influence functions by using a swing arm profilometer (SAP) and a laser tracker. The laser tracker is used to align the SAP and measure the parameters of the SAP setup before measuring the influence function. The instruments and the measurement method are described, together with measurement uncertainty analysis. An influence function deliberately produced with asymmetric form in order to create a challenging test is measured, and compared with that of a commercial 3D profilometer. The SAP result is 48.2 microm in PV, 7.271 mm(3) in volume. The 3D profilometer result is 48.4 microm in PV, 7.289 mm(3) in volume. The forms of the two results show excellent correlation. This gives confidence of the viability of the SAP method for larger influence functions out of range of the commercial instrument.
Laboratory test for ice adhesion strength using commercial instrumentation.
Wang, Chenyu; Zhang, Wei; Siva, Adarsh; Tiea, Daniel; Wynne, Kenneth J
2014-01-21
A laboratory test method for evaluating ice adhesion has been developed employing a commercially available instrument normally used for dynamic mechanical analysis (TA RSA-III). This is the first laboratory ice adhesion test that does not require a custom-built apparatus. The upper grip range of ∼10 mm is an enabling feature that is essential for the test. The method involves removal of an ice cylinder from a polymer coating with a probe and the determination of peak removal force (Ps). To validate the test method, the strength of ice adhesion was determined for a prototypical glassy polymer, poly(methyl methacrylate). The distance of the probe from the PMMA surface has been identified as a critical variable for Ps. The new test provides a readily available platform for investigating fundamental surface characteristics affecting ice adhesion. In addition to the ice release test, PMMA coatings were characterized using DSC, DCA, and TM-AFM.
Measuring Aerosol Optical Properties with the Ozone Monitoring Instrument (OMI)
NASA Technical Reports Server (NTRS)
Veefkind, J. P.; Torres, O.; Syniuk, A.; Decae, R.; deLeeuw, G.
2003-01-01
The Ozone Monitoring Instrument (OMI) is the Dutch-Finnish contribution to the NASA EOS-Aura mission scheduled for launch in January 2004. OM1 is an imaging spectrometer that will measure the back-scattered Solar radiance between 270 an 500 nm. With its relatively high spatial resolution (13x24 sq km at nadir) and daily global coverage. OM1 will make a major contribution to our understanding of atmospheric chemistry and to climate research. OM1 will provide data continuity with the TOMS instruments. One of the pleasant surprises of the TOMS data record was its information on aerosol properties. First, only the absorbing aerosol index, which is sensitive to elevated lay- ers of aerosols such as desert dust and smoke aerosols, was derived. Recently these methods were further improved to yield aerosol optical thickness and single scattering albedo over land and ocean for 19 years of TOMS data (1979-1992,1997-2002), making it one of the longest and most valuable time series for aerosols presently available. Such long time series are essential to quantify the effect of aerosols on the Earth& climate. The OM1 instrument is better suited to measure aerosols than the TOMS instruments because of the smaller footprint, and better spectral coverage. The better capabilities of OMI will enable us to provide an improved aerosol product, but the knowledge will also be used for further analysis of the aerosol record from TOMS. The OM1 aerosol product that is currently being developed for OM1 combines the TOMS experience and the multi-spectral techniques that are used in the visible and near infrared. The challenge for this new product is to provide aerosol optical thickness and single scattering albedo from the near ultraviolet to the visible (330-500 nm) over land and ocean. In this presentation the methods for deriving the OM1 aerosol product will be presented. Part of these methods developed for OM1 can already be applied to TOMS data and results of such analysis will be shown.
Coleman, S; Nixon, J; Keen, J; Muir, D; Wilson, L; McGinnis, E; Stubbs, N; Dealey, C; Nelson, E A
2016-11-16
Variation in development methods of Pressure Ulcer Risk Assessment Instruments has led to inconsistent inclusion of risk factors and concerns about content validity. A new evidenced-based Risk Assessment Instrument, the Pressure Ulcer Risk Primary Or Secondary Evaluation Tool - PURPOSE-T was developed as part of a National Institute for Health Research (NIHR) funded Pressure Ulcer Research Programme (PURPOSE: RP-PG-0407-10056). This paper reports the pre-test phase to assess and improve PURPOSE-T acceptability, usability and confirm content validity. A descriptive study incorporating cognitive pre-testing methods and integration of service user views was undertaken over 3 cycles comprising PURPOSE-T training, a focus group and one-to-one think-aloud interviews. Clinical nurses from 2 acute and 2 community NHS Trusts, were grouped according to job role. Focus group participants used 3 vignettes to complete PURPOSE-T assessments and then participated in the focus group. Think-aloud participants were interviewed during their completion of PURPOSE-T. After each pre-test cycle analysis was undertaken and adjustment/improvements made to PURPOSE-T in an iterative process. This incorporated the use of descriptive statistics for data completeness and decision rule compliance and directed content analysis for interview and focus group data. Data were collected April 2012-June 2012. Thirty-four nurses participated in 3 pre-test cycles. Data from 3 focus groups, 12 think-aloud interviews incorporating 101 PURPOSE-T assessments led to changes to improve instrument content and design, flow and format, decision support and item-specific wording. Acceptability and usability were demonstrated by improved data completion and appropriate risk pathway allocation. The pre-test also confirmed content validity with clinical nurses. The pre-test was an important step in the development of the preliminary PURPOSE-T and the methods used may have wider instrument development application. PURPOSE-T proposes a new approach to pressure ulcer risk assessment, incorporating a screening stage, the inclusion of skin status to distinguish between those who require primary prevention and those who require secondary prevention/treatment and the use of colour to support pathway allocation and decision making. Further clinical evaluation is planned to assess the reliability and validity of PURPOSE-T and it's impact on care processes and patient outcomes.
Miller, Jacob A; Balagamwala, Ehsan H; Berriochoa, Camille A; Angelov, Lilyana; Suh, John H; Benzel, Edward C; Mohammadi, Alireza M; Emch, Todd; Magnelli, Anthony; Godley, Andrew; Qi, Peng; Chao, Samuel T
2017-10-01
OBJECTIVE Spine stereotactic radiosurgery (SRS) is a safe and effective treatment for spinal metastases. However, it is unknown whether this highly conformal radiation technique is suitable at instrumented sites given the potential for microscopic disease seeding. The authors hypothesized that spinal decompression with instrumentation is not associated with increased local failure (LF) following SRS. METHODS A 2:1 propensity-matched retrospective cohort study of patients undergoing SRS for spinal metastasis was conducted. Patients with less than 1 month of radiographic follow-up were excluded. Each SRS treatment with spinal decompression and instrumentation was propensity matched to 2 controls without decompression or instrumentation on the basis of demographic, disease-related, dosimetric, and treatment-site characteristics. Standardized differences were used to assess for balance between matched cohorts. The primary outcome was the 12-month cumulative incidence of LF, with death as a competing risk. Lesions demonstrating any in-field progression were considered LFs. Secondary outcomes of interest were post-SRS pain flare, vertebral compression fracture, instrumentation failure, and any Grade ≥ 3 toxicity. Cumulative incidences analysis was used to estimate LF in each cohort, which were compared via Gray's test. Multivariate competing-risks regression was then used to adjust for prespecified covariates. RESULTS Of 650 candidates for the control group, 166 were propensity matched to 83 patients with instrumentation. Baseline characteristics were well balanced. The median prescription dose was 16 Gy in each cohort. The 12-month cumulative incidence of LF was not statistically significantly different between cohorts (22.8% [instrumentation] vs 15.8% [control], p = 0.25). After adjusting for the prespecified covariates in a multivariate competing-risks model, decompression with instrumentation did not contribute to a greater risk of LF (HR 1.21, 95% CI 0.74-1.98, p = 0.45). The incidences of post-SRS pain flare (11% vs 14%, p = 0.55), vertebral compression fracture (12% vs 22%, p = 0.04), and Grade ≥ 3 toxicity (1% vs 1%, p = 1.00) were not increased at instrumented sites. No instrumentation failures were observed. CONCLUSIONS In this propensity-matched analysis, LF and toxicity were similar among cohorts, suggesting that decompression with instrumentation does not significantly impact the efficacy or safety of spine SRS. Accordingly, spinal instrumentation may not be a contraindication to SRS. Future studies comparing SRS to conventional radiotherapy at instrumented sites in matched populations are warranted.
Guetterman, Timothy C; Creswell, John W; Wittink, Marsha; Barg, Fran K; Castro, Felipe G; Dahlberg, Britt; Watkins, Daphne C; Deutsch, Charles; Gallo, Joseph J
2017-01-01
Demand for training in mixed methods is high, with little research on faculty development or assessment in mixed methods. We describe the development of a self-rated mixed methods skills assessment and provide validity evidence. The instrument taps six research domains: "Research question," "Design/approach," "Sampling," "Data collection," "Analysis," and "Dissemination." Respondents are asked to rate their ability to define or explain concepts of mixed methods under each domain, their ability to apply the concepts to problems, and the extent to which they need to improve. We administered the questionnaire to 145 faculty and students using an internet survey. We analyzed descriptive statistics and performance characteristics of the questionnaire using the Cronbach alpha to assess reliability and an analysis of variance that compared a mixed methods experience index with assessment scores to assess criterion relatedness. Internal consistency reliability was high for the total set of items (0.95) and adequate (≥0.71) for all but one subscale. Consistent with establishing criterion validity, respondents who had more professional experiences with mixed methods (eg, published a mixed methods article) rated themselves as more skilled, which was statistically significant across the research domains. This self-rated mixed methods assessment instrument may be a useful tool to assess skills in mixed methods for training programs. It can be applied widely at the graduate and faculty level. For the learner, assessment may lead to enhanced motivation to learn and training focused on self-identified needs. For faculty, the assessment may improve curriculum and course content planning.
Pleil, Joachim; Giese, Roger
2017-09-07
Dogs have been studied for many years as a medical diagnostic tool to detect a pre-clinical disease state by sniffing emissions directly from a human or an in vitro biological sample. Some of the studies report high sensitivity and specificity in blinded case-control studies. However, in these studies it is completely unknown as to which suites of chemicals the dogs detect and how they ultimately interpret this information amidst confounding background odors. Herein, we consider the advantages and challenges of canine olfaction for early (meaningful) detection of cancer, and propose an experimental concept to narrow the molecular signals used by the dog for sample classification to laboratory-based instrumental analysis. This serves two purposes; first, in contrast to dogs, analytical methods could be quickly up-scaled for high throughput sampling. Second, the knowledge gained from identifying probative chemicals could be helpful in learning more about biochemical pathways and disease progression. We focus on exhaled breath aerosol, arguing that the semi-volatile fraction should be given more attention. Ultimately, we conclude that the interaction between dog-based and instrument-based research will be mutually beneficial and accelerate progress towards early detection of cancer by breath analysis.
Fluorescence Fluctuation Approaches to the Study of Adhesion and Signaling
Bachir, Alexia I.; Kubow, Kristopher E.; Horwitz, Alan R.
2013-01-01
Cell–matrix adhesions are large, multimolecular complexes through which cells sense and respond to their environment. They also mediate migration by serving as traction points and signaling centers and allow the cell to modify the surroucnding tissue. Due to their fundamental role in cell behavior, adhesions are germane to nearly all major human health pathologies. However, adhesions are extremely complex and dynamic structures that include over 100 known interacting proteins and operate over multiple space (nm–µm) and time (ms–min) regimes. Fluorescence fluctuation techniques are well suited for studying adhesions. These methods are sensitive over a large spatiotemporal range and provide a wealth of information including molecular transport dynamics, interactions, and stoichiometry from a single time series. Earlier chapters in this volume have provided the theoretical background, instrumentation, and analysis algorithms for these techniques. In this chapter, we discuss their implementation in living cells to study adhesions in migrating cells. Although each technique and application has its own unique instrumentation and analysis requirements, we provide general guidelines for sample preparation, selection of imaging instrumentation, and optimization of data acquisition and analysis parameters. Finally, we review several recent studies that implement these techniques in the study of adhesions. PMID:23280111