Sample records for additionally analytical instruments

  1. Developments in analytical instrumentation

    NASA Astrophysics Data System (ADS)

    Petrie, G.

    The situation regarding photogrammetric instrumentation has changed quite dramatically over the last 2 or 3 years with the withdrawal of most analogue stereo-plotting machines from the market place and their replacement by analytically based instrumentation. While there have been few new developments in the field of comparators, there has been an explosive development in the area of small, relatively inexpensive analytical stereo-plotters based on the use of microcomputers. In particular, a number of new instruments have been introduced by manufacturers who mostly have not been associated previously with photogrammetry. Several innovative concepts have been introduced in these small but capable instruments, many of which are aimed at specialised applications, e.g. in close-range photogrammetry (using small-format cameras); for thematic mapping (by organisations engaged in environmental monitoring or resources exploitation); for map revision, etc. Another innovative and possibly significant development has been the production of conversion kits to convert suitable analogue stereo-plotting machines such as the Topocart, PG-2 and B-8 into fully fledged analytical plotters. The larger and more sophisticated analytical stereo-plotters are mostly being produced by the traditional mainstream photogrammetric systems suppliers with several new instruments and developments being introduced at the top end of the market. These include the use of enlarged photo stages to handle images up to 25 × 50 cm format; the complete integration of graphics workstations into the analytical plotter design; the introduction of graphics superimposition and stereo-superimposition; the addition of correlators for the automatic measurement of height, etc. The software associated with this new analytical instrumentation is now undergoing extensive re-development with the need to supply photogrammetric data as input to the more sophisticated G.I.S. systems now being installed by clients, instead

  2. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  3. Study on bending behaviour of nickel–titanium rotary endodontic instruments by analytical and numerical analyses

    PubMed Central

    Tsao, C C; Liou, J U; Wen, P H; Peng, C C; Liu, T S

    2013-01-01

    Aim To develop analytical models and analyse the stress distribution and flexibility of nickel–titanium (NiTi) instruments subject to bending forces. Methodology The analytical method was used to analyse the behaviours of NiTi instruments under bending forces. Two NiTi instruments (RaCe and Mani NRT) with different cross-sections and geometries were considered. Analytical results were derived using Euler–Bernoulli nonlinear differential equations that took into account the screw pitch variation of these NiTi instruments. In addition, the nonlinear deformation analysis based on the analytical model and the finite element nonlinear analysis was carried out. Numerical results are obtained by carrying out a finite element method. Results According to analytical results, the maximum curvature of the instrument occurs near the instrument tip. Results of the finite element analysis revealed that the position of maximum von Mises stress was near the instrument tip. Therefore, the proposed analytical model can be used to predict the position of maximum curvature in the instrument where fracture may occur. Finally, results of analytical and numerical models were compatible. Conclusion The proposed analytical model was validated by numerical results in analysing bending deformation of NiTi instruments. The analytical model is useful in the design and analysis of instruments. The proposed theoretical model is effective in studying the flexibility of NiTi instruments. Compared with the finite element method, the analytical model can deal conveniently and effectively with the subject of bending behaviour of rotary NiTi endodontic instruments. PMID:23173762

  4. Microfabricated field calibration assembly for analytical instruments

    DOEpatents

    Robinson, Alex L [Albuquerque, NM; Manginell, Ronald P [Albuquerque, NM; Moorman, Matthew W [Albuquerque, NM; Rodacy, Philip J [Albuquerque, NM; Simonson, Robert J [Cedar Crest, NM

    2011-03-29

    A microfabricated field calibration assembly for use in calibrating analytical instruments and sensor systems. The assembly comprises a circuit board comprising one or more resistively heatable microbridge elements, an interface device that enables addressable heating of the microbridge elements, and, in some embodiments, a means for positioning the circuit board within an inlet structure of an analytical instrument or sensor system.

  5. Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"

    NASA Astrophysics Data System (ADS)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.

  6. Merging Old and New: An Instrumentation-Based Introductory Analytical Laboratory

    ERIC Educational Resources Information Center

    Jensen, Mark B.

    2015-01-01

    An instrumentation-based laboratory curriculum combining traditional unknown analyses with student-designed projects has been developed for an introductory analytical chemistry course. In the first half of the course, students develop laboratory skills and instrumental proficiency by rotating through six different instruments performing…

  7. Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.

    PubMed

    Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S

    2016-04-07

    Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Promoting Active Learning by Practicing the "Self-Assembly" of Model Analytical Instruments

    ERIC Educational Resources Information Center

    Algar, W. Russ; Krull, Ulrich J.

    2010-01-01

    In our upper-year instrumental analytical chemistry course, we have developed "cut-and-paste" exercises where students "build" models of analytical instruments from individual schematic images of components. These exercises encourage active learning by students. Instead of trying to memorize diagrams, students are required to think deeply about…

  9. Analytical Electrochemistry: Theory and Instrumentation of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Johnson, Dennis C.

    1980-01-01

    Emphasizes trends in the development of six topics concerning analytical electrochemistry, including books and reviews (34 references cited), mass transfer (59), charge transfer (25), surface effects (33), homogeneous reactions (21), and instrumentation (31). (CS)

  10. Analytic Method for Computing Instrument Pointing Jitter

    NASA Technical Reports Server (NTRS)

    Bayard, David

    2003-01-01

    A new method of calculating the root-mean-square (rms) pointing jitter of a scientific instrument (e.g., a camera, radar antenna, or telescope) is introduced based on a state-space concept. In comparison with the prior method of calculating the rms pointing jitter, the present method involves significantly less computation. The rms pointing jitter of an instrument (the square root of the jitter variance shown in the figure) is an important physical quantity which impacts the design of the instrument, its actuators, controls, sensory components, and sensor- output-sampling circuitry. Using the Sirlin, San Martin, and Lucke definition of pointing jitter, the prior method of computing the rms pointing jitter involves a frequency-domain integral of a rational polynomial multiplied by a transcendental weighting function, necessitating the use of numerical-integration techniques. In practice, numerical integration complicates the problem of calculating the rms pointing error. In contrast, the state-space method provides exact analytic expressions that can be evaluated without numerical integration.

  11. Analytical methods for dating modern writing instrument inks on paper.

    PubMed

    Ezcurra, Magdalena; Góngora, Juan M G; Maguregui, Itxaso; Alonso, Rosa

    2010-04-15

    This work reviews the different analytical methods that have been proposed in the field of forensic dating of inks from different modern writing instruments. The reported works have been classified according to the writing instrument studied and the ink component analyzed in relation to aging. The study, done chronologically, shows the advances experienced in the ink dating field in the last decades. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  12. Method and apparatus for continuous fluid leak monitoring and detection in analytical instruments and instrument systems

    DOEpatents

    Weitz, Karl K [Pasco, WA; Moore, Ronald J [West Richland, WA

    2010-07-13

    A method and device are disclosed that provide for detection of fluid leaks in analytical instruments and instrument systems. The leak detection device includes a collection tube, a fluid absorbing material, and a circuit that electrically couples to an indicator device. When assembled, the leak detection device detects and monitors for fluid leaks, providing a preselected response in conjunction with the indicator device when contacted by a fluid.

  13. Assessment of economic instruments for countries with low municipal waste management performance: An approach based on the analytic hierarchy process.

    PubMed

    Kling, Maximilian; Seyring, Nicole; Tzanova, Polia

    2016-09-01

    Economic instruments provide significant potential for countries with low municipal waste management performance in decreasing landfill rates and increasing recycling rates for municipal waste. In this research, strengths and weaknesses of landfill tax, pay-as-you-throw charging systems, deposit-refund systems and extended producer responsibility schemes are compared, focusing on conditions in countries with low waste management performance. In order to prioritise instruments for implementation in these countries, the analytic hierarchy process is applied using results of a literature review as input for the comparison. The assessment reveals that pay-as-you-throw is the most preferable instrument when utility-related criteria are regarded (wb = 0.35; analytic hierarchy process distributive mode; absolute comparison) mainly owing to its waste prevention effect, closely followed by landfill tax (wb = 0.32). Deposit-refund systems (wb = 0.17) and extended producer responsibility (wb = 0.16) rank third and fourth, with marginal differences owing to their similar nature. When cost-related criteria are additionally included in the comparison, landfill tax seems to provide the highest utility-cost ratio. Data from literature concerning cost (contrary to utility-related criteria) is currently not sufficiently available for a robust ranking according to the utility-cost ratio. In general, the analytic hierarchy process is seen as a suitable method for assessing economic instruments in waste management. Independent from the chosen analytic hierarchy process mode, results provide valuable indications for policy-makers on the application of economic instruments, as well as on their specific strengths and weaknesses. Nevertheless, the instruments need to be put in the country-specific context along with the results of this analytic hierarchy process application before practical decisions are made. © The Author(s) 2016.

  14. Data Acquisition Programming (LabVIEW): An Aid to Teaching Instrumental Analytical Chemistry.

    ERIC Educational Resources Information Center

    Gostowski, Rudy

    A course was developed at Austin Peay State University (Tennessee) which offered an opportunity for hands-on experience with the essential components of modern analytical instruments. The course aimed to provide college students with the skills necessary to construct a simple model instrument, including the design and fabrication of electronic…

  15. Integrating laboratory robots with analytical instruments--must it really be so difficult?

    PubMed

    Kramer, G W

    1990-09-01

    Creating a reliable system from discrete laboratory instruments is often a task fraught with difficulties. While many modern analytical instruments are marvels of detection and data handling, attempts to create automated analytical systems incorporating such instruments are often frustrated by their human-oriented control structures and their egocentricity. The laboratory robot, while fully susceptible to these problems, extends such compatibility issues to the physical dimensions involving sample interchange, manipulation, and event timing. The workcell concept was conceived to describe the procedure and equipment necessary to carry out a single task during sample preparation. This notion can be extended to organize all operations in an automated system. Each workcell, no matter how complex its local repertoire of functions, must be minimally capable of accepting information (commands, data), returning information on demand (status, results), and being started, stopped, and reset by a higher level device. Even the system controller should have a mode where it can be directed by instructions from a higher level.

  16. Using chromatography – desorption method of manufacturing gas mixtures for analytical instruments calibration

    NASA Astrophysics Data System (ADS)

    Platonov, I. A.; Kolesnichenko, I. N.; Lange, P. K.

    2018-05-01

    In this paper, the chromatography desorption method of obtaining gas mixtures of known compositions stable for a time sufficient to calibrate analytical instruments is considered. The comparative analysis results of the preparation accuracy of gas mixtures with volatile organic compounds using diffusion, polyabarbotage and chromatography desorption methods are presented. It is shown that the application of chromatography desorption devices allows one to obtain gas mixtures that are stable for 10...60 hours in a dynamic condition. These gas mixtures contain volatile aliphatic and aromatic hydrocarbons with a concentration error of no more than 7%. It is shown that it is expedient to use such gas mixtures for analytical instruments calibration (chromatographs, spectrophotometers, etc.)

  17. 5. INTERIOR, INSTRUMENTATION AND CONTROL BUILDING ADDITION. Looking north. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. INTERIOR, INSTRUMENTATION AND CONTROL BUILDING ADDITION. Looking north. - Edwards Air Force Base, South Base Sled Track, Instrumentation & Control Building, South of Sled Track, Station "50" area, Lancaster, Los Angeles County, CA

  18. Development of collaborative-creative learning model using virtual laboratory media for instrumental analytical chemistry lectures

    NASA Astrophysics Data System (ADS)

    Zurweni, Wibawa, Basuki; Erwin, Tuti Nurian

    2017-08-01

    The framework for teaching and learning in the 21st century was prepared with 4Cs criteria. Learning providing opportunity for the development of students' optimal creative skills is by implementing collaborative learning. Learners are challenged to be able to compete, work independently to bring either individual or group excellence and master the learning material. Virtual laboratory is used for the media of Instrumental Analytical Chemistry (Vis, UV-Vis-AAS etc) lectures through simulations computer application and used as a substitution for the laboratory if the equipment and instruments are not available. This research aims to design and develop collaborative-creative learning model using virtual laboratory media for Instrumental Analytical Chemistry lectures, to know the effectiveness of this design model adapting the Dick & Carey's model and Hannafin & Peck's model. The development steps of this model are: needs analyze, design collaborative-creative learning, virtual laboratory media using macromedia flash, formative evaluation and test of learning model effectiveness. While, the development stages of collaborative-creative learning model are: apperception, exploration, collaboration, creation, evaluation, feedback. Development of collaborative-creative learning model using virtual laboratory media can be used to improve the quality learning in the classroom, overcome the limitation of lab instruments for the real instrumental analysis. Formative test results show that the Collaborative-Creative Learning Model developed meets the requirements. The effectiveness test of students' pretest and posttest proves significant at 95% confidence level, t-test higher than t-table. It can be concluded that this learning model is effective to use for Instrumental Analytical Chemistry lectures.

  19. Validation of Reverse-Engineered and Additive-Manufactured Microsurgical Instrument Prototype.

    PubMed

    Singh, Ramandeep; Suri, Ashish; Anand, Sneh; Baby, Britty

    2016-12-01

    With advancements in imaging techniques, neurosurgical procedures are becoming highly precise and minimally invasive, thus demanding development of new ergonomically aesthetic instruments. Conventionally, neurosurgical instruments are manufactured using subtractive manufacturing methods. Such a process is complex, time-consuming, and impractical for prototype development and validation of new designs. Therefore, an alternative design process has been used utilizing blue light scanning, computer-aided designing, and additive manufacturing direct metal laser sintering (DMLS) for microsurgical instrument prototype development. Deviations of DMLS-fabricated instrument were studied by superimposing scan data of fabricated instrument with the computer-aided designing model. Content and concurrent validity of the fabricated prototypes was done by a group of 15 neurosurgeons by performing sciatic nerve anastomosis in small laboratory animals. Comparative scoring was obtained for the control and study instrument. T test was applied to the individual parameters and P values for force (P < .0001) and surface roughness (P < .01) were found to be statistically significant. These 2 parameters were further analyzed using objective measures. Results depicts that additive manufacturing by DMLS provides an effective method for prototype development. However, direct application of these additive-manufactured instruments in the operating room requires further validation. © The Author(s) 2016.

  20. Two-dimensional convolute integers for analytical instrumentation

    NASA Technical Reports Server (NTRS)

    Edwards, T. R.

    1982-01-01

    As new analytical instruments and techniques emerge with increased dimensionality, a corresponding need is seen for data processing logic which can appropriately address the data. Two-dimensional measurements reveal enhanced unknown mixture analysis capability as a result of the greater spectral information content over two one-dimensional methods taken separately. It is noted that two-dimensional convolute integers are merely an extension of the work by Savitzky and Golay (1964). It is shown that these low-pass, high-pass and band-pass digital filters are truly two-dimensional and that they can be applied in a manner identical with their one-dimensional counterpart, that is, a weighted nearest-neighbor, moving average with zero phase shifting, convoluted integer (universal number) weighting coefficients.

  1. Novel approaches to the construction of miniaturized analytical instrumentation

    NASA Technical Reports Server (NTRS)

    Porter, Marc D.; Otoole, Ronald P.; Coldiron, Shelley J.; Deninger, William D.; Deinhammer, Randall S.; Burns, Stanley G.; Bastiaans, Glenn J.; Braymen, Steve D.; Shanks, Howard R.

    1992-01-01

    This paper focuses on the design, construction, preliminary testing, and potential applications of three forms of miniaturized analytical instrumentation. The first is an optical fiber instrument for monitoring pH and other cations in aqueous solutions. The instrument couples chemically selective indicators that were immobilized at porous polymeric films with a hardware package that provides the excitation light source, required optical components, and detection and data processing hardware. The second is a new form of a piezoelectric mass sensor. The sensor was fabricated by the deposition of a thin (5.5 micron) film of piezoelectric aluminum nitride (AIN). The completed deposition process yields a thin film resonator (TFR) that is shaped as a 400 micron square and supports a standing bulk acoustic wave in a longitudinal mode at frequencies of approx. 1 GHz. Various deposition and vapor sorption studies indicate that the mass sensitivity of the TFR's rival those of the most sensitive mass sensors currently available, though offering such performance in a markedly smaller device. The third couples a novel form of liquid chromatography with microlithographic miniaturization techniques. The status of the miniaturization effort, the goal of which is to achieve chip-scale separations, is briefly discussed.

  2. Incorporating Students' Self-Designed, Research-Based Analytical Chemistry Projects into the Instrumentation Curriculum

    ERIC Educational Resources Information Center

    Gao, Ruomei

    2015-01-01

    In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…

  3. The development of an integrated assessment instrument for measuring analytical thinking and science process skills

    NASA Astrophysics Data System (ADS)

    Irwanto, Rohaeti, Eli; LFX, Endang Widjajanti; Suyanta

    2017-05-01

    This research aims to develop instrument and determine the characteristics of an integrated assessment instrument. This research uses 4-D model, which includes define, design, develop, and disseminate. The primary product is validated by expert judgment, tested it's readability by students, and assessed it's feasibility by chemistry teachers. This research involved 246 students of grade XI of four senior high schools in Yogyakarta, Indonesia. Data collection techniques include interview, questionnaire, and test. Data collection instruments include interview guideline, item validation sheet, users' response questionnaire, instrument readability questionnaire, and essay test. The results show that the integrated assessment instrument has Aiken validity value of 0.95. Item reliability was 0.99 and person reliability was 0.69. Teachers' response to the integrated assessment instrument is very good. Therefore, the integrated assessment instrument is feasible to be applied to measure the students' analytical thinking and science process skills.

  4. Development of internal controls for the Luminex instrument as part of a multiplex seven-analyte viral respiratory antibody profile.

    PubMed

    Martins, Thomas B

    2002-01-01

    The ability of the Luminex system to simultaneously quantitate multiple analytes from a single sample source has proven to be a feasible and cost-effective technology for assay development. In previous studies, my colleagues and I introduced two multiplex profiles consisting of 20 individual assays into the clinical laboratory. With the Luminex instrument's ability to classify up to 100 distinct microspheres, however, we have only begun to realize the enormous potential of this technology. By utilizing additional microspheres, it is now possible to add true internal controls to each individual sample. During the development of a seven-analyte serologic viral respiratory antibody profile, internal controls for detecting sample addition and interfering rheumatoid factor (RF) were investigated. To determine if the correct sample was added, distinct microspheres were developed for measuring the presence of sufficient quantities of immunoglobulin G (IgG) or IgM in the diluted patient sample. In a multiplex assay of 82 samples, the IgM verification control correctly identified 23 out of 23 samples with low levels (<20 mg/dl) of this antibody isotype. An internal control microsphere for RF detected 30 out of 30 samples with significant levels (>10 IU/ml) of IgM RF. Additionally, RF-positive samples causing false-positive adenovirus and influenza A virus IgM results were correctly identified. By exploiting the Luminex instrument's multiplexing capabilities, I have developed true internal controls to ensure correct sample addition and identify interfering RF as part of a respiratory viral serologic profile that includes influenza A and B viruses, adenovirus, parainfluenza viruses 1, 2, and 3, and respiratory syncytial virus. Since these controls are not assay specific, they can be incorporated into any serologic multiplex assay.

  5. Juicing the Juice: A Laboratory-Based Case Study for an Instrumental Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Schaber, Peter M.; Dinan, Frank J.; St. Phillips, Michael; Larson, Renee; Pines, Harvey A.; Larkin, Judith E.

    2011-01-01

    A young, inexperienced Food and Drug Administration (FDA) chemist is asked to distinguish between authentic fresh orange juice and suspected reconstituted orange juice falsely labeled as fresh. In an advanced instrumental analytical chemistry application of this case, inductively coupled plasma (ICP) spectroscopy is used to distinguish between the…

  6. Insights into the varnishes of historical musical instruments using synchrotron micro-analytical methods

    NASA Astrophysics Data System (ADS)

    Echard, J.-P.; Cotte, M.; Dooryhee, E.; Bertrand, L.

    2008-07-01

    Though ancient violins and other stringed instruments are often revered for the beauty of their varnishes, the varnishing techniques are not much known. In particular, very few detailed varnish analyses have been published so far. Since 2002, a research program at the Musée de la musique (Paris) is dedicated to a detailed description of varnishes on famous ancient musical instruments using a series of novel analytical methods. For the first time, results are presented on the study of the varnish from a late 16th century Venetian lute, using synchrotron micro-analytical methods. Identification of both organic and inorganic compounds distributed within the individual layers of a varnish microsample has been performed using spatially resolved synchrotron Fourier transform infrared microscopy. The univocal identification of the mineral phases is obtained through synchrotron powder X-ray diffraction. The materials identified may be of utmost importance to understand the varnishing process and its similarities with some painting techniques. In particular, the proteinaceous binding medium and the calcium sulfate components (bassanite and anhydrite) that have been identified in the lower layers of the varnish microsample could be related, to a certain extent, to the ground materials of earlier Italian paintings.

  7. Heat addition to a subsonic boundary layer: A preliminary analytical study

    NASA Technical Reports Server (NTRS)

    Macha, J. M.; Norton, D. J.

    1971-01-01

    A preliminary analytical study of the effects of heat addition to the subsonic boundary layer flow over a typical airfoil shape is presented. This phenomenon becomes of interest in the space shuttle mission since heat absorbed by the wing structure during re-entry will be rejected to the boundary layer during the subsequent low speed maneuvering and landing phase. A survey of existing literature and analytical solutions for both laminar and turbulent flow indicate that a heated surface generally destabilizes the boundary layer. Specifically, the boundary layer thickness is increased, the skin friction at the surface is decreased and the point of flow separation is moved forward. In addition, limited analytical results predict that the angle of attack at which a heated airfoil will stall is significantly less than the stall angle of an unheated wing. These effects could adversely affect the lift and drag, and thus the maneuvering capabilities of booster and orbiter shuttle vehicles.

  8. Analytical techniques and instrumentation: A compilation

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information on developments in instrumentation is arranged into four sections: (1) instrumentation for analysis; (2) analysis of matter; (3) analysis of electrical and mechanical phenomena; and (4) structural analysis. Patent information for two of the instruments described is presented.

  9. Analytical evaluation of the automated galectin-3 assay on the Abbott ARCHITECT immunoassay instruments.

    PubMed

    Gaze, David C; Prante, Christian; Dreier, Jens; Knabbe, Cornelius; Collet, Corinne; Launay, Jean-Marie; Franekova, Janka; Jabor, Antonin; Lennartz, Lieselotte; Shih, Jessie; del Rey, Jose Manuel; Zaninotto, Martina; Plebani, Mario; Collinson, Paul O

    2014-06-01

    Galectin-3 is secreted from macrophages and binds and activates fibroblasts forming collagen. Tissue fibrosis is central to the progression of chronic heart failure (CHF). We performed a European multicentered evaluation of the analytical performance of the two-step routine and Short Turn-Around-Time (STAT) galectin-3 immunoassay on the ARCHITECT i1000SR, i2000SR, and i4000SR (Abbott Laboratories). We evaluated the assay precision and dilution linearity for both routine and STAT assays and compared serum and plasma, and fresh vs. frozen samples. The reference interval and biological variability were also assessed. Measurable samples were compared between ARCHITECT instruments and between the routine and STAT assays and also to a galectin-3 ELISA (BG Medicine). The total assay coefficient of variation (CV%) was 2.3%-6.2% and 1.7%-7.4% for the routine and STAT assays, respectively. Both assays demonstrated linearity up to 120 ng/mL. Galectin-3 concentrations were higher in plasma samples than in serum samples and correlated well between fresh and frozen samples (R=0.997), between the routine and STAT assays, between the ARCHITECT i1000 and i2000 instruments and with the galectin-3 ELISA. The reference interval on 627 apparently healthy individuals (53% male) yielded upper 95th and 97.5th percentiles of 25.2 and 28.4 ng/mL, respectively. Values were significantly lower in subjects younger than 50 years. The galectin-3 routine and STAT assays on the Abbott ARCHITECT instruments demonstrated good analytical performance. Further clinical studies are required to demonstrate the diagnostic and prognostic potential of this novel marker in patients with CHF.

  10. Utilizing global data to estimate analytical performance on the Sigma scale: A global comparative analysis of methods, instruments, and manufacturers through external quality assurance and proficiency testing programs.

    PubMed

    Westgard, Sten A

    2016-06-01

    To assess the analytical performance of instruments and methods through external quality assessment and proficiency testing data on the Sigma scale. A representative report from five different EQA/PT programs around the world (2 US, 1 Canadian, 1 UK, and 1 Australasian) was accessed. The instrument group standard deviations were used as surrogate estimates of instrument imprecision. Performance specifications from the US CLIA proficiency testing criteria were used to establish a common quality goal. Then Sigma-metrics were calculated to grade the analytical performance. Different methods have different Sigma-metrics for each analyte reviewed. Summary Sigma-metrics estimate the percentage of the chemistry analytes that are expected to perform above Five Sigma, which is where optimized QC design can be implemented. The range of performance varies from 37% to 88%, exhibiting significant differentiation between instruments and manufacturers. Median Sigmas for the different manufacturers in three analytes (albumin, glucose, sodium) showed significant differentiation. Chemistry tests are not commodities. Quality varies significantly from manufacturer to manufacturer, instrument to instrument, and method to method. The Sigma-assessments from multiple EQA/PT programs provide more insight into the performance of methods and instruments than any single program by itself. It is possible to produce a ranking of performance by manufacturer, instrument and individual method. Laboratories seeking optimal instrumentation would do well to consult this data as part of their decision-making process. To confirm that these assessments are stable and reliable, a longer term study should be conducted that examines more results over a longer time period. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  11. Analytical interference of drugs in clinical chemistry: I--Study of twenty drugs on seven different instruments.

    PubMed

    Letellier, G; Desjarlais, F

    1985-12-01

    We have investigated the effect of 20 drugs on the accuracy of results obtained from seven instruments now widely used in clinical biochemistry laboratories: Abbott VP, aca II, Cobas Bio, Ektachem 400, Hitachi 705, KDA and SMAC. Eleven to 18 constituents were analysed on each instrument. Our results lead us to the following conclusions: (1) only rarely does drug interference with a method lead to a clinically significant change in a measured value; (2) the magnitude of the change may relate linearly or non-linearly to the drug concentration but is usually independent of the target analyte concentration; (3) interference with a chemical reaction on one instrument does not always mean that the same reaction will be altered in the same way on other instruments; (4) no interferences were found for drugs with therapeutic levels in the low micro-molar range; (5) in most cases the interference could not be predicted from the chemical nature of drug.

  12. Sharing the Data along with the Responsibility: Examining an Analytic Scale-Based Model for Assessing School Climate.

    ERIC Educational Resources Information Center

    Shindler, John; Taylor, Clint; Cadenas, Herminia; Jones, Albert

    This study was a pilot effort to examine the efficacy of an analytic trait scale school climate assessment instrument and democratic change system in two urban high schools. Pilot study results indicate that the instrument shows promising soundness in that it exhibited high levels of validity and reliability. In addition, the analytic trait format…

  13. Analytical Chemistry in Russia.

    PubMed

    Zolotov, Yuri

    2016-09-06

    Research in Russian analytical chemistry (AC) is carried out on a significant scale, and the analytical service solves practical tasks of geological survey, environmental protection, medicine, industry, agriculture, etc. The education system trains highly skilled professionals in AC. The development and especially manufacturing of analytical instruments should be improved; in spite of this, there are several good domestic instruments and other satisfy some requirements. Russian AC has rather good historical roots.

  14. Instrumentation '79.

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1979

    1979-01-01

    Surveys the state of commerical development of analytical instrumentation as reflected by the Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy. Includes optical spectroscopy, liquid chromatography, magnetic spectrometers, and x-ray. (Author/MA)

  15. Analytical relationships for prediction of the mechanical properties of additively manufactured porous biomaterials

    PubMed Central

    Hedayati, Reza

    2016-01-01

    Abstract Recent developments in additive manufacturing techniques have motivated an increasing number of researchers to study regular porous biomaterials that are based on repeating unit cells. The physical and mechanical properties of such porous biomaterials have therefore received increasing attention during recent years. One of the areas that have revived is analytical study of the mechanical behavior of regular porous biomaterials with the aim of deriving analytical relationships that could predict the relative density and mechanical properties of porous biomaterials, given the design and dimensions of their repeating unit cells. In this article, we review the analytical relationships that have been presented in the literature for predicting the relative density, elastic modulus, Poisson's ratio, yield stress, and buckling limit of regular porous structures based on various types of unit cells. The reviewed analytical relationships are used to compare the mechanical properties of porous biomaterials based on different types of unit cells. The major areas where the analytical relationships have improved during the recent years are discussed and suggestions are made for future research directions. © 2016 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 104A: 3164–3174, 2016. PMID:27502358

  16. Analytical relationships for prediction of the mechanical properties of additively manufactured porous biomaterials.

    PubMed

    Zadpoor, Amir Abbas; Hedayati, Reza

    2016-12-01

    Recent developments in additive manufacturing techniques have motivated an increasing number of researchers to study regular porous biomaterials that are based on repeating unit cells. The physical and mechanical properties of such porous biomaterials have therefore received increasing attention during recent years. One of the areas that have revived is analytical study of the mechanical behavior of regular porous biomaterials with the aim of deriving analytical relationships that could predict the relative density and mechanical properties of porous biomaterials, given the design and dimensions of their repeating unit cells. In this article, we review the analytical relationships that have been presented in the literature for predicting the relative density, elastic modulus, Poisson's ratio, yield stress, and buckling limit of regular porous structures based on various types of unit cells. The reviewed analytical relationships are used to compare the mechanical properties of porous biomaterials based on different types of unit cells. The major areas where the analytical relationships have improved during the recent years are discussed and suggestions are made for future research directions. © 2016 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 104A: 3164-3174, 2016. © 2016 The Authors Journal of Biomedical Materials Research Part A Published by Wiley Periodicals, Inc.

  17. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A.

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses onmore » validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.« less

  19. Integration of analytical instruments with computer scripting.

    PubMed

    Carvalho, Matheus C

    2013-08-01

    Automation of laboratory routines aided by computer software enables high productivity and is the norm nowadays. However, the integration of different instruments made by different suppliers is still difficult, because to accomplish it, the user must have knowledge of electronics and/or low-level programming. An alternative approach is to control different instruments without an electronic connection between them, relying only on their software interface on a computer. This can be achieved through scripting, which is the emulation of user operations (mouse clicks and keyboard inputs) on the computer. The main advantages of this approach are its simplicity, which enables people with minimal knowledge of computer programming to employ it, and its universality, which enables the integration of instruments made by different suppliers, meaning that the user is totally free to choose the devices to be integrated. Therefore, scripting can be a useful, accessible, and economic solution for laboratory automation.

  20. Application of the correlation constrained multivariate curve resolution alternating least-squares method for analyte quantitation in the presence of unexpected interferences using first-order instrumental data.

    PubMed

    Goicoechea, Héctor C; Olivieri, Alejandro C; Tauler, Romà

    2010-03-01

    Correlation constrained multivariate curve resolution-alternating least-squares is shown to be a feasible method for processing first-order instrumental data and achieve analyte quantitation in the presence of unexpected interferences. Both for simulated and experimental data sets, the proposed method could correctly retrieve the analyte and interference spectral profiles and perform accurate estimations of analyte concentrations in test samples. Since no information concerning the interferences was present in calibration samples, the proposed multivariate calibration approach including the correlation constraint facilitates the achievement of the so-called second-order advantage for the analyte of interest, which is known to be present for more complex higher-order richer instrumental data. The proposed method is tested using a simulated data set and two experimental data systems, one for the determination of ascorbic acid in powder juices using UV-visible absorption spectral data, and another for the determination of tetracycline in serum samples using fluorescence emission spectroscopy.

  1. Implementation and application of moving average as continuous analytical quality control instrument demonstrated for 24 routine chemistry assays.

    PubMed

    Rossum, Huub H van; Kemperman, Hans

    2017-07-26

    General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.

  2. Local density measurement of additive manufactured copper parts by instrumented indentation

    NASA Astrophysics Data System (ADS)

    Santo, Loredana; Quadrini, Fabrizio; Bellisario, Denise; Tedde, Giovanni Matteo; Zarcone, Mariano; Di Domenico, Gildo; D'Angelo, Pierpaolo; Corona, Diego

    2018-05-01

    Instrumented flat indentation has been used to evaluate local density of additive manufactured (AM) copper samples with different relative density. Indentations were made by using tungsten carbide (WC) flat pins with 1 mm diameter. Pure copper powders were used in a selective laser melting (SLM) machine to produce samples to test. By changing process parameters, samples density was changed from the relative density of 63% to 71%. Indentation tests were performed on the xy surface of the AM samples. In order to make a correlation between indentation test results and sample density, the indentation pressure at fixed displacement was selected. Results show that instrumented indentation is a valid technique to measure density distribution along the geometry of an SLM part. In fact, a linear trend between indentation pressure and sample density was found for the selected density range.

  3. Canine olfaction as an alternative to analytical instruments for ...

    EPA Pesticide Factsheets

    Recent literature has touted the use of canine olfaction as a diagnostic tool for identifying pre-clinical disease status, especially cancer and infection from biological media samples. Studies have shown a wide range of outcomes, ranging from almost perfect discrimination, all the way to essentially random results. This disparity is not likely to be a detection issue; dogs have been shown to have extremely sensitive noses as proven by their use for tracking, bomb detection and search and rescue. However, in contrast to analytical instruments, dogs are subject to boredom, fatigue, hunger and external distractions. These challenges are of particular importance in a clinical environment where task repetition is prized, but not as entertaining for a dog as chasing odours outdoors. The question addressed here is how to exploit the intrinsic sensitivity and simplicity of having a dog simply sniff out disease, in the face of variability in behavior and response. There is no argument that living cells emanate a variety of gas- and liquid-phase compounds as waste from normal metabolism, and that these compounds become easureable from various biological media including skin, blood, urine, breath, feces, etc. [1, 2] The overarching term for this phenomenon from the perspective of systems biology analysis is “cellular respiration”, which has become an important topic for the interpretation and documentation of the human exposome, the chemical counterpart to the genome.

  4. Automated novel high-accuracy miniaturized positioning system for use in analytical instrumentation

    NASA Astrophysics Data System (ADS)

    Siomos, Konstadinos; Kaliakatsos, John; Apostolakis, Manolis; Lianakis, John; Duenow, Peter

    1996-01-01

    The development of three-dimensional automotive devices (micro-robots) for applications in analytical instrumentation, clinical chemical diagnostics and advanced laser optics, depends strongly on the ability of such a device: firstly to be positioned with high accuracy, reliability, and automatically, by means of user friendly interface techniques; secondly to be compact; and thirdly to operate under vacuum conditions, free of most of the problems connected with conventional micropositioners using stepping-motor gear techniques. The objective of this paper is to develop and construct a mechanically compact computer-based micropositioning system for coordinated motion in the X-Y-Z directions with: (1) a positioning accuracy of less than 1 micrometer, (the accuracy of the end-position of the system is controlled by a hard/software assembly using a self-constructed optical encoder); (2) a heat-free propulsion mechanism for vacuum operation; and (3) synchronized X-Y motion.

  5. The relative responsiveness of test instruments can be estimated using a meta-analytic approach: an illustration with treatments for depression.

    PubMed

    Kounali, Daphne Z; Button, Katherine S; Lewis, Glyn; Ades, Anthony E

    2016-09-01

    We present a meta-analytic method that combines information on treatment effects from different instruments from a network of randomized trials to estimate instrument relative responsiveness. Five depression-test instruments [Beck Depression Inventory (BDI I/II), Patient Health Questionnaire (PHQ9), Hamilton Rating for Depression 17 and 24 items, Montgomery-Asberg Depression Rating] and three generic quality of life measures [EuroQoL (EQ-5D), SF36 mental component summary (SF36 MCS), and physical component summary (SF36 PCS)] were compared. Randomized trials of treatments for depression reporting outcomes on any two or more of these instruments were identified. Information on the within-trial ratios of standardized treatment effects was pooled across the studies to estimate relative responsiveness. The between-instrument ratios of standardized treatment effects vary across trials, with a coefficient of variation of 13% (95% credible interval: 6%, 25%). There were important differences between the depression measures, with PHQ9 being the most responsive instrument and BDI the least. Responsiveness of the EQ-5D and SF36 PCS was poor. SF36 MCS performed similarly to depression instruments. Information on relative responsiveness of several test instruments can be pooled across networks of trials reporting at least two outcomes, allowing comparison and ranking of test instruments that may never have been compared directly. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Analytical techniques for retrieval of atmospheric composition with the quadrupole mass spectrometer of the Sample Analysis at Mars instrument suite on Mars Science Laboratory

    NASA Astrophysics Data System (ADS)

    B. Franz, Heather; G. Trainer, Melissa; H. Wong, Michael; L. K. Manning, Heidi; C. Stern, Jennifer; R. Mahaffy, Paul; K. Atreya, Sushil; Benna, Mehdi; G. Conrad, Pamela; N. Harpold, Dan; A. Leshin, Laurie; A. Malespin, Charles; P. McKay, Christopher; Thomas Nolan, J.; Raaen, Eric

    2014-06-01

    The Sample Analysis at Mars (SAM) instrument suite is the largest scientific payload on the Mars Science Laboratory (MSL) Curiosity rover, which landed in Mars' Gale Crater in August 2012. As a miniature geochemical laboratory, SAM is well-equipped to address multiple aspects of MSL's primary science goal, characterizing the potential past or present habitability of Gale Crater. Atmospheric measurements support this goal through compositional investigations relevant to martian climate evolution. SAM instruments include a quadrupole mass spectrometer, a tunable laser spectrometer, and a gas chromatograph that are used to analyze martian atmospheric gases as well as volatiles released by pyrolysis of solid surface materials (Mahaffy et al., 2012). This report presents analytical methods for retrieving the chemical and isotopic composition of Mars' atmosphere from measurements obtained with SAM's quadrupole mass spectrometer. It provides empirical calibration constants for computing volume mixing ratios of the most abundant atmospheric species and analytical functions to correct for instrument artifacts and to characterize measurement uncertainties. Finally, we discuss differences in volume mixing ratios of the martian atmosphere as determined by SAM (Mahaffy et al., 2013) and Viking (Owen et al., 1977; Oyama and Berdahl, 1977) from an analytical perspective. Although the focus of this paper is atmospheric observations, much of the material concerning corrections for instrumental effects also applies to reduction of data acquired with SAM from analysis of solid samples. The Sample Analysis at Mars (SAM) instrument measures the composition of the martian atmosphere. Rigorous calibration of SAM's mass spectrometer was performed with relevant gas mixtures. Calibration included derivation of a new model to correct for electron multiplier effects. Volume mixing ratios for Ar and N2 obtained with SAM differ from those obtained with Viking. Differences between SAM and Viking

  7. Analytical instrumentation infrastructure for combinatorial and high-throughput development of formulated discrete and gradient polymeric sensor materials arrays

    NASA Astrophysics Data System (ADS)

    Potyrailo, Radislav A.; Hassib, Lamyaa

    2005-06-01

    Multicomponent polymer-based formulations of optical sensor materials are difficult and time consuming to optimize using conventional approaches. To address these challenges, our long-term goal is to determine relationships between sensor formulation and sensor response parameters using new scientific methodologies. As the first step, we have designed and implemented an automated analytical instrumentation infrastructure for combinatorial and high-throughput development of polymeric sensor materials for optical sensors. Our approach is based on the fabrication and performance screening of discrete and gradient sensor arrays. Simultaneous formation of multiple sensor coatings into discrete 4×6, 6×8, and 8×12 element arrays (3-15μL volume per element) and their screening provides not only a well-recognized acceleration in the screening rate, but also considerably reduces or even eliminates sources of variability, which are randomly affecting sensors response during a conventional one-at-a-time sensor coating evaluation. The application of gradient sensor arrays provides additional capabilities for rapid finding of the optimal formulation parameters.

  8. Caesium sputter ion source compatible with commercial SIMS instruments

    NASA Astrophysics Data System (ADS)

    Belykh, S. F.; Palitsin, V. V.; Veryovkin, I. V.; Kovarsky, A. P.; Chang, R. J. H.; Adriaens, A.; Dowsett, M.; Adams, F.

    2006-07-01

    A simple design for a caesium sputter cluster ion source compatible with commercially available secondary ion mass spectrometers is reported. This source has been tested with the Cameca IMS 4f instrument using the cluster Si n- and Cu n- ions, and will shortly be retrofitted to the floating low energy ion gun (FLIG) of the type used on the Cameca 4500/4550 quadruple instruments. Our experiments with surface characterization and depth profiling conducted to date demonstrate improvements of analytical capabilities of the SIMS instrument due to the non-additive enhancement of secondary ion emission and shorter ion ranges of polyatomic projectiles compared to atomic ions with the same impact energy.

  9. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    PubMed Central

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370

  10. Percutaneous Dorsal Instrumentation of Vertebral Burst Fractures: Value of Additional Percutaneous Intravertebral Reposition—Cadaver Study

    PubMed Central

    Krüger, Antonio; Schmuck, Maya; Noriega, David C.; Ruchholtz, Steffen; Baroud, Gamal; Oberkircher, Ludwig

    2015-01-01

    Purpose. The treatment of vertebral burst fractures is still controversial. The aim of the study is to evaluate the purpose of additional percutaneous intravertebral reduction when combined with dorsal instrumentation. Methods. In this biomechanical cadaver study twenty-eight spine segments (T11-L3) were used (male donors, mean age 64.9 ± 6.5 years). Burst fractures of L1 were generated using a standardised protocol. After fracture all spines were allocated to four similar groups and randomised according to surgical techniques (posterior instrumentation; posterior instrumentation + intravertebral reduction device + cement augmentation; posterior instrumentation + intravertebral reduction device without cement; and intravertebral reduction device + cement augmentation). After treatment, 100000 cycles (100–600 N, 3 Hz) were applied using a servohydraulic loading frame. Results. Overall anatomical restoration was better in all groups where the intravertebral reduction device was used (p < 0.05). In particular, it was possible to restore central endplates (p > 0.05). All techniques decreased narrowing of the spinal canal. After loading, clearance could be maintained in all groups fitted with the intravertebral reduction device. Narrowing increased in the group treated with dorsal instrumentation. Conclusions. For height and anatomical restoration, the combination of an intravertebral reduction device with dorsal instrumentation showed significantly better results than sole dorsal instrumentation. PMID:26137481

  11. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validatemore » analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.« less

  12. Net analyte signal standard addition method (NASSAM) as a novel spectrofluorimetric and spectrophotometric technique for simultaneous determination, application to assay of melatonin and pyridoxine

    NASA Astrophysics Data System (ADS)

    Asadpour-Zeynali, Karim; Bastami, Mohammad

    2010-02-01

    In this work a new modification of the standard addition method called "net analyte signal standard addition method (NASSAM)" is presented for the simultaneous spectrofluorimetric and spectrophotometric analysis. The proposed method combines the advantages of standard addition method with those of net analyte signal concept. The method can be applied for the determination of analyte in the presence of known interferents. The accuracy of the predictions against H-point standard addition method is not dependent on the shape of the analyte and interferent spectra. The method was successfully applied to simultaneous spectrofluorimetric and spectrophotometric determination of pyridoxine (PY) and melatonin (MT) in synthetic mixtures and in a pharmaceutical formulation.

  13. Determination of Ca content of coral skeleton by analyte additive method using the LIBS technique

    NASA Astrophysics Data System (ADS)

    Haider, A. F. M. Y.; Khan, Z. H.

    2012-09-01

    Laser-induced breakdown spectroscopic (LIBS) technique was used to study the elemental profile of coral skeletons. Apart from calcium and carbon, which are the main elemental constituents of coral skeleton, elements like Sr, Na, Mg, Li, Si, Cu, Ti, K, Mn, Zn, Ba, Mo, Br and Fe were detected in the coral skeletons from the Inani Beach and the Saint Martin's island of Bangladesh and the coral from the Philippines. In addition to the qualitative analysis, the quantitative analysis of the main elemental constituent, calcium (Ca), was done. The result shows the presence of (36.15±1.43)% by weight of Ca in the coral skeleton collected from the Inani Beach, Cox's Bazar, Bangladesh. It was determined by using six calibration curves, drawn for six emission lines of Ca I (428.301 nm, 428.936 nm, 431.865 nm, 443.544 nm, 443.569 nm, and 445.589 nm), by standard analyte additive method. Also from AAS measurement the percentage content of Ca in the same sample of coral skeleton obtained was 39.87% by weight which compares fairly well with the result obtained by the analyte additive method.

  14. Accommodating subject and instrument variations in spectroscopic determinations

    DOEpatents

    Haas, Michael J [Albuquerque, NM; Rowe, Robert K [Corrales, NM; Thomas, Edward V [Albuquerque, NM

    2006-08-29

    A method and apparatus for measuring a biological attribute, such as the concentration of an analyte, particularly a blood analyte in tissue such as glucose. The method utilizes spectrographic techniques in conjunction with an improved instrument-tailored or subject-tailored calibration model. In a calibration phase, calibration model data is modified to reduce or eliminate instrument-specific attributes, resulting in a calibration data set modeling intra-instrument or intra-subject variation. In a prediction phase, the prediction process is tailored for each target instrument separately using a minimal number of spectral measurements from each instrument or subject.

  15. Analytical Chemistry Laboratory Progress Report for FY 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less

  16. A novel second-order standard addition analytical method based on data processing with multidimensional partial least-squares and residual bilinearization.

    PubMed

    Lozano, Valeria A; Ibañez, Gabriela A; Olivieri, Alejandro C

    2009-10-05

    In the presence of analyte-background interactions and a significant background signal, both second-order multivariate calibration and standard addition are required for successful analyte quantitation achieving the second-order advantage. This report discusses a modified second-order standard addition method, in which the test data matrix is subtracted from the standard addition matrices, and quantitation proceeds via the classical external calibration procedure. It is shown that this novel data processing method allows one to apply not only parallel factor analysis (PARAFAC) and multivariate curve resolution-alternating least-squares (MCR-ALS), but also the recently introduced and more flexible partial least-squares (PLS) models coupled to residual bilinearization (RBL). In particular, the multidimensional variant N-PLS/RBL is shown to produce the best analytical results. The comparison is carried out with the aid of a set of simulated data, as well as two experimental data sets: one aimed at the determination of salicylate in human serum in the presence of naproxen as an additional interferent, and the second one devoted to the analysis of danofloxacin in human serum in the presence of salicylate.

  17. Pods: a Powder Delivery System for Mars In-situ Organic, Mineralogic and Isotopic Analysis Instruments

    NASA Technical Reports Server (NTRS)

    Saha, C. P.; Bryson, C. E.; Sarrazin, P.; Blake, D. F.

    2005-01-01

    Many Mars in situ instruments require fine-grained high-fidelity samples of rocks or soil. Included are instruments for the determination of mineralogy as well as organic and isotopic chemistry. Powder can be obtained as a primary objective of a sample collection system (e.g., by collecting powder as a surface is abraded by a rotary abrasion tool (RAT)), or as a secondary objective (e.g, by collecting drill powder as a core is drilled). In the latter case, a properly designed system could be used to monitor drilling in real time as well as to deliver powder to analytical instruments which would perform complementary analyses to those later performed on the intact core. In addition, once a core or other sample is collected, a system that could transfer intelligently collected subsamples of power from the intact core to a suite of analytical instruments would be highly desirable. We have conceptualized, developed and tested a breadboard Powder Delivery System (PoDS) intended to satisfy the collection, processing and distribution requirements of powder samples for Mars in-situ mineralogic, organic and isotopic measurement instruments.

  18. Science Update: Analytical Chemistry.

    ERIC Educational Resources Information Center

    Worthy, Ward

    1980-01-01

    Briefly discusses new instrumentation in the field of analytical chemistry. Advances in liquid chromatography, photoacoustic spectroscopy, the use of lasers, and mass spectrometry are also discussed. (CS)

  19. A generic standard additions based method to determine endogenous analyte concentrations by immunoassays to overcome complex biological matrix interference.

    PubMed

    Pang, Susan; Cowen, Simon

    2017-12-13

    We describe a novel generic method to derive the unknown endogenous concentrations of analyte within complex biological matrices (e.g. serum or plasma) based upon the relationship between the immunoassay signal response of a biological test sample spiked with known analyte concentrations and the log transformed estimated total concentration. If the estimated total analyte concentration is correct, a portion of the sigmoid on a log-log plot is very close to linear, allowing the unknown endogenous concentration to be estimated using a numerical method. This approach obviates conventional relative quantification using an internal standard curve and need for calibrant diluent, and takes into account the individual matrix interference on the immunoassay by spiking the test sample itself. This technique is based on standard additions for chemical analytes. Unknown endogenous analyte concentrations within even 2-fold diluted human plasma may be determined reliably using as few as four reaction wells.

  20. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  1. Instrument Attitude Precision Control

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan

    2004-01-01

    A novel approach is presented in this paper to analyze attitude precision and control for an instrument gimbaled to a spacecraft subject to an internal disturbance caused by a moving component inside the instrument. Nonlinear differential equations of motion for some sample cases are derived and solved analytically to gain insight into the influence of the disturbance on the attitude pointing error. A simple control law is developed to eliminate the instrument pointing error caused by the internal disturbance. Several cases are presented to demonstrate and verify the concept presented in this paper.

  2. [Clinical Application of Analytical and Medical Instruments Mainly Using MS Techniques].

    PubMed

    Tanaka, Koichi

    2016-02-01

    Analytical instruments for clinical use are commonly required to confirm the compounds and forms related to diseases with the highest possible sensitivity, quantitative performance, and specificity and minimal invasiveness within a short time, easily, and at a low cost. Advancements of technical innovation for Mass Spectrometer (MS) have led to techniques that meet such requirements. Besides confirming known substances, other purposes and advantages of MS that are not fully known to the public are using MS as a tool to discover unknown phenomena and compounds. An example is clarifying the mechanisms of human diseases. The human body has approximately 100 thousand types of protein, and there may be more than several million types of protein and their metabolites. Most of them have yet to be discovered, and their discovery may give birth to new academic fields and lead to the clarification of diseases, development of new medicines, etc. For example, using the MS system developed under "Contribution to drug discovery and diagnosis by next generation of advanced mass spectrometry system," one of the 30 projects of the "Funding Program for World-Leading Innovative R&D on Science and Technology" (FIRST program), and other individual basic technologies, we succeeded in discovering new disease biomarker candidates for Alzheimer's disease, cancer, etc. Further contribution of MS to clinical medicine can be expected through the development and improvement of new techniques, efforts to verify discoveries, and communications with the medical front.

  3. The Use and Abuse of Limits of Detection in Environmental Analytical Chemistry

    PubMed Central

    Brown, Richard J. C.

    2008-01-01

    The limit of detection (LoD) serves as an important method performance measure that is useful for the comparison of measurement techniques and the assessment of likely signal to noise performance, especially in environmental analytical chemistry. However, the LoD is only truly related to the precision characteristics of the analytical instrument employed for the analysis and the content of analyte in the blank sample. This article discusses how other criteria, such as sampling volume, can serve to distort the quoted LoD artificially and make comparison between various analytical methods inequitable. In order to compare LoDs between methods properly, it is necessary to state clearly all of the input parameters relating to the measurements that have been used in the calculation of the LoD. Additionally, the article discusses that the use of LoDs in contexts other than the comparison of the attributes of analytical methods, in particular when reporting analytical results, may be confusing, less informative than quoting the actual result with an accompanying statement of uncertainty, and may act to bias descriptive statistics. PMID:18690384

  4. Analytical balance-based Faraday magnetometer

    NASA Astrophysics Data System (ADS)

    Riminucci, Alberto; Uhlarz, Marc; De Santis, Roberto; Herrmannsdörfer, Thomas

    2017-03-01

    We introduce a Faraday magnetometer based on an analytical balance in which we were able to apply magnetic fields up to 0.14 T. We calibrated it with a 1 mm Ni sphere previously characterized in a superconducting quantum interference device (SQUID) magnetometer. The proposed magnetometer reached a theoretical sensitivity of 3 × 10-8 A m2. We demonstrated its operation on magnetic composite scaffolds made of poly(ɛ-caprolactone)/iron-doped hydroxyapatite. To confirm the validity of the method, we measured the same scaffold properties in a SQUID magnetometer. The agreement between the two measurements was within 5% at 0.127 T and 12% at 24 mT. With the addition, for a small cost, of a permanent magnet and computer controlled linear translators, we were thus able to assemble a Faraday magnetometer based on an analytical balance, which is a virtually ubiquitous instrument. This will make simple but effective magnetometry easily accessible to most laboratories, in particular, to life sciences ones, which are increasingly interested in magnetic materials.

  5. The rise of environmental analytical chemistry as an interdisciplinary activity.

    PubMed

    Brown, Richard

    2009-07-01

    Modern scientific endeavour is increasingly delivered within an interdisciplinary framework. Analytical environmental chemistry is a long-standing example of an interdisciplinary approach to scientific research where value is added by the close cooperation of different disciplines. This editorial piece discusses the rise of environmental analytical chemistry as an interdisciplinary activity and outlines the scope of the Analytical Chemistry and the Environmental Chemistry domains of TheScientificWorldJOURNAL (TSWJ), and the appropriateness of TSWJ's domain format in covering interdisciplinary research. All contributions of new data, methods, case studies, and instrumentation, or new interpretations and developments of existing data, case studies, methods, and instrumentation, relating to analytical and/or environmental chemistry, to the Analytical and Environmental Chemistry domains, are welcome and will be considered equally.

  6. Analytical performance of benchtop total reflection X-ray fluorescence instrumentation for multielemental analysis of wine samples

    NASA Astrophysics Data System (ADS)

    Dalipi, Rogerta; Marguí, Eva; Borgese, Laura; Bilo, Fabjola; Depero, Laura E.

    2016-06-01

    Recent technological improvements have led to a widespread adoption of benchtop total reflection X-ray fluorescence systems (TXRF) for analysis of liquid samples. However, benchtop TXRF systems usually present limited sensitivity compared with high-scale instrumentation which can restrict its application in some fields. The aim of the present work was to evaluate and compare the analytical capabilities of two TXRF systems, equipped with low power Mo and W target X-ray tubes, for multielemental analysis of wine samples. Using the Mo-TXRF system, the detection limits for most elements were one order of magnitude lower than those attained using the W-TXRF system. For the detection of high Z elements like Cd and Ag, however, W-TXRF remains a very good option due to the possibility of K-Lines detection. Accuracy and precision of the obtained results have been evaluated analyzing spiked real wine samples and comparing the TXRF results with those obtained by inductively coupled plasma emission spectroscopy (ICP-OES). In general, good agreement was obtained between ICP-OES and TXRF results for the analysis of both red and white wine samples except for light elements (i.e., K) which TXRF concentrations were underestimated. However, a further achievement of analytical quality of TXRF results can be achieved if wine analysis is performed after dilution of the sample with de-ionized water.

  7. 40 CFR 1066.130 - Measurement instrument calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Measurement instrument calibrations... (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Equipment, Measurement Instruments, Fuel, and Analytical Gas Specifications § 1066.130 Measurement instrument calibrations and verifications. The...

  8. Trends in Analytical Scale Separations.

    ERIC Educational Resources Information Center

    Jorgenson, James W.

    1984-01-01

    Discusses recent developments in the instrumentation and practice of analytical scale operations. Emphasizes detection devices and procedures in gas chromatography, liquid chromatography, electrophoresis, supercritical fluid chromatography, and field-flow fractionation. (JN)

  9. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  10. H-point standard additions method for simultaneous determination of sulfamethoxazole and trimethoprim in pharmaceutical formulations and biological fluids with simultaneous addition of two analytes

    NASA Astrophysics Data System (ADS)

    Givianrad, M. H.; Saber-Tehrani, M.; Aberoomand-Azar, P.; Mohagheghian, M.

    2011-03-01

    The applicability of H-point standard additions method (HPSAM) to the resolving of overlapping spectra corresponding to the sulfamethoxazole and trimethoprim is verified by UV-vis spectrophotometry. The results show that the H-point standard additions method with simultaneous addition of both analytes is suitable for the simultaneous determination of sulfamethoxazole and trimethoprim in aqueous media. The results of applying the H-point standard additions method showed that the two drugs could be determined simultaneously with the concentration ratios of sulfamethoxazole to trimethoprim varying from 1:18 to 16:1 in the mixed samples. Also, the limits of detections were 0.58 and 0.37 μmol L -1 for sulfamethoxazole and trimethoprim, respectively. In addition the means of the calculated RSD (%) were 1.63 and 2.01 for SMX and TMP, respectively in synthetic mixtures. The proposed method has been successfully applied to the simultaneous determination of sulfamethoxazole and trimethoprim in some synthetic, pharmaceutical formulation and biological fluid samples.

  11. Potential sources of analytical bias and error in selected trace element data-quality analyses

    USGS Publications Warehouse

    Paul, Angela P.; Garbarino, John R.; Olsen, Lisa D.; Rosen, Michael R.; Mebane, Christopher A.; Struzeski, Tedmund M.

    2016-09-28

    Potential sources of analytical bias and error associated with laboratory analyses for selected trace elements where concentrations were greater in filtered samples than in paired unfiltered samples were evaluated by U.S. Geological Survey (USGS) Water Quality Specialists in collaboration with the USGS National Water Quality Laboratory (NWQL) and the Branch of Quality Systems (BQS).Causes for trace-element concentrations in filtered samples to exceed those in associated unfiltered samples have been attributed to variability in analytical measurements, analytical bias, sample contamination either in the field or laboratory, and (or) sample-matrix chemistry. These issues have not only been attributed to data generated by the USGS NWQL but have been observed in data generated by other laboratories. This study continues the evaluation of potential analytical bias and error resulting from matrix chemistry and instrument variability by evaluating the performance of seven selected trace elements in paired filtered and unfiltered surface-water and groundwater samples collected from 23 sampling sites of varying chemistries from six States, matrix spike recoveries, and standard reference materials.Filtered and unfiltered samples have been routinely analyzed on separate inductively coupled plasma-mass spectrometry instruments. Unfiltered samples are treated with hydrochloric acid (HCl) during an in-bottle digestion procedure; filtered samples are not routinely treated with HCl as part of the laboratory analytical procedure. To evaluate the influence of HCl on different sample matrices, an aliquot of the filtered samples was treated with HCl. The addition of HCl did little to differentiate the analytical results between filtered samples treated with HCl from those samples left untreated; however, there was a small, but noticeable, decrease in the number of instances where a particular trace-element concentration was greater in a filtered sample than in the associated

  12. The Influence of Modern Instrumentation on the Analytical and General Chemistry Curriculum at Bates College

    NASA Astrophysics Data System (ADS)

    Wenzel, Thomas J.

    2001-09-01

    The availability of state-of-the-art instruments such as high performance liquid chromatograph, gas chromatograph-mass spectrometer, inductively coupled plasma-atomic emission spectrometer, capillary electrophoresis system, and ion chromatograph obtained through four Instructional Laboratory Improvement and one Course, Curriculum, and Laboratory Improvement grants from the National Science Foundation has led to a profound change in the structure of the analytical and general chemistry courses at Bates College. Students in both sets of courses now undertake ambitious, semester-long, small-group projects. The general chemistry course, which fulfills the prerequisite requirement for all upper-level chemistry courses, focuses on the connection between chemistry and the study of the environment. The projects provide students with an opportunity to conduct a real scientific investigation. The projects emphasize problem solving, team work, and communication, while still fostering the development of important laboratory skills. Cooperative learning is also used extensively in the classroom portion of these courses.

  13. Net analyte signal standard addition method for simultaneous determination of sulphadiazine and trimethoprim in bovine milk and veterinary medicines.

    PubMed

    Hajian, Reza; Mousavi, Esmat; Shams, Nafiseh

    2013-06-01

    Net analyte signal standard addition method has been used for the simultaneous determination of sulphadiazine and trimethoprim by spectrophotometry in some bovine milk and veterinary medicines. The method combines the advantages of standard addition method with the net analyte signal concept which enables the extraction of information concerning a certain analyte from spectra of multi-component mixtures. This method has some advantages such as the use of a full spectrum realisation, therefore it does not require calibration and prediction step and only a few measurements require for the determination. Cloud point extraction based on the phenomenon of solubilisation used for extraction of sulphadiazine and trimethoprim in bovine milk. It is based on the induction of micellar organised media by using Triton X-100 as an extraction solvent. At the optimum conditions, the norm of NAS vectors increased linearly with concentrations in the range of 1.0-150.0 μmolL(-1) for both sulphadiazine and trimethoprim. The limits of detection (LOD) for sulphadiazine and trimethoprim were 0.86 and 0.92 μmolL(-1), respectively. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Post-analytical Issues in Hemostasis and Thrombosis Testing.

    PubMed

    Favaloro, Emmanuel J; Lippi, Giuseppe

    2017-01-01

    Analytical concerns within hemostasis and thrombosis testing are continuously decreasing. This is essentially attributable to modern instrumentation, improvements in test performance and reliability, as well as the application of appropriate internal quality control and external quality assurance measures. Pre-analytical issues are also being dealt with in some newer instrumentation, which are able to detect hemolysis, icteria and lipemia, and, in some cases, other issues related to sample collection such as tube under-filling. Post-analytical issues are generally related to appropriate reporting and interpretation of test results, and these are the focus of the current overview, which provides a brief description of these events, as well as guidance for their prevention or minimization. In particular, we propose several strategies for improved post-analytical reporting of hemostasis assays and advise that this may provide the final opportunity to prevent serious clinical errors in diagnosis.

  15. Modern Instrumental Methods in Forensic Toxicology*

    PubMed Central

    Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.

    2009-01-01

    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

  16. H-point standard additions method for simultaneous determination of sulfamethoxazole and trimethoprim in pharmaceutical formulations and biological fluids with simultaneous addition of two analytes.

    PubMed

    Givianrad, M H; Saber-Tehrani, M; Aberoomand-Azar, P; Mohagheghian, M

    2011-03-01

    The applicability of H-point standard additions method (HPSAM) to the resolving of overlapping spectra corresponding to the sulfamethoxazole and trimethoprim is verified by UV-vis spectrophotometry. The results show that the H-point standard additions method with simultaneous addition of both analytes is suitable for the simultaneous determination of sulfamethoxazole and trimethoprim in aqueous media. The results of applying the H-point standard additions method showed that the two drugs could be determined simultaneously with the concentration ratios of sulfamethoxazole to trimethoprim varying from 1:18 to 16:1 in the mixed samples. Also, the limits of detections were 0.58 and 0.37 μmol L(-1) for sulfamethoxazole and trimethoprim, respectively. In addition the means of the calculated RSD (%) were 1.63 and 2.01 for SMX and TMP, respectively in synthetic mixtures. The proposed method has been successfully applied to the simultaneous determination of sulfamethoxazole and trimethoprim in some synthetic, pharmaceutical formulation and biological fluid samples. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry

    PubMed Central

    Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui

    2014-01-01

    Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355

  18. Continuing evolution of in-vitro diagnostic instrumentation

    NASA Astrophysics Data System (ADS)

    Cohn, Gerald E.

    2000-04-01

    The synthesis of analytical instrumentation and analytical biochemistry technologies in modern in vitro diagnostic instrumentation continues to generate new systems with improved performance and expanded capability. Detection modalities have expanded to include multichip modes of fluorescence, scattering, luminescence and reflectance so as to accommodate increasingly sophisticated immunochemical and nucleic acid based reagent systems. The time line graph of system development now extends from the earliest automated clinical spectrophotometers through molecule recognition assays and biosensors to the new breakthroughs of biochip and DNA diagnostics. This brief review traces some of the major innovations in the evolution of system technologies and previews the conference program.

  19. Instrumental Analysis in Environmental Chemistry - Gas Phase Detection Systems

    ERIC Educational Resources Information Center

    Stedman, Donald H.; Meyers, Philip A.

    1974-01-01

    Discusses advances made in chemical analysis instrumentation used in environmental monitoring. This first of two articles is concerned with analytical instrumentation in which detection and dispersion depend ultimately on the properties of gaseous molecules. (JR)

  20. [Survey of analytical works for drugs at emergency and critical care centers with high-performance instruments provided by the Ministry of Health and Welfare (at present: Ministry of Health, Labour, and Welfare) in fiscal 1998--continuation of survey with 2008 survey results as point of reference].

    PubMed

    Saito, Takeshi; Tominaga, Aya; Nozawa, Mayu; Unei, Hiroko; Hatano, Yayoi; Fujita, Yuji; Iseki, Ken; Hori, Yasushi

    2013-09-01

    In a 2008 survey of the 73 emergency and critical care centers around the nation that were equipped with the drug and chemical analytical instrument provided by the Ministry of Welfare (currently the Ministry of Health, Labour, and Welfare) in 1998, 36 of those facilities were using the analytical instruments. Of these 36 facilities, a follow-up survey of the 17 facilities that recorded 50 or analyses per year. Responses were gained from 16 of the facilities and we learned that of those, 14 facilities (87.5%) were conducting analyses using the instrument. There was a positive mutual correlation between the annual number of cases of the 14 facilities conducting analyses with the instrument and the number of work hours. Depending on the instrument in use, average analytical instrument parts and maintenance expenses were roughly three million yen and consumables required a maximum three million yen for analysis of 51-200 cases per year. From this, we calculate that such expenses can be covered under the allowed budget for advanced emergency and critical care centers of 5,000 NHI points (1 point = 10 yen). We found there were few facilities using the instrument for all 15 of the toxic substances recommended for testing by the Japanese Society for Clinical Toxicology. There tended to be no use of the analytical instrument for compounds with no toxicology cases. However, flexible responses were noted at each facility in relation to frequently analyzed compounds. It is thought that a reevaluation of compounds subject to analysis is required.

  1. Analytical Chemistry and the Microchip.

    ERIC Educational Resources Information Center

    Lowry, Robert K.

    1986-01-01

    Analytical techniques used at various points in making microchips are described. They include: Fourier transform infrared spectrometry (silicon purity); optical emission spectroscopy (quantitative thin-film composition); X-ray photoelectron spectroscopy (chemical changes in thin films); wet chemistry, instrumental analysis (process chemicals);…

  2. Selected Analytical Methods for Environmental Remediation and Recovery (SAM) - Home

    EPA Pesticide Factsheets

    The SAM Home page provides access to all information provided in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM), and includes a query function allowing users to search methods by analyte, sample type and instrumentation.

  3. Selecting Suicide Ideation Assessment Instruments: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Erford, Bradley T.; Jackson, Jessica; Bardhoshi, Gerta; Duncan, Kelly; Atalay, Zumra

    2018-01-01

    Psychometric meta-analyses and reviews were provided for four commonly used suicidal ideation instruments: the Beck Scale for Suicide Ideation, the Suicide Ideation Questionnaire, the Suicide Probability Scale, and Columbia--Suicide Severity Rating Scale. Practical and technical issues and best use recommendations for screening and outcome…

  4. Electrochemical Detection of Multiple Bioprocess Analytes

    NASA Technical Reports Server (NTRS)

    Rauh, R. David

    2010-01-01

    An apparatus that includes highly miniaturized thin-film electrochemical sensor array has been demonstrated as a prototype of instruments for simultaneous detection of multiple substances of interest (analytes) and measurement of acidity or alkalinity in bioprocess streams. Measurements of pH and of concentrations of nutrients and wastes in cell-culture media, made by use of these instruments, are to be used as feedback for optimizing the growth of cells or the production of desired substances by the cultured cells. The apparatus is designed to utilize samples of minimal volume so as to minimize any perturbation of monitored processes. The apparatus can function in a potentiometric mode (for measuring pH), an amperometric mode (detecting analytes via oxidation/reduction reactions), or both. The sensor array is planar and includes multiple thin-film microelectrodes covered with hydrous iridium oxide. The oxide layer on each electrode serves as both a protective and electrochemical transducing layer. In its transducing role, the oxide provides electrical conductivity for amperometric measurement or pH response for potentiometric measurement. The oxide on an electrode can also serve as a matrix for one or more enzymes that render the electrode sensitive to a specific analyte. In addition to transducing electrodes, the array includes electrodes for potential control. The array can be fabricated by techniques familiar to the microelectronics industry. The sensor array is housed in a thin-film liquid-flow cell that has a total volume of about 100 mL. The flow cell is connected to a computer-controlled subsystem that periodically draws samples from the bioprocess stream to be monitored. Before entering the cell, each 100-mL sample is subjected to tangential-flow filtration to remove particles. In the present version of the apparatus, the electrodes are operated under control by a potentiostat and are used to simultaneously measure the pH and the concentration of glucose

  5. Mars Analytical Microimager

    NASA Astrophysics Data System (ADS)

    Batory, Krzysztof J.; Govindjee; Andersen, Dale; Presley, John; Lucas, John M.; Sears, S. Kelly; Vali, Hojatollah

    Unambiguous detection of extraterrestrial nitrogenous hydrocarbon microbiology requires an instrument both to recognize potential biogenic specimens and to successfully discriminate them from geochemical settings. Such detection should ideally be in-situ and not jeopardize other experiments by altering samples. Taken individually most biomarkers are inconclusive. For example, since amino acids can be synthesized abiotically they are not always considered reliable biomarkers. An enantiomeric imbalance, which is characteristic of all terrestrial life, may be questioned because chirality can also be altered abiotically. However, current scientific understanding holds that aggregates of identical proteins or proteinaceous complexes, with their well-defined amino acid residue sequences, are indisputable biomarkers. Our paper describes the Mars Analytical Microimager, an instrument for the simultaneous imaging of generic autofluorescent biomarkers and overall morphology. Autofluorescence from ultraviolet to near-infrared is emitted by all known terrestrial biology, and often as consistent complex bands uncharacteristic of abiotic mineral luminescence. The MAM acquires morphology, and even sub-micron morphogenesis, at a 3-centimeter working distance with resolution approaching a laser scanning microscope. Luminescence is simultaneously collected via a 2.5-micron aperture, thereby permitting accurate correlation of multi-dimensional optical behavior with specimen morphology. A variable wavelength excitation source and photospectrometer serve to obtain steady-state and excitation spectra of biotic and luminescent abiotic sources. We believe this is the first time instrumentation for detecting hydrated or desiccated microbiology non-destructively in-situ has been demonstrated. We have obtained excellent preliminary detection of biota and inorganic matrix discrimination from terrestrial polar analogues, and perimetric morphology of individual magnetotactic bacteria. Proposed

  6. Spectral multivariate calibration without laboratory prepared or determined reference analyte values.

    PubMed

    Ottaway, Josh; Farrell, Jeremy A; Kalivas, John H

    2013-02-05

    An essential part to calibration is establishing the analyte calibration reference samples. These samples must characterize the sample matrix and measurement conditions (chemical, physical, instrumental, and environmental) of any sample to be predicted. Calibration usually requires measuring spectra for numerous reference samples in addition to determining the corresponding analyte reference values. Both tasks are typically time-consuming and costly. This paper reports on a method named pure component Tikhonov regularization (PCTR) that does not require laboratory prepared or determined reference values. Instead, an analyte pure component spectrum is used in conjunction with nonanalyte spectra for calibration. Nonanalyte spectra can be from different sources including pure component interference samples, blanks, and constant analyte samples. The approach is also applicable to calibration maintenance when the analyte pure component spectrum is measured in one set of conditions and nonanalyte spectra are measured in new conditions. The PCTR method balances the trade-offs between calibration model shrinkage and the degree of orthogonality to the nonanalyte content (model direction) in order to obtain accurate predictions. Using visible and near-infrared (NIR) spectral data sets, the PCTR results are comparable to those obtained using ridge regression (RR) with reference calibration sets. The flexibility of PCTR also allows including reference samples if such samples are available.

  7. Foundations of measurement and instrumentation

    NASA Technical Reports Server (NTRS)

    Warshawsky, Isidore

    1990-01-01

    The user of instrumentation has provided an understanding of the factors that influence instrument performance, selection, and application, and of the methods of interpreting and presenting the results of measurements. Such understanding is prerequisite to the successful attainment of the best compromise among reliability, accuracy, speed, cost, and importance of the measurement operation in achieving the ultimate goal of a project. Some subjects covered are dimensions; units; sources of measurement error; methods of describing and estimating accuracy; deduction and presentation of results through empirical equations, including the method of least squares; experimental and analytical methods of determining the static and dynamic behavior of instrumentation systems, including the use of analogs.

  8. Instrumentation for analytical scale supercritical fluid chromatography.

    PubMed

    Berger, Terry A

    2015-11-20

    Analytical scale supercritical fluid chromatography (SFC) is largely a sub-discipline of high performance liquid chromatography (HPLC), in that most of the hardware and software can be used for either technique. The aspects that separate the 2 techniques stem from the use of carbon dioxide (CO2) as the main component of the mobile phase in SFC. The high compressibility and low viscosity of CO2 mean that pumps, and autosamplers designed for HPLC either need to be modified or an alternate means of dealing with compressibility needs to be found. The inclusion of a back pressure regulator and a high pressure flow cell for any UV-Vis detector are also necessary. Details of the various approaches, problems and solutions are described. Characteristics, such as adiabatic vs. isothermal compressibility, thermal gradients, and refractive index issues are dealt with in detail. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Digital Education Governance: Data Visualization, Predictive Analytics, and "Real-Time" Policy Instruments

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    Educational institutions and governing practices are increasingly augmented with digital database technologies that function as new kinds of policy instruments. This article surveys and maps the landscape of digital policy instrumentation in education and provides two detailed case studies of new digital data systems. The Learning Curve is a…

  10. Development of TPS flight test and operational instrumentation

    NASA Technical Reports Server (NTRS)

    Carnahan, K. R.; Hartman, G. J.; Neuner, G. J.

    1975-01-01

    Thermal and flow sensor instrumentation was developed for use as an integral part of the space shuttle orbiter reusable thermal protection system. The effort was performed in three tasks: a study to determine the optimum instruments and instrument installations for the space shuttle orbiter RSI and RCC TPS; tests and/or analysis to determine the instrument installations to minimize measurement errors; and analysis using data from the test program for comparison to analytical methods. A detailed review of existing state of the art instrumentation in industry was performed to determine the baseline for the departure of the research effort. From this information, detailed criteria for thermal protection system instrumentation were developed.

  11. Development and validation of a multi-analyte method for the regulatory control of carotenoids used as feed additives in fish and poultry feed.

    PubMed

    Vincent, Ursula; Serano, Federica; von Holst, Christoph

    2017-08-01

    Carotenoids are used in animal nutrition mainly as sensory additives that favourably affect the colour of fish, birds and food of animal origin. Various analytical methods exist for their quantification in compound feed, reflecting the different physico-chemical characteristics of the carotenoid and the corresponding feed additives. They may be natural products or specific formulations containing the target carotenoids produced by chemical synthesis. In this study a multi-analyte method was developed that can be applied to the determination of all 10 carotenoids currently authorised within the European Union for compound feedingstuffs. The method functions regardless of whether the carotenoids have been added to the compound feed via natural products or specific formulations. It is comprised of three steps: (1) digestion of the feed sample with an enzyme; (2) pressurised liquid extraction; and (3) quantification of the analytes by reversed-phase HPLC coupled to a photodiode array detector in the visible range. The method was single-laboratory validated for poultry and fish feed covering a mass fraction range of the target analyte from 2.5 to 300 mg kg - 1 . The following method performance characteristics were obtained: the recovery rate varied from 82% to 129% and precision expressed as the relative standard deviation of intermediate precision varied from 1.6% to 15%. Based on the acceptable performance obtained in the validation study, the multi-analyte method is considered fit for the intended purpose.

  12. Recommendations for fluorescence instrument qualification: the new ASTM Standard Guide.

    PubMed

    DeRose, Paul C; Resch-Genger, Ute

    2010-03-01

    Aimed at improving quality assurance and quantitation for modern fluorescence techniques, ASTM International (ASTM) is about to release a Standard Guide for Fluorescence, reviewed here. The guide's main focus is on steady state fluorometry, for which available standards and instrument characterization procedures are discussed along with their purpose, suitability, and general instructions for use. These include the most relevant instrument properties needing qualification, such as linearity and spectral responsivity of the detection system, spectral irradiance reaching the sample, wavelength accuracy, sensitivity or limit of detection for an analyte, and day-to-day performance verification. With proper consideration of method-inherent requirements and limitations, many of these procedures and standards can be adapted to other fluorescence techniques. In addition, procedures for the determination of other relevant fluorometric quantities including fluorescence quantum yields and fluorescence lifetimes are briefly introduced. The guide is a clear and concise reference geared for users of fluorescence instrumentation at all levels of experience and is intended to aid in the ongoing standardization of fluorescence measurements.

  13. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples.

    PubMed

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart; Larsen, Pia Bükmann

    2017-12-01

    Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability in incurred samples. We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used to determine stability of each analyte. Additionally, evaporation from the decapped blood collection tubes and the residual platelet count in the plasma after centrifugation were quantified. We report a post-analytical stability of most routine analytes of ≥8h and do therefore - with few exceptions - suggest a standard 8hour-time limit for reordering and reanalysis of analytes in incurred samples. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  14. Accurate mass measurements and their appropriate use for reliable analyte identification.

    PubMed

    Godfrey, A Ruth; Brenton, A Gareth

    2012-09-01

    Accurate mass instrumentation is becoming increasingly available to non-expert users. This data can be mis-used, particularly for analyte identification. Current best practice in assigning potential elemental formula for reliable analyte identification has been described with modern informatic approaches to analyte elucidation, including chemometric characterisation, data processing and searching using facilities such as the Chemical Abstracts Service (CAS) Registry and Chemspider.

  15. Instrumental Surveillance of Water Quality.

    ERIC Educational Resources Information Center

    Miller, J. A.; And Others

    The role analytical instrumentation performs in the surveillance and control of the quality of water resources is reviewed. Commonly performed analyses may range from simple tests for physical parameters to more highly sophisticated radiological or spectrophotometric methods. This publication explores many of these types of water quality analyses…

  16. An Inexpensive, Open-Source USB Arduino Data Acquisition Device for Chemical Instrumentation.

    PubMed

    Grinias, James P; Whitfield, Jason T; Guetschow, Erik D; Kennedy, Robert T

    2016-07-12

    Many research and teaching labs rely on USB data acquisition devices to collect voltage signals from instrumentation. However, these devices can be cost-prohibitive (especially when large numbers are needed for teaching labs) and require software to be developed for operation. In this article, we describe the development and use of an open-source USB data acquisition device (with 16-bit acquisition resolution) built using simple electronic components and an Arduino Uno that costs under $50. Additionally, open-source software written in Python is included so that data can be acquired using nearly any PC or Mac computer with a simple USB connection. Use of the device was demonstrated for a sophomore-level analytical experiment using GC and a CE-UV separation on an instrument used for research purposes.

  17. Hydrothermal Alteration Mineralogy Characterized Through Multiple Analytical Methods: Implications for Mars

    NASA Astrophysics Data System (ADS)

    Black, S.; Hynek, B. M.; Kierein-Young, K. S.; Avard, G.; Alvarado-Induni, G.

    2015-12-01

    Proper characterization of mineralogy is an essential part of geologic interpretation. This process becomes even more critical when attempting to interpret the history of a region remotely, via satellites and/or landed spacecraft. Orbiters and landed missions to Mars carry with them a wide range of analytical tools to aid in the interpretation of Mars' geologic history. However, many instruments make a single type of measurement (e.g., APXS: elemental chemistry; XRD: mineralogy), and multiple data sets must be utilized to develop a comprehensive understanding of a sample. Hydrothermal alteration products often exist in intimate mixtures, and vary widely across a site due to changing pH, temperature, and fluid/gas chemistries. These characteristics require that we develop a detailed understanding regarding the possible mineral mixtures that may exist, and their detectability in different instrument data sets. This comparative analysis study utilized several analytical methods on existing or planned Mars rovers (XRD Raman, LIBS, Mössbauer, and APXS) combined with additional characterization (thin section, VNIR, XRF, SEM-EMP) to develop a comprehensive suite of data for hydrothermal alteration products collected from Poás and Turrialba volcanoes in Costa Rica. Analyzing the same samples across a wide range of instruments allows for direct comparisons of results, and identification of instrumentation "blind spots." This provides insight into the ability of in-situ analyses to comprehensively characterize sites on Mars exhibiting putative hydrothermal characteristics, such as the silica and sulfate deposits at Gusev crater [eg: Squyres et al., 2008], as well as valuable information for future mission planning and data interpretation. References: Squyres et al. (2008), Detection of Silica-Rich Deposits on Mars, Science, 320, 1063-1067, doi:10.1126/science.1155429.

  18. The role of light microscopy in aerospace analytical laboratories

    NASA Technical Reports Server (NTRS)

    Crutcher, E. R.

    1977-01-01

    Light microscopy has greatly reduced analytical flow time and added new dimensions to laboratory capability. Aerospace analytical laboratories are often confronted with problems involving contamination, wear, or material inhomogeneity. The detection of potential problems and the solution of those that develop necessitate the most sensitive and selective applications of sophisticated analytical techniques and instrumentation. This inevitably involves light microscopy. The microscope can characterize and often identify the cause of a problem in 5-15 minutes with confirmatory tests generally less than one hour. Light microscopy has and will make a very significant contribution to the analytical capabilities of aerospace laboratories.

  19. Instruments measuring perceived racism/racial discrimination: review and critique of factor analytic techniques.

    PubMed

    Atkins, Rahshida

    2014-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis.

  20. INSTRUMENTS MEASURING PERCEIVED RACISM/RACIAL DISCRIMINATION: REVIEW AND CRITIQUE OF FACTOR ANALYTIC TECHNIQUES

    PubMed Central

    Atkins, Rahshida

    2015-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis. PMID:25626225

  1. Analytical research and development for the Whitney Programs. Automation and instrumentation. Computer automation of the Cary Model 17I spectrophotometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haugen, G.R.; Bystroff, R.I.; Downey, R.M.

    1975-09-01

    In the area of automation and instrumentation, progress in the following studies is reported: computer automation of the Cary model 17I spectrophotometer; a new concept for monitoring the concentration of water in gases; on-line gas analysis for a gas circulation experiment; and count-rate-discriminator technique for measuring grain-boundary composition. In the area of analytical methodology and measurements, progress is reported in the following studies: separation of molecular species by radiation pressure; study of the vaporization of U(thd)$sub 4$, (thd = 2,2,6,6-tetramethylheptane-3,5-drone); study of the vaporization of U(C$sub 8$H$sub 8$)$sub 2$; determination of ethylenic unsaturation in polyimide resins; and, semimicrodetermination of hydroxylmore » and amino groups with pyromellitic dianhydride (PMDA). (JGB)« less

  2. Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.

    PubMed

    Lubes, Giuseppe; Goodarzi, Mohammad

    2017-05-10

    Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.

  3. Comparison of analytical tools appropriate for identification of proteinaceous additives in historical mortars.

    PubMed

    Krizova, Iva; Schultz, Julia; Nemec, Ivan; Cabala, Radomir; Hynek, Radovan; Kuckova, Stepanka

    2018-01-01

    Natural organic additives such as eggs, lard, resins, and oils have been added to mortars since ancient times, because the ancient builders knew of their positive effect on the mortar quality. The tradition of adding organic materials to mortars was commonly handed down only verbally for thousands years. However, this practice disappeared in the nineteenth century, when the usage of modern materials started. Today, one of the most recent topics in the industry of building materials is the reusing of natural organic materials and searching for the forgotten ancient recipes. The research of the old technological approaches involves currently the most advanced analytical techniques and methods. This paper is focussed on testing the possibility of identification of proteinaceous additives in historical mortars and model mortar samples containing blood, bone glue, curd, eggs and gelatine, by Fourier transform infrared (FTIR) and Raman spectroscopy, gas chromatography - mass spectrometry (GC-MS), matrix-assisted laser desorption/ionisation-time of flight mass spectrometry (MALDI-TOF MS), liquid chromatography-electrospray ionisation-quadrupole-time of flight mass spectrometry (LC-ESI-Q-TOF MS) and enzyme-linked immunosorbent assay (ELISA). All these methods were applied to the mortar sample taken from the interior of the medieval (sixteenth century) castle in Namest nad Oslavou in the Czech Republic and their comparison contributed to the rough estimation of the protein additive content in the mortar. The obtained results demonstrate that only LC-ESI-Q-TOF MS, MALDI-TOF MS and ELISA have the sufficiently low detection limits that enable the reliable identification of collagens in historical mortars. Graphical abstract Proteomics analyses of historical mortars.

  4. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  5. Tunable lasers and their application in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Steinfeld, J. I.

    1975-01-01

    The impact that laser techniques might have in chemical analysis is examined. Absorption, scattering, and heterodyne detection is considered. Particular emphasis is placed on the advantages of using frequency-tunable sources, and dye solution lasers are regarded as the outstanding example of this type of laser. Types of spectroscopy that can be carried out with lasers are discussed along with the ultimate sensitivity or minimum detectable concentration of molecules that can be achieved with each method. Analytical applications include laser microprobe analysis, remote sensing and instrumental methods such as laser-Raman spectroscopy, atomic absorption/fluorescence spectrometry, fluorescence assay techniques, optoacoustic spectroscopy, and polarization measurements. The application of lasers to spectroscopic methods of analysis would seem to be a rewarding field both for research in analytical chemistry and for investments in instrument manufacturing.

  6. Cross-instrument Analysis Correlation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, Timothy R.

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog boxmore » driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.« less

  7. Ion Mobility-Derived Collision Cross Section As an Additional Measure for Lipid Fingerprinting and Identification

    PubMed Central

    2014-01-01

    Despite recent advances in analytical and computational chemistry, lipid identification remains a significant challenge in lipidomics. Ion-mobility spectrometry provides an accurate measure of the molecules’ rotationally averaged collision cross-section (CCS) in the gas phase and is thus related to ionic shape. Here, we investigate the use of CCS as a highly specific molecular descriptor for identifying lipids in biological samples. Using traveling wave ion mobility mass spectrometry (MS), we measured the CCS values of over 200 lipids within multiple chemical classes. CCS values derived from ion mobility were not affected by instrument settings or chromatographic conditions, and they were highly reproducible on instruments located in independent laboratories (interlaboratory RSD < 3% for 98% of molecules). CCS values were used as additional molecular descriptors to identify brain lipids using a variety of traditional lipidomic approaches. The addition of CCS improved the reproducibility of analysis in a liquid chromatography-MS workflow and maximized the separation of isobaric species and the signal-to-noise ratio in direct-MS analyses (e.g., “shotgun” lipidomics and MS imaging). These results indicate that adding CCS to databases and lipidomics workflows increases the specificity and selectivity of analysis, thus improving the confidence in lipid identification compared to traditional analytical approaches. The CCS/accurate-mass database described here is made publicly available. PMID:25495617

  8. Establishment of a reference collection of additives and an analytical handbook of reference data to support enforcement of EU regulations on food contact plastics.

    PubMed

    van Lierop, B; Castle, L; Feigenbaum, A; Ehlert, K; Boenke, A

    1998-10-01

    A collection has been made of additives that are required as analytical standards for enforcement of European Union legislation on food contact plastics. The 100 additives have been characterized by mass spectrometry, infra-red spectroscopy and proton nuclear magnetic resonance spectroscopy to provide reference spectra. Gas chromatographic retention times have been recorded to facilitate identification by retention index. This information has been further supplemented by physico-chemical data. Finally, chromatographic methods have been used to indicate the presence of any impurities in the commercial chemicals. Samples of the reference substances are available on request and the collection of spectra and other information will be made available in printed format and on-line through the Internet. This paper gives an overview of the work done to establish the reference collection and the spectral atlas, which together will assist enforcement laboratories in the characterization of plastics and the selection of analytical methods for additives that may migrate.

  9. High-sensitivity ESCA instrument

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davies, R.D.; Herglotz, H.K.; Lee, J.D.

    1973-01-01

    A new electron spectroscopy for chemical analysis (ESCA) instrument has been developed to provide high sensitivity and efficient operation for laboratory analysis of composition and chemical bonding in very thin surface layers of solid samples. High sensitivity is achieved by means of the high-intensity, efficient x-ray source described by Davies and Herglotz at the 1968 Denver X-Ray Conference, in combination with the new electron energy analyzer described by Lee at the 1972 Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy. A sample chamber designed to provide for rapid introduction and replacement of samples has adequate facilities for various sample treatmentsmore » and conditiouing followed immediately by ESCA analysis of the sample. Examples of application are presented, demonstrating the sensitivity and resolution achievable with this instrument. Its usefulness in trace surface analysis is shown and some chemical shifts'' measured by the instrument are compared with those obtained by x-ray spectroscopy. (auth)« less

  10. Serendipity: Genesis of the Electrochemical Instrumentation at Princeton Applied Research Corporation

    ERIC Educational Resources Information Center

    Flato, J. B.

    2007-01-01

    Princeton Applied Research Corporation (PAR) was a small electronic instrument company in early 1960s but once they entered electrochemistry they were very successful. Since then they have developed and designed successful instruments with their tremendous knowledge and have made great contribution to the field of analytical chemistry.

  11. Development of Internal Controls for the Luminex Instrument as Part of a Multiplex Seven-Analyte Viral Respiratory Antibody Profile

    PubMed Central

    Martins, Thomas B.

    2002-01-01

    The ability of the Luminex system to simultaneously quantitate multiple analytes from a single sample source has proven to be a feasible and cost-effective technology for assay development. In previous studies, my colleagues and I introduced two multiplex profiles consisting of 20 individual assays into the clinical laboratory. With the Luminex instrument’s ability to classify up to 100 distinct microspheres, however, we have only begun to realize the enormous potential of this technology. By utilizing additional microspheres, it is now possible to add true internal controls to each individual sample. During the development of a seven-analyte serologic viral respiratory antibody profile, internal controls for detecting sample addition and interfering rheumatoid factor (RF) were investigated. To determine if the correct sample was added, distinct microspheres were developed for measuring the presence of sufficient quantities of immunoglobulin G (IgG) or IgM in the diluted patient sample. In a multiplex assay of 82 samples, the IgM verification control correctly identified 23 out of 23 samples with low levels (<20 mg/dl) of this antibody isotype. An internal control microsphere for RF detected 30 out of 30 samples with significant levels (>10 IU/ml) of IgM RF. Additionally, RF-positive samples causing false-positive adenovirus and influenza A virus IgM results were correctly identified. By exploiting the Luminex instrument’s multiplexing capabilities, I have developed true internal controls to ensure correct sample addition and identify interfering RF as part of a respiratory viral serologic profile that includes influenza A and B viruses, adenovirus, parainfluenza viruses 1, 2, and 3, and respiratory syncytial virus. Since these controls are not assay specific, they can be incorporated into any serologic multiplex assay. PMID:11777827

  12. Surgical instrument similarity metrics and tray analysis for multi-sensor instrument identification

    NASA Astrophysics Data System (ADS)

    Glaser, Bernhard; Schellenberg, Tobias; Franke, Stefan; Dänzer, Stefan; Neumuth, Thomas

    2015-03-01

    A robust identification of the instrument currently used by the surgeon is crucial for the automatic modeling and analysis of surgical procedures. Various approaches for intra-operative surgical instrument identification have been presented, mostly based on radio-frequency identification (RFID) or endoscopic video analysis. A novel approach is to identify the instruments on the instrument table of the scrub nurse with a combination of video and weight information. In a previous article, we successfully followed this approach and applied it to multiple instances of an ear, nose and throat (ENT) procedure and the surgical tray used therein. In this article, we present a metric for the suitability of the instruments of a surgical tray for identification by video and weight analysis and apply it to twelve trays of four different surgical domains (abdominal surgery, neurosurgery, orthopedics and urology). The used trays were digitized at the central sterile services department of the hospital. The results illustrate that surgical trays differ in their suitability for the approach. In general, additional weight information can significantly contribute to the successful identification of surgical instruments. Additionally, for ten different surgical instruments, ten exemplars of each instrument were tested for their weight differences. The samples indicate high weight variability in instruments with identical brand and model number. The results present a new metric for approaches aiming towards intra-operative surgical instrument detection and imply consequences for algorithms exploiting video and weight information for identification purposes.

  13. Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.

    ERIC Educational Resources Information Center

    Hercules, David M.; Hercules, Shirley H.

    1984-01-01

    Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)

  14. Advances in Instrumental Analysis of Brominated Flame Retardants: Current Status and Future Perspectives

    PubMed Central

    2014-01-01

    This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482

  15. Jonathan W. Amy and the Amy Facility for Instrumentation Development.

    PubMed

    Cooks, R Graham

    2017-05-16

    This Perspective describes the unique Jonathan Amy Facility for Chemical Instrumentation in the Department of Chemistry at Purdue University, tracing its history and mode of operation. It also describes aspects of the career of its namesake and some of his insights which have been central to analytical instrumentation development, improvement, and utilization, both at Purdue and nationally.

  16. Nutritional Lipidomics: Molecular Metabolism, Analytics, and Diagnostics

    PubMed Central

    Smilowitz, Jennifer T.; Zivkovic, Angela M.; Wan, Yu-Jui Yvonne; Watkins, Steve M.; Nording, Malin L.; Hammock, Bruce D.; German, J. Bruce

    2013-01-01

    The field of lipidomics is providing nutritional science a more comprehensive view of lipid intermediates. Lipidomics research takes advantage of the increase in accuracy and sensitivity of mass detection of mass spectrometry with new bioinformatics toolsets to characterize the structures and abundances of complex lipids. Yet, translating lipidomics to practice via nutritional interventions is still in its infancy. No single instrumentation platform is able to solve the varying analytical challenges of the different molecular lipid species. Biochemical pathways of lipid metabolism remain incomplete and the tools to map lipid compositional data to pathways are still being assembled. Biology itself is dauntingly complex and simply separating biological structures remains a key challenge to lipidomics. Nonetheless, the strategy of combining tandem analytical methods to perform the sensitive, high-throughput, quantitative and comprehensive analysis of lipid metabolites of very large numbers of molecules is poised to drive the field forward rapidly. Among the next steps for nutrition to understand the changes in structures, compositions and function of lipid biomolecules in response to diet is to describe their distribution within discrete functional compartments-lipoproteins. Additionally, lipidomics must tackle the task of assigning the functions of lipids as signaling molecules, nutrient sensors, and intermediates of metabolic pathways. PMID:23818328

  17. Biochemical Applications in the Analytical Chemistry Lab

    ERIC Educational Resources Information Center

    Strong, Cynthia; Ruttencutter, Jeffrey

    2004-01-01

    An HPLC and a UV-visible spectrophotometer are identified as instruments that helps to incorporate more biologically-relevant experiments into the course, in order to increase the students understanding of selected biochemistry topics and enhances their ability to apply an analytical approach to biochemical problems. The experiment teaches…

  18. Black Boxes in Analytical Chemistry: University Students' Misconceptions of Instrumental Analysis

    ERIC Educational Resources Information Center

    Carbo, Antonio Domenech; Adelantado, Jose Vicente Gimeno; Reig, Francisco Bosch

    2010-01-01

    Misconceptions of chemistry and chemical engineering university students concerning instrumental analysis have been established from coordinated tests, tutorial interviews and laboratory lessons. Misconceptions can be divided into: (1) formal, involving specific concepts and formulations within the general frame of chemistry; (2)…

  19. Does addition of crosslink to pedicle-screw-based instrumentation impact the development of the spinal canal in children younger than 5 years of age?

    PubMed

    Chen, Zhong-hui; Chen, Xi; Zhu, Ze-zhang; Wang, Bin; Qian, Bang-ping; Zhu, Feng; Sun, Xu; Qiu, Yong

    2015-07-01

    Use of pedicle screws has been popularized in the treatment of pediatric spinal deformity. Despite many studies regarding the effect of pedicle screws on the immature spine, there is no study concerning the impact of addition of crosslink to pedicle-screw-based instrumentation on the development of the spinal canal in young children. This study aims to determine the influence of the screw-rod-crosslink complex on the development of the spinal canal. This study reviewed 34 patients with congenital scoliosis (14 boys and 20 girls) who were treated with posterior-only hemivertebrectomy and pedicle-screw-based short-segment instrumentation before the age of 5 years. The mean age at surgery in this cohort was 37 ± 11 months (range 21-57 months). They were followed up for at least 24 months. Of these patients, 10 underwent only pedicle screw instrumentation without crosslink, and 24 with additional crosslink placement. The vertebrae were divided into three regions as follows: (1) S-CL (screw-crosslink) region, in which the vertebrae were inserted with bilateral pedicle screws and two rods connected with the crosslink; (2) S (screw) region, in which the vertebrae were inserted with bilateral pedicle screws but without crosslink; (3) NS (no screws) region, which comprised vertebrae cephalad or caudal to the instrumented region. The area, anteroposterior and transverse diameters of the spinal canal were measured at all vertebrae on the postoperative and last follow-up computed tomography axial images. The instrumentation-related parameters were also measured, including the distance between the bilateral screws and the screw base angles. The changes in the above measurements were compared between each region to evaluate the instrumentation's effect on the spinal canal growth. The mean follow-up was 37 ± 13 months (range 24-68 months) and the mean age at the last follow-up was 74 ± 20 months (range 46-119 months). In each region, the spinal canal dimensions significantly

  20. Contributions of Analytical Chemistry to the Clinical Laboratory.

    ERIC Educational Resources Information Center

    Skogerboe, Kristen J.

    1988-01-01

    Highlights several analytical techniques that are being used in state-of-the-art clinical labs. Illustrates how other advances in instrumentation may contribute to clinical chemistry in the future. Topics include: biosensors, polarization spectroscopy, chemiluminescence, fluorescence, photothermal deflection, and chromatography in clinical…

  1. Existential Measurement: A Factor Analytic Study of Some Current Psychometric Instruments.

    ERIC Educational Resources Information Center

    Thauberger, Patrick C.; And Others

    1982-01-01

    Research in existentialism and ontology has given rise to several psychometric instruments. Used both exploratory and confirmatory principal-factor analyses to study relationships among 16 existential scales. Exploratory factor analysis provided some support of the theory that the avoidance of existential confrontation is a central function of…

  2. Analytical design and performance studies of nuclear furnace tests of small nuclear light bulb models

    NASA Technical Reports Server (NTRS)

    Latham, T. S.; Rodgers, R. J.

    1972-01-01

    Analytical studies were continued to identify the design and performance characteristics of a small-scale model of a nuclear light bulb unit cell suitable for testing in a nuclear furnace reactor. Emphasis was placed on calculating performance characteristics based on detailed radiant heat transfer analyses, on designing the test assembly for ease of insertion, connection, and withdrawal at the reactor test cell, and on determining instrumentation and test effluent handling requirements. In addition, a review of candidate test reactors for future nuclear light bulb in-reactor tests was conducted.

  3. Monte Carlo simulations of neutron-scattering instruments using McStas

    NASA Astrophysics Data System (ADS)

    Nielsen, K.; Lefmann, K.

    2000-06-01

    Monte Carlo simulations have become an essential tool for improving the performance of neutron-scattering instruments, since the level of sophistication in the design of instruments is defeating purely analytical methods. The program McStas, being developed at Risø National Laboratory, includes an extension language that makes it easy to adapt it to the particular requirements of individual instruments, and thus provides a powerful and flexible tool for constructing such simulations. McStas has been successfully applied in such areas as neutron guide design, flux optimization, non-Gaussian resolution functions of triple-axis spectrometers, and time-focusing in time-of-flight instruments.

  4. The Southern Hemisphere Additional Ozonesondes (SHADOZ) 1998-2002 Tropical Ozone Climatology. 3; Instrumentation and Station-to-Station Variability

    NASA Technical Reports Server (NTRS)

    Thompson, Anne M.; Witte, Jacqueline C.; Smit, Herman G. J.; Oltmans, Samuel J.; Johnson, Bryan J.; Kirchhoff, Volker W. J. H.; Schmidlin, Francis J.

    2004-01-01

    Abstract: Since 1998 the Southern Hemisphere ADditional OZonesondes (SHADOZ) project has collected more than 2000 ozone profiles from a dozen tropical and subtropical sites using balloon-borne electrochemical concentration cell (ECC) ozonesondes. The data (with accompanying pressure-temperature-humidity soundings) are archived. Analysis of ozonesonde imprecision within the SHADOZ dataset revealed that variations in ozonesonde technique could lead to station-to-station biases in the measurements. In this paper imprecisions and accuracy in the SHADOZ dataset are examined in light of new data. When SHADOZ total ozone column amounts are compared to version 8 TOMS (2004 release), discrepancies between sonde and satellite datasets decline 1-2 percentage points on average, compared to version 7 TOMS. Variability among stations is evaluated using total ozone normalized to TOMS and results of laboratory tests on ozonesondes (JOSE-2O00, Julich Ozonesonde Intercomparison Experiment). Ozone deviations from a standard instrument in the JOSE flight simulation chamber resemble those of SHADOZ station data relative to a SHADOZ-defined climatological reference. Certain systematic variations in SHADOZ ozone profiles are accounted for by differences in solution composition, data processing and instrument (manufacturer). Instrument bias leads to a greater ozone measurement above 25 km over Nairobi and to lower total column ozone at three Pacific sites compared to other SHADOZ stations at 0-20 deg.S.

  5. 42 CFR 493.1252 - Standard: Test systems, equipment, instruments, reagents, materials, and supplies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Test systems, equipment, instruments... REQUIREMENTS Quality System for Nonwaived Testing Analytic Systems § 493.1252 Standard: Test systems, equipment...) Temperature. (3) Humidity. (4) Protection of equipment and instruments from fluctuations and interruptions in...

  6. Moving your laboratories to the field – Advantages and limitations of the use of field portable instruments in environmental sample analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gałuszka, Agnieszka, E-mail: Agnieszka.Galuszka@ujk.edu.pl; Migaszewski, Zdzisław M.; Namieśnik, Jacek

    The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector),more » ultraviolet–visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. - Highlights: • Field portable instruments are widely used in environmental sample analysis. • Field portable instruments are indispensable for analysis in emergency response. • Miniaturization of field portable instruments reduces resource consumption. • In situ analysis is in agreement with green analytical

  7. Thermo Scientific Ozone Analyzer Instrument Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springston, S. R.

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data ismore » being collected.« less

  8. Targeted analyte detection by standard addition improves detection limits in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui

    2012-09-18

    Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.

  9. Democratizing science with the aid of parametric design and additive manufacturing: Design and fabrication of a versatile and low-cost optical instrument for scattering measurement.

    PubMed

    Nadal-Serrano, Jose M; Nadal-Serrano, Adolfo; Lopez-Vallejo, Marisa

    2017-01-01

    This paper focuses on the application of rapid prototyping techniques using additive manufacturing in combination with parametric design to create low-cost, yet accurate and reliable instruments. The methodology followed makes it possible to make instruments with a degree of customization until now available only to a narrow audience, helping democratize science. The proposal discusses a holistic design-for-manufacturing approach that comprises advanced modeling techniques, open-source design strategies, and an optimization algorithm using free parametric software for both professional and educational purposes. The design and fabrication of an instrument for scattering measurement is used as a case of study to present the previous concepts.

  10. Democratizing science with the aid of parametric design and additive manufacturing: Design and fabrication of a versatile and low-cost optical instrument for scattering measurement

    PubMed Central

    Lopez-Vallejo, Marisa

    2017-01-01

    This paper focuses on the application of rapid prototyping techniques using additive manufacturing in combination with parametric design to create low-cost, yet accurate and reliable instruments. The methodology followed makes it possible to make instruments with a degree of customization until now available only to a narrow audience, helping democratize science. The proposal discusses a holistic design-for-manufacturing approach that comprises advanced modeling techniques, open-source design strategies, and an optimization algorithm using free parametric software for both professional and educational purposes. The design and fabrication of an instrument for scattering measurement is used as a case of study to present the previous concepts. PMID:29112987

  11. Predicting Team Performance through Human Behavioral Sensing and Quantitative Workflow Instrumentation

    DTIC Science & Technology

    2016-07-27

    make risk-informed decisions during serious games . Statistical models of intra- game performance were developed to determine whether behaviors in...specific facets of the gameplay workflow were predictive of analytical performance and games outcomes. A study of over seventy instrumented teams revealed...more accurate game decisions. 2 Keywords: Humatics · Serious Games · Human-System Interaction · Instrumentation · Teamwork · Communication Analysis

  12. Seismic instrumentation of buildings

    USGS Publications Warehouse

    Çelebi, Mehmet

    2000-01-01

    The purpose of this report is to provide information on how and why we deploy seismic instruments in and around building structures. The recorded response data from buildings and other instrumented structures can be and are being primarily used to facilitate necessary studies to improve building codes and therefore reduce losses of life and property during damaging earthquakes. Other uses of such data can be in emergency response situations in large urban environments. The report discusses typical instrumentation schemes, existing instrumentation programs, the steps generally followed in instrumenting a structure, selection and type of instruments, installation and maintenance requirements and data retrieval and processing issues. In addition, a summary section on how recorded response data have been utilized is included. The benefits from instrumentation of structural systems are discussed.

  13. Data Filtering in Instrumental Analyses with Applications to Optical Spectroscopy and Chemical Imaging

    ERIC Educational Resources Information Center

    Vogt, Frank

    2011-01-01

    Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…

  14. Mars Geochemical Instrument (MarGI): An instrument for the analysis of the Martian surface and the search for evidence of life

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Mancinelli, Rocco; Martin, Joe; Holland, Paul M.; Stimac, Robert M.; Kaye, William J.

    2005-01-01

    The Mars Geochemical Instrument, MarGI, was developed to provide a comprehensive analysis of the rocks and surface material on Mars. The instrument combines Differential Thermal Analysis (DTA) with miniature Gas Chromatography-Ion Mobility Spectrometry (GC-IMS) to identify minerals, the presence and state of water, and organic compounds. Miniature pyrolysis ovens are used to both, conduct DTA analysis of soil or crushed rocks samples, and pyrolyze the samples at temperatures up to 1000 degrees C for GC-IMS analysis of the released gases. This combination of analytical processes and techniques, which can characterize the mineralogy of the rocks and soil, and identify and quantify volatiles released during pyrolysis, has applications across a wide range of target sites including comets, planets, asteroids, and moons such as Titan and Europa. The MarGI analytical approach evolved from the Cometary Ice and Dust Experiment (CIDEX) selected to fly on the Comet Rendezvous Asteroid Flyby Mission (CRAF).

  15. Martian Soil Delivery to Analytical Instrument on Phoenix

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Robotic Arm of NASA's Phoenix Mars Lander released a sample of Martian soil onto a screened opening of the lander's Thermal and Evolved-Gas Analyzer (TEGA) during the 12th Martian day, or sol, since landing (June 6, 2008). TEGA did not confirm that any of the sample had passed through the screen.

    The Robotic Arm Camera took this image on Sol 12. Soil from the sample delivery is visible on the sloped surface of TEGA, which has a series of parallel doors. The two doors for the targeted cell of TEGA are the one positioned vertically, at far right, and the one partially open just to the left of that one. The soil between those two doors is resting on a screen designed to let fine particles through while keeping bigger ones Efrom clogging the interior of the instrument. Each door is about 10 centimeters (4 inches) long.

    The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  16. A Critical Review of Screening and Diagnostic Instruments for Autism Spectrum Disorders in People with Sensory Impairments in Addition to Intellectual Disabilities

    ERIC Educational Resources Information Center

    de Vaan, Gitta; Vervloed, Mathijs P. J.; Hoevenaars-van den Boom, Marella; Antonissen, Anneke; Knoors, Harry; Verhoeven, Ludo

    2016-01-01

    Instruments that are used for diagnosing of, or screening for, autism spectrum disorder (ASD) may not be applicable to people with sensory disabilities in addition to intellectual disabilities. First, because they do not account for equifinality, the possibility that different conditions may lead to the same outcome. Second, because they do not…

  17. The instrumental rationality of addiction.

    PubMed

    Pickard, Hanna

    2011-12-01

    The claim that non-addictive drug use is instrumental must be distinguished from the claim that its desired ends are evolutionarily adaptive or easy to comprehend. Use can be instrumental without being adaptive or comprehensible. This clarification, together with additional data, suggests that Müller & Schumann's (M&S's) instrumental framework may explain addictive, as well as non-addictive consumption.

  18. Facilitating Research and Learning in Petrology and Geochemistry through Classroom Applications of Remotely Operable Research Instrumentation

    NASA Astrophysics Data System (ADS)

    Ryan, J. G.

    2012-12-01

    Bringing the use of cutting-edge research tools into student classroom experiences has long been a popular educational strategy in the geosciences and other STEM disciplines. The NSF CCLI and TUES programs have funded a large number of projects that placed research-grade instrumentation at educational institutions for instructional use and use in supporting undergraduate research activities. While student and faculty response to these activities has largely been positive, a range of challenges exist related to their educational effectiveness. Many of the obstacles these approaches have faced relate to "scaling up" of research mentoring experiences (e.g., providing training and time for use for an entire classroom of students, as opposed to one or two), and to time tradeoffs associated with providing technical training for effective instrument use versus course content coverage. The biggest challenge has often been simple logistics: a single instrument, housed in a different space, is difficult to integrate effectively into instructional activities. My CCLI-funded project sought primarily to knock down the logistical obstacles to research instrument use by taking advantage of remote instrument operation technologies, which allow the in-classroom use of networked analytical tools. Remote use of electron microprobe and SEM instruments of the Florida Center for Analytical Electron Microscopy (FCAEM) in Miami, FL was integrated into two geoscience courses at USF in Tampa, FL. Remote operation permitted the development of whole-class laboratory exercises to familiarize students with the tools, their function, and their capabilities; and it allowed students to collect high-quality chemical and image data on their own prepared samples in the classroom during laboratory periods. These activities improve student engagement in the course, appear to improve learning of key concepts in mineralogy and petrology, and have led to students pursuing independent research projects, as

  19. Introduction to Instrumental Analysis of Water Pollutants. Training Manual.

    ERIC Educational Resources Information Center

    Office of Water Program Operations (EPA), Cincinnati, OH. National Training and Operational Technology Center.

    This course is designed for those requiring an introduction to instruments commonly used in water pollution analyses. Examples are: pH, conductivity, dissolved oxygen meters, spectrophotometers, turbidimeters, carbon analyzer, and gas chromatographs. Students should have a basic knowledge of analytical chemistry. (CO)

  20. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  1. Shipboard Analytical Capabilities on the Renovated JOIDES Resolution, IODP Riserless Drilling Vessel

    NASA Astrophysics Data System (ADS)

    Blum, P.; Foster, P.; Houpt, D.; Bennight, C.; Brandt, L.; Cobine, T.; Crawford, W.; Fackler, D.; Fujine, K.; Hastedt, M.; Hornbacher, D.; Mateo, Z.; Moortgat, E.; Vasilyev, M.; Vasilyeva, Y.; Zeliadt, S.; Zhao, J.

    2008-12-01

    The JOIDES Resolution (JR) has conducted 121 scientific drilling expeditions during the Ocean Drilling Program (ODP) and the first phase of the Integrated Ocean Drilling Program (IODP) (1983-2006). The vessel and scientific systems have just completed an NSF-sponsored renovation (2005-2008). Shipboard analytical systems have been upgraded, within funding constraints imposed by market driven vessel conversion cost increases, to include: (1) enhanced shipboard analytical services including instruments and software for sampling and the capture of chemistry, physical properties, and geological data; (2) new data management capabilities built around a laboratory information management system (LIMS), digital asset management system, and web services; (3) operations data services with enhanced access to navigation and rig instrumentation data; and (4) a combination of commercial and home-made user applications for workflow- specific data extractions, generic and customized data reporting, and data visualization within a shipboard production environment. The instrumented data capture systems include a new set of core loggers for rapid and non-destructive acquisition of images and other physical properties data from drill cores. Line-scan imaging and natural gamma ray loggers capture data at unprecedented quality due to new and innovative designs. Many instruments used to characterize chemical compounds of rocks, sediments, and interstitial fluids were upgraded with the latest technology. The shipboard analytical environment features a new and innovative framework (DESCinfo) and application (DESClogik) for capturing descriptive and interpretive data from geological sub-domains such as sedimentology, petrology, paleontology, structural geology, stratigraphy, etc. This system fills a long-standing gap by providing a global database, controlled vocabularies and taxa name lists with version control, a highly configurable spreadsheet environment for data capture, and

  2. Life cycle management of analytical methods.

    PubMed

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Analytical performance of the various acquisition modes in Orbitrap MS and MS/MS.

    PubMed

    Kaufmann, Anton

    2018-04-30

    Quadrupole Orbitrap instruments (Q Orbitrap) permit high-resolution mass spectrometry (HRMS)-based full scan acquisitions and have a number of acquisition modes where the quadrupole isolates a particular mass range prior to a possible fragmentation and HRMS-based acquisition. Selecting the proper acquisition mode(s) is essential if trace analytes are to be quantified in complex matrix extracts. Depending on the particular requirements, such as sensitivity, selectivity of detection, linear dynamic range, and speed of analysis, different acquisition modes may have to be chosen. This is particularly important in the field of multi-residue analysis (e.g., pesticides or veterinary drugs in food samples) where a large number of analytes within a complex matrix have to be detected and reliably quantified. Meeting the specific detection and quantification performance criteria for every targeted compound may be challenging. It is the aim of this paper to describe the strengths and the limitations of the currently available Q Orbitrap acquisition modes. In addition, the incorporation of targeted acquisitions between full scan experiments is discussed. This approach is intended to integrate compounds that require an additional degree of sensitivity or selectivity into multi-residue methods. This article is protected by copyright. All rights reserved.

  4. Quantitative evaluation of analyte transport on microfluidic paper-based analytical devices (μPADs).

    PubMed

    Ota, Riki; Yamada, Kentaro; Suzuki, Koji; Citterio, Daniel

    2018-02-07

    The transport efficiency during capillary flow-driven sample transport on microfluidic paper-based analytical devices (μPADs) made from filter paper has been investigated for a selection of model analytes (Ni 2+ , Zn 2+ , Cu 2+ , PO 4 3- , bovine serum albumin, sulforhodamine B, amaranth) representing metal cations, complex anions, proteins and anionic molecules. For the first time, the transport of the analytical target compounds rather than the sample liquid, has been quantitatively evaluated by means of colorimetry and absorption spectrometry-based methods. The experiments have revealed that small paperfluidic channel dimensions, additional user operation steps (e.g. control of sample volume, sample dilution, washing step) as well as the introduction of sample liquid wicking areas allow to increase analyte transport efficiency. It is also shown that the interaction of analytes with the negatively charged cellulosic paper substrate surface is strongly influenced by the physico-chemical properties of the model analyte and can in some cases (Cu 2+ ) result in nearly complete analyte depletion during sample transport. The quantitative information gained through these experiments is expected to contribute to the development of more sensitive μPADs.

  5. Build Your Own Photometer: A Guided-Inquiry Experiment to Introduce Analytical Instrumentation

    ERIC Educational Resources Information Center

    Wang, Jessie J.; Nun´ez, Jose´ R. Rodríguez; Maxwell, E. Jane; Algar, W. Russ

    2016-01-01

    A guided-inquiry project designed to teach students the basics of spectrophotometric instrumentation at the second year level is presented. Students design, build, program, and test their own single-wavelength, submersible photometer using low-cost light-emitting diodes (LEDs) and inexpensive household items. A series of structured prelaboratory…

  6. Instrumentation Performance During the TMI-2 Accident

    NASA Astrophysics Data System (ADS)

    Rempe, Joy L.; Knudson, Darrell L.

    2014-08-01

    The accident at the Three Mile Island Unit 2 (TMI-2) reactor provided a unique opportunity to evaluate sensors exposed to severe accident conditions. The loss of coolant and the hydrogen combustion that occurred during this accident exposed instrumentation to harsh conditions, including direct radiation, radioactive contamination, and high humidity with elevated temperatures and pressures. As part of a program initiated by the Department of Energy Office of Nuclear Energy (DOE-NE), a review was completed to gain insights from prior TMI-2 sensor survivability and data qualification efforts. This new effort focused upon a set of sensors that provided critical data to TMI-2 operators for assessing the condition of the plant and the effects of mitigating actions taken by these operators. In addition, the effort considered sensors providing data required for subsequent accident simulations. Over 100 references related to instrumentation performance and post-accident evaluations of TMI-2 sensors and measurements were reviewed. Insights gained from this review are summarized within this paper. As noted within this paper, several techniques were invoked in the TMI-2 post-accident program to evaluate sensor survivability status and data qualification, including comparisons with data from other sensors, analytical calculations, laboratory testing, and comparisons with sensors subjected to similar conditions in large-scale integral tests and with sensors that were similar in design but more easily removed from the TMI-2 plant for evaluations. Conclusions from this review provide important insights related to sensor survivability and enhancement options for improving sensor performance. In addition, this paper provides recommendations related to sensor survivability and the data evaluation process that could be implemented in upcoming Fukushima Daiichi recovery efforts.

  7. Moving your laboratories to the field--Advantages and limitations of the use of field portable instruments in environmental sample analysis.

    PubMed

    Gałuszka, Agnieszka; Migaszewski, Zdzisław M; Namieśnik, Jacek

    2015-07-01

    The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector), ultraviolet-visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. The Pavlovian analysis of instrumental conditioning.

    PubMed

    Gormezano, I; Tait, R W

    1976-01-01

    An account was given of the development within the Russian literature of a uniprocess formulation of classical and instrumental conditioning, known as the bidirectional conditioning hypothesis. The hypothesis purports to offer a single set of Pavlovian principles to account for both paradigms, based upon a neural model which assumes that bidirectional (forward and backward) connections are formed in both calssical and instrumental conditioning situations. In instrumental conditioning, the bidirectional connections are hypothesized to be simply more complex than those in classical conditioning, and any differences in empirical functions are presumed to lie not in difference in mechanism, but in the strength of the forward and backward connections. Although bidirectional connections are assumed to develop in instrumental conditioning, the experimental investigation of the bidirectional conditioning hypothesis has been essentially restricted to the classical conditioning operations of pairing two CSs (sensory preconditioning training), a US followed by a CS (backward conditioning training) and two USs. However, the paradigm involving the pairing of two USs, because of theoretical and analytical considerations, is the one most commonly employed by Russian investigators. The results of an initial experiment involving the pairing of two USs, and reference to the results of a more extensive investigation, leads us to tentatively question the validity of the bidirectional conditioning account of instrumental conditioning.

  9. Cisapride a green analytical reagent for rapid and sensitive determination of bromate in drinking water, bread and flour additives by oxidative coupling spectrophotometric methods

    NASA Astrophysics Data System (ADS)

    Al Okab, Riyad Ahmed

    2013-02-01

    Green analytical methods using Cisapride (CPE) as green analytical reagent was investigated in this work. Rapid, simple, and sensitive spectrophotometric methods for the determination of bromate in water sample, bread and flour additives were developed. The proposed methods based on the oxidative coupling between phenoxazine and Cisapride in the presence of bromate to form red colored product with max at 520 nm. Phenoxazine and Cisapride and its reaction products were found to be environmentally friendly under the optimum experimental condition. The method obeys beers law in concentration range 0.11-4.00 g ml-1 and molar absorptivity 1.41 × 104 L mol-1 cm-1. All variables have been optimized and the presented reaction sequences were applied to the analysis of bromate in water, bread and flour additive samples. The performance of these method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference method. The combination of pharmaceutical drugs reagents with low concentration create some unique green chemical analyses.

  10. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Sample Analysis at Mars Instrument Simulator

    NASA Technical Reports Server (NTRS)

    Benna, Mehdi; Nolan, Tom

    2013-01-01

    The Sample Analysis at Mars Instrument Simulator (SAMSIM) is a numerical model dedicated to plan and validate operations of the Sample Analysis at Mars (SAM) instrument on the surface of Mars. The SAM instrument suite, currently operating on the Mars Science Laboratory (MSL), is an analytical laboratory designed to investigate the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. SAMSIM was developed using Matlab and Simulink libraries of MathWorks Inc. to provide MSL mission planners with accurate predictions of the instrument electrical, thermal, mechanical, and fluid responses to scripted commands. This tool is a first example of a multi-purpose, full-scale numerical modeling of a flight instrument with the purpose of supplementing or even eliminating entirely the need for a hardware engineer model during instrument development and operation. SAMSIM simulates the complex interactions that occur between the instrument Command and Data Handling unit (C&DH) and all subsystems during the execution of experiment sequences. A typical SAM experiment takes many hours to complete and involves hundreds of components. During the simulation, the electrical, mechanical, thermal, and gas dynamics states of each hardware component are accurately modeled and propagated within the simulation environment at faster than real time. This allows the simulation, in just a few minutes, of experiment sequences that takes many hours to execute on the real instrument. The SAMSIM model is divided into five distinct but interacting modules: software, mechanical, thermal, gas flow, and electrical modules. The software module simulates the instrument C&DH by executing a customized version of the instrument flight software in a Matlab environment. The inputs and outputs to this synthetic C&DH are mapped to virtual sensors and command lines that mimic in their structure and connectivity the layout of the instrument harnesses. This module executes

  12. Reflectance Infrared Spectroscopy on Operating Surface Acoustic Wave Chemical Sensors During Exposure to Gas-Phase Analytes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hierlemann, A.; Hill, M.; Ricco, A.J.

    We have developed instrumentation to enable the combination of surface acoustic wave (SAW) sensor measurements with direct, in-situ molecular spectroscopic measurements to understand the response of the SAW sensors with respect to the interfacial chemistry of surface-confined sensing films interacting with gas-phase analytes. Specifically, the instrumentation and software was developed to perform in-situ Fourier-transform infrared external-reflectance spectroscopy (FTIR-ERS) on operating SAW devices during dosing of their chemically modified surfaces with analytes. By probing the surface with IR spectroscopy during gas exposure, it is possible to understand in unprecedented detail the interaction processes between the sorptive SAW coatings and the gaseousmore » analyte molecules. In this report, we provide details of this measurement system, and also demonstrate the utility of these combined measurements by characterizing the SAW and FTIR-ERS responses of organic thin-film sensor coatings interacting with gas-phase analytes.« less

  13. Man vs. Machine: A Junior-level Laboratory Exercise Comparing Human and Instrumental Detection Limits

    ERIC Educational Resources Information Center

    Elias, Ryan J.; Hopfer, Helene; Hofstaedter, Amanda N.; Hayes, John E.

    2017-01-01

    The human nose is a very sensitive detector and is able to detect potent aroma compounds down to low ng/L levels. These levels are often below detection limits of analytical instrumentation. The following laboratory exercise is designed to compare instrumental and human methods for the detection of volatile odor active compounds. Reference…

  14. Development of an Analytical Method for Dibutyl Phthalate Determination Using Surrogate Analyte Approach

    PubMed Central

    Farzanehfar, Vahid; Faizi, Mehrdad; Naderi, Nima; Kobarfard, Farzad

    2017-01-01

    Dibutyl phthalate (DBP) is a phthalic acid ester and is widely used in polymeric products to make them more flexible. DBP is found in almost every plastic material and is believed to be persistent in the environment. Various analytical methods have been used to measure DBP in different matrices. Considering the ubiquitous nature of DBP, the most important challenge in DBP analyses is the contamination of even analytical grade organic solvents with this compound and lack of availability of a true blank matrix to construct the calibration line. Standard addition method or using artificial matrices reduce the precision and accuracy of the results. In this study a surrogate analyte approach that is based on using deuterium labeled analyte (DBP-d4) to construct the calibration line was applied to determine DBP in hexane samples. PMID:28496469

  15. Targeted analyte deconvolution and identification by four-way parallel factor analysis using three-dimensional gas chromatography with mass spectrometry data.

    PubMed

    Watson, Nathanial E; Prebihalo, Sarah E; Synovec, Robert E

    2017-08-29

    Comprehensive three-dimensional gas chromatography with time-of-flight mass spectrometry (GC 3 -TOFMS) creates an opportunity to explore a new paradigm in chemometric analysis. Using this newly described instrument and the well understood Parallel Factor Analysis (PARAFAC) model we present one option for utilization of the novel GC 3 -TOFMS data structure. We present a method which builds upon previous work in both GC 3 and targeted analysis using PARAFAC to simplify some of the implementation challenges previously discovered. Conceptualizing the GC 3 -TOFMS instead as a one-dimensional gas chromatograph with GC × GC-TOFMS detection we allow the instrument to create the PARAFAC target window natively. Each first dimension modulation thus creates a full GC × GC-TOFMS chromatogram fully amenable to PARAFAC. A simple mixture of 115 compounds and a diesel sample are interrogated through this methodology. All test analyte targets are successfully identified in both mixtures. In addition, mass spectral matching of the PARAFAC loadings to library spectra yielded results greater than 900 in 40 of 42 test analyte cases. Twenty-nine of these cases produced match values greater than 950. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Standard NIM Instrumentation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costrell, Louis; Lenkszus, Frank R.; Rudnick, Stanley J.

    NIM is a standard modular instrumentation system that is in wide use throughout the world. As the NIM system developed and accommodations were made to a dynamic instrumentation field and a rapidly advancing technology, additions, revisions and clarifications were made. These were incorporated into the standard in the form of addenda and errata. This standard is a revision of the NIM document, AEC Report TID-20893 (Rev 4) dated July 1974. It includes all the addenda and errata items that were previously issued as well as numerous additional items to make the standard current with modern technology and manufacturing practice.

  17. ANALYTICAL METHOD COMPARISONS BY ESTIMATES OF PRECISION AND LOWER DETECTION LIMIT

    EPA Science Inventory

    The paper describes the use of principal component analysis to estimate the operating precision of several different analytical instruments or methods simultaneously measuring a common sample of a material whose actual value is unknown. This approach is advantageous when none of ...

  18. Social Learning Analytics: Navigating the Changing Settings of Higher Education

    ERIC Educational Resources Information Center

    de Laat, Maarten; Prinsen, Fleur R.

    2014-01-01

    Current trends and challenges in higher education (HE) require a reorientation towards openness, technology use and active student participation. In this article we will introduce Social Learning Analytics (SLA) as instrumental in formative assessment practices, aimed at supporting and strengthening students as active learners in increasingly open…

  19. The role of multi-target policy instruments in agri-environmental policy mixes.

    PubMed

    Schader, Christian; Lampkin, Nicholas; Muller, Adrian; Stolze, Matthias

    2014-12-01

    The Tinbergen Rule has been used to criticise multi-target policy instruments for being inefficient. The aim of this paper is to clarify the role of multi-target policy instruments using the case of agri-environmental policy. Employing an analytical linear optimisation model, this paper demonstrates that there is no general contradiction between multi-target policy instruments and the Tinbergen Rule, if multi-target policy instruments are embedded in a policy-mix with a sufficient number of targeted instruments. We show that the relation between cost-effectiveness of the instruments, related to all policy targets, is the key determinant for an economically sound choice of policy instruments. If economies of scope with respect to achieving policy targets are realised, a higher cost-effectiveness of multi-target policy instruments can be achieved. Using the example of organic farming support policy, we discuss several reasons why economies of scope could be realised by multi-target agri-environmental policy instruments. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Analytical and pre-analytical performance characteristics of a novel cartridge-type blood gas analyzer for point-of-care and laboratory testing.

    PubMed

    Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique

    2018-03-01

    Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  1. Analytical capabilities and services of Lawrence Livermore Laboratory's General Chemistry Division. [Methods available at Lawrence Livermore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutmacher, R.; Crawford, R.

    This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.

  2. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  3. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  4. A review of instrumentation kinematics of engine-driven nickel-titanium instruments.

    PubMed

    Çapar, I D; Arslan, H

    2016-02-01

    Over the years, NiTi alloys have become indispensable materials in endodontic treatment. With technological advancements in metallurgy, manufacturers have attempted to produce instruments with enhanced features. In parallel with these developments, endodontic motors have undergone improvements in terms of torque control and kinematics that are adjustable in different directions. This review presents an overview of the advancements in instrumentation kinematics and the effect of instrumentation kinematics on root canal shaping procedures and instrument performance. The literature search for this narrative review was conducted in Google Scholar, Scopus, PubMed and Web of Science using the keywords 'kinematics and endodontics' and 'reciprocation and endodontics'. In addition, historical literature was searched using the keyword 'nickel-titanium and endodontics'. Overall, 143 articles were included up to 2015. © 2015 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  5. Specifying and calibrating instrumentations for wideband electronic power measurements. [in switching circuits

    NASA Technical Reports Server (NTRS)

    Lesco, D. J.; Weikle, D. H.

    1980-01-01

    The wideband electric power measurement related topics of electronic wattmeter calibration and specification are discussed. Tested calibration techniques are described in detail. Analytical methods used to determine the bandwidth requirements of instrumentation for switching circuit waveforms are presented and illustrated with examples from electric vehicle type applications. Analog multiplier wattmeters, digital wattmeters and calculating digital oscilloscopes are compared. The instrumentation characteristics which are critical to accurate wideband power measurement are described.

  6. Analytical Protein Microarrays: Advancements Towards Clinical Applications

    PubMed Central

    Sauer, Ursula

    2017-01-01

    Protein microarrays represent a powerful technology with the potential to serve as tools for the detection of a broad range of analytes in numerous applications such as diagnostics, drug development, food safety, and environmental monitoring. Key features of analytical protein microarrays include high throughput and relatively low costs due to minimal reagent consumption, multiplexing, fast kinetics and hence measurements, and the possibility of functional integration. So far, especially fundamental studies in molecular and cell biology have been conducted using protein microarrays, while the potential for clinical, notably point-of-care applications is not yet fully utilized. The question arises what features have to be implemented and what improvements have to be made in order to fully exploit the technology. In the past we have identified various obstacles that have to be overcome in order to promote protein microarray technology in the diagnostic field. Issues that need significant improvement to make the technology more attractive for the diagnostic market are for instance: too low sensitivity and deficiency in reproducibility, inadequate analysis time, lack of high-quality antibodies and validated reagents, lack of automation and portable instruments, and cost of instruments necessary for chip production and read-out. The scope of the paper at hand is to review approaches to solve these problems. PMID:28146048

  7. Laboratory evaluation of alcohol safety interlock systems. Volume 2 : instrument screening experiments

    DOT National Transportation Integrated Search

    1974-01-01

    The report contains the results of an experimental and analytical evaluation of instruments and techniques designed to prevent an intoxicated driver from operating his automobile. The prototype 'Alcohol Safety Interlock Systems' tested were developed...

  8. Chandra ACIS-I particle background: an analytical model

    NASA Astrophysics Data System (ADS)

    Bartalucci, I.; Mazzotta, P.; Bourdin, H.; Vikhlinin, A.

    2014-06-01

    Aims: Imaging and spectroscopy of X-ray extended sources require a proper characterisation of a spatially unresolved background signal. This background includes sky and instrumental components, each of which are characterised by its proper spatial and spectral behaviour. While the X-ray sky background has been extensively studied in previous work, here we analyse and model the instrumental background of the ACIS-I detector on board the Chandra X-ray observatory in very faint mode. Methods: Caused by interaction of highly energetic particles with the detector, the ACIS-I instrumental background is spectrally characterised by the superimposition of several fluorescence emission lines onto a continuum. To isolate its flux from any sky component, we fitted an analytical model of the continuum to observations performed in very faint mode with the detector in the stowed position shielded from the sky, and gathered over the eight-year period starting in 2001. The remaining emission lines were fitted to blank-sky observations of the same period. We found 11 emission lines. Analysing the spatial variation of the amplitude, energy and width of these lines has further allowed us to infer that three lines of these are presumably due to an energy correction artefact produced in the frame store. Results: We provide an analytical model that predicts the instrumental background with a precision of 2% in the continuum and 5% in the lines. We use this model to measure the flux of the unresolved cosmic X-ray background in the Chandra deep field south. We obtain a flux of 10.2+0.5-0.4 × 10-13 erg cm-2 deg-2 s-1 for the [1-2] keV band and (3.8 ± 0.2) × 10-12 erg cm-2 deg-2 s-1 for the [2-8] keV band.

  9. Gas-analytic measurement complexes of Baikal atmospheric-limnological observatory

    NASA Astrophysics Data System (ADS)

    Pestunov, D. A.; Shamrin, A. M.; Shmargunov, V. P.; Panchenko, M. V.

    2015-11-01

    The paper presents the present-day structure of stationary and mobile hardware-software gas-analytical complexes of Baikal atmospheric-limnological observatory (BALO) Siberian Branch Russian Academy of Sciences (SB RAS), designed to study the processes of gas exchange of carbon-containing gases in the "atmosphere-water" system, which are constantly updated to include new measuring and auxiliary instrumentation.

  10. Recent trends in atomic fluorescence spectrometry towards miniaturized instrumentation-A review.

    PubMed

    Zou, Zhirong; Deng, Yujia; Hu, Jing; Jiang, Xiaoming; Hou, Xiandeng

    2018-08-17

    Atomic fluorescence spectrometry (AFS), as one of the common atomic spectrometric techniques with high sensitivity, simple instrumentation, and low acquisition and running cost, has been widely used in various fields for trace elemental analysis, notably the determination of hydride-forming elements by hydride generation atomic fluorescence spectrometry (HG-AFS). In recent years, the soaring demand of field analysis has significantly promoted the miniaturization of analytical atomic spectrometers or at least instrumental components. Various techniques have also been developed to approach the goal of portable/miniaturized AFS instrumentation for field analysis. In this review, potentially portable/miniaturized AFS techniques, primarily involving advanced instrumental components and whole instrumentation with references since 2000, are summarized and discussed. The discussion mainly includes five aspects: radiation source, atomizer, detector, sample introduction, and miniaturized atomic fluorescence spectrometer/system. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Implementation of picoSpin Benchtop NMR Instruments into Organic Chemistry Teaching Laboratories through Spectral Analysis of Fischer Esterification Products

    ERIC Educational Resources Information Center

    Yearty, Kasey L.; Sharp, Joseph T.; Meehan, Emma K.; Wallace, Doyle R.; Jackson, Douglas M.; Morrison, Richard W.

    2017-01-01

    [Superscript 1]H NMR analysis is an important analytical technique presented in introductory organic chemistry courses. NMR instrument access is limited for undergraduate organic chemistry students due to the size of the instrument, price of NMR solvents, and the maintenance level required for instrument upkeep. The University of Georgia Chemistry…

  12. Construction of a Chemical Sensor/Instrumentation Package Using Fiber Optic and Miniaturization Technology

    NASA Technical Reports Server (NTRS)

    Newton, R. L.

    1999-01-01

    The objective of this research was to construct a chemical sensor/instrumentation package that was smaller in weight and volume than conventional instrumentation. This reduction in weight and volume is needed to assist in further reducing the cost of launching payloads into space. To accomplish this, fiber optic sensors, miniaturized spectrometers, and wireless modems were employed. The system was evaluated using iodine as a calibration analyte.

  13. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Status of the Neutron Imaging and Diffraction Instrument IMAT

    NASA Astrophysics Data System (ADS)

    Kockelmann, Winfried; Burca, Genoveva; Kelleher, Joe F.; Kabra, Saurabh; Zhang, Shu-Yan; Rhodes, Nigel J.; Schooneveld, Erik M.; Sykora, Jeff; Pooley, Daniel E.; Nightingale, Jim B.; Aliotta, Francesco; Ponterio, Rosa C.; Salvato, Gabriele; Tresoldi, Dario; Vasi, Cirino; McPhate, Jason B.; Tremsin, Anton S.

    A cold neutron imaging and diffraction instrument, IMAT, is currently being constructed at the ISIS second target station. IMAT will capitalize on time-of-flight transmission and diffraction techniques available at a pulsed neutron source. Analytical techniques will include neutron radiography, neutron tomography, energy-selective neutron imaging, and spatially resolved diffraction scans for residual strain and texture determination. Commissioning of the instrument will start in 2015, with time-resolving imaging detectors and two diffraction detector prototype modules. IMAT will be operated as a user facility for material science applications and will be open for developments of time-of-flight imaging methods.

  15. A Database Management Assessment Instrument

    ERIC Educational Resources Information Center

    Landry, Jeffrey P.; Pardue, J. Harold; Daigle, Roy; Longenecker, Herbert E., Jr.

    2013-01-01

    This paper describes an instrument designed for assessing learning outcomes in data management. In addition to assessment of student learning and ABET outcomes, we have also found the instrument to be effective for determining database placement of incoming information systems (IS) graduate students. Each of these three uses is discussed in this…

  16. Electric-Field Instrument With Ac-Biased Corona Point

    NASA Technical Reports Server (NTRS)

    Markson, R.; Anderson, B.; Govaert, J.

    1993-01-01

    Measurements indicative of incipient lightning yield additional information. New instrument gives reliable readings. High-voltage ac bias applied to needle point through high-resistance capacitance network provides corona discharge at all times, enabling more-slowly-varying component of electrostatic potential of needle to come to equilibrium with surrounding air. High resistance of high-voltage coupling makes instrument insensitive to wind. Improved corona-point instrument expected to yield additional information assisting in safety-oriented forecasting of lighting.

  17. 45 CFR 63.32 - Data collection instruments.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... collection instruments. (a) Definitions. For the purposes of this section “Child” means an individual who has..., sex, race, or politics. (2) A grantee which proposes to use a data collection instrument shall set... instruments in which children are involved as respondents, the grantee, in addition to observing the other...

  18. Comparison of removed dentin thickness with hand and rotary instruments

    PubMed Central

    Shahriari, Shahriar; Abedi, Hasan; Hashemi, Mahdi; Jalalzadeh, Seyed Mohsen

    2009-01-01

    INTRODUCTION: The aim of this study was to evaluate the amount of dentine removed after canal preparation using stainless steel (SS) hand instruments or rotary ProFile instruments. MATERIALS AND METHODS: Thirty-six extracted human teeth with root canal curvatures less than 30º were embedded in clear polyester resin. The roots were cut horizontally at apical 2, 4 and 7 mm. Dentin thickness was measured at each section and the sections were accurately reassembled using a muffle. Root canals were randomly prepared by SS hand instruments or rotary ProFile instruments. Root sections were again separated, and the remaining dentin thickness was measured. Mann-Whitney U and t tests were performed for analytic comparison of the results. RESULTS: The thickness of removed dentin was significantly different between the two used methods (P<0.05). Significantly greater amounts of dentin was removed mesially in all sections in hand instrumentation group (P<0.001). CONCLUSION: ProFile rotary instrumentation prepares root canals with a greater conservation of tooth structure. PMID:23940489

  19. Analytical optimal controls for the state constrained addition and removal of cryoprotective agents

    PubMed Central

    Chicone, Carmen C.; Critser, John K.

    2014-01-01

    Cryobiology is a field with enormous scientific, financial and even cultural impact. Successful cryopreservation of cells and tissues depends on the equilibration of these materials with high concentrations of permeating chemicals (CPAs) such as glycerol or 1,2 propylene glycol. Because cells and tissues are exposed to highly anisosmotic conditions, the resulting gradients cause large volume fluctuations that have been shown to damage cells and tissues. On the other hand, there is evidence that toxicity to these high levels of chemicals is time dependent, and therefore it is ideal to minimize exposure time as well. Because solute and solvent flux is governed by a system of ordinary differential equations, CPA addition and removal from cells is an ideal context for the application of optimal control theory. Recently, we presented a mathematical synthesis of the optimal controls for the ODE system commonly used in cryobiology in the absence of state constraints and showed that controls defined by this synthesis were optimal. Here we define the appropriate model, analytically extend the previous theory to one encompassing state constraints, and as an example apply this to the critical and clinically important cell type of human oocytes, where current methodologies are either difficult to implement or have very limited success rates. We show that an enormous increase in equilibration efficiency can be achieved under the new protocols when compared to classic protocols, potentially allowing a greatly increased survival rate for human oocytes, and pointing to a direction for the cryopreservation of many other cell types. PMID:22527943

  20. Recognizing and Reducing Analytical Errors and Sources of Variation in Clinical Pathology Data in Safety Assessment Studies.

    PubMed

    Schultze, A E; Irizarry, A R

    2017-02-01

    Veterinary clinical pathologists are well positioned via education and training to assist in investigations of unexpected results or increased variation in clinical pathology data. Errors in testing and unexpected variability in clinical pathology data are sometimes referred to as "laboratory errors." These alterations may occur in the preanalytical, analytical, or postanalytical phases of studies. Most of the errors or variability in clinical pathology data occur in the preanalytical or postanalytical phases. True analytical errors occur within the laboratory and are usually the result of operator or instrument error. Analytical errors are often ≤10% of all errors in diagnostic testing, and the frequency of these types of errors has decreased in the last decade. Analytical errors and increased data variability may result from instrument malfunctions, inability to follow proper procedures, undetected failures in quality control, sample misidentification, and/or test interference. This article (1) illustrates several different types of analytical errors and situations within laboratories that may result in increased variability in data, (2) provides recommendations regarding prevention of testing errors and techniques to control variation, and (3) provides a list of references that describe and advise how to deal with increased data variability.

  1. Analytic thinking promotes religious disbelief.

    PubMed

    Gervais, Will M; Norenzayan, Ara

    2012-04-27

    Scientific interest in the cognitive underpinnings of religious belief has grown in recent years. However, to date, little experimental research has focused on the cognitive processes that may promote religious disbelief. The present studies apply a dual-process model of cognitive processing to this problem, testing the hypothesis that analytic processing promotes religious disbelief. Individual differences in the tendency to analytically override initially flawed intuitions in reasoning were associated with increased religious disbelief. Four additional experiments provided evidence of causation, as subtle manipulations known to trigger analytic processing also encouraged religious disbelief. Combined, these studies indicate that analytic processing is one factor (presumably among several) that promotes religious disbelief. Although these findings do not speak directly to conversations about the inherent rationality, value, or truth of religious beliefs, they illuminate one cognitive factor that may influence such discussions.

  2. High-Level Disinfection of Otorhinolaryngology Clinical Instruments: An Evaluation of the Efficacy and Cost-effectiveness of Instrument Storage.

    PubMed

    Yalamanchi, Pratyusha; Yu, Jason; Chandler, Laura; Mirza, Natasha

    2018-01-01

    Objectives Despite increasing interest in individual instrument storage, risk of bacterial cross-contamination of otorhinolaryngology clinic instruments has not been assessed. This study is the first to determine the clinical efficacy and cost-effectiveness of standard high-level disinfection and clinic instrument storage. Methods To assess for cross-contamination, surveillance cultures of otorhinolaryngology clinic instruments subject to standard high-level disinfection and storage were obtained at the start and end of the outpatient clinical workday. Rate of microorganism recovery was compared with cultures of instruments stored in individual peel packs and control cultures of contaminated instruments. Based on historical clinic data, the direct allocation method of cost accounting was used to determine aggregate raw material cost and additional labor hours required to process and restock peel-packed instruments. Results Among 150 cultures of standard high-level disinfected and co-located clinic instruments, 3 positive bacterial cultures occurred; 100% of control cultures were positive for bacterial species ( P < .001). There was no statistical difference between surveillance cultures obtained before and after the clinic day. While there was also no significant difference in rate of contamination between peel-packed and co-located instruments, peel packing all instruments requires 6250 additional labor hours, and conservative analyses place the cost of individual semicritical instrument storage at $97,852.50 per year. Discussion With in vitro inoculation of >200 otorhinolaryngology clinic instruments, this study demonstrates that standard high-level disinfection and storage are equally efficacious to more time-consuming and expensive individual instrument storage protocols, such as peel packing, with regard to bacterial contamination. Implications for Practice Standard high-level disinfection and storage are equally effective to labor-intensive and costly

  3. The Effect of Emulsion Intensity on Selected Sensory and Instrumental Texture Properties of Full-Fat Mayonnaise

    PubMed Central

    Olsson, Viktoria; Håkansson, Andreas

    2018-01-01

    Varying processing conditions can strongly affect the microstructure of mayonnaise, opening up new applications for the creation of products tailored to meet different consumer preferences. The aim of the study was to evaluate the effect of emulsification intensity on sensory and instrumental characteristics of full-fat mayonnaise. Mayonnaise, based on a standard recipe, was processed at low and high emulsification intensities, with selected sensory and instrumental properties then evaluated using an analytical panel and a back extrusion method. The evaluation also included a commercial reference mayonnaise. The overall effects of a higher emulsification intensity on the sensory and instrumental characteristics of full-fat mayonnaise were limited. However, texture was affected, with a more intense emulsification resulting in a firmer mayonnaise according to both back extrusion data and the analytical sensory panel. Appearance, taste and flavor attributes were not affected by processing. PMID:29342128

  4. Proposed techniques for launching instrumented balloons into tornadoes

    NASA Technical Reports Server (NTRS)

    Grant, F. C.

    1971-01-01

    A method is proposed to introduce instrumented balloons into tornadoes by means of the radial pressure gradient, which supplies a buoyancy force driving to the center. Presented are analytical expressions, verified by computer calculations, which show the possibility of introducing instrumented balloons into tornadoes at or below the cloud base. The times required to reach the center are small enough that a large fraction of tornadoes are suitable for the technique. An experimental procedure is outlined in which a research airplane puts an instrumented, self-inflating balloon on the track ahead of the tornado. The uninflated balloon waits until the tornado closes to, typically, 750 meters; then it quickly inflates and spirals up and into the core, taking roughly 3 minutes. Since the drive to the center is automatically produced by the radial pressure gradient, a proper launch radius is the only guidance requirement.

  5. Emission quantification using the tracer gas dispersion method: The influence of instrument, tracer gas species and source simulation.

    PubMed

    Delre, Antonio; Mønster, Jacob; Samuelsson, Jerker; Fredenslund, Anders M; Scheutz, Charlotte

    2018-09-01

    The tracer gas dispersion method (TDM) is a remote sensing method used for quantifying fugitive emissions by relying on the controlled release of a tracer gas at the source, combined with concentration measurements of the tracer and target gas plumes. The TDM was tested at a wastewater treatment plant for plant-integrated methane emission quantification, using four analytical instruments simultaneously and four different tracer gases. Measurements performed using a combination of an analytical instrument and a tracer gas, with a high ratio between the tracer gas release rate and instrument precision (a high release-precision ratio), resulted in well-defined plumes with a high signal-to-noise ratio and a high methane-to-tracer gas correlation factor. Measured methane emission rates differed by up to 18% from the mean value when measurements were performed using seven different instrument and tracer gas combinations. Analytical instruments with a high detection frequency and good precision were established as the most suitable for successful TDM application. The application of an instrument with a poor precision could only to some extent be overcome by applying a higher tracer gas release rate. A sideward misplacement of the tracer gas release point of about 250m resulted in an emission rate comparable to those obtained using a tracer gas correctly simulating the methane emission. Conversely, an upwind misplacement of about 150m resulted in an emission rate overestimation of almost 50%, showing the importance of proper emission source simulation when applying the TDM. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. The 1985 pittsburgh conference: a special instrumentation report.

    PubMed

    1985-03-29

    For the first time in its 36 years of operation, the Pittsburgh Conference and Exposition on Analytical Chemistry and Applied Spectroscopy had a sharp drop in attendance-down 16 percent to 20,731. That loss was attributed to the fact that the meeting was held in New Orleans for the first time, and most of the lost attendees were students and young professionals who had previously come for only 1 day. The number of exhibitors and the number of booths, however, were both up about 15 percent, to 730 and 1856, respectively. A large proportion of that increase was contributed by foreign companies exhibiting for the first time, but there were also some well-known names, such as General Electric and Xerox, making first forays into analytical chemistry. There was also a sharp increase in the number and type of instruments displayed. "The key skill now in analytical chemistry," says Perkin-Elmer president Horace McDonell, Jr., "may be simply finding the right tool to obtain the answers you need." The predominant theme of the show, as it has been for the past few years, was automation of both laboratories and instruments. That trend is having major effects in chemical laboratories, but it is also affecting the instrument companies themselves. At large companies such as Varian, Beckman, and Perkin-Elmer, as much as 50 percent of the research and development budget is now going toward development of software-a much higher percentage than it was even 5 years ago. Another trend in automation also seemed clear at the show. As recently as 2 or 3 years ago, much of the available software for chemistry was designed for Apple and similar computers. Now, the laboratory standard is the IBM PC. As a representative of another company that manufactures computers noted with only slight exaggeration, "There's probably not a booth on the floor that doesn't have one."

  7. Earth Viewing Applications Laboratory (EVAL). Instrument catalog

    NASA Technical Reports Server (NTRS)

    1976-01-01

    There were 87 instruments described that are used in earth observation, with an additional 51 instruments containing references to programs and their major functions. These instruments were selected from such sources as: (1) earth observation flight program, (2) operational satellite improvement programs, (3) advanced application flight experiment program, (4) shuttle experiment definition program, and (5) earth observation aircraft program.

  8. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    PubMed

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  9. Instrumental images: the visual rhetoric of self-presentation in Hevelius's Machina Coelestis.

    PubMed

    Vertesi, Janet

    2010-06-01

    This article places the famous images of Johannes Hevelius's instruments in his Machina Coelestis (1673) in the context of Hevelius's contested cometary observations and his debate with Hooke over telescopic sights. Seen thus, the images promote a crafted vision of Hevelius's astronomical practice and skills, constituting a careful self-presentation to his distant professional network and a claim as to which instrumental techniques guarantee accurate observations. Reviewing the reception of the images, the article explores how visual rhetoric may be invoked and challenged in the context of controversy, and suggests renewed analytical attention to the role of laboratory imagery in instrumental cultures in the history of science.

  10. The Mars Organic Molecule Analyzer (MOMA) Instrument: Characterization of Organic Material in Martian Sediments

    PubMed Central

    Goesmann, Fred; Brinckerhoff, William B.; Raulin, François; Danell, Ryan M.; Getty, Stephanie A.; Siljeström, Sandra; Mißbach, Helge; Steininger, Harald; Arevalo, Ricardo D.; Buch, Arnaud; Freissinet, Caroline; Grubisic, Andrej; Meierhenrich, Uwe J.; Pinnick, Veronica T.; Stalport, Fabien; Szopa, Cyril; Vago, Jorge L.; Lindner, Robert; Schulte, Mitchell D.; Brucato, John Robert; Glavin, Daniel P.; Grand, Noel; Li, Xiang; van Amerom, Friso H. W.

    2017-01-01

    Abstract The Mars Organic Molecule Analyzer (MOMA) instrument onboard the ESA/Roscosmos ExoMars rover (to launch in July, 2020) will analyze volatile and refractory organic compounds in martian surface and subsurface sediments. In this study, we describe the design, current status of development, and analytical capabilities of the instrument. Data acquired on preliminary MOMA flight-like hardware and experimental setups are also presented, illustrating their contribution to the overall science return of the mission. Key Words: Mars—Mass spectrometry—Life detection—Planetary instrumentation. Astrobiology 17, 655–685.

  11. TOPLEX: Teleoperated Lunar Explorer. Instruments and Operational Concepts for an Unmanned Lunar Rover

    NASA Technical Reports Server (NTRS)

    Blacic, James D.

    1992-01-01

    A Teleoperated Lunar Explorer, or TOPLEX, consisting of a lunar lander payload in which a small, instrument-carrying lunar surface rover is robotically landed and teleoperated from Earth to perform extended lunar geoscience and resource evaluation traverses is proposed. The rover vehicle would mass about 100 kg and carry approximately 100 kg of analytic instruments. Four instruments are envisioned: (1) a Laser-Induced Breakdown Spectrometer (LIBS) for geochemical analysis at ranges up to 100 m, capable of operating in three different modes; (2) a combined x-ray fluorescence and x-ray diffraction (XRF/XRD) instrument for elemental and mineralogic analysis of acquired samples; (3) a mass spectrometer system for stepwise heating analysis of gases released from acquired samples; and (4) a geophysical instrument package for subsurface mapping of structures such as lava tubes.

  12. MEMS-Based Micro Instruments for In-Situ Planetary Exploration

    NASA Technical Reports Server (NTRS)

    George, Thomas; Urgiles, Eduardo R; Toda, Risaku; Wilcox, Jaroslava Z.; Douglas, Susanne; Lee, C-S.; Son, Kyung-Ah; Miller, D.; Myung, N.; Madsen, L.; hide

    2005-01-01

    NASA's planetary exploration strategy is primarily targeted to the detection of extant or extinct signs of life. Thus, the agency is moving towards more in-situ landed missions as evidenced by the recent, successful demonstration of twin Mars Exploration Rovers. Also, future robotic exploration platforms are expected to evolve towards sophisticated analytical laboratories composed of multi-instrument suites. MEMS technology is very attractive for in-situ planetary exploration because of the promise of a diverse and capable set of advanced, low mass and low-power devices and instruments. At JPL, we are exploiting this diversity of MEMS for the development of a new class of miniaturized instruments for planetary exploration. In particular, two examples of this approach are the development of an Electron Luminescence X-ray Spectrometer (ELXS), and a Force-Detected Nuclear Magnetic Resonance (FDNMR) Spectrometer.

  13. Analytic Solution of the Problem of Additive Formation of an Inhomogeneous Elastic Spherical Body in an Arbitrary Nonstationary Central Force Field

    NASA Astrophysics Data System (ADS)

    Parshin, D. A.

    2017-09-01

    We study the processes of additive formation of spherically shaped rigid bodies due to the uniform accretion of additional matter to their surface in an arbitrary centrally symmetric force field. A special case of such a field can be the gravitational or electrostatic force field. We consider the elastic deformation of the formed body. The body is assumed to be isotropic with elasticmoduli arbitrarily varying along the radial coordinate.We assume that arbitrary initial circular stresses can arise in the additional material added to the body in the process of its formation. In the framework of linear mechanics of growing bodies, the mathematical model of the processes under study is constructed in the quasistatic approximation. The boundary value problems describing the development of stress-strain state of the object under study before the beginning of the process and during the entire process of its formation are posed. The closed analytic solutions of the posed problems are constructed by quadratures for some general types of material inhomogeneity. Important typical characteristics of the mechanical behavior of spherical bodies additively formed in the central force field are revealed. These characteristics substantially distinguish such bodies from the already completely composed bodies similar in dimensions and properties which are placed in the force field and are described by problems of mechanics of deformable solids in the classical statement disregarding the mechanical aspects of additive processes.

  14. [Physical, chemical and morphological urine examination guidelines for the Analytical Phase from the Intersociety Urinalysis Group].

    PubMed

    Manoni, Fabio; Gessoni, Gianluca; Fogazzi, Giovanni Battista; Alessio, Maria Grazia; Caleffi, Alberta; Gambaro, Giovanni; Epifani, Maria Grazia; Pieretti, Barbara; Perego, Angelo; Ottomano, Cosimo; Saccani, Graziella; Valverde, Sara; Secchiero, Sandra

    2016-01-01

    With these guidelines the Intersociety Urinalysis Group (GIAU) aims to stimulate the following aspects: Improvement and standardization of the analytical approach to physical, chemical and morphological urine examination (ECMU). Improvement of the chemical analysis of urine with particular regard to the reconsideration of the diagnostic significance of the parameters that are traditionally evaluated in dipstick analysis together with an increasing awareness of the limits of sensitivity and specificity of this analytical method. Increase the awareness of the importance of professional skills in the field of urinary morphology and the relationship with the clinicians. Implement a policy of evaluation of the analytical quality by using, in addition to traditional internal and external controls, a program for the evaluation of morphological competence. Stimulate the diagnostics industry to focus research efforts and development methodology and instrumental catering on the needs of clinical diagnosis. The hope is to revalue the enormous diagnostic potential of 'ECMU, implementing a urinalysis on personalized diagnostic needs for each patient. Emphasize the value added to ECMU by automated analyzers for the study of the morphology of the corpuscular fraction urine. The hope is to revalue the enormous potential diagnostic of 'ECMU, implementing a urinalysis on personalized diagnostic needs that each patient brings with it.

  15. Quantifying risks with exact analytical solutions of derivative pricing distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin

    2017-04-01

    Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.

  16. Robust best linear estimator for Cox regression with instrumental variables in whole cohort and surrogates with additive measurement error in calibration sample.

    PubMed

    Wang, Ching-Yun; Song, Xiao

    2016-11-01

    Biomedical researchers are often interested in estimating the effect of an environmental exposure in relation to a chronic disease endpoint. However, the exposure variable of interest may be measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies an additive measurement error model, but it may not have repeated measurements. The subset in which the surrogate variables are available is called a calibration sample. In addition to the surrogate variables that are available among the subjects in the calibration sample, we consider the situation when there is an instrumental variable available for all study subjects. An instrumental variable is correlated with the unobserved true exposure variable, and hence can be useful in the estimation of the regression coefficients. In this paper, we propose a nonparametric method for Cox regression using the observed data from the whole cohort. The nonparametric estimator is the best linear combination of a nonparametric correction estimator from the calibration sample and the difference of the naive estimators from the calibration sample and the whole cohort. The asymptotic distribution is derived, and the finite sample performance of the proposed estimator is examined via intensive simulation studies. The methods are applied to the Nutritional Biomarkers Study of the Women's Health Initiative. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Robust best linear estimator for Cox regression with instrumental variables in whole cohort and surrogates with additive measurement error in calibration sample

    PubMed Central

    Wang, Ching-Yun; Song, Xiao

    2017-01-01

    SUMMARY Biomedical researchers are often interested in estimating the effect of an environmental exposure in relation to a chronic disease endpoint. However, the exposure variable of interest may be measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies an additive measurement error model, but it may not have repeated measurements. The subset in which the surrogate variables are available is called a calibration sample. In addition to the surrogate variables that are available among the subjects in the calibration sample, we consider the situation when there is an instrumental variable available for all study subjects. An instrumental variable is correlated with the unobserved true exposure variable, and hence can be useful in the estimation of the regression coefficients. In this paper, we propose a nonparametric method for Cox regression using the observed data from the whole cohort. The nonparametric estimator is the best linear combination of a nonparametric correction estimator from the calibration sample and the difference of the naive estimators from the calibration sample and the whole cohort. The asymptotic distribution is derived, and the finite sample performance of the proposed estimator is examined via intensive simulation studies. The methods are applied to the Nutritional Biomarkers Study of the Women’s Health Initiative. PMID:27546625

  18. How discriminating are discriminative instruments?

    PubMed

    Hankins, Matthew

    2008-05-27

    The McMaster framework introduced by Kirshner & Guyatt is the dominant paradigm for the development of measures of health status and health-related quality of life (HRQL). The framework defines the functions of such instruments as evaluative, predictive or discriminative. Evaluative instruments are required to be sensitive to change (responsiveness), but there is no corresponding index of the degree to which discriminative instruments are sensitive to cross-sectional differences. This paper argues that indices of validity and reliability are not sufficient to demonstrate that a discriminative instrument performs its function of discriminating between individuals, and that the McMaster framework would be augmented by the addition of a separate index of discrimination. The coefficient proposed by Ferguson (Delta) is easily adapted to HRQL instruments and is a direct, non-parametric index of the degree to which an instrument distinguishes between individuals. While Delta should prove useful in the development and evaluation of discriminative instruments, further research is required to elucidate the relationship between the measurement properties of discrimination, reliability and responsiveness.

  19. Integrated Response Time Evaluation Methodology for the Nuclear Safety Instrumentation System

    NASA Astrophysics Data System (ADS)

    Lee, Chang Jae; Yun, Jae Hee

    2017-06-01

    Safety analysis for a nuclear power plant establishes not only an analytical limit (AL) in terms of a measured or calculated variable but also an analytical response time (ART) required to complete protective action after the AL is reached. If the two constraints are met, the safety limit selected to maintain the integrity of physical barriers used for preventing uncontrolled radioactivity release will not be exceeded during anticipated operational occurrences and postulated accidents. Setpoint determination methodologies have actively been developed to ensure that the protective action is initiated before the process conditions reach the AL. However, regarding the ART for a nuclear safety instrumentation system, an integrated evaluation methodology considering the whole design process has not been systematically studied. In order to assure the safety of nuclear power plants, this paper proposes a systematic and integrated response time evaluation methodology that covers safety analyses, system designs, response time analyses, and response time tests. This methodology is applied to safety instrumentation systems for the advanced power reactor 1400 and the optimized power reactor 1000 nuclear power plants in South Korea. The quantitative evaluation results are provided herein. The evaluation results using the proposed methodology demonstrate that the nuclear safety instrumentation systems fully satisfy corresponding requirements of the ART.

  20. The Analog Revolution and Its On-Going Role in Modern Analytical Measurements.

    PubMed

    Enke, Christie G

    2015-12-15

    The electronic revolution in analytical instrumentation began when we first exceeded the two-digit resolution of panel meters and chart recorders and then took the first steps into automated control. It started with the first uses of operational amplifiers (op amps) in the analog domain 20 years before the digital computer entered the analytical lab. Their application greatly increased both accuracy and precision in chemical measurement and they provided an elegant means for the electronic control of experimental quantities. Later, laboratory and personal computers provided an unlimited readout resolution and enabled programmable control of instrument parameters as well as storage and computation of acquired data. However, digital computers did not replace the op amp's critical role of converting the analog sensor's output to a robust and accurate voltage. Rather it added a new role: converting that voltage into a number. These analog operations are generally the limiting portions of our computerized instrumentation systems. Operational amplifier performance in gain, input current and resistance, offset voltage, and rise time have improved by a remarkable 3-4 orders of magnitude since their first implementations. Each 10-fold improvement has opened the doors for the development of new techniques in all areas of chemical analysis. Along with some interesting history, the multiple roles op amps play in modern instrumentation are described along with a number of examples of new areas of analysis that have been enabled by their improvements.

  1. Walking towards Instrumental Appropriation of Mobile Devices. A Comparison of Studies

    ERIC Educational Resources Information Center

    Hernandez Serrano, Maria José; Yang, Lingling

    2013-01-01

    The study of instrumental appropriation is considered a relevant outstanding and productive perspective in the arena of Mobile ICT and learning. This paper seeks for the consolidation of this perspective at a theoretical and analytical level. Regarding the theoretical level, two characteristics of mobile devices--flexibility and mobility--are…

  2. 14 CFR 25.1333 - Instrument systems.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... For systems that operate the instruments required by § 25.1303(b) which are located at each pilot's... operating systems which are independent of the operating systems at other flight crew stations, or other...) Additional instruments, systems, or equipment may not be connected to the operating systems for the required...

  3. 14 CFR 25.1333 - Instrument systems.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .... For systems that operate the instruments required by § 25.1303(b) which are located at each pilot's... operating systems which are independent of the operating systems at other flight crew stations, or other...) Additional instruments, systems, or equipment may not be connected to the operating systems for the required...

  4. 14 CFR 25.1333 - Instrument systems.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... For systems that operate the instruments required by § 25.1303(b) which are located at each pilot's... operating systems which are independent of the operating systems at other flight crew stations, or other...) Additional instruments, systems, or equipment may not be connected to the operating systems for the required...

  5. 14 CFR 25.1333 - Instrument systems.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... For systems that operate the instruments required by § 25.1303(b) which are located at each pilot's... operating systems which are independent of the operating systems at other flight crew stations, or other...) Additional instruments, systems, or equipment may not be connected to the operating systems for the required...

  6. 14 CFR 25.1333 - Instrument systems.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... For systems that operate the instruments required by § 25.1303(b) which are located at each pilot's... operating systems which are independent of the operating systems at other flight crew stations, or other...) Additional instruments, systems, or equipment may not be connected to the operating systems for the required...

  7. Management of surgical instruments with radio frequency identification tags.

    PubMed

    Kusuda, Kaori; Yamashita, Kazuhiko; Ohnishi, Akiko; Tanaka, Kiyohito; Komino, Masaru; Honda, Hiroshi; Tanaka, Shinichi; Okubo, Takashi; Tripette, Julien; Ohta, Yuji

    2016-01-01

    To prevent malpractices, medical staff has adopted inventory time-outs and/or checklists. Accurate inventory and maintenance of surgical instruments decreases the risk of operating room miscounting and malfunction. In our previous study, an individual management of surgical instruments was accomplished using Radio Frequency Identification (RFID) tags. The purpose of this paper is to evaluate a new management method of RFID-tagged instruments. The management system of RFID-tagged surgical instruments was used for 27 months in clinical areas. In total, 13 study participants assembled surgical trays in the central sterile supply department. While using the management system, trays were assembled 94 times. During this period, no assembly errors occurred. An instrument malfunction had occurred after the 19th, 56th, and 73 th uses, no malfunction caused by the RFID tags, and usage history had been recorded. Additionally, the time it took to assemble surgical trays was recorded, and the long-term usability of the management system was evaluated. The system could record the number of uses and the defective history of each surgical instrument. In addition, the history of the frequency of instruments being transferred from one tray to another was recorded. The results suggest that our system can be used to manage instruments safely. Additionally, the management system was acquired of the learning effect and the usability on daily maintenance. This finding suggests that the management system examined here ensures surgical instrument and tray assembly quality.

  8. Laboratory evaluation of alcohol safety interlock systems. Volume 3 : instrument performance at high BAL

    DOT National Transportation Integrated Search

    1974-01-01

    This report contains the results of an experimental and analytical evaluation of instruments and techniques designed to prevent an intoxicated driver from operating his automobile. The prototype 'Alcohol Safety Interlock Systems' tested were develope...

  9. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  10. Unexpected Analyte Oxidation during Desorption Electrospray Ionization - Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasilis, Sofie P; Kertesz, Vilmos; Van Berkel, Gary J

    2008-01-01

    During the analysis of surface spotted analytes using desorption electrospray ionization mass spectrometry (DESI-MS), abundant ions are sometimes observed that appear to be the result of oxygen addition reactions. In this investigation, the effect of sample aging, the ambient lab environment, spray voltage, analyte surface concentration, and surface type on this oxidative modification of spotted analytes, exemplified by tamoxifen and reserpine, during analysis by desorption electrospray ionization mass spectrometry was studied. Simple exposure of the samples to air and to ambient lighting increased the extent of oxidation. Increased spray voltage lead also to increased analyte oxidation, possibly as a resultmore » of oxidative species formed electrochemically at the emitter electrode or in the gas - phase by discharge processes. These oxidative species are carried by the spray and impinge on and react with the sampled analyte during desorption/ionization. The relative abundance of oxidized species was more significant for analysis of deposited analyte having a relatively low surface concentration. Increasing spray solvent flow rate and addition of hydroquinone as a redox buffer to the spray solvent were found to decrease, but not entirely eliminate, analyte oxidation during analysis. The major parameters that both minimize and maximize analyte oxidation were identified and DESI-MS operational recommendations to avoid these unwanted reactions are suggested.« less

  11. FTIR instrumentation to monitor vapors from Shuttle tile waterproofing materials

    NASA Technical Reports Server (NTRS)

    Mattson, C. B.; Schwindt, C. J.

    1995-01-01

    The Space Shuttle Thermal Protection System (TPS) tiles and blankets are waterproofed using DimethylEthoxySilane (DMEX) in the Orbiter Processing Facilities (OPF). DMES has a Threshold Limit Value (TLV) for exposure of personnel to vapor concentration in air of 0.5 ppm. The OPF high bay cannot be opened for normal work after a waterproofing operation until the DMES concentration is verified by measurement to be below the TLV. On several occasions the high bay has been kept closed for up to 8 hours following waterproofing operations due to high DMES measurements. In addition, the Miran 203 and Miran 1 BX infrared analyzers calibrated at different wavelengths gave different readings under the same conditions. There was reason to believe that some of the high DMES concentration readings were caused by interference form water and ethanol vapors. The Toxic Vapor Detection Laboratory (TVDL) was asked to test the existing DMES instruments and identify the best qualified instrument. In addition the TVDL was requested to develop instrumentation to ensure the OPF high bay could be opened safely as soon as possible after a waterproofing operation. A Fourier Transform Infrared (FTIR) spectrophotometer instrument developed for an earlier project was reprogrammed to measure DMES vapor along with ethanol, water, and several common solvent vapors. The FTIR was then used to perform a series of laboratory and field tests to evaluate the performance of the single wavelength IR instruments in use. The results demonstrated that the single wavelength IR instruments did respond to ethanol and water vapors, more or less depending on the analytical IR wavelength selected. The FTIR was able to separate the responses to DMES, water and ethanol, and give consistent readings for the DMES vapor concentration. The FTIR was then deployed to the OPF to monitor real waterproofing operations. The FTIR was also used to measure the time for DMES to evaporate from TPS tile under a range of humidity

  12. A new generation of x-ray spectrometry UHV instruments at the SR facilities BESSY II, ELETTRA and SOLEIL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lubeck, J., E-mail: janin.lubeck@ptb.de; Fliegauf, R.; Holfelder, I.

    A novel type of ultra-high vacuum instrument for X-ray reflectometry and spectrometry-related techniques for nanoanalytics by means of synchrotron radiation (SR) has been constructed and commissioned at BESSY II. This versa-tile instrument was developed by the PTB, Germany’s national metrology institute, and includes a 9-axis manipulator that allows for an independent alignment of the samples with respect to all degrees of freedom. In addition, it integrates a rotational and translational movement of several photodiodes as well as a translational movement of a beam-geometry-defining aperture system. Thus, the new instrument enables various analytical techniques based on energy dispersive X-ray detectors suchmore » as reference-free X-Ray Fluorescence (XRF) analysis, total-reflection XRF, grazing-incidence XRF, in addition to optional X-Ray Reflectometry (XRR) measurements or polarization-dependent X-ray absorption fine structure analyses (XAFS). Samples having a size of up to (100 × 100) mm{sup 2}; can be analyzed with respect to their mass deposition, elemental, spatial or species composition. Surface contamination, nanolayer composition and thickness, depth pro-file of matrix elements or implants, nanoparticles or buried interfaces as well as molecular orientation of bonds can be accessed. Three technology transfer projects of adapted instruments have enhanced X-Ray Spectrometry (XRS) research activities within Europe at the synchrotron radiation facilities ELETTRA (IAEA) and SOLEIL (CEA/LNE-LNHB) as well as at the X-ray innovation laboratory BLiX (TU Berlin) where different laboratory sources are used. Here, smaller chamber requirements led PTB in cooperation with TU Berlin to develop a modified instrument equipped with a 7-axis manipulator: reduced freedom in the choice of experimental geometry modifications (absence of out-of-SR-plane and reference-free XRS options) has been compensated by encoder-enhanced angular accuracy for GIXRF and XRR.« less

  13. Instrument for Real-Time Digital Nucleic Acid Amplification on Custom Microfluidic Devices

    PubMed Central

    Selck, David A.

    2016-01-01

    Nucleic acid amplification tests that are coupled with a digital readout enable the absolute quantification of single molecules, even at ultralow concentrations. Digital methods are robust, versatile and compatible with many amplification chemistries including isothermal amplification, making them particularly invaluable to assays that require sensitive detection, such as the quantification of viral load in occult infections or detection of sparse amounts of DNA from forensic samples. A number of microfluidic platforms are being developed for carrying out digital amplification. However, the mechanistic investigation and optimization of digital assays has been limited by the lack of real-time kinetic information about which factors affect the digital efficiency and analytical sensitivity of a reaction. Commercially available instruments that are capable of tracking digital reactions in real-time are restricted to only a small number of device types and sample-preparation strategies. Thus, most researchers who wish to develop, study, or optimize digital assays rely on the rate of the amplification reaction when performed in a bulk experiment, which is now recognized as an unreliable predictor of digital efficiency. To expand our ability to study how digital reactions proceed in real-time and enable us to optimize both the digital efficiency and analytical sensitivity of digital assays, we built a custom large-format digital real-time amplification instrument that can accommodate a wide variety of devices, amplification chemistries and sample-handling conditions. Herein, we validate this instrument, we provide detailed schematics that will enable others to build their own custom instruments, and we include a complete custom software suite to collect and analyze the data retrieved from the instrument. We believe assay optimizations enabled by this instrument will improve the current limits of nucleic acid detection and quantification, improving our fundamental

  14. Challenges in Modern Anti-Doping Analytical Science.

    PubMed

    Ayotte, Christiane; Miller, John; Thevis, Mario

    2017-01-01

    The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.

  15. A simple energy filter for low energy electron microscopy/photoelectron emission microscopy instruments.

    PubMed

    Tromp, R M; Fujikawa, Y; Hannon, J B; Ellis, A W; Berghaus, A; Schaff, O

    2009-08-05

    Addition of an electron energy filter to low energy electron microscopy (LEEM) and photoelectron emission microscopy (PEEM) instruments greatly improves their analytical capabilities. However, such filters tend to be quite complex, both electron optically and mechanically. Here we describe a simple energy filter for the existing IBM LEEM/PEEM instrument, which is realized by adding a single scanning aperture slit to the objective transfer optics, without any further modifications to the microscope. This energy filter displays a very high energy resolution ΔE/E = 2 × 10(-5), and a non-isochromaticity of ∼0.5 eV/10 µm. The setup is capable of recording selected area electron energy spectra and angular distributions at 0.15 eV energy resolution, as well as energy filtered images with a 1.5 eV energy pass band at an estimated spatial resolution of ∼10 nm. We demonstrate the use of this energy filter in imaging and spectroscopy of surfaces using a laboratory-based He I (21.2 eV) light source, as well as imaging of Ag nanowires on Si(001) using the 4 eV energy loss Ag plasmon.

  16. Mechanical Properties of Additively Manufactured Thick Honeycombs.

    PubMed

    Hedayati, Reza; Sadighi, Mojtaba; Mohammadi Aghdam, Mohammad; Zadpoor, Amir Abbas

    2016-07-23

    Honeycombs resemble the structure of a number of natural and biological materials such as cancellous bone, wood, and cork. Thick honeycomb could be also used for energy absorption applications. Moreover, studying the mechanical behavior of honeycombs under in-plane loading could help understanding the mechanical behavior of more complex 3D tessellated structures such as porous biomaterials. In this paper, we study the mechanical behavior of thick honeycombs made using additive manufacturing techniques that allow for fabrication of honeycombs with arbitrary and precisely controlled thickness. Thick honeycombs with different wall thicknesses were produced from polylactic acid (PLA) using fused deposition modelling, i.e., an additive manufacturing technique. The samples were mechanically tested in-plane under compression to determine their mechanical properties. We also obtained exact analytical solutions for the stiffness matrix of thick hexagonal honeycombs using both Euler-Bernoulli and Timoshenko beam theories. The stiffness matrix was then used to derive analytical relationships that describe the elastic modulus, yield stress, and Poisson's ratio of thick honeycombs. Finite element models were also built for computational analysis of the mechanical behavior of thick honeycombs under compression. The mechanical properties obtained using our analytical relationships were compared with experimental observations and computational results as well as with analytical solutions available in the literature. It was found that the analytical solutions presented here are in good agreement with experimental and computational results even for very thick honeycombs, whereas the analytical solutions available in the literature show a large deviation from experimental observation, computational results, and our analytical solutions.

  17. Mechanical Properties of Additively Manufactured Thick Honeycombs

    PubMed Central

    Hedayati, Reza; Sadighi, Mojtaba; Mohammadi Aghdam, Mohammad; Zadpoor, Amir Abbas

    2016-01-01

    Honeycombs resemble the structure of a number of natural and biological materials such as cancellous bone, wood, and cork. Thick honeycomb could be also used for energy absorption applications. Moreover, studying the mechanical behavior of honeycombs under in-plane loading could help understanding the mechanical behavior of more complex 3D tessellated structures such as porous biomaterials. In this paper, we study the mechanical behavior of thick honeycombs made using additive manufacturing techniques that allow for fabrication of honeycombs with arbitrary and precisely controlled thickness. Thick honeycombs with different wall thicknesses were produced from polylactic acid (PLA) using fused deposition modelling, i.e., an additive manufacturing technique. The samples were mechanically tested in-plane under compression to determine their mechanical properties. We also obtained exact analytical solutions for the stiffness matrix of thick hexagonal honeycombs using both Euler-Bernoulli and Timoshenko beam theories. The stiffness matrix was then used to derive analytical relationships that describe the elastic modulus, yield stress, and Poisson’s ratio of thick honeycombs. Finite element models were also built for computational analysis of the mechanical behavior of thick honeycombs under compression. The mechanical properties obtained using our analytical relationships were compared with experimental observations and computational results as well as with analytical solutions available in the literature. It was found that the analytical solutions presented here are in good agreement with experimental and computational results even for very thick honeycombs, whereas the analytical solutions available in the literature show a large deviation from experimental observation, computational results, and our analytical solutions. PMID:28773735

  18. Tiered analytics for purity assessment of macrocyclic peptides in drug discovery: Analytical consideration and method development.

    PubMed

    Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A

    2017-05-10

    Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.

  19. The cobas p 630 instrument: a dedicated pre-analytic solution to optimize COBAS® AmpliPrep/COBAS® TaqMan® system workflow and turn-around-time.

    PubMed

    Vallefuoco, L; Sorrentino, R; Spalletti Cernia, D; Colucci, G; Portella, G

    2012-12-01

    The cobas p 630, a fully automated pre-analytical instrument for primary tube handling recently introduced to complete the Cobas(®) TaqMan systems portfolio, was evaluated in conjunction with: the COBAS(®) AmpliPrep/COBAS(®) TaqMan HBV Test, v2.0, COBAS(®) AmpliPrep/COBAS(®) TaqMan HCV Test, v1.0 and COBAS(®) AmpliPrep/COBAS(®) TaqMan HIV Test, v2.0. The instrument performance in transferring samples from primary to secondary tubes, its impact in improving COBAS(®) AmpliPrep/COBAS(®) TaqMan workflow and hands-on reduction and the risk of possible cross-contamination were assessed. Samples from 42 HBsAg positive, 42 HCV and 42 HIV antibody (Ab) positive patients as well as 21 healthy blood donors were processed with or without automated primary tubes. HIV, HCV and HBsAg positive samples showed a correlation index of 0.999, 0.987 and of 0.994, respectively. To assess for cross-contamination, high titer HBV DNA positive samples, HCV RNA and HIV RNA positive samples were distributed in the cobas p 630 in alternate tube positions, adjacent to negative control samples within the same rack. None of the healthy donor samples showed any reactivity. Based on these results, the cobas p 630 can improve workflow and sample tracing in laboratories performing molecular tests, and reduce turnaround time, errors, and risks. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Development of a canopy Solar-induced chlorophyll fluorescence measurement instrument

    NASA Astrophysics Data System (ADS)

    Sun, G.; Wang, X.; Niu, Zh; Chen, F.

    2014-02-01

    A portable solar-induced chlorophyll fluorescence detecting instrument based on Fraunhofer line principle was designed and tested. The instrument has a valid survey area of 1.3 × 1.3 meter when the height was fixed to 1.3 meter. The instrument uses sunlight as its light source. The instrument is quipped with two sets of special photoelectrical detectors with the centre wavelength at 760 nm and 771 nm respectively and bandwidth less than 1nm. Both sets of detectors are composed of an upper detector which are used for detecting incidence sunlight and a bottom detector which are used for detecting reflex light from the canopy of crop. This instrument includes photoelectric detector module, signal process module, A/D convert module, the data storage and upload module and human-machine interface module. The microprocessor calculates solar-induced fluorescence value based on the A/D values get from detectors. And the value can be displayed on the instrument's LCD, stored in the flash memory of instrument and can also be uploaded to PC through the PC's serial interface. The prototype was tested in the crop field and the results demonstrate that the instrument can measure the solar-induced chlorophyll value exactly with the correlation coefficients was 0.9 compared to the values got from Analytical Spectral Devices FieldSpec Pro spectrometer. This instrument can diagnose the plant growth status by the acquired spectral response.

  1. The Development of Proofs in Analytical Mathematics for Undergraduate Students

    NASA Astrophysics Data System (ADS)

    Ali, Maselan; Sufahani, Suliadi; Hasim, Nurnazifa; Saifullah Rusiman, Mohd; Roslan, Rozaini; Mohamad, Mahathir; Khalid, Kamil

    2018-04-01

    Proofs in analytical mathematics are essential parts of mathematics, difficult to learn because its underlying concepts are not visible. This research consists of problems involving logic and proofs. In this study, a short overview was provided on how proofs in analytical mathematics were used by university students. From the results obtained, excellent students obtained better scores compared to average and poor students. The research instruments used in this study consisted of two parts: test and interview. In this way, analysis of students’ actual performances can be obtained. The result of this study showed that the less able students have fragile conceptual and cognitive linkages but the more able students use their strong conceptual linkages to produce effective solutions

  2. Assessing Proposals for New Global Health Treaties: An Analytic Framework.

    PubMed

    Hoffman, Steven J; Røttingen, John-Arne; Frenk, Julio

    2015-08-01

    We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties.

  3. Assessing Proposals for New Global Health Treaties: An Analytic Framework

    PubMed Central

    Røttingen, John-Arne; Frenk, Julio

    2015-01-01

    We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties. PMID:26066926

  4. Automated Flow-Injection Instrument for Chemiluminescence Detection Using a Low-Cost Photodiode Detector

    ERIC Educational Resources Information Center

    Economou, A.; Papargyris, D.; Stratis, J.

    2004-01-01

    The development of an FI analyzer for chemiluminescence detection using a low-cost photoiodide is presented. The experiment clearly demonstrates in a single interdisciplinary project the way in which different aspects in chemical instrumentation fit together to produce a working analytical system.

  5. SOLID2: An Antibody Array-Based Life-Detector Instrument in a Mars Drilling Simulation Experiment (MARTE)

    NASA Astrophysics Data System (ADS)

    Parro, Víctor; Fernández-Calvo, Patricia; Rodríguez Manfredi, José A.; Moreno-Paz, Mercedes; Rivas, Luis A.; García-Villadangos, Miriam; Bonaccorsi, Rosalba; González-Pastor, José Eduardo; Prieto-Ballesteros, Olga; Schuerger, Andrew C.; Davidson, Mark; Gómez-Elvira, Javier; Stoker, Carol R.

    2008-10-01

    A field prototype of an antibody array-based life-detector instrument, Signs Of LIfe Detector (SOLID2), has been tested in a Mars drilling mission simulation called MARTE (Mars Astrobiology Research and Technology Experiment). As one of the analytical instruments on the MARTE robotic drilling rig, SOLID2 performed automatic sample processing and analysis of ground core samples (0.5 g) with protein microarrays that contained 157 different antibodies. Core samples from different depths (down to 5.5 m) were analyzed, and positive reactions were obtained in antibodies raised against the Gram-negative bacterium Leptospirillum ferrooxidans, a species of the genus Acidithiobacillus (both common microorganisms in the Río Tinto area), and extracts from biofilms and other natural samples from the Río Tinto area. These positive reactions were absent when the samples were previously subjected to a high-temperature treatment, which indicates the biological origin and structural dependency of the antibody-antigen reactions. We conclude that an antibody array-based life-detector instrument like SOLID2 can detect complex biological material, and it should be considered as a potential analytical instrument for future planetary missions that search for life.

  6. SOLID2: an antibody array-based life-detector instrument in a Mars Drilling Simulation Experiment (MARTE).

    PubMed

    Parro, Víctor; Fernández-Calvo, Patricia; Rodríguez Manfredi, José A; Moreno-Paz, Mercedes; Rivas, Luis A; García-Villadangos, Miriam; Bonaccorsi, Rosalba; González-Pastor, José Eduardo; Prieto-Ballesteros, Olga; Schuerger, Andrew C; Davidson, Mark; Gómez-Elvira, Javier; Stoker, Carol R

    2008-10-01

    A field prototype of an antibody array-based life-detector instrument, Signs Of LIfe Detector (SOLID2), has been tested in a Mars drilling mission simulation called MARTE (Mars Astrobiology Research and Technology Experiment). As one of the analytical instruments on the MARTE robotic drilling rig, SOLID2 performed automatic sample processing and analysis of ground core samples (0.5 g) with protein microarrays that contained 157 different antibodies. Core samples from different depths (down to 5.5 m) were analyzed, and positive reactions were obtained in antibodies raised against the Gram-negative bacterium Leptospirillum ferrooxidans, a species of the genus Acidithiobacillus (both common microorganisms in the Río Tinto area), and extracts from biofilms and other natural samples from the Río Tinto area. These positive reactions were absent when the samples were previously subjected to a high-temperature treatment, which indicates the biological origin and structural dependency of the antibody-antigen reactions. We conclude that an antibody array-based life-detector instrument like SOLID2 can detect complex biological material, and it should be considered as a potential analytical instrument for future planetary missions that search for life.

  7. Addition of instrumented fusion after posterior decompression surgery suppresses thickening of ossification of the posterior longitudinal ligament of the cervical spine.

    PubMed

    Ota, Mitsutoshi; Furuya, Takeo; Maki, Satoshi; Inada, Taigo; Kamiya, Koshiro; Ijima, Yasushi; Saito, Junya; Takahashi, Kazuhisa; Yamazaki, Masashi; Aramomi, Masaaki; Mannoji, Chikato; Koda, Masao

    2016-12-01

    Laminoplasty (LMP) is a widely accepted surgical procedure for ossification of the posterior longitudinal ligament (OPLL) of the cervical spine. Progression of OPLL can occur in the long term after LMP. The aim of the present study was to determine whether addition of the instrumented fusion, (posterior decompression with instrumented fusion [PDF]), can suppress progression of OPLL or not. The present study included 50 patients who underwent LMP (n=23) or PDF (n=27) for OPLL of the cervical spine. We performed open door laminoplasty. PDF surgery was performed by double-door laminoplasty followed by instrumented fusion. We observed the non-ossified segment of the OPLL and measured the thickness of the OPLL at the thickest segment with pre- and postoperative sagittal CT multi-planar reconstruction images. Postoperative CT scan revealed fusion of the non-ossified segment of the OPLL was obtained in 4/23 patients (17%) in the LPM group and in 23/27 patients (85%) in the PDF group, showing a significant difference between both groups (p=0.003). Progression of the thickness of the OPLL in the PDF group (-0.1±0.4mm) was significantly smaller than in the LMP group (0.6±0.7mm, p=0.0002). The proportion of patients showing the decrease in thickness of OPLL was significantly larger in the PDF group (6/27 patients; 22%) than in the LMP group (0/23 patients; 0%, p=0.05). In conclusion, PDF surgery can suppress the thickening of OPLL. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Progress along the E-ELT instrumentation roadmap

    NASA Astrophysics Data System (ADS)

    Ramsay, Suzanne; Casali, Mark; Cirasuolo, Michele; Egner, Sebastian; Gray, Peter; Gonzáles Herrera, Juan Carlos; Hammersley, Peter; Haupt, Christoph; Ives, Derek; Jochum, Lieselotte; Kasper, Markus; Kerber, Florian; Lewis, Steffan; Mainieri, Vincenzo; Manescau, Antonio; Marchetti, Enrico; Oberti, Sylvain; Padovani, Paolo; Schmid, Christian; Schimpelsberger, Johannes; Siebenmorgen, Ralf; Szecsenyi, Orsolya; Tamai, Roberto; Vernet, Joël.

    2016-08-01

    A suite of seven instruments and associated AO systems have been planned as the "E-ELT Instrumentation Roadmap". Following the E-ELT project approval in December 2014, rapid progress has been made in organising and signing the agreements for construction with European universities and institutes. Three instruments (HARMONI, MICADO and METIS) and one MCAO module (MAORY) have now been approved for construction. In addition, Phase-A studies have begun for the next two instruments - a multi-object spectrograph and high-resolution spectrograph. Technology development is also ongoing in preparation for the final instrument in the roadmap, the planetary camera and spectrograph. We present a summary of the status and capabilities of this first set of instruments for the E-ELT.

  9. Thirty Meter Telescope science instruments: a status report

    NASA Astrophysics Data System (ADS)

    Simard, Luc; Ellerbroek, Brent; Bhatia, Ravinder; Radovan, Matthew; Chisholm, Eric

    2016-08-01

    An overview of the current status of the science instruments for the Thirty Meter Telescope is presented. Three first-light instruments as well as a science calibration unit for AO-assisted instruments are under development. Developing instrument collaborations that can design and build these challenging instruments remains an area of intense activity. In addition to the instruments themselves, a preliminary design for a facility cryogenic cooling system based on gaseous helium turbine expanders has been completed. This system can deliver a total of 2.4 kilowatts of cooling power at 65K to the instruments with essentially no vibrations. Finally, the process for developing future instruments beyond first light has been extensively discussed and will get under way in early 2017.

  10. Career Decision Statuses among Portuguese Secondary School Students: A Cluster Analytical Approach

    ERIC Educational Resources Information Center

    Santos, Paulo Jorge; Ferreira, Joaquim Armando

    2012-01-01

    Career indecision is a complex phenomenon and an increasing number of authors have proposed that undecided individuals do not form a group with homogeneous characteristics. This study examines career decision statuses among a sample of 362 12th-grade Portuguese students. A cluster-analytical procedure, based on a battery of instruments designed to…

  11. A comparison of analytical laboratory and optical in situ methods for the measurement of nitrate in north Florida water bodies

    NASA Astrophysics Data System (ADS)

    Rozin, A. G.; Clark, M. W.

    2013-12-01

    Assessing the impact of nutrient concentrations on aquatic ecosystems requires an in depth understanding of dynamic biogeochemical cycles that are often a challenge to monitor at the high spatial and temporal resolution necessary to understand these complex processes. Traditional sampling approaches involving discrete samples and laboratory analyses can be constrained by analytical costs, field time, and logistical details that can fail to accurately capture both spatial and temporal changes. Optical in situ instruments may provide the opportunity to continuously monitor a variety of water quality parameters at a high spatial or temporal resolution. This work explores the suitability of a Submersible Ultraviolet Nitrate Analyzer (SUNA), produced by Satlantic, to accurately assess in situ nitrate concentration in several freshwater systems in north Florida. The SUNA was deployed to measure nitrate at five different water bodies selected to represent a range of watershed land uses and water chemistry in the region. In situ nitrate measurements were compared to standard laboratory methods to evaluate the effectiveness of the SUNA's operation. Other optical sensors were used to measure the spectral properties of absorbance, fluorescence, and turbidity (scatter) in the same Florida water bodies. Data from these additional sensors were collected to quantify possible interferences that may affect SUNA performance. In addition, data from the SUNA and other sensors are being used to infer information about the quality and quantity of aqueous constituents besides nitrate. A better understanding of the capabilities and possible limitations of these relatively new analytical instruments will allow researchers to more effectively investigate biogeochemical processes and nutrient transport and enhance decision-making to protect our water bodies.

  12. Advancements in the safe identification of explosives using a Raman handheld instrument (ACE-ID)

    NASA Astrophysics Data System (ADS)

    Arnó, Josep; Frunzi, Michael; Kittredge, Marina; Sparano, Brian

    2014-05-01

    Raman spectroscopy is the technology of choice to identify bulk solid and liquid phase unknown samples without the need to contact the substance. Materials can be identified through transparent and semi-translucent containers such as plastic and glass. ConOps in emergency response and military field applications require the redesign of conventional laboratory units for: field portability; shock, thermal and chemical attack resistance; easy and intuitive use in restrictive gear; reduced size, weight, and power. This article introduces a new handheld instrument (ACE-IDTM) designed to take Raman technology to the next level in terms of size, safety, speed, and analytical performance. ACE-ID is ruggedized for use in severe climates and terrains. It is lightweight and can be operated with just one hand. An intuitive software interface guides users through the entire identification process, making it easy-to-use by personnel of different skill levels including military explosive ordinance disposal technicians, civilian bomb squads and hazmat teams. Through the use of embedded advanced algorithms, the instrument is capable of providing fluorescence correction and analysis of binary mixtures. Instrument calibration is performed automatically upon startup without requiring user intervention. ACE-ID incorporates an optical rastering system that diffuses the laser energy over the sample. This important innovation significantly reduces the heat induced in dark samples and the probability of ignition of susceptible explosive materials. In this article, the explosives identification performance of the instrument will be provided in addition to a quantitative evaluation of the safety improvements derived from the reduced ignition probabilities.

  13. Measurement instruments for automatically monitoring the water chemistry of reactor coolant at nuclear power stations equipped with VVER reactors. Selection of measurement instruments and experience gained from their operation at Russian and foreign NPSs

    NASA Astrophysics Data System (ADS)

    Ivanov, Yu. A.

    2007-12-01

    An analytical review is given of Russian and foreign measurement instruments employed in a system for automatically monitoring the water chemistry of the reactor coolant circuit and used in the development of projects of nuclear power stations equipped with VVER-1000 reactors and the nuclear station project AES 2006. The results of experience gained from the use of such measurement instruments at nuclear power stations operating in Russia and abroad are presented.

  14. An instrumental puzzle: the modular integration of AOLI

    NASA Astrophysics Data System (ADS)

    López, Roberto L.; Velasco, Sergio; Colodro-Conde, Carlos; Valdivia, Juan J. F.; Puga, Marta; Oscoz, Alejandro; Rebolo, Rafael; MacKay, Craig; Pérez-Garrido, Antonio; Rodríguez-Ramos, Luis Fernando; Rodríguez-Ramos, José Manuel M.; King, David; Labadie, Lucas; Muthusubramanian, Balaji; Rodríguez-Coira, Gustavo

    2016-08-01

    The Adaptive Optics Lucky Imager, AOLI, is an instrument developed to deliver the highest spatial resolution ever obtained in the visible, 20 mas, from ground-based telescopes. In AOLI a new philosophy of instrumental prototyping has been applied, based on the modularization of the subsystems. This modular concept offers maximum flexibility regarding the instrument, telescope or the addition of future developments.

  15. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence

  16. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  17. Improved hybridization of Fuzzy Analytic Hierarchy Process (FAHP) algorithm with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW)

    NASA Astrophysics Data System (ADS)

    Zaiwani, B. E.; Zarlis, M.; Efendi, S.

    2018-03-01

    In this research, the improvement of hybridization algorithm of Fuzzy Analytic Hierarchy Process (FAHP) with Fuzzy Technique for Order Preference by Similarity to Ideal Solution (FTOPSIS) in selecting the best bank chief inspector based on several qualitative and quantitative criteria with various priorities. To improve the performance of the above research, FAHP algorithm hybridization with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW) algorithm was adopted, which applied FAHP algorithm to the weighting process and SAW for the ranking process to determine the promotion of employee at a government institution. The result of improvement of the average value of Efficiency Rate (ER) is 85.24%, which means that this research has succeeded in improving the previous research that is equal to 77.82%. Keywords: Ranking and Selection, Fuzzy AHP, Fuzzy TOPSIS, FMADM-SAW.

  18. Mechanical properties of additively manufactured octagonal honeycombs.

    PubMed

    Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A

    2016-12-01

    Honeycomb structures have found numerous applications as structural and biomedical materials due to their favourable properties such as low weight, high stiffness, and porosity. Application of additive manufacturing and 3D printing techniques allows for manufacturing of honeycombs with arbitrary shape and wall thickness, opening the way for optimizing the mechanical and physical properties for specific applications. In this study, the mechanical properties of honeycomb structures with a new geometry, called octagonal honeycomb, were investigated using analytical, numerical, and experimental approaches. An additive manufacturing technique, namely fused deposition modelling, was used to fabricate the honeycomb from polylactic acid (PLA). The honeycombs structures were then mechanically tested under compression and the mechanical properties of the structures were determined. In addition, the Euler-Bernoulli and Timoshenko beam theories were used for deriving analytical relationships for elastic modulus, yield stress, Poisson's ratio, and buckling stress of this new design of honeycomb structures. Finite element models were also created to analyse the mechanical behaviour of the honeycombs computationally. The analytical solutions obtained using Timoshenko beam theory were close to computational results in terms of elastic modulus, Poisson's ratio and yield stress, especially for relative densities smaller than 25%. The analytical solutions based on the Timoshenko analytical solution and the computational results were in good agreement with experimental observations. Finally, the elastic properties of the proposed honeycomb structure were compared to those of other honeycomb structures such as square, triangular, hexagonal, mixed, diamond, and Kagome. The octagonal honeycomb showed yield stress and elastic modulus values very close to those of regular hexagonal honeycombs and lower than the other considered honeycombs. Copyright © 2016 Elsevier B.V. All rights

  19. Detecting Learning Strategies with Analytics: Links with Self-Reported Measures and Academic Performance

    ERIC Educational Resources Information Center

    Gaševic, Dragan; Jovanovic, Jelena; Pardo, Abelardo; Dawson, Shane

    2017-01-01

    The use of analytic methods for extracting learning strategies from trace data has attracted considerable attention in the literature. However, there is a paucity of research examining any association between learning strategies extracted from trace data and responses to well-established self-report instruments and performance scores. This paper…

  20. Opening Remarks for "Analytical Chemistry, Monitoring, and Environmental Fate and Transport" Session at Fluoros 2015

    EPA Science Inventory

    There have been a number of revolutionary developments during the past decade that have led to a much more comprehensive understanding of per- and polyfluoroalkyl substances (PFASs) in the environment. Improvements in analytical instrumentation have made liquid chromatography tri...

  1. Instrumental Analysis of Biodiesel Content in Commercial Diesel Blends: An Experiment for Undergraduate Analytical Chemistry

    ERIC Educational Resources Information Center

    Feng, Z. Vivian; Buchman, Joseph T.

    2012-01-01

    The potential of replacing petroleum fuels with renewable biofuels has drawn significant public interest. Many states have imposed biodiesel mandates or incentives to use commercial biodiesel blends. We present an inquiry-driven experiment where students are given the tasks to gather samples, develop analytical methods using various instrumental…

  2. Rapid Quantification of Melamine in Different Brands/Types of Milk Powders Using Standard Addition Net Analyte Signal and Near-Infrared Spectroscopy

    PubMed Central

    2016-01-01

    Multivariate calibration (MVC) and near-infrared (NIR) spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA) net analyte signal (NAS) method (SANAS) for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w) indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP) values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders. PMID:27525154

  3. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    PubMed

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  4. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory

    PubMed Central

    Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721

  5. ICDA: A Platform for Intelligent Care Delivery Analytics

    PubMed Central

    Gotz, David; Stavropoulos, Harry; Sun, Jimeng; Wang, Fei

    2012-01-01

    The identification of high-risk patients is a critical component in improving patient outcomes and managing costs. This paper describes the Intelligent Care Delivery Analytics platform (ICDA), a system which enables risk assessment analytics that process large collections of dynamic electronic medical data to identify at-risk patients. ICDA works by ingesting large volumes of data into a common data model, then orchestrating a collection of analytics that identify at-risk patients. It also provides an interactive environment through which users can access and review the analytics results. In addition, ICDA provides APIs via which analytics results can be retrieved to surface in external applications. A detailed review of ICDA’s architecture is provided. Descriptions of four use cases are included to illustrate ICDA’s application within two different data environments. These use cases showcase the system’s flexibility and exemplify the types of analytics it enables. PMID:23304296

  6. Designing new guides and instruments using McStas

    NASA Astrophysics Data System (ADS)

    Farhi, E.; Hansen, T.; Wildes, A.; Ghosh, R.; Lefmann, K.

    With the increasing complexity of modern neutron-scattering instruments, the need for powerful tools to optimize their geometry and physical performances (flux, resolution, divergence, etc.) has become essential. As the usual analytical methods reach their limit of validity in the description of fine effects, the use of Monte Carlo simulations, which can handle these latter, has become widespread. The McStas program was developed at Riso National Laboratory in order to provide neutron scattering instrument scientists with an efficient and flexible tool for building Monte Carlo simulations of guides, neutron optics and instruments [1]. To date, the McStas package has been extensively used at the Institut Laue-Langevin, Grenoble, France, for various studies including cold and thermal guides with ballistic geometry, diffractometers, triple-axis, backscattering and time-of-flight spectrometers [2]. In this paper, we present some simulation results concerning different guide geometries that may be used in the future at the Institut Laue-Langevin. Gain factors ranging from two to five may be obtained for the integrated intensities, depending on the exact geometry, the guide coatings and the source.

  7. State-of-the-art Instruments for Detecting Extraterrestrial Life

    NASA Technical Reports Server (NTRS)

    Bada, Jeffrey L.

    2003-01-01

    In the coming decades, state-of-the-art spacecraft-based instruments that can detect key components associated with life as we know it on Earth will directly search for extinct or extant extraterrestrial life in our solar system. Advances in our analytical and detection capabilities, especially those based on microscale technologies, will be important in enhancing the abilities of these instruments. Remote sensing investigations of the atmospheres of extrasolar planets could provide evidence of photosynthetic-based life outside our solar system, although less advanced life will remain undetectable by these methods. Finding evidence of extraterrestrial life would have profound consequences both with respect to our understanding of chemical and biological evolution, and whether the biochemistry on Earth is unique in the universe.

  8. [Deformation evaluation of ProTaper nickel-titanium rotary instruments in curved canals instrumentation in vitro].

    PubMed

    Yan, Hong; Ren, Min; Yin, Xing-Zhe; Zhao, Shu-Yan; Zhang, Cheng-Fei

    2008-04-01

    To evaluate the deformation of ProTaper rotary instruments using in root canals of different curvature in vitro. Extracted first mandibular molars were divided into two experimental groups according to the curvature of mesial buccal canals: group A with moderate curvature and group B with severe curvature. Only the mesial buccal canals of all these teeth were prepared with 6 sets of new ProTaper rotary instruments individually. Additionally, the control group was established with a set of new ProTaper rotary instruments. After finishing each canal preparation, the instruments accompanied by control were examined under the stereomicroscope by an inspector without knowing the group. If distortion, unwinding, abrasion or fracture occurred within one set of instruments, then the whole set was disposed. The sets without problems were in use until 30 times. After 5, 10, 20 times canal preparation, S1, F1 files without deformation under stereomicroscope were examined under scanning electron microscope (SEM). Deformation of ProTaper rotary instruments happened after 12 times in group A and after 7 times in group B. In these two experimental groups, microcracks were found increasing with the times of use under SEM in the instruments without deformation under stereomicroscope. The microcracks on tip of instruments were deep in the early use and became smoother after 10 times. Similar changes happened on knife-edge of instruments, and the microcracks extended over the edge after 20 times use. Root canals with severe curvature tend to damage ProTaper rotary instruments more frequently than moderately curved canals. ProTaper rotary instruments are appropriate to treat less than 7 root canals with severe curvature or 12 root canals with moderate curvature. Stereomicroscope is recommended to detect early damages on Ni-Ti rotary instruments, for preventing instruments fracture in clinic.

  9. Hybrid Instrumentation in Lumbar Spinal Fusion: A Biomechanical Evaluation of Three Different Instrumentation Techniques.

    PubMed

    Obid, Peter; Danyali, Reza; Kueny, Rebecca; Huber, Gerd; Reichl, Michael; Richter, Alexander; Niemeyer, Thomas; Morlock, Michael; Püschel, Klaus; Übeyli, Hüseyin

    2017-02-01

    Ex vivo human cadaveric study. The development or progression of adjacent segment disease (ASD) after spine stabilization and fusion is a major problem in spine surgery. Apart from optimal balancing of the sagittal profile, dynamic instrumentation is often suggested to prevent or impede ASD. Hybrid instrumentation is used to gain stabilization while allowing motion to avoid hypermobility in the adjacent segment. In this biomechanical study, the effects of two different hybrid instrumentations on human cadaver spines were evaluated and compared with a rigid instrumentation. Eighteen human cadaver spines (T11-L5) were subdivided into three groups: rigid, dynamic, and hook comprising six spines each. Clinical parameters and initial mechanical characteristics were consistent among groups. All specimens received rigid fixation from L3-L5 followed by application of a free bending load of extension and flexion. The range of motion (ROM) for every segment was evaluated. For the rigid group, further rigid fixation from L1-L5 was applied. A dynamic Elaspine system (Spinelab AG, Winterthur, Switzerland) was applied from L1 to L3 for the dynamic group, and the hook group was instrumented with additional laminar hooks at L1-L3. ROM was then evaluated again. There was no significant difference in ROM among the three instrumentation techniques. Based on this data, the intended advantage of a hybrid or dynamic instrumentation might not be achieved.

  10. Low-Level Analytical Methodology Updates to Support Decontaminant Performance Evaluations

    DTIC Science & Technology

    2011-06-01

    from EPDM and tire rubber coupon materials that were spiked with a known amount of the chemical agent VX, treated with bleach decontaminant, and...to evaluate the performance of bleach decontaminant on EPDM and tire rubber coupons. Dose-confirmation or Tool samples were collected by delivering...components • An aging or damaged analytical column • Dirty detector • Other factors related to general instrument and/or sample analysis performance

  11. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  12. Applications of the Analytical Electron Microscope to Materials Science

    NASA Technical Reports Server (NTRS)

    Goldstein, J. I.

    1992-01-01

    In the last 20 years, the analytical electron microscope (AEM) as allowed investigators to obtain chemical and structural information from less than 50 nanometer diameter regions in thin samples of materials and to explore problems where reactions occur at boundaries and interfaces or within small particles or phases in bulk samples. Examples of the application of the AEM to materials science problems are presented in this paper and demonstrate the usefulness and the future potential of this instrument.

  13. Analytical method for nitroaromatic explosives in radiologically contaminated soil for ISO/IEC 17025 accreditation

    DOE PAGES

    Boggess, Andrew; Crump, Stephen; Gregory, Clint; ...

    2017-12-06

    Here, unique hazards are presented in the analysis of radiologically contaminated samples. Strenuous safety and security precautions must be in place to protect the analyst, laboratory, and instrumentation used to perform analyses. A validated method has been optimized for the analysis of select nitroaromatic explosives and degradative products using gas chromatography/mass spectrometry via sonication extraction of radiologically contaminated soils, for samples requiring ISO/IEC 17025 laboratory conformance. Target analytes included 2-nitrotoluene, 4-nitrotoluene, 2,6-dinitrotoluene, and 2,4,6-trinitrotoluene, as well as the degradative product 4-amino-2,6-dinitrotoluene. Analytes were extracted from soil in methylene chloride by sonication. Administrative and engineering controls, as well as instrument automationmore » and quality control measures, were utilized to minimize potential human exposure to radiation at all times and at all stages of analysis, from receiving through disposition. Though thermal instability increased uncertainties of these selected compounds, a mean lower quantitative limit of 2.37 µg/mL and mean accuracy of 2.3% relative error and 3.1% relative standard deviation were achieved. Quadratic regression was found to be optimal for calibration of all analytes, with compounds of lower hydrophobicity displaying greater parabolic curve. Blind proficiency testing (PT) of spiked soil samples demonstrated a mean relative error of 9.8%. Matrix spiked analyses of PT samples demonstrated that 99% recovery of target analytes was achieved. To the knowledge of the authors, this represents the first safe, accurate, and reproducible quantitative method for nitroaromatic explosives in soil for specific use on radiologically contaminated samples within the constraints of a nuclear analytical lab.« less

  14. Analytical method for nitroaromatic explosives in radiologically contaminated soil for ISO/IEC 17025 accreditation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boggess, Andrew; Crump, Stephen; Gregory, Clint

    Here, unique hazards are presented in the analysis of radiologically contaminated samples. Strenuous safety and security precautions must be in place to protect the analyst, laboratory, and instrumentation used to perform analyses. A validated method has been optimized for the analysis of select nitroaromatic explosives and degradative products using gas chromatography/mass spectrometry via sonication extraction of radiologically contaminated soils, for samples requiring ISO/IEC 17025 laboratory conformance. Target analytes included 2-nitrotoluene, 4-nitrotoluene, 2,6-dinitrotoluene, and 2,4,6-trinitrotoluene, as well as the degradative product 4-amino-2,6-dinitrotoluene. Analytes were extracted from soil in methylene chloride by sonication. Administrative and engineering controls, as well as instrument automationmore » and quality control measures, were utilized to minimize potential human exposure to radiation at all times and at all stages of analysis, from receiving through disposition. Though thermal instability increased uncertainties of these selected compounds, a mean lower quantitative limit of 2.37 µg/mL and mean accuracy of 2.3% relative error and 3.1% relative standard deviation were achieved. Quadratic regression was found to be optimal for calibration of all analytes, with compounds of lower hydrophobicity displaying greater parabolic curve. Blind proficiency testing (PT) of spiked soil samples demonstrated a mean relative error of 9.8%. Matrix spiked analyses of PT samples demonstrated that 99% recovery of target analytes was achieved. To the knowledge of the authors, this represents the first safe, accurate, and reproducible quantitative method for nitroaromatic explosives in soil for specific use on radiologically contaminated samples within the constraints of a nuclear analytical lab.« less

  15. A Review of Calibration Transfer Practices and Instrument Differences in Spectroscopy.

    PubMed

    Workman, Jerome J

    2018-03-01

    Calibration transfer for use with spectroscopic instruments, particularly for near-infrared, infrared, and Raman analysis, has been the subject of multiple articles, research papers, book chapters, and technical reviews. There has been a myriad of approaches published and claims made for resolving the problems associated with transferring calibrations; however, the capability of attaining identical results over time from two or more instruments using an identical calibration still eludes technologists. Calibration transfer, in a precise definition, refers to a series of analytical approaches or chemometric techniques used to attempt to apply a single spectral database, and the calibration model developed using that database, for two or more instruments, with statistically retained accuracy and precision. Ideally, one would develop a single calibration for any particular application, and move it indiscriminately across instruments and achieve identical analysis or prediction results. There are many technical aspects involved in such precision calibration transfer, related to the measuring instrument reproducibility and repeatability, the reference chemical values used for the calibration, the multivariate mathematics used for calibration, and sample presentation repeatability and reproducibility. Ideally, a multivariate model developed on a single instrument would provide a statistically identical analysis when used on other instruments following transfer. This paper reviews common calibration transfer techniques, mostly related to instrument differences, and the mathematics of the uncertainty between instruments when making spectroscopic measurements of identical samples. It does not specifically address calibration maintenance or reference laboratory differences.

  16. Evaluation of a reconfigurable portable instrument for copper determination based on luminescent carbon dots.

    PubMed

    Salinas-Castillo, Alfonso; Morales, Diego P; Lapresta-Fernández, Alejandro; Ariza-Avidad, María; Castillo, Encarnación; Martínez-Olmos, Antonio; Palma, Alberto J; Capitan-Vallvey, Luis Fermin

    2016-04-01

    A portable reconfigurable platform for copper (Cu(II)) determination based on luminescent carbon dot (Cdots) quenching is described. The electronic setup consists of a light-emitting diode (LED) as the carbon dot optical exciter and a photodiode as a light-to-current converter integrated in the same instrument. Moreover, the overall analog conditioning is simply performed with one integrated solution, a field-programmable analog array (FPAA), which makes it possible to reconfigure the filter and gain stages in real time. This feature provides adaptability to use the platform as an analytical probe for carbon dots coming from different batches with some variations in luminescence characteristics. The calibration functions obtained that fit a modified Stern-Volmer equation were obtained using luminescence signals from Cdots quenching by Cu(II). The analytical applicability of the reconfigurable portable instrument for Cu(II) using Cdots has been successfully demonstrated in tap water analysis.

  17. INVESTIGATION OF RESPONSE DIFFERENCES BETWEEN DIFFERENT TYPES OF TOTAL ORGANIC CARBON (TOC) ANALYTICAL INSTRUMENT SYSTEMS

    EPA Science Inventory

    Total organic carbon (TOC) and dissolved organic carbon (DOC) have long been used to estimate the amount of natural organic matter (NOM) found in raw and finished drinking water. In recent years, computer automation and improved instrumental analysis technologies have created a ...

  18. Gyroscopic Instruments for Instrument Flying

    NASA Technical Reports Server (NTRS)

    Brombacher, W G; Trent, W C

    1938-01-01

    The gyroscopic instruments commonly used in instrument flying in the United States are the turn indicator, the directional gyro, the gyromagnetic compass, the gyroscopic horizon, and the automatic pilot. These instruments are described. Performance data and the method of testing in the laboratory are given for the turn indicator, the directional gyro, and the gyroscopic horizon. Apparatus for driving the instruments is discussed.

  19. Meta-analytic guidelines for evaluating single-item reliabilities of personality instruments.

    PubMed

    Spörrle, Matthias; Bekk, Magdalena

    2014-06-01

    Personality is an important predictor of various outcomes in many social science disciplines. However, when personality traits are not the principal focus of research, for example, in global comparative surveys, it is often not possible to assess them extensively. In this article, we first provide an overview of the advantages and challenges of single-item measures of personality, a rationale for their construction, and a summary of alternative ways of assessing their reliability. Second, using seven diverse samples (Ntotal = 4,263) we develop the SIMP-G, the German adaptation of the Single-Item Measures of Personality, an instrument assessing the Big Five with one item per trait, and evaluate its validity and reliability. Third, we integrate previous research and our data into a first meta-analysis of single-item reliabilities of personality measures, and provide researchers with guidelines and recommendations for the evaluation of single-item reliabilities. © The Author(s) 2013.

  20. Paper-based analytical devices for environmental analysis.

    PubMed

    Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S

    2016-03-21

    The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.

  1. Infrared Instrument for Detecting Hydrogen Fires

    NASA Technical Reports Server (NTRS)

    Youngquist, Robert; Ihlefeld, Curtis; Immer, Christopher; Oostdyk, Rebecca; Cox, Robert; Taylor, John

    2006-01-01

    The figure shows an instrument incorporating an infrared camera for detecting small hydrogen fires. The instrument has been developed as an improved replacement for prior infrared and ultraviolet instruments used to detect hydrogen fires. The need for this or any such instrument arises because hydrogen fires (e.g., those associated with leaks from tanks, valves, and ducts) pose a great danger, yet they emit so little visible light that they are mostly undetectable by the unaided human eye. The main performance advantage offered by the present instrument over prior hydrogen-fire-detecting instruments lies in its greater ability to avoid false alarms by discriminating against reflected infrared light, including that originating in (1) the Sun, (2) welding torches, and (3) deliberately ignited hydrogen flames (e.g., ullage-burn-off flames) that are nearby but outside the field of view intended to be monitored by the instrument. Like prior such instruments, this instrument is based mostly on the principle of detecting infrared emission above a threshold level. However, in addition, this instrument utilizes information on the spatial distribution of infrared light from a source that it detects. Because the combination of spatial and threshold information about a flame tends to constitute a unique signature that differs from that of reflected infrared light originating in a source not in the field of view, the incidence of false alarms is reduced substantially below that of related prior threshold- based instruments.

  2. Biological Matrix Effects in Quantitative Tandem Mass Spectrometry-Based Analytical Methods: Advancing Biomonitoring

    PubMed Central

    Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd

    2015-01-01

    The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585

  3. Design and validation of instruments to measure knowledge.

    PubMed

    Elliott, T E; Regal, R R; Elliott, B A; Renier, C M

    2001-01-01

    Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.

  4. Second-order advantage obtained from standard addition first-order instrumental data and multivariate curve resolution-alternating least squares. Calculation of the feasible bands of results.

    PubMed

    Mohseni, Naimeh; Bahram, Morteza; Olivieri, Alejandro C

    2014-03-25

    In order to achieve the second-order advantage, second-order data per sample is usually required, e.g., kinetic-spectrophotometric data. In this study, instead of monitoring the time evolution of spectra (and collecting the kinetic-spectrophotometric data) replicate spectra are used to build a virtual second order data. This data matrix (replicate mode×λ) is rank deficient. Augmentation of these data with standard addition data [or standard sample(s)] will break the rank deficiency, making the quantification of the analyte of interest possible. The MCR-ALS algorithm was applied for the resolution and quantitation of the analyte in both simulated and experimental data sets. In order to evaluate the rotational ambiguity in the retrieved solutions, the MCR-BANDS algorithm was employed. It has been shown that the reliability of the quantitative results significantly depends on the amount of spectral overlap in the spectral region of occurrence of the compound of interest and the remaining constituent(s). Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Aquatic concentrations of chemical analytes compared to ...

    EPA Pesticide Factsheets

    We describe screening level estimates of potential aquatic toxicity posed by 227 chemical analytes that were measured in 25 ambient water samples collected as part of a joint USGS/USEPA drinking water plant study. Measured concentrations were compared to biological effect concentration (EC) estimates, including USEPA aquatic life criteria, effective plasma concentrations of pharmaceuticals, published toxicity data summarized in the USEPA ECOTOX database, and chemical structure-based predictions. Potential dietary exposures were estimated using a generic 3-tiered food web accumulation scenario. For many analytes, few or no measured effect data were found, and for some analytes, reporting limits exceeded EC estimates, limiting the scope of conclusions. Results suggest occasional occurrence above ECs for copper, aluminum, strontium, lead, uranium, and nitrate. Sparse effect data for manganese, antimony, and vanadium suggest that these analytes may occur above ECs, but additional effect data would be desirable to corroborate EC estimates. These conclusions were not affected by bioaccumulation estimates. No organic analyte concentrations were found to exceed EC estimates, but ten analytes had concentrations in excess of 1/10th of their respective EC: triclocarban, norverapamil, progesterone, atrazine, metolachlor, triclosan, para-nonylphenol, ibuprofen, venlafaxine, and amitriptyline, suggesting more detailed characterization of these analytes. Purpose: to provide sc

  6. Conversion of multiple analyte cation types to a single analyte anion type via ion/ion charge inversion.

    PubMed

    Hassell, Kerry M; LeBlanc, Yves; McLuckey, Scott A

    2009-11-01

    Charge inversion ion/ion reactions can convert several cation types associated with a single analyte molecule to a single anion type for subsequent mass analysis. Specifically, analyte ions present with one of a variety of cationizing agents, such as an excess proton, excess sodium ion, or excess potassium ion, can all be converted to the deprotonated molecule, provided that a stable anion can be generated for the analyte. Multiply deprotonated species that are capable of exchanging a proton for a metal ion serve as the reagent anions for the reaction. This process is demonstrated here for warfarin and for a glutathione conjugate. Examples for several other glutathione conjugates are provided as supplementary material to demonstrate the generality of the reaction. In the case of glutathione conjugates, multiple metal ions can be associated with the singly-charged analyte due to the presence of two carboxylate groups. The charge inversion reaction involves the removal of the excess cationizing agent, as well as any metal ions associated with anionic groups to yield a singly deprotonated analyte molecule. The ability to convert multiple cation types to a single anion type is analytically desirable in cases in which the analyte signal is distributed among several cation types, as is common in the electrospray ionization of solutions with relatively high salt contents. For analyte species that undergo efficient charge inversion, such as glutathione conjugates, there is the additional potential advantage for significantly improved signal-to-noise ratios when species that give rise to 'chemical noise' in the positive ion spectrum do not undergo efficient charge inversion.

  7. ATR NSUF Instrumentation Enhancement Efforts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joy L. Rempe; Mitchell K. Meyer; Darrell L. Knudson

    A key component of the Advanced Test Reactor (ATR) National Scientific User Facility (NSUF) effort is to expand instrumentation available to users conducting irradiation tests in this unique facility. In particular, development of sensors capable of providing real-time measurements of key irradiation parameters is emphasized because of their potential to increase data fidelity and reduce posttest examination costs. This paper describes the strategy for identifying new instrumentation needed for ATR irradiations and the program underway to develop and evaluate new sensors to address these needs. Accomplishments from this program are illustrated by describing new sensors now available to users ofmore » the ATR NSUF. In addition, progress is reported on current research efforts to provide improved in-pile instrumentation to users.« less

  8. Approach and Instrument Placement Validation

    NASA Technical Reports Server (NTRS)

    Ator, Danielle

    2005-01-01

    The Mars Exploration Rovers (MER) from the 2003 flight mission represents the state of the art technology for target approach and instrument placement on Mars. It currently takes 3 sols (Martian days) for the rover to place an instrument on a designated rock target that is about 10 to 20 m away. The objective of this project is to provide an experimentally validated single-sol instrument placement capability to future Mars missions. After completing numerous test runs on the Rocky8 rover under various test conditions, it has been observed that lighting conditions, shadow effects, target features and the initial target distance have an effect on the performance and reliability of the tracking software. Additional software validation testing will be conducted in the months to come.

  9. GMI Instrument Spin Balance Method, Optimization, Calibration, and Test

    NASA Technical Reports Server (NTRS)

    Ayari, Laoucet; Kubitschek, Michael; Ashton, Gunnar; Johnston, Steve; Debevec, Dave; Newell, David; Pellicciotti, Joseph

    2014-01-01

    The Global Microwave Imager (GMI) instrument must spin at a constant rate of 32 rpm continuously for the 3 year mission life. Therefore, GMI must be very precisely balanced about the spin axis and CG to maintain stable scan pointing and to minimize disturbances imparted to the spacecraft and attitude control on-orbit. The GMI instrument is part of the core Global Precipitation Measurement (GPM) spacecraft and is used to make calibrated radiometric measurements at multiple microwave frequencies and polarizations. The GPM mission is an international effort managed by the National Aeronautics and Space Administration (NASA) to improve climate, weather, and hydro-meteorological predictions through more accurate and frequent precipitation measurements. Ball Aerospace and Technologies Corporation (BATC) was selected by NASA Goddard Space Flight Center to design, build, and test the GMI instrument. The GMI design has to meet a challenging set of spin balance requirements and had to be brought into simultaneous static and dynamic spin balance after the entire instrument was already assembled and before environmental tests began. The focus of this contribution is on the analytical and test activities undertaken to meet the challenging spin balance requirements of the GMI instrument. The novel process of measuring the residual static and dynamic imbalances with a very high level of accuracy and precision is presented together with the prediction of the optimal balance masses and their locations.

  10. GMI Instrument Spin Balance Method, Optimization, Calibration and Test

    NASA Technical Reports Server (NTRS)

    Ayari, Laoucet; Kubitschek, Michael; Ashton, Gunnar; Johnston, Steve; Debevec, Dave; Newell, David; Pellicciotti, Joseph

    2014-01-01

    The Global Microwave Imager (GMI) instrument must spin at a constant rate of 32 rpm continuously for the 3-year mission life. Therefore, GMI must be very precisely balanced about the spin axis and center of gravity (CG) to maintain stable scan pointing and to minimize disturbances imparted to the spacecraft and attitude control on-orbit. The GMI instrument is part of the core Global Precipitation Measurement (GPM) spacecraft and is used to make calibrated radiometric measurements at multiple microwave frequencies and polarizations. The GPM mission is an international effort managed by the National Aeronautics and Space Administration (NASA) to improve climate, weather, and hydro-meteorological predictions through more accurate and frequent precipitation measurements. Ball Aerospace and Technologies Corporation (BATC) was selected by NASA Goddard Space Flight Center to design, build, and test the GMI instrument. The GMI design has to meet a challenging set of spin balance requirements and had to be brought into simultaneous static and dynamic spin balance after the entire instrument was already assembled and before environmental tests began. The focus of this contribution is on the analytical and test activities undertaken to meet the challenging spin balance requirements of the GMI instrument. The novel process of measuring the residual static and dynamic imbalances with a very high level of accuracy and precision is presented together with the prediction of the optimal balance masses and their locations.

  11. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  12. Annual banned-substance review: Analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans

    2018-01-01

    Several high-profile revelations concerning anti-doping rule violations over the past 12 months have outlined the importance of tackling prevailing challenges and reducing the limitations of the current anti-doping system. At this time, the necessity to enhance, expand, and improve analytical test methods in response to the substances outlined in the World Anti-Doping Agency's (WADA) Prohibited List represents an increasingly crucial task for modern sports drug-testing programs. The ability to improve analytical testing methods often relies on the expedient application of novel information regarding superior target analytes for sports drug-testing assays, drug elimination profiles, alternative test matrices, together with recent advances in instrumental developments. This annual banned-substance review evaluates literature published between October 2016 and September 2017 offering an in-depth evaluation of developments in these arenas and their potential application to substances reported in WADA's 2017 Prohibited List. Copyright © 2017 John Wiley & Sons, Ltd.

  13. 14 CFR Appendix C to Part 141 - Instrument Rating Course

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Instrument Rating Course C Appendix C to... Rating Course 1. Applicability. This appendix prescribes the minimum curriculum for an instrument rating course and an additional instrument rating course, required under this part, for the following ratings...

  14. 14 CFR Appendix C to Part 141 - Instrument Rating Course

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Instrument Rating Course C Appendix C to... Rating Course 1. Applicability. This appendix prescribes the minimum curriculum for an instrument rating course and an additional instrument rating course, required under this part, for the following ratings...

  15. 14 CFR Appendix C to Part 141 - Instrument Rating Course

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Instrument Rating Course C Appendix C to... Rating Course 1. Applicability. This appendix prescribes the minimum curriculum for an instrument rating course and an additional instrument rating course, required under this part, for the following ratings...

  16. Impulsive-Analytic Disposition in Mathematical Problem Solving: A Survey and a Mathematics Test

    ERIC Educational Resources Information Center

    Lim, Kien H.; Wagler, Amy

    2012-01-01

    The Likelihood-to-Act (LtA) survey and a mathematics test were used in this study to assess students' impulsive-analytic disposition in the context of mathematical problem solving. The results obtained from these two instruments were compared to those obtained using two widely-used scales: Need for Cognition (NFC) and Barratt Impulsivity Scale…

  17. Statistical Segmentation of Surgical Instruments in 3D Ultrasound Images

    PubMed Central

    Linguraru, Marius George; Vasilyev, Nikolay V.; Del Nido, Pedro J.; Howe, Robert D.

    2008-01-01

    The recent development of real-time 3D ultrasound enables intracardiac beating heart procedures, but the distorted appearance of surgical instruments is a major challenge to surgeons. In addition, tissue and instruments have similar gray levels in US images and the interface between instruments and tissue is poorly defined. We present an algorithm that automatically estimates instrument location in intracardiac procedures. Expert-segmented images are used to initialize the statistical distributions of blood, tissue and instruments. Voxels are labeled through an iterative expectation-maximization algorithm using information from the neighboring voxels through a smoothing kernel. Once the three classes of voxels are separated, additional neighboring information is combined with the known shape characteristics of instruments in order to correct for misclassifications. We analyze the major axis of segmented data through their principal components and refine the results by a watershed transform, which corrects the results at the contact between instrument and tissue. We present results on 3D in-vitro data from a tank trial, and 3D in-vivo data from cardiac interventions on porcine beating hearts, using instruments of four types of materials. The comparison of algorithm results to expert-annotated images shows the correct segmentation and position of the instrument shaft. PMID:17521802

  18. Using instrumental variables to estimate a Cox's proportional hazards regression subject to additive confounding

    PubMed Central

    Tosteson, Tor D.; Morden, Nancy E.; Stukel, Therese A.; O'Malley, A. James

    2014-01-01

    The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival. PMID:25506259

  19. Using instrumental variables to estimate a Cox's proportional hazards regression subject to additive confounding.

    PubMed

    MacKenzie, Todd A; Tosteson, Tor D; Morden, Nancy E; Stukel, Therese A; O'Malley, A James

    2014-06-01

    The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival.

  20. Utility perspective on USEPA analytical methods program redirection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, B.; Davis, M.K.; Krasner, S.W.

    1996-11-01

    The Metropolitan Water District of Southern California (Metropolitan) is a public, municipal corporation, created by the State of California, which wholesales supplemental water trough 27 member agencies (cities and water districts). Metropolitan serves nearly 16 million people in an area along the coastal plain of Southern California that covers approximately 5200 square miles. Water deliveries have averaged up to 2.5 million acre-feet per year. Metropolitan`s Water Quality Laboratory (WQL) conducts compliance monitoring of its source and finished drinking waters for chemical and microbial constituents. The laboratory maintains certification of a large number and variety of analytical procedures. The WQL operatesmore » in a 17,000-square-foot facility. The equipment is state-of-the-art analytical instrumentation. The staff consists of 40 professional chemists and microbiologists whose experience and expertise are extensive and often highly specialized. The staff turnover is very low, and the laboratory is consistently, efficiently, and expertly run.« less

  1. Analytical capillary isotachophoresis after 50 years of development: Recent progress 2014-2016.

    PubMed

    Malá, Zdena; Gebauer, Petr; Boček, Petr

    2017-01-01

    This review brings a survey of papers on analytical ITP published since 2014 until the first quarter of 2016. The 50th anniversary of ITP as a modern analytical method offers the opportunity to present a brief view on its beginnings and to discuss the present state of the art from the viewpoint of the history of its development. Reviewed papers from the field of theory and principles confirm the continuing importance of computer simulations in the discovery of new and unexpected phenomena. The strongly developing field of instrumentation and techniques shows novel channel methodologies including use of porous media and new on-chip assays, where ITP is often included in a preseparative or even preparative function. A number of new analytical applications are reported, with ITP appearing almost exclusively in combination with other principles and methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Guided-inquiry laboratory experiments to improve students' analytical thinking skills

    NASA Astrophysics Data System (ADS)

    Wahyuni, Tutik S.; Analita, Rizki N.

    2017-12-01

    This study aims to improve the experiment implementation quality and analytical thinking skills of undergraduate students through guided-inquiry laboratory experiments. This study was a classroom action research conducted in three cycles. The study has been carried out with 38 undergraduate students of the second semester of Biology Education Department of State Islamic Institute (SII) of Tulungagung, as a part of Chemistry for Biology course. The research instruments were lesson plans, learning observation sheets and undergraduate students' experimental procedure. Research data were analyzed using quantitative-descriptive method. The increasing of analytical thinking skills could be measured using gain score normalized and statistical paired t-test. The results showed that guided-inquiry laboratory experiments model was able to improve both the experiment implementation quality and the analytical thinking skills. N-gain score of the analytical thinking skills was increased, in spite of just 0.03 with low increase category, indicated by experimental reports. Some of undergraduate students have had the difficulties in detecting the relation of one part to another and to an overall structure. The findings suggested that giving feedback the procedural knowledge and experimental reports were important. Revising the experimental procedure that completed by some scaffolding questions were also needed.

  3. Toward improved understanding and control in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hieftje, Gary M.

    1989-01-01

    As with most papers which attempt to predict the future, this treatment will begin with a coverage of past events. It will be shown that progress in the field of analytical atomic spectrometry has occurred through a series of steps which involve the addition of new techniques and the occasional displacement of established ones. Because it is difficult or impossible to presage true breakthroughs, this manuscript will focus on how such existing methods can be modified or improved to greatest advantage. The thesis will be that rational improvement can be accomplished most effectively by understanding fundamentally the nature of an instrumental system, a measurement process, and a spectrometric technique. In turn, this enhanced understanding can lead to closer control, from which can spring improved performance. Areas where understanding is now lacking and where control is most greatly needed will be identified and a possible scheme for implementing control procedures will be outlined. As we draw toward the new millennium, these novel procedures seem particularly appealing; new high-speed computers, the availability of expert systems, and our enhanced understanding of atomic spectrometric events combine to make future prospects extremely bright.

  4. Instrument performance enhancement and modification through an extended instrument paradigm

    NASA Astrophysics Data System (ADS)

    Mahan, Stephen Lee

    An extended instrument paradigm is proposed, developed and shown in various applications. The CBM (Chin, Blass, Mahan) method is an extension to the linear systems model of observing systems. In the most obvious and practical application of image enhancement of an instrument characterized by a time-invariant instrumental response function, CBM can be used to enhance images or spectra through a simple convolution application of the CBM filter for a resolution improvement of as much as a factor of two. The CBM method can be used in many applications. We discuss several within this work including imaging through turbulent atmospheres, or what we've called Adaptive Imaging. Adaptive Imaging provides an alternative approach for the investigator desiring results similar to those obtainable with adaptive optics, however on a minimal budget. The CBM method is also used in a backprojected filtered image reconstruction method for Positron Emission Tomography. In addition, we can use information theoretic methods to aid in the determination of model instrumental response function parameters for images having an unknown origin. Another application presented herein involves the use of the CBM method for the determination of the continuum level of a Fourier transform spectrometer observation of ethylene, which provides a means for obtaining reliable intensity measurements in an automated manner. We also present the application of CBM to hyperspectral image data of the comet Shoemaker-Levy 9 impact with Jupiter taken with an acousto-optical tunable filter equipped CCD camera to an adaptive optics telescope.

  5. Developing Guidelines for Assessing Visual Analytics Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    2011-07-01

    In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domainsmore » and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.« less

  6. Using a Systematic Approach to Develop a Chemistry Course Introducing Students to Instrumental Analysis

    ERIC Educational Resources Information Center

    Shen, Hao-Yu; Shen, Bo; Hardacre, Christopher

    2013-01-01

    A systematic approach to develop the teaching of instrumental analytical chemistry is discussed, as well as a conceptual framework for organizing and executing lectures and a laboratory course. Three main components are used in this course: theoretical knowledge developed in the classroom, simulations via a virtual laboratory, and practical…

  7. Contribution of Electrochemistry to the Biomedical and Pharmaceutical Analytical Sciences.

    PubMed

    Kauffmann, Jean-Michel; Patris, Stephanie; Vandeput, Marie; Sarakbi, Ahmad; Sakira, Abdul Karim

    2016-01-01

    All analytical techniques have experienced major progress since the last ten years and electroanalysis is also involved in this trend. The unique characteristics of phenomena occurring at the electrode-solution interface along with the variety of electrochemical methods currently available allow for a broad spectrum of applications. Potentiometric, conductometric, voltammetric and amperometric methods are briefly reviewed with a critical view in terms of performance of the developed instrumentation with special emphasis on pharmaceutical and biomedical applications.

  8. The work of the European Union Reference Laboratory for Food Additives (EURL) and its support for the authorisation process of feed additives in the European Union: a review

    PubMed Central

    von Holst, Christoph; Robouch, Piotr; Bellorini, Stefano; de la Huebra, María José González; Ezerskis, Zigmas

    2016-01-01

    ABSTRACT This paper describes the operation of the European Union Reference Laboratory for Feed Additives (EURL) and its role in the authorisation procedure of feed additives in the European Union. Feed additives are authorised according to Regulation (EC) No. 1831/2003, which introduced a completely revised authorisation procedure and also established the EURL. The regulations authorising feed additives contain conditions of use such as legal limits of the feed additives, which require the availability of a suitable method of analysis for official control purposes under real world conditions. It is the task of the EURL to evaluate the suitability of analytical methods as proposed by the industry for this purpose. Moreover, the paper shows that one of the major challenges is the huge variety of the methodology applied in feed additive analysis, thus requiring expertise in quite different analytical areas. In order to cope with this challenge, the EURL is supported by a network of national reference laboratories (NRLs) and only the merged knowledge of all NRLs allows for a scientifically sound assessment of the analytical methods. PMID:26540604

  9. Realistic Analytical Polyhedral MRI Phantoms

    PubMed Central

    Ngo, Tri M.; Fung, George S. K.; Han, Shuo; Chen, Min; Prince, Jerry L.; Tsui, Benjamin M. W.; McVeigh, Elliot R.; Herzka, Daniel A.

    2015-01-01

    Purpose Analytical phantoms have closed form Fourier transform expressions and are used to simulate MRI acquisitions. Existing 3D analytical phantoms are unable to accurately model shapes of biomedical interest. It is demonstrated that polyhedral analytical phantoms have closed form Fourier transform expressions and can accurately represent 3D biomedical shapes. Theory The derivations of the Fourier transform of a polygon and polyhedron are presented. Methods The Fourier transform of a polyhedron was implemented and its accuracy in representing faceted and smooth surfaces was characterized. Realistic anthropomorphic polyhedral brain and torso phantoms were constructed and their use in simulated 3D/2D MRI acquisitions was described. Results Using polyhedra, the Fourier transform of faceted shapes can be computed to within machine precision. Smooth surfaces can be approximated with increasing accuracy by increasing the number of facets in the polyhedron; the additional accumulated numerical imprecision of the Fourier transform of polyhedra with many faces remained small. Simulations of 3D/2D brain and 2D torso cine acquisitions produced realistic reconstructions free of high frequency edge aliasing as compared to equivalent voxelized/rasterized phantoms. Conclusion Analytical polyhedral phantoms are easy to construct and can accurately simulate shapes of biomedical interest. PMID:26479724

  10. ANALYTICAL CHEMISTRY DIVISION ANNUAL PROGRESS REPORT FOR PERIOD ENDING DECEMBER 31, 1961

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1962-02-01

    Research and development progress is reported on analytlcal instrumentation, dlssolver-solution analyses, special research problems, reactor projects analyses, x-ray and spectrochemical analyses, mass spectrometry, optical and electron microscopy, radiochemical analyses, nuclear analyses, inorganic preparations, organic preparations, ionic analyses, infrared spectral studies, anodization of sector coils for the Analog II Cyclotron, quality control, process analyses, and the Thermal Breeder Reactor Projects Analytical Chemistry Laboratory. (M.C.G.)

  11. Analytical display design for flight tasks conducted under instrument meteorological conditions. [human factors engineering of pilot performance for display device design in instrument landing systems

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1976-01-01

    Paramount to proper utilization of electronic displays is a method for determining pilot-centered display requirements. Display design should be viewed fundamentally as a guidance and control problem which has interactions with the designer's knowledge of human psychomotor activity. From this standpoint, reliable analytical models of human pilots as information processors and controllers can provide valuable insight into the display design process. A relatively straightforward, nearly algorithmic procedure for deriving model-based, pilot-centered display requirements was developed and is presented. The optimal or control theoretic pilot model serves as the backbone of the design methodology, which is specifically directed toward the synthesis of head-down, electronic, cockpit display formats. Some novel applications of the optimal pilot model are discussed. An analytical design example is offered which defines a format for the electronic display to be used in a UH-1H helicopter in a landing approach task involving longitudinal and lateral degrees of freedom.

  12. The Assessment Journey: Defining and Refining Instrument Dilemmas

    ERIC Educational Resources Information Center

    Gallo, Ann Marie; Carr, Michael T.; Gallo, Joseph A.

    2010-01-01

    Designing and implementing assessment instruments are ongoing challenges for physical educators. The initial design phase includes asking the question, "What do I want the students to know and/or be able to do?" Seemingly, the question is direct and should lead to a concrete answer. In addition to designing assessment instruments, one of the most…

  13. Comparison of thermal analytic model with experimental test results for 30-sentimeter-diameter engineering model mercury ion thruster

    NASA Technical Reports Server (NTRS)

    Oglebay, J. C.

    1977-01-01

    A thermal analytic model for a 30-cm engineering model mercury-ion thruster was developed and calibrated using the experimental test results of tests of a pre-engineering model 30-cm thruster. A series of tests, performed later, simulated a wide range of thermal environments on an operating 30-cm engineering model thruster, which was instrumented to measure the temperature distribution within it. The modified analytic model is described and analytic and experimental results compared for various operating conditions. Based on the comparisons, it is concluded that the analytic model can be used as a preliminary design tool to predict thruster steady-state temperature distributions for stage and mission studies and to define the thermal interface bewteen the thruster and other elements of a spacecraft.

  14. THE COST-EFFECTIVENESS OF ALTERNATIVE INSTRUMENTS FOR ENVIRONMENTAL PROTECTION IN A SECOND-BEST SETTING. (R825313)

    EPA Science Inventory

    Abstract

    This paper employs analytical and numerical general equilibrium models to examine the significance of pre-existing factor taxes for the costs of pollution reduction under a wide range of environmental policy instruments. Pre-existing taxes imply significantly ...

  15. New software solutions for analytical spectroscopists

    NASA Astrophysics Data System (ADS)

    Davies, Antony N.

    1999-05-01

    Analytical spectroscopists must be computer literate to effectively carry out the tasks assigned to them. This has often been resisted within organizations with insufficient funds to equip their staff properly, a lack of desire to deliver the essential training and a basic resistance amongst staff to learn the new techniques required for computer assisted analysis. In the past these problems were compounded by seriously flawed software which was being sold for spectroscopic applications. Owing to the limited market for such complex products the analytical spectroscopist often was faced with buying incomplete and unstable tools if the price was to remain reasonable. Long product lead times meant spectrometer manufacturers often ended up offering systems running under outdated and sometimes obscure operating systems. Not only did this mean special staff training for each instrument where the knowledge gained on one system could not be transferred to the neighbouring system but these spectrometers were often only capable of running in a stand-alone mode, cut-off from the rest of the laboratory environment. Fortunately a number of developments in recent years have substantially changed this depressing picture. A true multi-tasking operating system with a simple graphical user interface, Microsoft Windows NT4, has now been widely introduced into the spectroscopic computing environment which has provided a desktop operating system which has proved to be more stable and robust as well as requiring better programming techniques of software vendors. The opening up of the Internet has provided an easy way to access new tools for data handling and has forced a substantial re-think about results delivery (for example Chemical MIME types, IUPAC spectroscopic data exchange standards). Improved computing power and cheaper hardware now allows large spectroscopic data sets to be handled without too many problems. This includes the ability to carry out chemometric operations in

  16. Semi-empirical and phenomenological instrument functions for the scanning tunneling microscope

    NASA Astrophysics Data System (ADS)

    Feuchtwang, T. E.; Cutler, P. H.; Notea, A.

    1988-08-01

    Recent progress in the development of a convenient algorithm for the determination of a quantitative local density of states (LDOS) of the sample, from data measured in the STM, is reviewd. It is argued that the sample LDOS strikes a good balance between the information content of a surface characteristic and effort required to obtain it experimentally. Hence, procedures to determine the sample LDOS as directly and as tip-model independently as possible are emphasized. The solution of the STM's "inverse" problem in terms of novel versions of the instrument (or Green) function technique is considered in preference to the well known, more direct solutions. Two types of instrument functions are considered: Approximations of the basic tip-instrument function obtained from the transfer Hamiltonian theory of the STM-STS. And, phenomenological instrument functions devised as a systematic scheme for semi-empirical first order corrections of "ideal" models. The instrument function, in this case, describes the corrections as the response of an independent component of the measuring apparatus inserted between the "ideal" instrument and the measured data. This linear response theory of measurement is reviewed and applied. A procedure for the estimation of the consistency of the model and the systematic errors due to the use of an approximate instrument function is presented. The independence of the instrument function techniques from explicit microscopic models of the tip is noted. The need for semi-empirical, as opposed to strictly empirical or analytical determination of the instrument function is discussed. The extension of the theory to the scanning tunneling spectrometer is noted, as well as its use in a theory of resolution.

  17. Analyte-Responsive Hydrogels: Intelligent Materials for Biosensing and Drug Delivery.

    PubMed

    Culver, Heidi R; Clegg, John R; Peppas, Nicholas A

    2017-02-21

    Nature has mastered the art of molecular recognition. For example, using synergistic non-covalent interactions, proteins can distinguish between molecules and bind a partner with incredible affinity and specificity. Scientists have developed, and continue to develop, techniques to investigate and better understand molecular recognition. As a consequence, analyte-responsive hydrogels that mimic these recognitive processes have emerged as a class of intelligent materials. These materials are unique not only in the type of analyte to which they respond but also in how molecular recognition is achieved and how the hydrogel responds to the analyte. Traditional intelligent hydrogels can respond to environmental cues such as pH, temperature, and ionic strength. The functional monomers used to make these hydrogels can be varied to achieve responsive behavior. For analyte-responsive hydrogels, molecular recognition can also be achieved by incorporating biomolecules with inherent molecular recognition properties (e.g., nucleic acids, peptides, enzymes, etc.) into the polymer network. Furthermore, in addition to typical swelling/syneresis responses, these materials exhibit unique responsive behaviors, such as gel assembly or disassembly, upon interaction with the target analyte. With the diverse tools available for molecular recognition and the ability to generate unique responsive behaviors, analyte-responsive hydrogels have found great utility in a wide range of applications. In this Account, we discuss strategies for making four different classes of analyte-responsive hydrogels, specifically, non-imprinted, molecularly imprinted, biomolecule-containing, and enzymatically responsive hydrogels. Then we explore how these materials have been incorporated into sensors and drug delivery systems, highlighting examples that demonstrate the versatility of these materials. For example, in addition to the molecular recognition properties of analyte-responsive hydrogels, the

  18. An alpha particle instrument with alpha, proton, and X-ray modes for planetary chemical analyses

    NASA Technical Reports Server (NTRS)

    Economou, T. E.; Turkevich, A. L.

    1976-01-01

    The interaction of alpha particles with matter is employed in a compact instrument that could provide rather complete in-situ chemical analyses of surfaces and thin atmospheres of extraterrestrial bodies. The instrument is a miniaturized and improved version of the Surveyor lunar instrument. The backscattering of alpha particles and (alpha, p) reactions provide analytical data on the light elements (carbon-iron). An X-ray mode that detects the photons produced by the alpha sources provides sensitivity and resolution for the chemical elements heavier than about silicon. The X-rays are detected by semiconductor detectors having a resolution between 150 and 250 eV at 5.9 keV. Such an instrument can identify and determine with good accuracy 99 percent of the atoms (except hydrogen) in rocks. For many trace elements, the detecting sensitivity is a few ppm. Auxiliary sources could be used to enhance the sensitivities for elements of special interest. The instrument could probably withstand the acceleration involved in semi-hard landings.

  19. Demonstration of the ExoMars sample preparation and distribution system jointly with an optical instrument head

    NASA Astrophysics Data System (ADS)

    Schulte, Wolfgang; Thiele, Hans; Hofmann, Peter; Baglioni, Pietro

    The ExoMars program will search for past and present life on Mars. ExoMars will address important scientific goals and demonstrate key in-situ enabling technologies. Among such technologies are the acquisition, preparation, distribution and analysis of samples from Mars surface rocks and from the subsurface. The 2018 mission will land an ESA rover on Mars which carries a sample preparation and distribution system (SPDS) and a suite of analytical instruments, the Pasteur Payload with its Analytical Laboratory Drawer (ALD). Kayser-Threde GmbH (Germany) will be responsible for the SPDS as a subcontractor under the mission prime Thales Alenia Space. The SPDS comprises a number of complex mechanisms and mechanical devices designed to transport drill core samples within the rover analytical laboratory, to crush them to powder with a fine grain size, to portion discrete amounts of powdered sample material, to distribute and fill the material into sample containers and to prepare flat sample surfaces for scientific analysis. Breadboards of the crushing mechanism, the dosing mechanism and a distribution carousel with sample containers and a powder sample surface flattening mechanism were built and tested. Kayser-Threde, as a member of the Spanish led ExoMars Raman Instrument team, is also responsible for development of the Raman optical head, which will be mounted inside ALD and will inspect the crushed samples, when they are presented to the instrument by the distribution carousel. Within this activity, which is performed under contract with the Institute of Physical Chemistry of the University of Jena (Germany) and funded by the German DLR, Kayser-Threde can demonstrate Raman measurements with the optical head and a COTS laser and spectrometer and thus simulate the full Raman instrument optical path. An autofocus system with actuator and feedback optics is also part of this activity, which allows focusing the 50 m Raman spot on the surface of the powdered sample

  20. Analytical advances in pharmaceutical impurity profiling.

    PubMed

    Holm, René; Elder, David P

    2016-05-25

    Impurities will be present in all drug substances and drug products, i.e. nothing is 100% pure if one looks in enough depth. The current regulatory guidance on impurities accepts this, and for drug products with a dose of less than 2g/day identification of impurities is set at 0.1% levels and above (ICH Q3B(R2), 2006). For some impurities, this is a simple undertaking as generally available analytical techniques can address the prevailing analytical challenges; whereas, for others this may be much more challenging requiring more sophisticated analytical approaches. The present review provides an insight into current development of analytical techniques to investigate and quantify impurities in drug substances and drug products providing discussion of progress particular within the field of chromatography to ensure separation of and quantification of those related impurities. Further, a section is devoted to the identification of classical impurities, but in addition, inorganic (metal residues) and solid state impurities are also discussed. Risk control strategies for pharmaceutical impurities aligned with several of the ICH guidelines, are also discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. A multilaboratory comparison of calibration accuracy and the performance of external references in analytical ultracentrifugation.

    PubMed

    Zhao, Huaying; Ghirlando, Rodolfo; Alfonso, Carlos; Arisaka, Fumio; Attali, Ilan; Bain, David L; Bakhtina, Marina M; Becker, Donald F; Bedwell, Gregory J; Bekdemir, Ahmet; Besong, Tabot M D; Birck, Catherine; Brautigam, Chad A; Brennerman, William; Byron, Olwyn; Bzowska, Agnieszka; Chaires, Jonathan B; Chaton, Catherine T; Cölfen, Helmut; Connaghan, Keith D; Crowley, Kimberly A; Curth, Ute; Daviter, Tina; Dean, William L; Díez, Ana I; Ebel, Christine; Eckert, Debra M; Eisele, Leslie E; Eisenstein, Edward; England, Patrick; Escalante, Carlos; Fagan, Jeffrey A; Fairman, Robert; Finn, Ron M; Fischle, Wolfgang; de la Torre, José García; Gor, Jayesh; Gustafsson, Henning; Hall, Damien; Harding, Stephen E; Cifre, José G Hernández; Herr, Andrew B; Howell, Elizabeth E; Isaac, Richard S; Jao, Shu-Chuan; Jose, Davis; Kim, Soon-Jong; Kokona, Bashkim; Kornblatt, Jack A; Kosek, Dalibor; Krayukhina, Elena; Krzizike, Daniel; Kusznir, Eric A; Kwon, Hyewon; Larson, Adam; Laue, Thomas M; Le Roy, Aline; Leech, Andrew P; Lilie, Hauke; Luger, Karolin; Luque-Ortega, Juan R; Ma, Jia; May, Carrie A; Maynard, Ernest L; Modrak-Wojcik, Anna; Mok, Yee-Foong; Mücke, Norbert; Nagel-Steger, Luitgard; Narlikar, Geeta J; Noda, Masanori; Nourse, Amanda; Obsil, Tomas; Park, Chad K; Park, Jin-Ku; Pawelek, Peter D; Perdue, Erby E; Perkins, Stephen J; Perugini, Matthew A; Peterson, Craig L; Peverelli, Martin G; Piszczek, Grzegorz; Prag, Gali; Prevelige, Peter E; Raynal, Bertrand D E; Rezabkova, Lenka; Richter, Klaus; Ringel, Alison E; Rosenberg, Rose; Rowe, Arthur J; Rufer, Arne C; Scott, David J; Seravalli, Javier G; Solovyova, Alexandra S; Song, Renjie; Staunton, David; Stoddard, Caitlin; Stott, Katherine; Strauss, Holger M; Streicher, Werner W; Sumida, John P; Swygert, Sarah G; Szczepanowski, Roman H; Tessmer, Ingrid; Toth, Ronald T; Tripathy, Ashutosh; Uchiyama, Susumu; Uebel, Stephan F W; Unzai, Satoru; Gruber, Anna Vitlin; von Hippel, Peter H; Wandrey, Christine; Wang, Szu-Huan; Weitzel, Steven E; Wielgus-Kutrowska, Beata; Wolberger, Cynthia; Wolff, Martin; Wright, Edward; Wu, Yu-Sung; Wubben, Jacinta M; Schuck, Peter

    2015-01-01

    Analytical ultracentrifugation (AUC) is a first principles based method to determine absolute sedimentation coefficients and buoyant molar masses of macromolecules and their complexes, reporting on their size and shape in free solution. The purpose of this multi-laboratory study was to establish the precision and accuracy of basic data dimensions in AUC and validate previously proposed calibration techniques. Three kits of AUC cell assemblies containing radial and temperature calibration tools and a bovine serum albumin (BSA) reference sample were shared among 67 laboratories, generating 129 comprehensive data sets. These allowed for an assessment of many parameters of instrument performance, including accuracy of the reported scan time after the start of centrifugation, the accuracy of the temperature calibration, and the accuracy of the radial magnification. The range of sedimentation coefficients obtained for BSA monomer in different instruments and using different optical systems was from 3.655 S to 4.949 S, with a mean and standard deviation of (4.304 ± 0.188) S (4.4%). After the combined application of correction factors derived from the external calibration references for elapsed time, scan velocity, temperature, and radial magnification, the range of s-values was reduced 7-fold with a mean of 4.325 S and a 6-fold reduced standard deviation of ± 0.030 S (0.7%). In addition, the large data set provided an opportunity to determine the instrument-to-instrument variation of the absolute radial positions reported in the scan files, the precision of photometric or refractometric signal magnitudes, and the precision of the calculated apparent molar mass of BSA monomer and the fraction of BSA dimers. These results highlight the necessity and effectiveness of independent calibration of basic AUC data dimensions for reliable quantitative studies.

  2. A Multilaboratory Comparison of Calibration Accuracy and the Performance of External References in Analytical Ultracentrifugation

    PubMed Central

    Zhao, Huaying; Ghirlando, Rodolfo; Alfonso, Carlos; Arisaka, Fumio; Attali, Ilan; Bain, David L.; Bakhtina, Marina M.; Becker, Donald F.; Bedwell, Gregory J.; Bekdemir, Ahmet; Besong, Tabot M. D.; Birck, Catherine; Brautigam, Chad A.; Brennerman, William; Byron, Olwyn; Bzowska, Agnieszka; Chaires, Jonathan B.; Chaton, Catherine T.; Cölfen, Helmut; Connaghan, Keith D.; Crowley, Kimberly A.; Curth, Ute; Daviter, Tina; Dean, William L.; Díez, Ana I.; Ebel, Christine; Eckert, Debra M.; Eisele, Leslie E.; Eisenstein, Edward; England, Patrick; Escalante, Carlos; Fagan, Jeffrey A.; Fairman, Robert; Finn, Ron M.; Fischle, Wolfgang; de la Torre, José García; Gor, Jayesh; Gustafsson, Henning; Hall, Damien; Harding, Stephen E.; Cifre, José G. Hernández; Herr, Andrew B.; Howell, Elizabeth E.; Isaac, Richard S.; Jao, Shu-Chuan; Jose, Davis; Kim, Soon-Jong; Kokona, Bashkim; Kornblatt, Jack A.; Kosek, Dalibor; Krayukhina, Elena; Krzizike, Daniel; Kusznir, Eric A.; Kwon, Hyewon; Larson, Adam; Laue, Thomas M.; Le Roy, Aline; Leech, Andrew P.; Lilie, Hauke; Luger, Karolin; Luque-Ortega, Juan R.; Ma, Jia; May, Carrie A.; Maynard, Ernest L.; Modrak-Wojcik, Anna; Mok, Yee-Foong; Mücke, Norbert; Nagel-Steger, Luitgard; Narlikar, Geeta J.; Noda, Masanori; Nourse, Amanda; Obsil, Tomas; Park, Chad K.; Park, Jin-Ku; Pawelek, Peter D.; Perdue, Erby E.; Perkins, Stephen J.; Perugini, Matthew A.; Peterson, Craig L.; Peverelli, Martin G.; Piszczek, Grzegorz; Prag, Gali; Prevelige, Peter E.; Raynal, Bertrand D. E.; Rezabkova, Lenka; Richter, Klaus; Ringel, Alison E.; Rosenberg, Rose; Rowe, Arthur J.; Rufer, Arne C.; Scott, David J.; Seravalli, Javier G.; Solovyova, Alexandra S.; Song, Renjie; Staunton, David; Stoddard, Caitlin; Stott, Katherine; Strauss, Holger M.; Streicher, Werner W.; Sumida, John P.; Swygert, Sarah G.; Szczepanowski, Roman H.; Tessmer, Ingrid; Toth, Ronald T.; Tripathy, Ashutosh; Uchiyama, Susumu; Uebel, Stephan F. W.; Unzai, Satoru; Gruber, Anna Vitlin; von Hippel, Peter H.; Wandrey, Christine; Wang, Szu-Huan; Weitzel, Steven E.; Wielgus-Kutrowska, Beata; Wolberger, Cynthia; Wolff, Martin; Wright, Edward; Wu, Yu-Sung; Wubben, Jacinta M.; Schuck, Peter

    2015-01-01

    Analytical ultracentrifugation (AUC) is a first principles based method to determine absolute sedimentation coefficients and buoyant molar masses of macromolecules and their complexes, reporting on their size and shape in free solution. The purpose of this multi-laboratory study was to establish the precision and accuracy of basic data dimensions in AUC and validate previously proposed calibration techniques. Three kits of AUC cell assemblies containing radial and temperature calibration tools and a bovine serum albumin (BSA) reference sample were shared among 67 laboratories, generating 129 comprehensive data sets. These allowed for an assessment of many parameters of instrument performance, including accuracy of the reported scan time after the start of centrifugation, the accuracy of the temperature calibration, and the accuracy of the radial magnification. The range of sedimentation coefficients obtained for BSA monomer in different instruments and using different optical systems was from 3.655 S to 4.949 S, with a mean and standard deviation of (4.304 ± 0.188) S (4.4%). After the combined application of correction factors derived from the external calibration references for elapsed time, scan velocity, temperature, and radial magnification, the range of s-values was reduced 7-fold with a mean of 4.325 S and a 6-fold reduced standard deviation of ± 0.030 S (0.7%). In addition, the large data set provided an opportunity to determine the instrument-to-instrument variation of the absolute radial positions reported in the scan files, the precision of photometric or refractometric signal magnitudes, and the precision of the calculated apparent molar mass of BSA monomer and the fraction of BSA dimers. These results highlight the necessity and effectiveness of independent calibration of basic AUC data dimensions for reliable quantitative studies. PMID:25997164

  3. Continuous electrophoretic purification of individual analytes from multicomponent mixtures.

    PubMed

    McLaren, David G; Chen, David D Y

    2004-04-15

    Individual analytes can be isolated from multicomponent mixtures and collected in the outlet vial by carrying out electrophoretic purification through a capillary column. Desired analytes are allowed to migrate continuously through the column under the electric field while undesired analytes are confined to the inlet vial by application of a hydrodynamic counter pressure. Using pressure ramping and buffer replenishment techniques, 18% of the total amount present in a bulk sample can be purified when the resolution to the adjacent peak is approximately 3. With a higher resolution, the yield could be further improved. Additionally, by periodically introducing fresh buffer into the sample, changes in pH and conductivity can be mediated, allowing higher purity (>or=99.5%) to be preserved in the collected fractions. With an additional reversed cycle of flow counterbalanced capillary electrophoresis, any individual component in a sample mixture can be purified providing it can be separated in an electrophoresis system.

  4. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm

    2016-01-01

    The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Behavior of healthcare workers after injuries from sharp instruments.

    PubMed

    Adib-Hajbaghery, Mohsen; Lotfi, Mohammad Sajjad

    2013-09-01

    Injuries with sharps are common occupational hazards for healthcare workers. Such injuries predispose the staff to dangerous infections such as hepatitis B, C and HIV. The present study was conducted to investigate the behaviors of healthcare workers in Kashan healthcare centers after needle sticks and injuries with sharps in 2012. A cross-sectional study was conducted on 298 healthcare workers of medical centers governed by Kashan University of Medical Sciences. A questionnaire was used in this study. The first part included questions about demographic characteristics. The second part of the questionnaire consisted of 16 items related to the sharp instrument injuries. For data analysis, descriptive and analytical statistics (chi-square, ANOVA and Pearson correlation coefficient) SPSS version 16.0 software was used. From a total of 298 healthcare workers, 114 (38.3%) had a history of injury from needles and sharp instruments in the last six months. Most needle stick and sharp instrument injuries had occurred among the operating room nurses and midwifes; 32.5% of injuries from sharp instruments occurred in the morning shift. Needles were responsible for 46.5% of injuries. The most common actions taken after needle stick injuries were compression (27.2%) and washing the area with soap and water (15.8%). Only 44.6% of the injured personnel pursued follow-up measures after a needle stick or sharp instrument injury. More than a half of the healthcare workers with needle stick or sharp instrument injury had refused follow-up for various reasons. The authorities should implement education programs along with protocols to be implemented after needle stick injuries or sharps.

  6. Toward the characterization of biological toxins using field-based FT-IR spectroscopic instrumentation

    NASA Astrophysics Data System (ADS)

    Schiering, David W.; Walton, Robert B.; Brown, Christopher W.; Norman, Mark L.; Brewer, Joseph; Scott, James

    2004-12-01

    IR spectroscopy is a broadly applicable technique for the identification of covalent materials. Recent advances in instrumentation have made Fourier Transform infrared (FT-IR) spectroscopy available for field characterization of suspect materials. Presently, this instrumentation is broadly deployed and used for the identification of potential chemical hazards. This discussion concerns work towards expanding the analytical utility of field-based FT-IR spectrometry in the characterization of biological threats. Two classes of materials were studied: biologically produced chemical toxins which were non-peptide in nature and peptide toxin. The IR spectroscopic identification of aflatoxin-B1, trichothecene T2 mycotoxin, and strychnine was evaluated using the approach of spectral searching against large libraries of materials. For pure components, the IR method discriminated the above toxins at better than the 99% confidence level. The ability to identify non-peptide toxins in mixtures was also evaluated using a "spectral stripping" search approach. For the mixtures evaluated, this method was able to identify the mixture components from ca. 32K spectral library entries. Castor bean extract containing ricin was used as a representative peptide toxin. Due to similarity in protein spectra, a SIMCA pattern recognition methodology was evaluated for classifying peptide toxins. In addition to castor bean extract the method was validated using bovine serum albumin and myoglobin as simulants. The SIMCA approach was successful in correctly classifying these samples at the 95% confidence level.

  7. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  8. Status of TMI-2 instruments and electrical components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helbert, H J

    In the Task 1.0 section of the GEND 001 Planning Report, the Instrumentation and Electrical Equipment Survivability Planning Group (IEPG) supplied planning, guidance, and recommendations on collecting survivability data on instruments and electrical equipment involved in the March 28, 1979, accident at the Three Mile Island Unit 2 (TMI-2) Reactor. GEND 001 recommended collection of further data on the status of all the instruments and electrical equipment it listed. The current report supplies information concerning the operational status of instruments and electrical equipment listed in the Task 1.0 section of GEND 001. This document will be updated in the futuremore » as additional information is obtained.« less

  9. Dynamic Stability Instrumentation System (DSIS). Volume 3; User Manual

    NASA Technical Reports Server (NTRS)

    Daniels, Taumi S.; Boyden, Richmond P.; Dress, David A.; Jordan, Thomas L.

    1996-01-01

    The paper is an operating manual for the Dynamic Stability Instrumentation System in specific NASA Langley wind tunnels. The instrumentation system performs either a synchronous demodulation or a Fast Fourier Transform on dynamic balance strain gage signals, and ultimately computes aerodynamic coefficients. The dynamic balance converts sting motor rotation into pitch or yaw plane or roll axis oscillation, with timing information provided by a shaft encoder. Additional instruments control model attitude and balance temperature and monitor sting vibrations. Other instruments perform self-calibration and diagnostics. Procedures for conducting calibrations and wind-off and wind-on tests are listed.

  10. Design and analysis of radiometric instruments using high-level numerical models and genetic algorithms

    NASA Astrophysics Data System (ADS)

    Sorensen, Ira Joseph

    A primary objective of the effort reported here is to develop a radiometric instrument modeling environment to provide complete end-to-end numerical models of radiometric instruments, integrating the optical, electro-thermal, and electronic systems. The modeling environment consists of a Monte Carlo ray-trace (MCRT) model of the optical system coupled to a transient, three-dimensional finite-difference electrothermal model of the detector assembly with an analytic model of the signal-conditioning circuitry. The environment provides a complete simulation of the dynamic optical and electrothermal behavior of the instrument. The modeling environment is used to create an end-to-end model of the CERES scanning radiometer, and its performance is compared to the performance of an operational CERES total channel as a benchmark. A further objective of this effort is to formulate an efficient design environment for radiometric instruments. To this end, the modeling environment is then combined with evolutionary search algorithms known as genetic algorithms (GA's) to develop a methodology for optimal instrument design using high-level radiometric instrument models. GA's are applied to the design of the optical system and detector system separately and to both as an aggregate function with positive results.

  11. Data acquisition instruments: Psychopharmacology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartley, D.S. III

    This report contains the results of a Direct Assistance Project performed by Lockheed Martin Energy Systems, Inc., for Dr. K. O. Jobson. The purpose of the project was to perform preliminary analysis of the data acquisition instruments used in the field of psychiatry, with the goal of identifying commonalities of data and strategies for handling and using the data in the most advantageous fashion. Data acquisition instruments from 12 sources were provided by Dr. Jobson. Several commonalities were identified and a potentially useful data strategy is reported here. Analysis of the information collected for utility in performing diagnoses is recommended.more » In addition, further work is recommended to refine the commonalities into a directly useful computer systems structure.« less

  12. Ab initio simulation of diffractometer instrumental function for high-resolution X-ray diffraction1

    PubMed Central

    Mikhalychev, Alexander; Benediktovitch, Andrei; Ulyanenkova, Tatjana; Ulyanenkov, Alex

    2015-01-01

    Modeling of the X-ray diffractometer instrumental function for a given optics configuration is important both for planning experiments and for the analysis of measured data. A fast and universal method for instrumental function simulation, suitable for fully automated computer realization and describing both coplanar and noncoplanar measurement geometries for any combination of X-ray optical elements, is proposed. The method can be identified as semi-analytical backward ray tracing and is based on the calculation of a detected signal as an integral of X-ray intensities for all the rays reaching the detector. The high speed of calculation is provided by the expressions for analytical integration over the spatial coordinates that describe the detection point. Consideration of the three-dimensional propagation of rays without restriction to the diffraction plane provides the applicability of the method for noncoplanar geometry and the accuracy for characterization of the signal from a two-dimensional detector. The correctness of the simulation algorithm is checked in the following two ways: by verifying the consistency of the calculated data with the patterns expected for certain simple limiting cases and by comparing measured reciprocal-space maps with the corresponding maps simulated by the proposed method for the same diffractometer configurations. Both kinds of tests demonstrate the agreement of the simulated instrumental function shape with the measured data. PMID:26089760

  13. [Eating, nourishment and nutrition: instrumental analytic categories in the scientific research field].

    PubMed

    da Veiga Soares Carvalho, Maria Cláudia; Luz, Madel Therezinha; Prado, Shirley Donizete

    2011-01-01

    Eating, nourishment or nutrition circulate in our culture as synonyms and thus do not account for the changes that occur in nourishment, which intended or unintended, have a hybridization pattern that represents a change of rules and food preferences. This paper aims to take these common sense conceptions as analytic categories for analyzing and interpreting research for the Humanities and Health Sciences in a theoretical perspective, through conceptualization. The food is associated with a natural function (biological), a concept in which nature is opposed to culture, and nourishment takes cultural meanings (symbolic), expressing the division of labor, wealth, and a historical and cultural creation through which one can study a society. One attributes to Nutrition a sense of rational action, derived from the constitution of this science in modernity, inserted in a historical process of scientific rationalization of eating and nourishing. We believe that through the practice of conceptualization in interdisciplinary research, which involves a shared space of knowledge, we can be less constrained by a unified theoretical model of learning and be freer to think about life issues.

  14. Hanford analytical sample projections FY 1998--FY 2002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joyce, S.M.

    1998-02-12

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management,more » and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.« less

  15. Analytical characterization of a new mobile X-ray fluorescence and X-ray diffraction instrument combined with a pigment identification case study

    NASA Astrophysics Data System (ADS)

    Van de Voorde, Lien; Vekemans, Bart; Verhaeven, Eddy; Tack, Pieter; De Wolf, Robin; Garrevoet, Jan; Vandenabeele, Peter; Vincze, Laszlo

    2015-08-01

    A new, commercially available, mobile system combining X-ray diffraction and X-ray fluorescence has been evaluated which enables both elemental analysis and phase identification simultaneously. The instrument makes use of a copper or molybdenum based miniature X-ray tube and a silicon-Pin diode energy-dispersive detector to count the photons originating from the samples. The X-ray tube and detector are both mounted on an X-ray diffraction protractor in a Bragg-Brentano θ:θ geometry. The mobile instrument is one of the lightest and most compact instruments of its kind (3.5 kg) and it is thus very useful for in situ purposes such as the direct (non-destructive) analysis of cultural heritage objects which need to be analyzed on site without any displacement. The supplied software allows both the operation of the instrument for data collection and in-depth data analysis using the International Centre for Diffraction Data database. This paper focuses on the characterization of the instrument, combined with a case study on pigment identification and an illustrative example for the analysis of lead alloyed printing letters. The results show that this commercially available light-weight instrument is able to identify the main crystalline phases non-destructively, present in a variety of samples, with a high degree of flexibility regarding sample size and position.

  16. Analytical performance of a bronchial genomic classifier.

    PubMed

    Hu, Zhanzhi; Whitney, Duncan; Anderson, Jessica R; Cao, Manqiu; Ho, Christine; Choi, Yoonha; Huang, Jing; Frink, Robert; Smith, Kate Porta; Monroe, Robert; Kennedy, Giulia C; Walsh, P Sean

    2016-02-26

    The current standard practice of lung lesion diagnosis often leads to inconclusive results, requiring additional diagnostic follow up procedures that are invasive and often unnecessary due to the high benign rate in such lesions (Chest 143:e78S-e92, 2013). The Percepta bronchial genomic classifier was developed and clinically validated to provide more accurate classification of lung nodules and lesions that are inconclusive by bronchoscopy, using bronchial brushing specimens (N Engl J Med 373:243-51, 2015, BMC Med Genomics 8:18, 2015). The analytical performance of the Percepta test is reported here. Analytical performance studies were designed to characterize the stability of RNA in bronchial brushing specimens during collection and shipment; analytical sensitivity defined as input RNA mass; analytical specificity (i.e. potentially interfering substances) as tested on blood and genomic DNA; and assay performance studies including intra-run, inter-run, and inter-laboratory reproducibility. RNA content within bronchial brushing specimens preserved in RNAprotect is stable for up to 20 days at 4 °C with no changes in RNA yield or integrity. Analytical sensitivity studies demonstrated tolerance to variation in RNA input (157 ng to 243 ng). Analytical specificity studies utilizing cancer positive and cancer negative samples mixed with either blood (up to 10 % input mass) or genomic DNA (up to 10 % input mass) demonstrated no assay interference. The test is reproducible from RNA extraction through to Percepta test result, including variation across operators, runs, reagent lots, and laboratories (standard deviation of 0.26 for scores on > 6 unit scale). Analytical sensitivity, analytical specificity and robustness of the Percepta test were successfully verified, supporting its suitability for clinical use.

  17. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and

  18. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) and how to select an outcome measurement instrument.

    PubMed

    Mokkink, Lidwine B; Prinsen, Cecilia A C; Bouter, Lex M; Vet, Henrica C W de; Terwee, Caroline B

    2016-01-19

    COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments) is an initiative of an international multidisciplinary team of researchers who aim to improve the selection of outcome measurement instruments both in research and in clinical practice by developing tools for selecting the most appropriate available instrument. In this paper these tools are described, i.e. the COSMIN taxonomy and definition of measurement properties; the COSMIN checklist to evaluate the methodological quality of studies on measurement properties; a search filter for finding studies on measurement properties; a protocol for systematic reviews of outcome measurement instruments; a database of systematic reviews of outcome measurement instruments; and a guideline for selecting outcome measurement instruments for Core Outcome Sets in clinical trials. Currently, we are updating the COSMIN checklist, particularly the standards for content validity studies. Also new standards for studies using Item Response Theory methods will be developed. Additionally, in the future we want to develop standards for studies on the quality of non-patient reported outcome measures, such as clinician-reported outcomes and performance-based outcomes. In summary, we plea for more standardization in the use of outcome measurement instruments, for conducting high quality systematic reviews on measurement instruments in which the best available outcome measurement instrument is recommended, and for stopping the use of poor outcome measurement instruments.

  19. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) and how to select an outcome measurement instrument

    PubMed Central

    Mokkink, Lidwine B.; Prinsen, Cecilia A. C.; Bouter, Lex M.; de Vet, Henrica C. W.; Terwee, Caroline B.

    2016-01-01

    Background: COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments) is an initiative of an international multidisciplinary team of researchers who aim to improve the selection of outcome measurement instruments both in research and in clinical practice by developing tools for selecting the most appropriate available instrument. Method: In this paper these tools are described, i.e. the COSMIN taxonomy and definition of measurement properties; the COSMIN checklist to evaluate the methodological quality of studies on measurement properties; a search filter for finding studies on measurement properties; a protocol for systematic reviews of outcome measurement instruments; a database of systematic reviews of outcome measurement instruments; and a guideline for selecting outcome measurement instruments for Core Outcome Sets in clinical trials. Currently, we are updating the COSMIN checklist, particularly the standards for content validity studies. Also new standards for studies using Item Response Theory methods will be developed. Additionally, in the future we want to develop standards for studies on the quality of non-patient reported outcome measures, such as clinician-reported outcomes and performance-based outcomes. Conclusions: In summary, we plea for more standardization in the use of outcome measurement instruments, for conducting high quality systematic reviews on measurement instruments in which the best available outcome measurement instrument is recommended, and for stopping the use of poor outcome measurement instruments. PMID:26786084

  20. Analytical Chemistry (edited by R. Kellner, J.- M. Mermet, M. Otto, and H. M. Widmer)

    NASA Astrophysics Data System (ADS)

    Thompson, Reviewed By Robert Q.

    2000-04-01

    marginal notes. The text is divided into 5 parts (General Topics, Chemical Analysis, Physical Analysis, Computer-Based Analytical Chemistry, and Total Analysis Systems), 16 sections, and many chapters and subsections, all numbered and with headings for easy reference. The book provides comprehensive coverage of analytical science. Many curricula in North America cling to the tired notion of one semester of classical analytical (wet) chemistry followed by a second semester of instrumental analysis, and publishers continue to respond by publishing separate texts for each course. The Europeans, in contrast, have a text that bridges this artificial gap. Included are chapters and subsections on chemical equilibrium, electronic and vibrational spectroscopy, separations, and electrochemistry (found in most first courses in analytical chemistry). The authors also address atomic spectroscopy in all of its forms, luminescence, mass spectrometry, NMR spectrometry, surface analysis, thermal methods, activation analysis, and automated methods of analysis (found in most instrumental courses). Additional, uncommon chapters on chemical and biochemical sensors, immunoassay, chemometrics, miniaturized systems, and process analytical chemistry point toward the present and future of analytical science. The only glaring omission in comparison to other instrumental texts is in the area of measurement systems and electronics. No mention is made of the analytical laboratory, such as descriptions of glassware calibration and suggested experiments, as is found in most quantitative analysis texts in the U.S. The dangers in any multi-authored book include an uneven treatment of topics and a lack of cohesiveness and logical development of topics. I found some evidence of these problems in Analytical Chemistry. My first reaction to the Table of Contents and the grouping of chapters was "Where is ?" and "What about ?" While the order of topics in an analytical chemistry course always is open to debate

  1. Analytical and Experimental Studies of Leak Location and Environment Characterization for the International Space Station

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael; Abel, Joshua; Autrey, David; Blackmon, Rebecca; Bond, Tim; Brown, Martin; Buffington, Jesse; Cheng, Edward; DeLatte, Danielle; Garcia, Kelvin; hide

    2014-01-01

    The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to detect NH3 coolant leaks in the ISS thermal control system. An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performance to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations ("directionality"). The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lb-mass/yr. to about 1 lb-mass/day. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ram/wake flows and structural shadowing within low Earth orbit.

  2. Analytical and Experimental Studies of Leak Location and Environment Characterization for the International Space Station

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael S.; Abel, Joshua C.; Autrey, David; Blackmon, Rebecca; Bond, Tim; Brown, Martin; Buffington, Jesse; Cheng, Edward; DeLatte, Danielle; Garcia, Kelvin; hide

    2014-01-01

    The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to detect NH3 coolant leaks in the ISS thermal control system.An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performance to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations (directionality).The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lbmyr. to about 1 lbmday. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ramwake flows and structural shadowing within low Earth orbit.

  3. Analytic materials

    PubMed Central

    2016-01-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations. PMID:27956882

  4. Vertebral Compression Fractures after Lumbar Instrumentation.

    PubMed

    Granville, Michelle; Berti, Aldo; Jacobson, Robert E

    2017-09-29

    Lumbar spinal stenosis (LSS) is primarily found in an older population. This is a similar demographic group that develops both osteoporosis and vertebral compression fractures (VCF). This report reviewed a series of patients treated for VCF that had previous lumbar surgery for symptomatic spinal stenosis. Patients that only underwent laminectomy or fusion without instrumentation had a similar distribution of VCF as the non-surgical population in the mid-thoracic, or lower thoracic and upper lumbar spine. However, in the patients that had previous short-segment spinal instrumentation, fractures were found to be located more commonly in the mid-lumbar spine or sacrum adjacent to or within one or two spinal segments of the spinal instrumentation. Adjacent-level fractures that occur due to vertebral osteoporosis after long spinal segment instrumentation has been discussed in the literature. The purpose of this report is to highlight the previously unreported finding of frequent lumbar and sacral osteoporotic fractures in post-lumbar instrumentation surgery patients. Important additional factors found were lack of preventative medical treatment for osteoporosis, and secondary effects related to inactivity, especially during the first year after surgery.

  5. Laser light scattering instrument advanced technology development

    NASA Technical Reports Server (NTRS)

    Wallace, J. F.

    1993-01-01

    The objective of this advanced technology development (ATD) project has been to provide sturdy, miniaturized laser light scattering (LLS) instrumentation for use in microgravity experiments. To do this, we assessed user requirements, explored the capabilities of existing and prospective laser light scattering hardware, and both coordinated and participated in the hardware and software advances needed for a flight hardware instrument. We have successfully breadboarded and evaluated an engineering version of a single-angle glove-box instrument which uses solid state detectors and lasers, along with fiber optics, for beam delivery and detection. Additionally, we have provided the specifications and written verification procedures necessary for procuring a miniature multi-angle LLS instrument which will be used by the flight hardware project which resulted from this work and from this project's interaction with the laser light scattering community.

  6. Modification of a commercial gas chromatography isotope ratio mass spectrometer for on-line carbon isotope dilution: Evaluation of its analytical characteristics for the quantification of organic compounds.

    PubMed

    Alonso Sobrado, Laura; Robledo Fernández, Mario; Cueto Díaz, Sergio; Ruiz Encinar, Jorge; García Alonso, J Ignacio

    2015-11-06

    We describe the instrumental modification of a commercial gas chromatography isotope ratio mass spectrometer (GC-IRMS) and its application for on-line carbon isotope dilution. The main modification consisted in the addition of a constant flow of enriched (13)CO2 diluted in helium after the chromatographic column through the splitter holder located inside the chromatographic oven of the instrument. In addition, and in contrast to the conventional mode of operation of GC-IRMS instruments where the signal at m/z 45 is amplified 100-fold with respect to the signal at m/z 44, the same signal amplification was used in both Faraday cups at m/z 44 and 45. Under these conditions isotope ratio precision for the ratio 44/45 was around 0.05% RSD (n=50). The evaluation of the instrument was performed with mixtures of organic compounds including 11 n-alkanes, 16 PAHs, 12 PCBs and 3 benzothiophenes. It was observed that compounds of very different boiling points could be analysed without discrimination in the injector when a Programmable Temperature Vaporizer (PTV) injector was employed. Moreover, the presence of heteroatoms (Cl or S) in the structure of the organic compounds did not affect their combustion efficiency and therefore the trueness of the results. Quantitative results obtained for all the analytes assayed were excellent in terms of precision (<3% RSD) and accuracy (average relative error≤4%) and what is more important using a single and simple generic internal standard for quantification. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Laboratory Instruments Available to Support Space Station Researchers at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Panda, Binayak; Gorti, Sridhar

    2013-01-01

    A number of research instruments are available at NASA's Marshall Space Flight Center (MSFC) to support ISS researchers and their investigations. These modern analytical tools yield valuable and sometimes new informative resulting from sample characterization. Instruments include modern scanning electron microscopes equipped with field emission guns providing analytical capabilities that include angstron-level image resolution of dry, wet and biological samples. These microscopes are also equipped with silicon drift X-ray detectors (SDD) for fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations in crystalline alloys. Sample chambers admit large samples, provide variable pressures for wet samples, and quantitative analysis software to determine phase relations. Advances in solid-state electronics have also facilitated improvements for surface chemical analysis that are successfully employed to analyze metallic materials and alloys, ceramics, slags, and organic polymers. Another analytical capability at MSFC is a mganetic sector Secondary Ion Mass Spectroscopy (SIMS) that quantitatively determines and maps light elements such as hydrogen, lithium, and boron along with their isotopes, identifies and quantifies very low level impurities even at parts per billion (ppb) levels. Still other methods available at MSFC include X-ray photo-electron spectroscopy (XPS) that can determine oxidation states of elements as well as identify polymers and measure film thicknesses on coated materials, Scanning Auger electron spectroscopy (SAM) which combines surface sensitivity, spatial lateral resolution (approximately 20 nm), and depth profiling capabilities to describe elemental compositions in near surface regions and even the chemical state of analyzed atoms. Conventional Transmission Electron Microscope (TEM) for observing internal microstructures at very high magnifications and the Electron Probe

  8. Hybrid Instrumentation in Lumbar Spinal Fusion

    PubMed Central

    Danyali, Reza; Kueny, Rebecca; Huber, Gerd; Reichl, Michael; Richter, Alexander; Niemeyer, Thomas; Morlock, Michael; Püschel, Klaus; Übeyli, Hüseyin

    2017-01-01

    Study Design Ex vivo human cadaveric study. Objective The development or progression of adjacent segment disease (ASD) after spine stabilization and fusion is a major problem in spine surgery. Apart from optimal balancing of the sagittal profile, dynamic instrumentation is often suggested to prevent or impede ASD. Hybrid instrumentation is used to gain stabilization while allowing motion to avoid hypermobility in the adjacent segment. In this biomechanical study, the effects of two different hybrid instrumentations on human cadaver spines were evaluated and compared with a rigid instrumentation. Methods Eighteen human cadaver spines (T11–L5) were subdivided into three groups: rigid, dynamic, and hook comprising six spines each. Clinical parameters and initial mechanical characteristics were consistent among groups. All specimens received rigid fixation from L3–L5 followed by application of a free bending load of extension and flexion. The range of motion (ROM) for every segment was evaluated. For the rigid group, further rigid fixation from L1–L5 was applied. A dynamic Elaspine system (Spinelab AG, Winterthur, Switzerland) was applied from L1 to L3 for the dynamic group, and the hook group was instrumented with additional laminar hooks at L1–L3. ROM was then evaluated again. Results There was no significant difference in ROM among the three instrumentation techniques. Conclusion Based on this data, the intended advantage of a hybrid or dynamic instrumentation might not be achieved. PMID:28451509

  9. Instrumentation Cables Test Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muna, Alice Baca; LaFleur, Chris Bensdotter

    A fire at a nuclear power plant (NPP) has the potential to damage structures, systems, and components important to safety, if not promptly detected and suppressed. At Browns Ferry Nuclear Power Plant on March 22, 1975, a fire in the reactor building damaged electrical power and control systems. Damage to instrumentation cables impeded the function of both normal and standby reactor coolant systems, and degraded the operators’ plant monitoring capability. This event resulted in additional NRC involvement with utilities to ensure that NPPs are properly protected from fire as intended by the NRC principle design criteria (i.e., general design criteriamore » 3, Fire Protection). Current guidance and methods for both deterministic and performance based approaches typically make conservative (bounding) assumptions regarding the fire-induced failure modes of instrumentation cables and those failure modes effects on component and system response. Numerous fire testing programs have been conducted in the past to evaluate the failure modes and effects of electrical cables exposed to severe thermal conditions. However, that testing has primarily focused on control circuits with only a limited number of tests performed on instrumentation circuits. In 2001, the Nuclear Energy Institute (NEI) and the Electric Power Research Institute (EPRI) conducted a series of cable fire tests designed to address specific aspects of the cable failure and circuit fault issues of concern1. The NRC was invited to observe and participate in that program. The NRC sponsored Sandia National Laboratories to support this participation, whom among other things, added a 4-20 mA instrumentation circuit and instrumentation cabling to six of the tests. Although limited, one insight drawn from those instrumentation circuits tests was that the failure characteristics appeared to depend on the cable insulation material. The results showed that for thermoset insulated cables, the instrument reading tended

  10. Earth observing system instrument pointing control modeling for polar orbiting platforms

    NASA Technical Reports Server (NTRS)

    Briggs, H. C.; Kia, T.; Mccabe, S. A.; Bell, C. E.

    1987-01-01

    An approach to instrument pointing control performance assessment for large multi-instrument platforms is described. First, instrument pointing requirements and reference platform control systems for the Eos Polar Platforms are reviewed. Performance modeling tools including NASTRAN models of two large platforms, a modal selection procedure utilizing a balanced realization method, and reduced order platform models with core and instrument pointing control loops added are then described. Time history simulations of instrument pointing and stability performance in response to commanded slewing of adjacent instruments demonstrates the limits of tolerable slew activity. Simplified models of rigid body responses are also developed for comparison. Instrument pointing control methods required in addition to the core platform control system to meet instrument pointing requirements are considered.

  11. The Sound Access Parent Outcomes Instrument (SAPOI): Construction of a new instrument for children with severe multiple disabilities who use cochlear implants or hearing aids.

    PubMed

    Hayward, Denyse V; Ritter, Kathryn; Mousavi, Amin; Vatanapour, Shabnam

    2016-01-01

    To report on the Phase 2 development of the Sound Access Parent Outcomes Instrument (SAPOI), a new instrument focused on formalizing outcomes that parents of children with severe multiple disabilities (SMD) who use amplification prioritize as important. Phase 2 of this project involved item selection and refinement of the SAPOI based on (a) Phase 1 study participant input, (b) clinical specialist feedback, and (c) test-retest instrument reliability. Phase 1 participant responses were utilized to construct a draft version of the SAPOI. Next, clinical specialists examined the instrument for content validity and utility and instrument reliability was examined through a test-retest process with parents of children with SMD. The draft SAPOI was constructed based on Phase 1 participant input. Clinical specialists supported content validity and utility of the instrument and the inclusion of 19 additional items across four categories, namely Child Affect, Child Interaction, Parent Well-being, and Child's Device Use. The SAPOI was completed twice at one-month intervals by parents of children with SMD to examine instrument reliability across the four categories (Child Affect, Child Interaction, Parent Well-being, and Child's Device Use). Instrument reliability was strong-to-excellent across all four sections. The SAPOI shows promise as a much-needed addition to the assessment battery currently used for children with SMD who use cochlear implants and hearing aids. It provides valuable information regarding outcomes resulting from access to sound in this population that currently used assessments do not identify.

  12. The Relationship Between Specific Pavlovian Instrumental Transfer and Instrumental Reward Probability

    PubMed Central

    Cartoni, Emilio; Moretta, Tania; Puglisi-Allegra, Stefano; Cabib, Simona; Baldassarre, Gianluca

    2015-01-01

    Goal-directed behavior is influenced by environmental cues: in particular, cues associated with a reward can bias action choice toward actions directed to that same reward. This effect is studied experimentally as specific Pavlovian-instrumental transfer (specific PIT). We have investigated the hypothesis that cues associated to an outcome elicit specific PIT by rising the estimates of reward probability of actions associated to that same outcome. In other words, cues reduce the uncertainty on the efficacy of instrumental actions. We used a human PIT experimental paradigm to test the effects of two different instrumental contingencies: one group of participants had a 33% chance of being rewarded for each button press, while another had a 100% chance. The group trained with 33% reward probability showed a stronger PIT effect than the 100% group, in line with the hypothesis that Pavlovian cues linked to an outcome work by reducing the uncertainty of receiving it. The 100% group also showed a significant specific PIT effect, highlighting additional factors that could contribute to specific PIT beyond the instrumental training contingency. We hypothesize that the uncertainty about reward delivery due to testing in extinction might be one of these factors. These results add knowledge on how goal-directed behavior is influenced by the presence of environmental cues associated with a reward: such influence depends on the probability that we have to reach a reward, namely when there is less chance of getting a reward we are more influenced by cues associated with it, and vice versa. PMID:26635645

  13. Economic impact of laparoscopic instrumentation: a company perspective.

    PubMed

    Swem, T; Fazzalari, R

    1995-01-01

    This report represents findings concerning the economic impact of laparoscopic surgery. Specifically, the study addresses hospital costs, and not the hospital charges often given attention by studies in the literature. Hospital expenditures for the equipment and instrumentation required for laparoscopic surgery are important cost factors in laparoscopic surgery. Data for determining hospital costs was obtained from nine hospitals throughout the United States. At each hospital, a research team spent four to five days interviewing surgeons, OR staff, hospital administrators and other personnel as well as gathering data. Analysis of operating room equipment and supplies indicates that single-use laparoscopic instruments are a cost-effective alternative to reusable instruments. In addition, single-use instruments have many benefits that were not possible to quantify accurately in this study.

  14. Integrated Array/Metadata Analytics

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Baumann, Peter

    2015-04-01

    Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.

  15. Development of an Instrument Performance Simulation Capability for an Infrared Correlation Radiometer for Troposheric Carbon Monoxide Measurements From Geo

    NASA Technical Reports Server (NTRS)

    OsowskiNeil, Doreen; Yee, Jeng-Hwa; Boldt, John; Edwards, David

    2010-01-01

    We present the progress toward an analytical performance model of a 2.3 micron infrared correlation radiometer (IRCRg) prototype subsystem for a future geostationary space-borne instrument. The prototype is designed specifically to measure carbon monoxide (CO) from geostationary orbit. NASA's Geostationary Coastal and Air Pollution Events (GEO-CAPE) mission, one of the United States Earth Science and Applications Decadal Survey missions, specifies the use of infrared correlation radiometry to measure CO in two spectral regions for this mission. GEO-CAPE will use the robust IRCR measurement technique at geostationary orbit, nearly 50 times farther away than the Terra/MOPITT orbit, to determine hourly changes in CO across a continental domain. The abundance of CO in Earth's troposphere directly affects the concentration of hydroxyl, which regulates the lifetimes of many tropospheric pollutants. In addition, CO is a precursor to ozone formation; CO is used as a tracer to study the transport of global and regional pollutants; and CO is used as an indicator of both natural and anthropogenic air pollution sources and sinks. We have structured our development project to enable rapid evaluation of future spaceborne instrument designs. The project is part of NASA's Instrument Incubator Program. We describe the architecture of the performance model and the planned evaluation of the performance model using laboratory test data.

  16. Fundamental (f) oscillations in a magnetically coupled solar interior-atmosphere system - An analytical approach

    NASA Astrophysics Data System (ADS)

    Pintér, Balázs; Erdélyi, R.

    2018-01-01

    Solar fundamental (f) acoustic mode oscillations are investigated analytically in a magnetohydrodynamic (MHD) model. The model consists of three layers in planar geometry, representing the solar interior, the magnetic atmosphere, and a transitional layer sandwiched between them. Since we focus on the fundamental mode here, we assume the plasma is incompressible. A horizontal, canopy-like, magnetic field is introduced to the atmosphere, in which degenerated slow MHD waves can exist. The global (f-mode) oscillations can couple to local atmospheric Alfvén waves, resulting, e.g., in a frequency shift of the oscillations. The dispersion relation of the global oscillation mode is derived, and is solved analytically for the thin-transitional layer approximation and for the weak-field approximation. Analytical formulae are also provided for the frequency shifts due to the presence of a thin transitional layer and a weak atmospheric magnetic field. The analytical results generally indicate that, compared to the fundamental value (ω =√{ gk }), the mode frequency is reduced by the presence of an atmosphere by a few per cent. A thin transitional layer reduces the eigen-frequencies further by about an additional hundred microhertz. Finally, a weak atmospheric magnetic field can slightly, by a few percent, increase the frequency of the eigen-mode. Stronger magnetic fields, however, can increase the f-mode frequency by even up to ten per cent, which cannot be seen in observed data. The presence of a magnetic atmosphere in the three-layer model also introduces non-permitted propagation windows in the frequency spectrum; here, f-mode oscillations cannot exist with certain values of the harmonic degree. The eigen-frequencies can be sensitive to the background physical parameters, such as an atmospheric density scale-height or the rate of the plasma density drop at the photosphere. Such information, if ever observed with high-resolution instrumentation and inverted, could help to

  17. First Reprocessing of Southern Hemisphere Additional Ozonesondes (SHADOZ) Ozone Profiles (1998-2016): 2. Comparisons With Satellites and Ground-Based Instruments

    NASA Astrophysics Data System (ADS)

    Thompson, Anne M.; Witte, Jacquelyn C.; Sterling, Chance; Jordan, Allen; Johnson, Bryan J.; Oltmans, Samuel J.; Fujiwara, Masatomo; Vömel, Holger; Allaart, Marc; Piters, Ankie; Coetzee, Gert J. R.; Posny, Françoise; Corrales, Ernesto; Diaz, Jorge Andres; Félix, Christian; Komala, Ninong; Lai, Nga; Ahn Nguyen, H. T.; Maata, Matakite; Mani, Francis; Zainal, Zamuna; Ogino, Shin-ya; Paredes, Francisco; Penha, Tercio Luiz Bezerra; da Silva, Francisco Raimundo; Sallons-Mitro, Sukarni; Selkirk, Henry B.; Schmidlin, F. J.; Stübi, Rene; Thiongo, Kennedy

    2017-12-01

    The Southern Hemisphere ADditional OZonesonde (SHADOZ) network was assembled to validate a new generation of ozone-monitoring satellites and to better characterize the vertical structure of tropical ozone in the troposphere and stratosphere. Beginning with nine stations in 1998, more than 7,000 ozone and P-T-U profiles are available from 14 SHADOZ sites that have operated continuously for at least a decade. We analyze ozone profiles from the recently reprocessed SHADOZ data set that is based on adjustments for inconsistencies caused by varying ozonesonde instruments and operating techniques. First, sonde-derived total ozone column amounts are compared to the overpasses from the Earth Probe/Total Ozone Mapping Spectrometer, Ozone Monitoring Instrument, and Ozone Mapping and Profiler Suite satellites that cover 1998-2016. Second, characteristics of the stratospheric and tropospheric columns are examined along with ozone structure in the tropical tropopause layer (TTL). We find that (1) relative to our earlier evaluations of SHADOZ data, in 2003, 2007, and 2012, sonde-satellite total ozone column offsets at 12 stations are 2% or less, a significant improvement; (2) as in prior studies, the 10 tropical SHADOZ stations, defined as within ±19° latitude, display statistically uniform stratospheric column ozone, 229 ± 3.9 DU (Dobson units), and a tropospheric zonal wave-one pattern with a 14 DU mean amplitude; (3) the TTL ozone column, which is also zonally uniform, masks complex vertical structure, and this argues against using satellites for lower stratospheric ozone trends; and (4) reprocessing has led to more uniform stratospheric column amounts across sites and reduced bias in stratospheric profiles. As a consequence, the uncertainty in total column ozone now averages 5%.

  18. Instrumental neutron activation analysis for studying size-fractionated aerosols

    NASA Astrophysics Data System (ADS)

    Salma, Imre; Zemplén-Papp, Éva

    1999-10-01

    Instrumental neutron activation analysis (INAA) was utilized for studying aerosol samples collected into a coarse and a fine size fraction on Nuclepore polycarbonate membrane filters. As a result of the panoramic INAA, 49 elements were determined in an amount of about 200-400 μg of particulate matter by two irradiations and four γ-spectrometric measurements. The analytical calculations were performed by the absolute ( k0) standardization method. The calibration procedures, application protocol and the data evaluation process are described and discussed. They make it possible now to analyse a considerable number of samples, with assuring the quality of the results. As a means of demonstrating the system's analytical capabilities, the concentration ranges, median or mean atmospheric concentrations and detection limits are presented for an extensive series of aerosol samples collected within the framework of an urban air pollution study in Budapest. For most elements, the precision of the analysis was found to be beyond the uncertainty represented by the sampling techniques and sample variability.

  19. The Development of an Instrument to Measure Creative Teaching Abilities.

    ERIC Educational Resources Information Center

    Riley, John F.

    The development of an instrument to measure creative teaching abilities, the Creative Teaching Dilemma (CTD), involved three phases. The instrument was constructed and refined, and scoring procedures were outlined. The activities comprising the CTD included defining the teaching dilemma, gathering additional facts, identifying and stating the…

  20. Space telescope scientific instruments

    NASA Technical Reports Server (NTRS)

    Leckrone, D. S.

    1979-01-01

    The paper describes the Space Telescope (ST) observatory, the design concepts of the five scientific instruments which will conduct the initial observatory observations, and summarizes their astronomical capabilities. The instruments are the wide-field and planetary camera (WFPC) which will receive the highest quality images, the faint-object camera (FOC) which will penetrate to the faintest limiting magnitudes and achieve the finest angular resolution possible, and the faint-object spectrograph (FOS), which will perform photon noise-limited spectroscopy and spectropolarimetry on objects substantially fainter than those accessible to ground-based spectrographs. In addition, the high resolution spectrograph (HRS) will provide higher spectral resolution with greater photometric accuracy than previously possible in ultraviolet astronomical spectroscopy, and the high-speed photometer will achieve precise time-resolved photometric observations of rapidly varying astronomical sources on short time scales.

  1. Semantic Interaction for Visual Analytics: Toward Coupling Cognition and Computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander

    2014-07-01

    The dissertation discussed in this article [1] was written in the midst of an era of digitization. The world is becoming increasingly instrumented with sensors, monitoring, and other methods for generating data describing social, physical, and natural phenomena. Thus, data exist with the potential of being analyzed to uncover, or discover, the phenomena from which it was created. However, as the analytic models leveraged to analyze these data continue to increase in complexity and computational capability, how can visualizations and user interaction methodologies adapt and evolve to continue to foster discovery and sensemaking?

  2. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  3. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  4. Analysis of commercial equipment and instrumentation for Spacelab payloads, volume 2

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical results are presented of a study to investigate analytically the feasibility of using commercially available laboratory equipment and instrumentation in the spacelab in support of various experiments. The feasibility is demonstrated by the breadth of application of commercial, airborne, and military equipment to experiment equipment requirements in the spacelab, and the cost effectiveness of utilizing this class of equipment instead of custom-built aerospace equipment typical of past designs. Equipment design and specifications are discussed.

  5. Analytical study of sandwich structures using Euler-Bernoulli beam equation

    NASA Astrophysics Data System (ADS)

    Xue, Hui; Khawaja, H.

    2017-01-01

    This paper presents an analytical study of sandwich structures. In this study, the Euler-Bernoulli beam equation is solved analytically for a four-point bending problem. Appropriate initial and boundary conditions are specified to enclose the problem. In addition, the balance coefficient is calculated and the Rule of Mixtures is applied. The focus of this study is to determine the effective material properties and geometric features such as the moment of inertia of a sandwich beam. The effective parameters help in the development of a generic analytical correlation for complex sandwich structures from the perspective of four-point bending calculations. The main outcomes of these analytical calculations are the lateral displacements and longitudinal stresses for each particular material in the sandwich structure.

  6. Aquatic concentrations of chemical analytes compared to ecotoxicity estimates

    USGS Publications Warehouse

    Kostich, Mitchell S.; Flick, Robert W.; Angela L. Batt,; Mash, Heath E.; Boone, J. Scott; Furlong, Edward T.; Kolpin, Dana W.; Glassmeyer, Susan T.

    2017-01-01

    We describe screening level estimates of potential aquatic toxicity posed by 227 chemical analytes that were measured in 25 ambient water samples collected as part of a joint USGS/USEPA drinking water plant study. Measured concentrations were compared to biological effect concentration (EC) estimates, including USEPA aquatic life criteria, effective plasma concentrations of pharmaceuticals, published toxicity data summarized in the USEPA ECOTOX database, and chemical structure-based predictions. Potential dietary exposures were estimated using a generic 3-tiered food web accumulation scenario. For many analytes, few or no measured effect data were found, and for some analytes, reporting limits exceeded EC estimates, limiting the scope of conclusions. Results suggest occasional occurrence above ECs for copper, aluminum, strontium, lead, uranium, and nitrate. Sparse effect data for manganese, antimony, and vanadium suggest that these analytes may occur above ECs, but additional effect data would be desirable to corroborate EC estimates. These conclusions were not affected by bioaccumulation estimates. No organic analyte concentrations were found to exceed EC estimates, but ten analytes had concentrations in excess of 1/10th of their respective EC: triclocarban, norverapamil, progesterone, atrazine, metolachlor, triclosan, para-nonylphenol, ibuprofen, venlafaxine, and amitriptyline, suggesting more detailed characterization of these analytes.

  7. Aquatic concentrations of chemical analytes compared to ecotoxicity estimates.

    PubMed

    Kostich, Mitchell S; Flick, Robert W; Batt, Angela L; Mash, Heath E; Boone, J Scott; Furlong, Edward T; Kolpin, Dana W; Glassmeyer, Susan T

    2017-02-01

    We describe screening level estimates of potential aquatic toxicity posed by 227 chemical analytes that were measured in 25 ambient water samples collected as part of a joint USGS/USEPA drinking water plant study. Measured concentrations were compared to biological effect concentration (EC) estimates, including USEPA aquatic life criteria, effective plasma concentrations of pharmaceuticals, published toxicity data summarized in the USEPA ECOTOX database, and chemical structure-based predictions. Potential dietary exposures were estimated using a generic 3-tiered food web accumulation scenario. For many analytes, few or no measured effect data were found, and for some analytes, reporting limits exceeded EC estimates, limiting the scope of conclusions. Results suggest occasional occurrence above ECs for copper, aluminum, strontium, lead, uranium, and nitrate. Sparse effect data for manganese, antimony, and vanadium suggest that these analytes may occur above ECs, but additional effect data would be desirable to corroborate EC estimates. These conclusions were not affected by bioaccumulation estimates. No organic analyte concentrations were found to exceed EC estimates, but ten analytes had concentrations in excess of 1/10th of their respective EC: triclocarban, norverapamil, progesterone, atrazine, metolachlor, triclosan, para-nonylphenol, ibuprofen, venlafaxine, and amitriptyline, suggesting more detailed characterization of these analytes. Published by Elsevier B.V.

  8. Transcutaneous analyte measuring method (TAMM): a reflective, noninvasive, near-infrared blood chemistry analyzer

    NASA Astrophysics Data System (ADS)

    Schlager, Kenneth J.; Ruchti, Timothy L.

    1995-04-01

    TAMM for Transcutaneous Analyte Measuring Method is a near infrared spectroscopic technique for the noninvasive measurement of human blood chemistry. A near infrared indium gallium arsenide (InGaAs) photodiode array spectrometer has been developed and tested on over 1,000 patients as a part of an SBIR program sponsored by the Naval Medical Research and Development Command. Nine (9) blood analytes have been measured and evaluated during pre-clinical testing: sodium, chloride, calcium, potassium, bicarbonate, BUN, glucose, hematocrit and hemoglobin. A reflective rather than a transmissive invasive approach to measurement has been taken to avoid variations resulting from skin color and sensor positioning. The current status of the instrumentation, neural network pattern recognition algorithms and test results will be discussed.

  9. The Planned Soil Moisture Active Passive (SMAP) Mission L-Band Radar/Radiometer Instrument

    NASA Technical Reports Server (NTRS)

    Spencer, Michael; Wheeler, Kevin; Chan, Samuel; Piepmeier, Jeffrey; Hudson, Derek; Medeiros, James

    2011-01-01

    The Soil Moisture Active/Passive (SMAP) mission is a NASA mission identified by the NRC 'decadal survey' to measure both soil moisture and freeze/thaw state from space. The mission will use both active radar and passive radiometer instruments at L-Band. In order to achieve a wide swath at sufficiently high resolution for both active and passive channels, an instrument architecture that uses a large rotating reflector is employed. The instrument system has completed the preliminary design review (PDR) stage, and detailed instrument design has begun. In addition to providing an overview of the instrument design, two recent design modifications are discussed: 1) The addition of active thermal control to the instrument spun side to provide a more stable, settable thermal environment for the radiometer electronics, and 2) A 'sequential transmit' strategy for the two radar polarization channels which allows a single high-power amplifier to be used.

  10. 7 CFR 1951.240 - State Director's additional authorizations and guidance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... instruments. (4) Approve the extension or expansion of facilities and services. (5) Require additional security when: (i) Existing security is inadequate and the loan or security instruments obligate the... borrower equal the present market value and are assigned and pledged to FmHA or its successor agency under...

  11. Technical pre-analytical effects on the clinical biochemistry of Atlantic salmon (Salmo salar L.).

    PubMed

    Braceland, M; Houston, K; Ashby, A; Matthews, C; Haining, H; Rodger, H; Eckersall, P D

    2017-01-01

    Clinical biochemistry has long been utilized in human and veterinary medicine as a vital diagnostic tool, but despite occasional studies showing its usefulness in monitoring health status in Atlantic salmon (Salmo salar L.), it has not yet been widely utilized within the aquaculture industry. This is due, in part, to a lack of an agreed protocol for collection and processing of blood prior to analysis. Moreover, while the analytical phase of clinical biochemistry is well controlled, there is a growing understanding that technical pre-analytical variables can influence analyte concentrations or activities. In addition, post-analytical interpretation of treatment effects is variable in the literature, thus making the true effect of sample treatment hard to evaluate. Therefore, a number of pre-analytical treatments have been investigated to examine their effect on analyte concentrations and activities. In addition, reference ranges for salmon plasma biochemical analytes have been established to inform veterinary practitioners and the aquaculture industry of the importance of clinical biochemistry in health and disease monitoring. Furthermore, a standardized protocol for blood collection has been proposed. © 2016 The Authors Journal of Fish Diseases Published by John Wiley & Sons Ltd.

  12. Towards a standardized method to assess straylight in earth observing optical instruments

    NASA Astrophysics Data System (ADS)

    Caron, J.; Taccola, M.; Bézy, J.-L.

    2017-09-01

    Straylight is a spurious effect that can seriously degrade the radiometric accuracy achieved by Earth observing optical instruments, as a result of the high contrast in the observed Earth radiance scenes and spectra. It is considered critical for several ESA missions such as Sentinel-5, FLEX and potential successors to CarbonSat. Although it is traditionally evaluated by Monte-Carlo simulations performed with commercial softwares (e.g. ASAP, Zemax, LightTools), semi-analytical approximate methods [1,2] have drawn some interest in recent years due to their faster computing time and the greater insight they provide in straylight mechanisms. They cannot replace numerical simulations, but may be more advantageous in contexts where many iterations are needed, for instance during the early phases of an instrument design.

  13. Multi-analyte validation in heterogeneous solution by ELISA.

    PubMed

    Lakshmipriya, Thangavel; Gopinath, Subash C B; Hashim, Uda; Murugaiyah, Vikneswaran

    2017-12-01

    Enzyme Linked Immunosorbent Assay (ELISA) is a standard assay that has been used widely to validate the presence of analyte in the solution. With the advancement of ELISA, different strategies have shown and became a suitable immunoassay for a wide range of analytes. Herein, we attempted to provide additional evidence with ELISA, to show its suitability for multi-analyte detection. To demonstrate, three clinically relevant targets have been chosen, which include 16kDa protein from Mycobacterium tuberculosis, human blood clotting Factor IXa and a tumour marker Squamous Cell Carcinoma antigen. Indeed, we adapted the routine steps from the conventional ELISA to validate the occurrence of analytes both in homogeneous and heterogeneous solutions. With the homogeneous and heterogeneous solutions, we could attain the sensitivity of 2, 8 and 1nM for the targets 16kDa protein, FIXa and SSC antigen, respectively. Further, the specific multi-analyte validations were evidenced with the similar sensitivities in the presence of human serum. ELISA assay in this study has proven its applicability for the genuine multiple target validation in the heterogeneous solution, can be followed for other target validations. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.

    PubMed

    Stolper, Charles D; Perer, Adam; Gotz, David

    2014-12-01

    As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.

  15. Investigation of Stability of Precise Geodetic Instruments Used in Deformation Monitoring

    NASA Astrophysics Data System (ADS)

    Woźniak, Marek; Odziemczyk, Waldemar

    2017-12-01

    Monitoring systems using automated electronic total stations are an important element of safety control of many engineering objects. In order to ensure the appropriate credibility of acquired data, it is necessary that instruments (total stations in most of the cases) used for measurements meet requirements of measurement accuracy, as well as the stability of instrument axis system geometry. With regards to the above, it is expedient to conduct quality control of data acquired using electronic total stations in the context of performed measurement procedures. This paper presents results of research conducted at the Faculty of Geodesy and Cartography at Warsaw University of Technology investigating the stability of "basic" error values (collimation, zero location for V circle, inclination), for two types of automatic total stations: TDA 5005 and TCRP 1201+. Research provided also information concerning the influence of temperature changes upon the stability of investigated instrument's optical parameters. Results are presented in graphical analytic technique. Final conclusions propose methods, which allow avoiding negative results of measuring tool-set geometry changes during conducting precise deformation monitoring measurements.

  16. Entropy, instrument scan and pilot workload

    NASA Technical Reports Server (NTRS)

    Tole, J. R.; Stephens, A. T.; Vivaudou, M.; Harris, R. L., Jr.; Ephrath, A. R.

    1982-01-01

    Correlation and information theory which analyze the relationships between mental loading and visual scanpath of aircraft pilots are described. The relationship between skill, performance, mental workload, and visual scanning behavior are investigated. The experimental method required pilots to maintain a general aviation flight simulator on a straight and level, constant sensitivity, Instrument Landing System (ILS) course with a low level of turbulence. An additional periodic verbal task whose difficulty increased with frequency was used to increment the subject's mental workload. The subject's looppoint on the instrument panel during each ten minute run was computed via a TV oculometer and stored. Several pilots ranging in skill from novices to test pilots took part in the experiment. Analysis of the periodicity of the subject's instrument scan was accomplished by means of correlation techniques. For skilled pilots, the autocorrelation of instrument/dwell times sequences showed the same periodicity as the verbal task. The ability to multiplex simultaneous tasks increases with skill. Thus autocorrelation provides a way of evaluating the operator's skill level.

  17. Spectroscopic Instrumentation in Undergraduate Astronomy Laboratories

    NASA Astrophysics Data System (ADS)

    Ludovici, Dominic; Mutel, Robert Lucien; Lang, Cornelia C.

    2017-01-01

    We have designed and built two spectrographs for use in undergraduate astronomy laboratories at the University of Iowa. The first, a low cost (appx. $500) low resolution (R ~ 150 - 300) grating-prism (grism) spectrometer consists of five optical elements and is easily modified to other telescope optics. The grism spectrometer is designed to be used in a modified filter wheel. This type of spectrometer allows students to undertake projects requiring sensitive spectral measurements, such as determining the redshifts of quasars. The second instrument is a high resolution (R ~ 8000), moderate cost (appx. $5000) fiber fed echelle spectrometer. The echelle spectrometer will allow students to conduct Doppler measurements such as those used to study spectroscopic binaries. Both systems are designed to be used with robotic telescope systems. The availability of 3D printing enables both of these spectrographs to be constructed in hands-on instrumentation courses where students build and commission their own instruments. Additionally, these instruments enable introductory majors and non-majors laboratory students to gain experience conducting their own spectroscopic observations.

  18. Instrumentation: Software-Driven Instrumentation: The New Wave.

    ERIC Educational Resources Information Center

    Salit, M. L.; Parsons, M. L.

    1985-01-01

    Software-driven instrumentation makes measurements that demand a computer as an integral part of either control, data acquisition, or data reduction. The structure of such instrumentation, hardware requirements, and software requirements are discussed. Examples of software-driven instrumentation (such as wavelength-modulated continuum source…

  19. Siberian lidar station: instruments and results

    NASA Astrophysics Data System (ADS)

    Matvienko, Gennadii G.; Balin, Yurii S.; Bobrovnikov, Sergey M.; Romanovskii, Oleg A.; Kokhanenko, Grigirii P.; Samoilova, Svetlana V.; Penner, Ioganes E.; Gorlov, Evgenii V.; Zharkov, Victir I.; Sadovnikov, Sergey A.; Yakovlev, Semen V.; Bazhenov, Oleg E.; Dolgii, Sergey I.; Makeev, Andrey P.; Nevzorov, Alexey A.; Nevzorov, Alexey V.; Kharchenko, Olga V.

    2018-04-01

    The Siberian Lidar Station created at V.E. Zuev Institute of Atmospheric Optics and operating in Tomsk (56.5° N, 85.0° E) is a unique atmospheric observatory. It combines up-to-date instruments for remote laser and passive sounding for the study of aerosol and cloud fields, air temperature and humidity, and ozone and gaseous components of the ozone cycles. In addition to controlling a wide range of atmospheric parameters, the observatory allows simultaneous monitoring of the atmosphere throughout the valuable altitude range 0-75 km. In this paper, the instruments and results received at the Station are described.

  20. Analytical surveillance of emerging drugs of abuse and drug formulations

    PubMed Central

    Thomas, Brian F.; Pollard, Gerald T.; Grabenauer, Megan

    2012-01-01

    Uncontrolled recreational drugs are proliferating in number and variety. Effects of long-term use are unknown, and regulation is problematic, as efforts to control one chemical often lead to several other structural analogs. Advanced analytical instrumentation and methods are continuing to be developed to identify drugs, chemical constituents of products, and drug substances and metabolites in biological fluids. Several mass spectrometry based approaches appear promising, particularly those that involve high resolution chromatographic and mass spectrometric methods that allow unbiased data acquisition and sophisticated data interrogation. Several of these techniques are shown to facilitate both targeted and broad spectrum analysis, which is often of particular benefit when dealing with misleadingly labeled products or assessing a biological matrix for illicit drugs and metabolites. The development and application of novel analytical approaches such as these will help to assess the nature and degree of exposure and risk and, where necessary, inform forensics and facilitate implementation of specific regulation and control measures. PMID:23154240

  1. How to evaluate population management? Transforming the Care Continuum Alliance population health guide toward a broadly applicable analytical framework.

    PubMed

    Struijs, Jeroen N; Drewes, Hanneke W; Heijink, Richard; Baan, Caroline A

    2015-04-01

    Many countries face the persistent twin challenge of providing high-quality care while keeping health systems affordable and accessible. As a result, the interest for more efficient strategies to stimulate population health is increasing. A possible successful strategy is population management (PM). PM strives to address health needs for the population at-risk and the chronically ill at all points along the health continuum by integrating services across health care, prevention, social care and welfare. The Care Continuum Alliance (CCA) population health guide, which recently changed their name in Population Health Alliance (PHA) provides a useful instrument for implementing and evaluating such innovative approaches. This framework is developed for PM specifically and describes the core elements of the PM-concept on the basis of six subsequent interrelated steps. The aim of this article is to transform the CCA framework into an analytical framework. Quantitative methods are refined and we operationalized a set of indicators to measure the impact of PM in terms of the Triple Aim (population health, quality of care and cost per capita). Additionally, we added a qualitative part to gain insight into the implementation process of PM. This resulted in a broadly applicable analytical framework based on a mixed-methods approach. In the coming years, the analytical framework will be applied within the Dutch Monitor Population Management to derive transferable 'lessons learned' and to methodologically underpin the concept of PM. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Avoidance-based human Pavlovian-to-instrumental transfer

    PubMed Central

    Lewis, Andrea H.; Niznikiewicz, Michael A.; Delamater, Andrew R.; Delgado, Mauricio R.

    2013-01-01

    The Pavlovian-to-instrumental transfer (PIT) paradigm probes the influence of Pavlovian cues over instrumentally learned behavior. The paradigm has been used extensively to probe basic cognitive and motivational processes in studies of animal learning but, more recently, PIT and its underlying neural basis have been extended to investigations in humans. These initial neuroimaging studies of PIT have focused on the influence of appetitively conditioned stimuli on instrumental responses maintained by positive reinforcement, and highlight the involvement of the striatum. In the current study, we sought to understand the neural correlates of PIT in an aversive Pavlovian learning situation when instrumental responding was maintained through negative reinforcement. Participants exhibited specific PIT, wherein selective increases in instrumental responding to conditioned stimuli occurred when the stimulus signaled a specific aversive outcome whose omission negatively reinforced the instrumental response. Additionally, a general PIT effect was observed such that when a stimulus was associated with a different aversive outcome than was used to negatively reinforce instrumental behavior, the presence of that stimulus caused a non-selective increase in overall instrumental responding. Both specific and general PIT behavioral effects correlated with increased activation in corticostriatal circuitry, particularly in the striatum, a region involved in cognitive and motivational processes. These results suggest that avoidance-based PIT utilizes a similar neural mechanism to that seen with PIT in an appetitive context, which has implications for understanding mechanisms of drug-seeking behavior during addiction and relapse. PMID:24118624

  3. Nanomaterials in consumer products: a challenging analytical problem.

    PubMed

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  4. Nanomaterials in consumer products: a challenging analytical problem

    NASA Astrophysics Data System (ADS)

    Contado, Catia

    2015-08-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits versus risks of engineered nanomaterials and consequently to legislate in favor of consumer’s protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  5. Nanomaterials in consumer products: a challenging analytical problem

    PubMed Central

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices. PMID:26301216

  6. [Development of opened instrument for generating and measuring physiological signal].

    PubMed

    Chen, Longcong; Hu, Guohu; Gao, Bin

    2004-12-01

    An opened instrument with liquid crystal display (LCD) for generating and measuring physiological signal is introduced in this paper. Based on a single-chip microcomputer. the instrument uses the technique of LCD screen to display signal wave and information, and it realizes man-machine interaction by keyboard. This instrument can produce not only defined signal in common use by utilizing important saved data and relevant arithmetic, but also user-defined signal. Therefore, it is open to produce signal. In addition, this instrument has strong extension because of its modularized design as computer, which has much function such as displaying, measuring and saving physiological signal, and many features such as low power consumption, small volume, low cost and portability. Hence this instrument is convenient for experiment teaching, clinic examining, maintaining of medical instrument.

  7. TMI-2 - A Case Study for PWR Instrumentation Performance during a Severe Accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joy L. Rempe; Darrell L. Knudson

    2013-03-01

    The accident at the Three Mile Island Unit 2 (TMI-2) reactor provided a unique opportunity to evaluate sensors exposed to severe accident conditions. Conditions associated with the release of coolant and the hydrogen burn that occurred during this accident exposed instrumentation to harsh conditions, including direct radiation, radioactive contamination, and high humidity with elevated temperatures and pressures. As part of a program initiated in 2012 by the Department of Energy Office of Nuclear Energy (DOE-NE), a review was completed to gain insights from prior TMI-2 sensor survivability and data qualification efforts. This new effort focussed upon a set of sensorsmore » that provided critical data to TMI-2 operators for assessing the condition of the plant and the effects of mitigating actions taken by these operators. In addition, the effort considered sensors providing data required for subsequent accident simulations. Over 100 references related to instrumentation performance and post-accident evaluations of TMI-2 sensors and measurements were reviewed. Insights gained from this review are summarized within this report. For each sensor, a description is provided with the measured data and conclusions related to the sensor’s survivability, and the basis for conclusions about its survivability. As noted within this document, several techniques were invoked in the TMI-2 post-accident evaluation program to assess sensor status, including comparisons with data from other sensors, analytical calculations, laboratory testing, and comparisons with sensors subjected to similar conditions in large-scale integral tests and with sensors that were similar in design but more easily removed from the TMI-2 plant for evaluations. Conclusions from this review provide important insights related to sensor survivability and enhancement options for improving sensor performance. In addition, this document provides recommendations related to the sensor survivability and data

  8. TMI-2 - A Case Study for PWR Instrumentation Performance during a Severe Accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joy L. Rempe; Darrell L. Knudson

    2014-05-01

    The accident at the Three Mile Island Unit 2 (TMI-2) reactor provided a unique opportunity to evaluate sensors exposed to severe accident conditions. Conditions associated with the release of coolant and the hydrogen burn that occurred during this accident exposed instrumentation to harsh conditions, including direct radiation, radioactive contamination, and high humidity with elevated temperatures and pressures. As part of a program initiated in 2012 by the Department of Energy Office of Nuclear Energy (DOE-NE), a review was completed to gain insights from prior TMI-2 sensor survivability and data qualification efforts. This new effort focussed upon a set of sensorsmore » that provided critical data to TMI-2 operators for assessing the condition of the plant and the effects of mitigating actions taken by these operators. In addition, the effort considered sensors providing data required for subsequent accident simulations. Over 100 references related to instrumentation performance and post-accident evaluations of TMI-2 sensors and measurements were reviewed. Insights gained from this review are summarized within this report. For each sensor, a description is provided with the measured data and conclusions related to the sensor’s survivability, and the basis for conclusions about its survivability. As noted within this document, several techniques were invoked in the TMI-2 post-accident evaluation program to assess sensor status, including comparisons with data from other sensors, analytical calculations, laboratory testing, and comparisons with sensors subjected to similar conditions in large-scale integral tests and with sensors that were similar in design but more easily removed from the TMI-2 plant for evaluations. Conclusions from this review provide important insights related to sensor survivability and enhancement options for improving sensor performance. In addition, this document provides recommendations related to the sensor survivability and data

  9. Analytical effective tensor for flow-through composites

    DOEpatents

    Sviercoski, Rosangela De Fatima [Los Alamos, NM

    2012-06-19

    A machine, method and computer-usable medium for modeling an average flow of a substance through a composite material. Such a modeling includes an analytical calculation of an effective tensor K.sup.a suitable for use with a variety of media. The analytical calculation corresponds to an approximation to the tensor K, and follows by first computing the diagonal values, and then identifying symmetries of the heterogeneity distribution. Additional calculations include determining the center of mass of the heterogeneous cell and its angle according to a defined Cartesian system, and utilizing this angle into a rotation formula to compute the off-diagonal values and determining its sign.

  10. Reduction of dioxin-like toxicity in effluents by additional wastewater treatment and related effects in fish.

    PubMed

    Maier, Diana; Benisek, Martin; Blaha, Ludek; Dondero, Francesco; Giesy, John P; Köhler, Heinz-R; Richter, Doreen; Scheurer, Marco; Triebskorn, Rita

    2016-10-01

    Efficiency of advanced wastewater treatment technologies to reduce micropollutants which mediate dioxin-like toxicity was investigated. Technologies compared included ozonation, powdered activated carbon and granular activated carbon. In addition to chemical analyses in samples of effluents, surface waters, sediments, and fish, (1) dioxin-like potentials were measured in paired samples of effluents, surface waters, and sediments by use of an in vitro biotest (reporter gene assay) and (2) dioxin-like effects were investigated in exposed fish by use of in vivo activity of the mixed-function, monooxygenase enzyme, ethoxyresorufin O-deethylase (EROD) in liver. All advanced technologies studied, based on degradation or adsorption, significantly reduced dioxin-like potentials in samples and resulted in lesser EROD activity in livers of fish. Results of in vitro and in vivo biological responses were not clearly related to quantification of targeted analytes by use of instrumental analyses. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Focus determination for the James Webb Space Telescope Science Instruments: A Survey of Methods

    NASA Technical Reports Server (NTRS)

    Davila, Pamela S.; Bolcar, Matthew R.; Boss, B.; Dean, B.; Hapogian, J.; Howard, J.; Unger, B.; Wilson, M.

    2006-01-01

    The James Webb Space Telescope (JWST) is a segmented deployable telescope that will require on-orbit alignment using the Near Infrared Camera as a wavefront sensor. The telescope will be aligned by adjusting seven degrees of freedom on each of 18 primary mirror segments and five degrees of freedom on the secondary mirror to optimize the performance of the telescope and camera at a wavelength of 2 microns. With the completion of these adjustments, the telescope focus is set and the optical performance of each of the other science instruments should then be optimal without making further telescope focus adjustments for each individual instrument. This alignment approach requires confocality of the instruments after integration and alignment to the composite metering structure, which will be verified during instrument level testing at Goddard Space Flight Center with a telescope optical simulator. In this paper, we present the results from a study of several analytical approaches to determine the focus for each instrument. The goal of the study is to compare the accuracies obtained for each method, and to select the most feasible for use during optical testing.

  12. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  13. Let's Talk... Analytics

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  14. Analytical and experimental studies of leak location and environment characterization for the international space station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woronowicz, Michael; Blackmon, Rebecca; Brown, Martin

    2014-12-09

    The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to demonstrate the ability to detect NH{sub 3} coolant leaks in the ISS thermal control system. An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performancemore » to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations (“directionality”). The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lb{sub m/}/yr. to about 1 lb{sub m}/day. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ram/wake flows and structural shadowing within low Earth orbit.« less

  15. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  16. Analytical Chemical Sensing in the Submillimeter/terahertz Spectral Range

    NASA Astrophysics Data System (ADS)

    Moran, Benjamin L.; Fosnight, Alyssa M.; Medvedev, Ivan R.; Neese, Christopher F.

    2012-06-01

    Highly sensitive and selective Terahertz sensor utilized to quantitatively analyze a complex mixture of Volatile Organic Compounds is reported. To best demonstrate analytical capabilities of THz chemical sensors we chose to perform analytical quantitative analysis of a certified gas mixture using a novel prototype chemical sensor that couples a commercial preconcentration system (Entech 7100A) to a high resolution THz spectrometer. We selected Method TO-14A certified mixture of 39 volatile organic compounds (VOCs) diluted to 1 part per million (ppm) in nitrogen. 26 of the 39 chemicals were identified by us as suitable for THz spectroscopic detection. Entech 7100A system is designed and marketed as an inlet system for Gas Chromatography-Mass Spectrometry (GC-MS) instruments with a specific focus on TO-14 and TO-15 EPA sampling methods. Its preconcentration efficiency is high for the 39 chemicals in the mixture used for this study and our preliminary results confirm this. Here we present the results of this study which serves as basis for our ongoing research in environmental sensing and analysis of exhaled human breath.

  17. On the Modeling and Management of Cloud Data Analytics

    NASA Astrophysics Data System (ADS)

    Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni

    A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.

  18. 14 CFR Appendix B to Part 29 - Airworthiness Criteria for Helicopter Instrument Flight

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... systems that operate the required flight instruments at each pilot's station— (i) Only the required flight instruments for the first pilot may be connected to that operating system; (ii) Additional instruments, systems, or equipment may not be connected to an operating system for a second pilot unless provisions are...

  19. 14 CFR Appendix B to Part 29 - Airworthiness Criteria for Helicopter Instrument Flight

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... systems that operate the required flight instruments at each pilot's station— (i) Only the required flight instruments for the first pilot may be connected to that operating system; (ii) Additional instruments, systems, or equipment may not be connected to an operating system for a second pilot unless provisions are...

  20. 14 CFR Appendix B to Part 27 - Airworthiness Criteria for Helicopter Instrument Flight

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... systems that operate the required flight instruments at each pilot's station— (i) Only the required flight instruments for the first pilot may be connected to that operating system; (ii) Additional instruments, systems, or equipment may not be connected to an operating system for a second pilot unless provisions are...

  1. 14 CFR Appendix B to Part 27 - Airworthiness Criteria for Helicopter Instrument Flight

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... systems that operate the required flight instruments at each pilot's station— (i) Only the required flight instruments for the first pilot may be connected to that operating system; (ii) Additional instruments, systems, or equipment may not be connected to an operating system for a second pilot unless provisions are...

  2. 14 CFR Appendix B to Part 29 - Airworthiness Criteria for Helicopter Instrument Flight

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... systems that operate the required flight instruments at each pilot's station— (i) Only the required flight instruments for the first pilot may be connected to that operating system; (ii) Additional instruments, systems, or equipment may not be connected to an operating system for a second pilot unless provisions are...

  3. 14 CFR Appendix B to Part 29 - Airworthiness Criteria for Helicopter Instrument Flight

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... systems that operate the required flight instruments at each pilot's station— (i) Only the required flight instruments for the first pilot may be connected to that operating system; (ii) Additional instruments, systems, or equipment may not be connected to an operating system for a second pilot unless provisions are...

  4. 14 CFR Appendix B to Part 27 - Airworthiness Criteria for Helicopter Instrument Flight

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... systems that operate the required flight instruments at each pilot's station— (i) Only the required flight instruments for the first pilot may be connected to that operating system; (ii) Additional instruments, systems, or equipment may not be connected to an operating system for a second pilot unless provisions are...

  5. 14 CFR Appendix B to Part 27 - Airworthiness Criteria for Helicopter Instrument Flight

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... systems that operate the required flight instruments at each pilot's station— (i) Only the required flight instruments for the first pilot may be connected to that operating system; (ii) Additional instruments, systems, or equipment may not be connected to an operating system for a second pilot unless provisions are...

  6. 14 CFR Appendix B to Part 27 - Airworthiness Criteria for Helicopter Instrument Flight

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... systems that operate the required flight instruments at each pilot's station— (i) Only the required flight instruments for the first pilot may be connected to that operating system; (ii) Additional instruments, systems, or equipment may not be connected to an operating system for a second pilot unless provisions are...

  7. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    ERIC Educational Resources Information Center

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  8. Line-Focused Optical Excitation of Parallel Acoustic Focused Sample Streams for High Volumetric and Analytical Rate Flow Cytometry.

    PubMed

    Kalb, Daniel M; Fencl, Frank A; Woods, Travis A; Swanson, August; Maestas, Gian C; Juárez, Jaime J; Edwards, Bruce S; Shreve, Andrew P; Graves, Steven W

    2017-09-19

    Flow cytometry provides highly sensitive multiparameter analysis of cells and particles but has been largely limited to the use of a single focused sample stream. This limits the analytical rate to ∼50K particles/s and the volumetric rate to ∼250 μL/min. Despite the analytical prowess of flow cytometry, there are applications where these rates are insufficient, such as rare cell analysis in high cellular backgrounds (e.g., circulating tumor cells and fetal cells in maternal blood), detection of cells/particles in large dilute samples (e.g., water quality, urine analysis), or high-throughput screening applications. Here we report a highly parallel acoustic flow cytometer that uses an acoustic standing wave to focus particles into 16 parallel analysis points across a 2.3 mm wide optical flow cell. A line-focused laser and wide-field collection optics are used to excite and collect the fluorescence emission of these parallel streams onto a high-speed camera for analysis. With this instrument format and fluorescent microsphere standards, we obtain analysis rates of 100K/s and flow rates of 10 mL/min, while maintaining optical performance comparable to that of a commercial flow cytometer. The results with our initial prototype instrument demonstrate that the integration of key parallelizable components, including the line-focused laser, particle focusing using multinode acoustic standing waves, and a spatially arrayed detector, can increase analytical and volumetric throughputs by orders of magnitude in a compact, simple, and cost-effective platform. Such instruments will be of great value to applications in need of high-throughput yet sensitive flow cytometry analysis.

  9. Results of instrument reliability study for high-level nuclear-waste repositories. [Geotechnical parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogue, F.; Binnall, E.P.

    1982-10-01

    Reliable instrumentation will be needed to monitor the performance of future high-level waste repository sites. A study has been made to assess instrument reliability at Department of Energy (DOE) waste repository related experiments. Though the study covers a wide variety of instrumentation, this paper concentrates on experiences with geotechnical instrumentation in hostile repository-type environments. Manufacturers have made some changes to improve the reliability of instruments for repositories. This paper reviews the failure modes, rates, and mechanisms, along with manufacturer modifications and recommendations for additional improvements to enhance instrument performance. 4 tables.

  10. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    PubMed

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  11. Validating Affordances as an Instrument for Design and a Priori Analysis of Didactical Situations in Mathematics

    ERIC Educational Resources Information Center

    Sollervall, Håkan; Stadler, Erika

    2015-01-01

    The aim of the presented case study is to investigate how coherent analytical instruments may guide the a priori and a posteriori analyses of a didactical situation. In the a priori analysis we draw on the notion of affordances, as artefact-mediated opportunities for action, to construct hypothetical trajectories of goal-oriented actions that have…

  12. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    PubMed

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016

  13. Using Model-Based Reasoning for Autonomous Instrument Operation

    NASA Technical Reports Server (NTRS)

    Johnson, Mike; Rilee, M.; Truszkowski, W.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    Multiprobe missions are an important part of NASA's future: Cluster, Magnetospheric Multi Scale, Global Electrodynamics and Magnetospheric Constellation are representatives from the Sun-Earth Connections Theme. To make such missions robust, reliable, and affordable, ideally the many spacecraft of a constellation must be at least as easy to operate as one spacecraft is today. To support this need for scalability, science instrumentation must become increasingly easy to operate, even as this same instrumentation becomes more capable and advanced. Communication and control resources will be at a premium for future instruments. Many missions will be out of contact with ground operators for extended periods either to reduce operations cost or because of orbits that limit communication to weekly perigee transits. Autonomous capability is necessary if such missions are to effectively achieve their operational objectives. An autonomous system is one that acts given its situation in a mission appropriate manner without external direction to achieve mission goals. To achieve this capability autonomy must be built into the system through judicious design or through a built-in intelligence that recognizes system state and manages system response. To recognize desired or undesired system states, the system must have an implicit or explicit understanding of its expected states given its history and self observations. The systems we are concerned with, science instruments, can have stringent requirements for system state knowledge in addition to requirements driven by health and safety concerns. Without accurate knowledge of the system state, the usefulness of the science instrument may be severely limited. At the same time, health and safety concerns often lead to overly conservative instrument operations further reducing the effectiveness of the instrument. These requirements, coupled with overall mission requirements including lack of communication opportunities and tolerance

  14. Nuclear analytical techniques in medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cesareo, R.

    1988-01-01

    This book acquaints one with the fundamental principles and the instrumentation relevant to analytical technique based on atomic and nuclear physics, as well as present and future biomedical applications. Besides providing a theoretical description of the physical phenomena, a large part of the book is devoted to applications in the medical and biological field, particularly in hematology, forensic medicine and environmental science. This volume reviews methods such as the possibility of carrying out rapid multi-element analysis of trace elements on biomedical samples, in vitro and in vivo, by XRF-analysis; the ability of the PIXE-microprobe to analyze in detail and tomore » map trace elements in fragments of biomedical samples or inside the cells; the potentiality of in vivo nuclear activation analysis for diagnostic purposes. Finally, techniques are described such as radiation scattering (elastic and inelastic scattering) and attenuation measurements which will undoubtedly see great development in the immediate future.« less

  15. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    NASA Astrophysics Data System (ADS)

    Jaggi, S.

    1993-02-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  16. Radiation Budget Instrument (RBI) for JPSS-2

    NASA Technical Reports Server (NTRS)

    Georgieva, Elena; Priestley, Kory; Dunn, Barry; Cageao, Richard; Barki, Anum; Osmundsen, Jim; Turczynski, Craig; Abedin, Nurul

    2015-01-01

    Radiation Budget Instrument (RBI) will be one of five instruments flying aboard the JPSS-2 spacecraft, a polar-orbiting sun-synchronous satellite in Low Earth Orbit. RBI is a passive remote sensing instrument that will follow the successful legacy of the Clouds and Earth's Radiant Energy System (CERES) instruments to make measurement of Earth's short and longwave radiation budget. The goal of RBI is to provide an independent measurement of the broadband reflected solar radiance and Earth's emitted thermal radiance by using three spectral bands (Shortwave, Longwave, and Total) that will have the same overlapped point spread function (PSF) footprint on Earth. To ensure precise NIST-traceable calibration in space the RBI sensor is designed to use a visible calibration target (VCT), a solar calibration target (SCT), and an infrared calibration target (ICT) containing phase change cells (PCC) to enable on-board temperature calibration. The VCT is a thermally controlled integrating sphere with space grade Spectralon covering the inner surface. Two sides of the sphere will have fiber-coupled laser diodes in the UV to IR wavelength region. An electrical substitution radiometer on the integrating sphere will monitor the long term stability of the sources and the possible degradation of the Spectralon in space. In addition the radiometric calibration operations will use the Spectralon diffusers of the SCT to provide accurate measurements of Solar degradation. All those stable on-orbit references will ensure that calibration stability is maintained over the RBI sensor lifetime. For the preflight calibration the RBI will view five calibration sources - two integrating spheres and three CrIS (Cross-track Infrared Sounder ) -like blackbodies whose outputs will be validated with NIST calibration approach. Thermopile are the selected detectors for the RBI. The sensor has a requirement to perform lunar calibration in addition to solar calibration in space in a way similar to CERES

  17. Analytical methods for human biomonitoring of pesticides. A review.

    PubMed

    Yusa, Vicent; Millet, Maurice; Coscolla, Clara; Roca, Marta

    2015-09-03

    Biomonitoring of both currently-used and banned-persistent pesticides is a very useful tool for assessing human exposure to these chemicals. In this review, we present current approaches and recent advances in the analytical methods for determining the biomarkers of exposure to pesticides in the most commonly used specimens, such as blood, urine, and breast milk, and in emerging non-invasive matrices such as hair and meconium. We critically discuss the main applications for sample treatment, and the instrumental techniques currently used to determine the most relevant pesticide biomarkers. We finally look at the future trends in this field. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Sources Sought for Innovative Scientific Instrumentation for Scientific Lunar Rovers

    NASA Technical Reports Server (NTRS)

    Meyer, C.

    1993-01-01

    Lunar rovers should be designed as integrated scientific measurement systems that address scientific goals as their main objective. Scientific goals for lunar rovers are presented. Teleoperated robotic field geologists will allow the science team to make discoveries using a wide range of sensory data collected by electronic 'eyes' and sophisticated scientific instrumentation. rovers need to operate in geologically interesting terrain (rock outcrops) and to identify and closely examine interesting rock samples. Enough flight-ready instruments are available to fly on the first mission, but additional instrument development based on emerging technology is desirable. Various instruments that need to be developed for later missions are described.

  19. Instrumentation progress at the Giant Magellan Telescope project

    NASA Astrophysics Data System (ADS)

    Jacoby, George H.; Bernstein, R.; Bouchez, A.; Colless, M.; Crane, Jeff; DePoy, D.; Espeland, B.; Hare, Tyson; Jaffe, D.; Lawrence, J.; Marshall, J.; McGregor, P.; Shectman, Stephen; Sharp, R.; Szentgyorgyi, A.; Uomoto, Alan; Walls, B.

    2016-08-01

    Instrument development for the 24m Giant Magellan Telescope (GMT) is described: current activities, progress, status, and schedule. One instrument team has completed its preliminary design and is currently beginning its final design (GCLEF, an optical 350-950 nm, high-resolution and precision radial velocity echelle spectrograph). A second instrument team is in its conceptual design phase (GMACS, an optical 350-950 nm, medium resolution, 6-10 arcmin field, multi-object spectrograph). A third instrument team is midway through its preliminary design phase (GMTIFS, a near-IR YJHK diffraction-limited imager/integral-field-spectrograph), focused on risk reduction prototyping and design optimization. A fourth instrument team is currently fabricating the 5 silicon immersion gratings needed to begin its preliminary design phase (GMTNIRS, a simultaneous JHKLM high-resolution, AO-fed, echelle spectrograph). And, another instrument team is focusing on technical development and prototyping (MANIFEST, a facility robotic, multifiber feed, with a 20 arcmin field of view). In addition, a medium-field (6 arcmin, 0.06 arcsec/pix) optical imager will support telescope and AO commissioning activities, and will excel at narrow-band imaging. In the spirit of advancing synergies with other groups, the challenges of running an ELT instrument program and opportunities for cross-ELT collaborations are discussed.

  20. Assessment regarding the use of the computer aided analytical models in the calculus of the general strength of a ship hull

    NASA Astrophysics Data System (ADS)

    Hreniuc, V.; Hreniuc, A.; Pescaru, A.

    2017-08-01

    Solving a general strength problem of a ship hull may be done using analytical approaches which are useful to deduce the buoyancy forces distribution, the weighting forces distribution along the hull and the geometrical characteristics of the sections. These data are used to draw the free body diagrams and to compute the stresses. The general strength problems require a large amount of calculi, therefore it is interesting how a computer may be used to solve such problems. Using computer programming an engineer may conceive software instruments based on analytical approaches. However, before developing the computer code the research topic must be thoroughly analysed, in this way being reached a meta-level of understanding of the problem. The following stage is to conceive an appropriate development strategy of the original software instruments useful for the rapid development of computer aided analytical models. The geometrical characteristics of the sections may be computed using a bool algebra that operates with ‘simple’ geometrical shapes. By ‘simple’ we mean that for the according shapes we have direct calculus relations. In the set of ‘simple’ shapes we also have geometrical entities bounded by curves approximated as spline functions or as polygons. To conclude, computer programming offers the necessary support to solve general strength ship hull problems using analytical methods.

  1. Data enhancement and analysis through mathematical deconvolution of signals from scientific measuring instruments

    NASA Technical Reports Server (NTRS)

    Wood, G. M.; Rayborn, G. H.; Ioup, J. W.; Ioup, G. E.; Upchurch, B. T.; Howard, S. J.

    1981-01-01

    Mathematical deconvolution of digitized analog signals from scientific measuring instruments is shown to be a means of extracting important information which is otherwise hidden due to time-constant and other broadening or distortion effects caused by the experiment. Three different approaches to deconvolution and their subsequent application to recorded data from three analytical instruments are considered. To demonstrate the efficacy of deconvolution, the use of these approaches to solve the convolution integral for the gas chromatograph, magnetic mass spectrometer, and the time-of-flight mass spectrometer are described. Other possible applications of these types of numerical treatment of data to yield superior results from analog signals of the physical parameters normally measured in aerospace simulation facilities are suggested and briefly discussed.

  2. A new innovative instrument for space plasma instrumentation

    NASA Technical Reports Server (NTRS)

    Torbert, Roy B.

    1993-01-01

    The Faraday Ring Ammeter was the subject of this grant for a new innovative instrument for space plasma instrumentation. This report summarizes our progress in this work. Briefly, we have conducted an intensive series of experiments and trials over three years, testing some five configurations of the instrument to measure currents, resulting in two Ph.D. theses, supported by this grant, and two flight configurations of the instrument. The first flight would have been on a NASA-Air Force collaborative sounding rocket, but was not flown because of instrumental difficulties. The second has been successfully integrated on the NASA Auroral Turbulence payload which is to be launched in February, 1994.

  3. Instrumentation development for drug detection on the breath

    DOT National Transportation Integrated Search

    1972-09-01

    Based on a survey of candidate analytical methods, mass spectrometry was identified as a promising technique for drug detection on the breath. To demonstrate its capabilities, an existing laboratory mass spectrometer was modified by the addition of a...

  4. GENES AS INSTRUMENTS FOR STUDYING RISK BEHAVIOR EFFECTS: AN APPLICATION TO MATERNAL SMOKING AND OROFACIAL CLEFTS

    PubMed Central

    Jugessur, Astanand; Murray, Jeffrey C.; Moreno, Lina; Wilcox, Allen; Lie, Rolv T.

    2011-01-01

    This study uses instrumental variable (IV) models with genetic instruments to assess the effects of maternal smoking on the child’s risk of orofacial clefts (OFC), a common birth defect. The study uses genotypic variants in neurotransmitter and detoxification genes relateded to smoking as instruments for cigarette smoking before and during pregnancy. Conditional maximum likelihood and two-stage IV probit models are used to estimate the IV model. The data are from a population-level sample of affected and unaffected children in Norway. The selected genetic instruments generally fit the IV assumptions but may be considered “weak” in predicting cigarette smoking. We find that smoking before and during pregnancy increases OFC risk substantially under the IV model (by about 4–5 times at the sample average smoking rate). This effect is greater than that found with classical analytic models. This may be because the usual models are not able to consider self-selection into smoking based on unobserved confounders, or it may to some degree reflect limitations of the instruments. Inference based on weak-instrument robust confidence bounds is consistent with standard inference. Genetic instruments may provide a valuable approach to estimate the “causal” effects of risk behaviors with genetic-predisposing factors (such as smoking) on health and socioeconomic outcomes. PMID:22102793

  5. Analytical Enantioseparation of β-Substituted-2-Phenylpropionic Acids by High-Performance Liquid Chromatography with Hydroxypropyl-β-Cyclodextrin as Chiral Mobile Phase Additive.

    PubMed

    Tong, Shengqiang; Zhang, Hu; Yan, Jizhong

    2016-04-01

    Analytical enantioseparation of five β-substituted-2-phenylpropionic acids by high-performance liquid chromatography with hydroxypropyl-β-cyclodextrin (HP-β-CD) as chiral mobile phase additive was established in this paper, and chromatographic retention mechanism was studied. The effects of various factors such as the organic modifier, different ODS C18 columns and concentration of HP-β-CD were investigated. The chiral mobile phase was composed of methanol or acetonitrile and 0.5% triethylamine acetate buffer at pH 3.0 added with 25 mmol L(-1) of HP-β-CD, and baseline separations could be reached for all racemates. As for chromatographic retention mechanism, it was found that there was a negative correlation between the concentration of HP-β-CD in mobile phase and the retention factor under constant pH value and column temperature. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Instrument Remote Control via the Astronomical Instrument Markup Language

    NASA Technical Reports Server (NTRS)

    Sall, Ken; Ames, Troy; Warsaw, Craig; Koons, Lisa; Shafer, Richard

    1998-01-01

    The Instrument Remote Control (IRC) project ongoing at NASA's Goddard Space Flight Center's (GSFC) Information Systems Center (ISC) supports NASA's mission by defining an adaptive intranet-based framework that provides robust interactive and distributed control and monitoring of remote instruments. An astronomical IRC architecture that combines the platform-independent processing capabilities of Java with the power of Extensible Markup Language (XML) to express hierarchical data in an equally platform-independent, as well as human readable manner, has been developed. This architecture is implemented using a variety of XML support tools and Application Programming Interfaces (API) written in Java. IRC will enable trusted astronomers from around the world to easily access infrared instruments (e.g., telescopes, cameras, and spectrometers) located in remote, inhospitable environments, such as the South Pole, a high Chilean mountaintop, or an airborne observatory aboard a Boeing 747. Using IRC's frameworks, an astronomer or other scientist can easily define the type of onboard instrument, control the instrument remotely, and return monitoring data all through the intranet. The Astronomical Instrument Markup Language (AIML) is the first implementation of the more general Instrument Markup Language (IML). The key aspects of our approach to instrument description and control applies to many domains, from medical instruments to machine assembly lines. The concepts behind AIML apply equally well to the description and control of instruments in general. IRC enables us to apply our techniques to several instruments, preferably from different observatories.

  7. Control of vacuum induction brazing system for sealing of instrumentation feed-through

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung Ho Ahn; Jintae Hong; Chang Young Joung

    2015-07-01

    The integrity of instrumentation cables is an important performance parameter in addition to the sealing performance in the brazing process. An accurate brazing control was developed for the brazing of the instrumentation feed-through in the vacuum induction brazing system in this paper. The experimental results show that the accurate brazing temperature control performance is achieved by the developed control scheme. Consequently, the sealing performances of the instrumentation feed-through and the integrities of the instrumentation cables were satisfied after brazing. (authors)

  8. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    PubMed

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  9. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  10. Systems and Methods for Composable Analytics

    DTIC Science & Technology

    2014-04-29

    simplistic module that performs a mathematical operation on two numbers. The most important method is the Execute() method. This will get called when it is...addition, an input control is also specified in the example below. In this example, the mathematical operator can only be chosen from a preconfigured...approaches. Some of the industries that could benefit from Composable Analytics include pharmaceuticals, health care, insurance, actuaries , and

  11. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  12. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  13. The Quantitative Resolution of a Mixture of Group II Metal Ions by Thermometric Titration with EDTA. An Analytical Chemistry Experiment.

    ERIC Educational Resources Information Center

    Smith, Robert L.; Popham, Ronald E.

    1983-01-01

    Presents an experiment in thermometric titration used in an analytic chemistry-chemical instrumentation course, consisting of two titrations, one a mixture of calcium and magnesium, the other of calcium, magnesium, and barium ions. Provides equipment and solutions list/specifications, graphs, and discussion of results. (JM)

  14. Significance Testing in Confirmatory Factor Analytic Models.

    ERIC Educational Resources Information Center

    Khattab, Ali-Maher; Hocevar, Dennis

    Traditionally, confirmatory factor analytic models are tested against a null model of total independence. Using randomly generated factors in a matrix of 46 aptitude tests, this approach is shown to be unlikely to reject even random factors. An alternative null model, based on a single general factor, is suggested. In addition, an index of model…

  15. Prediction of sickness absence: development of a screening instrument

    PubMed Central

    Duijts, S F A; Kant, IJ; Landeweerd, J A; Swaen, G M H

    2006-01-01

    Objectives To develop a concise screening instrument for early identification of employees at risk for sickness absence due to psychosocial health complaints. Methods Data from the Maastricht Cohort Study on “Fatigue at Work” were used to identify items to be associated with an increased risk of sickness absence. The analytical procedures univariate logistic regression, backward stepwise linear regression, and multiple logistic regression were successively applied. For both men and women, sum scores were calculated, and sensitivity and specificity rates of different cut‐off points on the screening instrument were defined. Results In women, results suggested that feeling depressed, having a burnout, being tired, being less interested in work, experiencing obligatory change in working days, and living alone, were strong predictors of sickness absence due to psychosocial health complaints. In men, statistically significant predictors were having a history of sickness absence, compulsive thinking, being mentally fatigued, finding it hard to relax, lack of supervisor support, and having no hobbies. A potential cut‐off point of 10 on the screening instrument resulted in a sensitivity score of 41.7% for women and 38.9% for men, and a specificity score of 91.3% for women and 90.6% for men. Conclusions This study shows that it is possible to identify predictive factors for sickness absence and to develop an instrument for early identification of employees at risk for sickness absence. The results of this study increase the possibility for both employers and policymakers to implement interventions directed at the prevention of sickness absence. PMID:16698807

  16. Recent advances in CE-MS coupling: Instrumentation, methodology, and applications.

    PubMed

    Týčová, Anna; Ledvina, Vojtěch; Klepárník, Karel

    2017-01-01

    This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices coupled with MS for detection and identification of important analytes. It is a continuation of the review article on the same topic by Kleparnik (Electrophoresis 2015, 36, 159-178). A wide selection of 161 relevant articles covers the literature published from June 2014 till May 2016. New improvements in the instrumentation and methodology of MS interfaced with capillary or microfluidic versions of zone electrophoresis, isotachophoresis, and isoelectric focusing are described in detail. The most frequently implemented MS ionization methods include electrospray ionization, matrix-assisted desorption/ionization and inductively coupled plasma ionization. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography, and micellar electrokinetic chromatography are not included. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Electric vehicle power train instrumentation: Some constraints and considerations

    NASA Technical Reports Server (NTRS)

    Triner, J. E.; Hansen, I. G.

    1977-01-01

    The application of pulse modulation control (choppers) to dc motors creates unique instrumentation problems. In particular, the high harmonic components contained in the current waveforms require frequency response accommodations not normally considered in dc instrumentation. In addition to current sensing, accurate power measurement requires not only adequate frequency response but must also address phase errors caused by the finite bandwidths and component characteristics involved. The implications of these problems are assessed.

  18. Application of Steinberg vibration fatigue model for structural verification of space instruments

    NASA Astrophysics Data System (ADS)

    García, Andrés; Sorribes-Palmer, Félix; Alonso, Gustavo

    2018-01-01

    Electronic components in spaceships are subjected to vibration loads during the ascent phase of the launcher. It is important to verify by tests and analysis that all parts can survive in the most severe load cases. The purpose of this paper is to present the methodology and results of the application of the Steinberg's fatigue model to estimate the life of electronic components of the EPT-HET instrument for the Solar Orbiter space mission. A Nastran finite element model (FEM) of the EPT-HET instrument was created and used for the structural analysis. The methodology is based on the use of the FEM of the entire instrument to calculate the relative displacement RDSD and RMS values of the PCBs from random vibration analysis. These values are used to estimate the fatigue life of the most susceptible electronic components with the Steinberg's fatigue damage equation and the Miner's cumulative fatigue index. The estimations are calculated for two different configurations of the instrument and three different inputs in order to support the redesign process. Finally, these analytical results are contrasted with the inspections and the functional tests made after the vibration tests, concluding that this methodology can adequately predict the fatigue damage or survival of the electronic components.

  19. A mathematical model for describing the mechanical behaviour of root canal instruments.

    PubMed

    Zhang, E W; Cheung, G S P; Zheng, Y F

    2011-01-01

    The purpose of this study was to establish a general mathematical model for describing the mechanical behaviour of root canal instruments by combining a theoretical analytical approach with a numerical finite-element method. Mathematical formulas representing the longitudinal (taper, helical angle and pitch) and cross-sectional configurations and area, the bending and torsional inertia, the curvature of the boundary point and the (geometry of) loading condition were derived. Torsional and bending stresses and the resultant deformation were expressed mathematically as a function of these geometric parameters, modulus of elasticity of the material and the applied load. As illustrations, three brands of NiTi endodontic files of different cross-sectional configurations (ProTaper, Hero 642, and Mani NRT) were analysed under pure torsion and pure bending situation by entering the model into a finite-element analysis package (ANSYS). Numerical results confirmed that mathematical models were a feasible method to analyse the mechanical properties and predict the stress and deformation for root canal instruments during root canal preparation. Mathematical and numerical model can be a suitable way to examine mechanical behaviours as a criterion of the instrument design and to predict the stress and strain experienced by the endodontic instruments during root canal preparation. © 2010 International Endodontic Journal.

  20. High-speed noncontacting instrumentation for jet engine testing

    NASA Astrophysics Data System (ADS)

    Scotto, M. J.; Eismeier, M. E.

    1980-03-01

    This paper discusses high-speed, noncontacting instrumentation systems for measuring the operating characteristics of jet engines. The discussion includes optical pyrometers for measuring blade surface temperatures, capacitance clearanceometers for measuring blade tip clearance and vibration, and optoelectronic systems for measuring blade flex and torsion. In addition, engine characteristics that mandate the use of such unique instrumentation are pointed out as well as the shortcomings of conventional noncontacting devices. Experimental data taken during engine testing are presented and recommendations for future development discussed.

  1. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2017-01-01

    There has been an immense amount of visibility of doping issues on the international stage over the past 12 months with the complexity of doping controls reiterated on various occasions. Hence, analytical test methods continuously being updated, expanded, and improved to provide specific, sensitive, and comprehensive test results in line with the World Anti-Doping Agency's (WADA) 2016 Prohibited List represent one of several critical cornerstones of doping controls. This enterprise necessitates expediting the (combined) exploitation of newly generated information on novel and/or superior target analytes for sports drug testing assays, drug elimination profiles, alternative test matrices, and recent advances in instrumental developments. This paper is a continuation of the series of annual banned-substance reviews appraising the literature published between October 2015 and September 2016 concerning human sports drug testing in the context of WADA's 2016 Prohibited List. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2015-01-01

    Within the mosaic display of international anti-doping efforts, analytical strategies based on up-to-date instrumentation as well as most recent information about physiology, pharmacology, metabolism, etc., of prohibited substances and methods of doping are indispensable. The continuous emergence of new chemical entities and the identification of arguably beneficial effects of established or even obsolete drugs on endurance, strength, and regeneration, necessitate frequent and adequate adaptations of sports drug testing procedures. These largely rely on exploiting new technologies, extending the substance coverage of existing test protocols, and generating new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA). In reference of the content of the 2014 Prohibited List, literature concerning human sports drug testing that was published between October 2013 and September 2014 is summarized and reviewed in this annual banned-substance review, with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2014 John Wiley & Sons, Ltd.

  3. The AAO fiber instrument data simulator

    NASA Astrophysics Data System (ADS)

    Goodwin, Michael; Farrell, Tony; Smedley, Scott; Heald, Ron; Heijmans, Jeroen; De Silva, Gayandhi; Carollo, Daniela

    2012-09-01

    The fiber instrument data simulator is an in-house software tool that simulates detector images of fiber-fed spectrographs developed by the Australian Astronomical Observatory (AAO). In addition to helping validate the instrument designs, the resulting simulated images are used to develop the required data reduction software. Example applications that have benefited from the tool usage are the HERMES and SAMI instrumental projects for the Anglo-Australian Telescope (AAT). Given the sophistication of these projects an end-to-end data simulator that accurately models the predicted detector images is required. The data simulator encompasses all aspects of the transmission and optical aberrations of the light path: from the science object, through the atmosphere, telescope, fibers, spectrograph and finally the camera detectors. The simulator runs under a Linux environment that uses pre-calculated information derived from ZEMAX models and processed data from MATLAB. In this paper, we discuss the aspects of the model, software, example simulations and verification.

  4. Pharmaceutical supply chain risk assessment in Iran using analytic hierarchy process (AHP) and simple additive weighting (SAW) methods.

    PubMed

    Jaberidoost, Mona; Olfat, Laya; Hosseini, Alireza; Kebriaeezadeh, Abbas; Abdollahi, Mohammad; Alaeddini, Mahdi; Dinarvand, Rassoul

    2015-01-01

    Pharmaceutical supply chain is a significant component of the health system in supplying medicines, particularly in countries where main drugs are provided by local pharmaceutical companies. No previous studies exist assessing risks and disruptions in pharmaceutical companies while assessing the pharmaceutical supply chain. Any risks affecting the pharmaceutical companies could disrupt supply medicines and health system efficiency. The goal of this study was the risk assessment in pharmaceutical industry in Iran considering process's priority, hazard and probability of risks. The study was carried out in 4 phases; risk identification through literature review, risk identification in Iranian pharmaceutical companies through interview with experts, risk analysis through a questionnaire and consultation with experts using group analytic hierarchy process (AHP) method and rating scale (RS) and risk evaluation of simple additive weighting (SAW) method. In total, 86 main risks were identified in the pharmaceutical supply chain with perspective of pharmaceutical companies classified in 11 classes. The majority of risks described in this study were related to the financial and economic category. Also financial management was found to be the most important factor for consideration. Although pharmaceutical industry and supply chain were affected by current political conditions in Iran during the study time, but half of total risks in the pharmaceutical supply chain were found to be internal risks which could be fixed by companies, internally. Likewise, political status and related risks forced companies to focus more on financial and supply management resulting in less attention to quality management.

  5. Behavior Analytic Contributions to the Study of Creativity

    ERIC Educational Resources Information Center

    Kubina, Richard M., Jr.; Morrison, Rebecca S.; Lee, David L.

    2006-01-01

    As researchers continue to study creativity, a behavior analytic perspective may provide new vistas by offering an additional perspective. Contemporary behavior analysis began with B. F. Skinner and offers a selectionist approach to the scientific investigation of creativity. Behavior analysis contributes to the study of creativity by…

  6. Oral health, socio-economic and home environmental factors associated with general and oral-health related quality of life and convergent validity of two instruments.

    PubMed

    Paula, Janice S; Meneghim, Marcelo C; Pereira, Antônio C; Mialhe, Fábio L

    2015-02-24

    The objective of this study was to evaluate the convergent validity between the domains of the Autoquestionnaire Qualité de Vie Enfant image (AUQUEI) and the Child Perceptions Questionnaire instrument (CPQ(11-14)) among schoolchildren and to assess the difference between socio-economic and clinical variables associated with their scores. An analytical cross-sectional study was conducted in Juiz de Fora, Minas Gerais, Brazil, with 515 schoolchildren aged 12 years from 22 public and private schools, selected with the use of a random multistage sampling design. They were clinically examined for dental caries experience (DMFT and dmft index) and orthodontic treatments needs (DAI index) and were asked to complete the Brazilian versions of Child Perception Questionnaire (CPQ(11-14)) and Autoquestionnaire Qualité de Vie Enfant image (AUQUEI). In addition, a questionnaire was sent to their parents inquiring about their socio-economic status and home characteristics. The convergent validity of the Brazilian versions of CPQ(11-14) and AUQUEI instruments was analyzed by Spearman's correlation coefficients. For comparison between the summarized scores of each questionnaire with regard to the schoolchildren's socio-environmental and clinical aspects the nonparametric Mann-Whitney was used at level of significance of 5%. The mean DMFT index was 1.09 and 125 (24.3%) children had orthodontic treatment needs (DAI ≥ 31). There was a similarity and a weak correlation between the scores of the domains of CPQ(11-14) and AUQUEI (r ranged between -0.006 and 0.0296). In addition, a significant difference was found between the scores of the two instruments according to the socio-economic variables (p < 0.05) and presence of teeth with carious lesions (p < 0.05). The general and oral health-related quality of life instruments AUQUEI and CPQ(11-14) were both found to be useful, and significant influence of socio-economic and clinical variables were detected with both instruments.

  7. Nonparametric instrumental regression with non-convex constraints

    NASA Astrophysics Data System (ADS)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  8. Cervical shaping in curved root canals: comparison of the efficiency of two endodontic instruments.

    PubMed

    Busquim, Sandra Soares Kühne; dos Santos, Marcelo

    2002-01-01

    The aim of this study was to determine the removal of dentin produced by number 25 (0.08) Flare files (Quantec Flare Series, Analytic Endodontics, Glendora, California, USA) and number 1 e 2 Gates-Glidden burs (Dentsply - Maillefer, Ballaigues, Switzerland), in the mesio-buccal and mesio-lingual root canals, respectively, of extracted human permanent inferior molars, by means of measuring the width of dentinal walls prior and after instrumentation. The obtained values were compared. Due to the multiple analyses of data, a nonparametric test was used, and the Kruskal-Wallis test was chosen. There was no significant difference between the instruments as to the removal of dentin in the 1st and 2nd millimeters. However, when comparing the performances of the instruments in the 3rd millimeter, Flare files promoted a greater removal than Gates-Glidden drills (p > 0.05). The analysis revealed no significant differences as to mesial wear, which demonstrates the similar behavior of both instruments. Gates-Glidden drills produced an expressive mesial detour in the 2nd and 3rd millimeters, which was detected trough a statistically significant difference in the wear of this region (p > 0.05). There was no statistically significant difference between mesial and lateral wear when Flare instruments were employed.

  9. Rapid and continuous analyte processing in droplet microfluidic devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strey, Helmut; Kimmerling, Robert; Bakowski, Tomasz

    The compositions and methods described herein are designed to introduce functionalized microparticles into droplets that can be manipulated in microfluidic devices by fields, including electric (dielectrophoretic) or magnetic fields, and extracted by splitting a droplet to separate the portion of the droplet that contains the majority of the microparticles from the part that is largely devoid of the microparticles. Within the device, channels are variously configured at Y- or T junctions that facilitate continuous, serial isolation and dilution of analytes in solution. The devices can be limited in the sense that they can be designed to output purified analytes thatmore » are then further analyzed in separate machines or they can include additional channels through which purified analytes can be further processed and analyzed.« less

  10. IR Instruments | CTIO

    Science.gov Websites

    Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments ‹› You are here CTIO Home » Astronomers » Instruments » IR Instruments IR Instruments Infrared Imaging ANDICAM - Ohio State Visual/IR Imager (on SMARTS 1.3m Telescope) OSIRIS - The Ohio State

  11. Dreams In Jungian Psychology: The use of Dreams as an Instrument For Research, Diagnosis and Treatment of Social Phobia.

    PubMed

    Khodarahimi, Siamak

    2009-10-01

    The significance of dreams has been explained in psychoanalysis, depth psychology and gestalt therapy. There are many guidelines in analytic psychology for dream interpretation and integration in clinical practice. The present study, based on the Jungian analytic model, incorporated dreams as an instrument for assessment of aetiology, the psychotherapy process and the outcome of treatment for social phobia within a clinical case study. This case study describes the use of dream analysis in treating a female youth with social phobia. The present findings supported the three stage paradigm efficiency in the Jungian model for dream working within a clinical setting, i.e. written details, reassembly with amplification and assimilation. It was indicated that childhood and infantile traumatic events, psychosexual development malfunctions, and inefficient coping skills for solving current life events were expressed in the patient's dreams. Dreams can reflect a patient's aetiology, needs, illness prognosis and psychotherapy outcome. Dreams are an instrument for the diagnosis, research and treatment of mental disturbances in a clinical setting.

  12. Analytical electron microscopy in mineralogy; exsolved phases in pyroxenes

    USGS Publications Warehouse

    Nord, G.L.

    1982-01-01

    Analytical scanning transmission electron microscopy has been successfully used to characterize the structure and composition of lamellar exsolution products in pyroxenes. At operating voltages of 100 and 200 keV, microanalytical techniques of x-ray energy analysis, convergent-beam electron diffraction, and lattice imaging have been used to chemically and structurally characterize exsolution lamellae only a few unit cells wide. Quantitative X-ray energy analysis using ratios of peak intensities has been adopted for the U.S. Geological Survey AEM in order to study the compositions of exsolved phases and changes in compositional profiles as a function of time and temperature. The quantitative analysis procedure involves 1) removal of instrument-induced background, 2) reduction of contamination, and 3) measurement of correction factors obtained from a wide range of standard compositions. The peak-ratio technique requires that the specimen thickness at the point of analysis be thin enough to make absorption corrections unnecessary (i.e., to satisfy the "thin-foil criteria"). In pyroxenes, the calculated "maximum thicknesses" range from 130 to 1400 nm for the ratios Mg/Si, Fe/Si, and Ca/Si; these "maximum thicknesses" have been contoured in pyroxene composition space as a guide during analysis. Analytical spatial resolutions of 50-100 nm have been achieved in AEM at 200 keV from the composition-profile studies, and analytical reproducibility in AEM from homogeneous pyroxene standards is ?? 1.5 mol% endmember. ?? 1982.

  13. Subminiaturization for ERAST instrumentation (Environmental Research Aircraft and Sensor Technology)

    NASA Technical Reports Server (NTRS)

    Madou, Marc; Lowenstein, Max; Wegener, Steven

    1995-01-01

    We are focusing on the Argus as an example to demonstrate our philosophy on miniaturization of airborne analytical instruments for the study of atmospheric chemistry. Argus is a two channel, tunable-diode laser absorption spectrometer developed at NASA for the measurement of nitrogen dioxide (N2O) (4.5 micrometers) and ammonia (CH3) (3.3 micrometers) at the 0.1 parts per billion (ppb) level from the Perseus aircraft platform at altitudes up to 30 km. Although Argus' mass is down to 23 kg from the 197 kg Atlas, its predecessor, our goal is to design a next-generation subminiaturized instrument weighing less than 1 kg, measuring a few cm(exp 3) and able to eliminate dewars for cooling. Current designs enable use to make a small,inexpensive, monolithic spectrometer without the required sensitivity range. Further work is on its way to increase sensitivity. We are continuing to zero-base the technical approach in terms of the specifications for the given instrument. We are establishing a check list of questions to hone into the best micromachining approach and to superpose on the answers insights in scaling laws and flexible engineering designs to enable more relaxed tolerances for the smallest of the components.

  14. Trends & Controversies: Sociocultural Predictive Analytics and Terrorism Deterrence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.; McGrath, Liam R.

    2011-08-12

    The use of predictive analytics to model terrorist rhetoric is highly instrumental in developing a strategy to deter terrorism. Traditional (e.g. Cold-War) deterrence methods are ineffective with terrorist groups such as al Qaida. Terrorists typically regard the prospect of death or loss of property as acceptable consequences of their struggle. Deterrence by threat of punishment is therefore fruitless. On the other hand, isolating terrorists from the community that may sympathize with their cause can have a decisive deterring outcome. Without the moral backing of a supportive audience, terrorism cannot be successfully framed as a justifiable political strategy and recruiting ismore » curtailed. Ultimately, terrorism deterrence is more effectively enforced by exerting influence to neutralize the communicative reach of terrorists.« less

  15. LC-IM-TOF Instrument Control & Data Visualization Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-05-12

    Liquid Chromatography-Ion Mobility-time of Flight Instrument Control and Data Visualization software is designed to control instrument voltages for the Ion Mobility drift tube. It collects and stores information collected from the Agilent TOF instrument and analyses/displays the ion intensity information acquired. The software interface can be split into 3 categories -- Instrument Settings/Controls, Data Acquisition, and Viewer. The Instrument Settings/Controls prepares the instrument for Data Acquisition. The Viewer contains common objects that are used by Instrument Settings/Controls and Data Acquisition. Intensity information is collected in 1 nanosec bins and separated by TOF pulses called scans. A collection of scans aremore » stored side by side making up an accumulation. In order for the computer to keep up with the stream of data, 30-50 accumulations are commonly summed into a single frame. A collection of frames makes up an experiment. The Viewer software then takes the experiment and presents the data in several possible ways, each frame can be viewed in TOF bins or m/z (mass to charge ratio). The experiment can be viewed frame by frame, merging several frames, or by viewing the peak chromatogram. The user can zoom into the data, export data, and/or animate frames. Additional features include calibration of the data and even post-processing multiplexed data.« less

  16. MOEMs-based new functionalities for future instrumentation in space

    NASA Astrophysics Data System (ADS)

    Zamkotsian, Frédéric; Liotard, Arnaud; Viard, Thierry; Costes, Vincent; Hébert, Philippe-Jean; Hinglais, Emmanuel; Villenave, Michel

    2017-11-01

    Micro-Opto-Electro-Mechanical Systems (MOEMS) could be key components in future generation of space instruments. In Earth Observation, Universe Observation and Planet Exploration, scientific return of the instruments must be optimized in future missions. MOEMS devices are based on the mature micro-electronics technology and in addition to their compactness, scalability, and specific task customization, they could generate new functions not available with current technologies. CNES has initiated a study with LAM and TAS for listing the new functions associated with several types of MEMS (programmable slits, programmable micro-diffraction gratings, micro-deformable mirrors). Instrumental applications are then derived and promising concepts are described.

  17. Understanding Business Analytics

    DTIC Science & Technology

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  18. Comparison of icing cloud instruments for 1982-1983 icing season flight program

    NASA Technical Reports Server (NTRS)

    Ide, R. F.; Richter, G. P.

    1984-01-01

    A number of modern and old style liquid water content (LWC) and droplet sizing instruments were mounted on a DeHavilland DHC-6 Twin Otter and operated in natural icing clouds in order to determine their comparative operating characteristics and their limitations over a broad range of conditions. The evaluation period occurred during the 1982-1983 icing season from January to March 1983. Time histories of all instrument outputs were plotted and analyzed to assess instrument repeatability and reliability. Scatter plots were also generated for comparison of instruments. The measured LWC from four instruments differed by as much as 20 percent. The measured droplet size from two instruments differed by an average of three microns. The overall effort demonstrated the need for additional data, and for some means of calibrating these instruments to known standards.

  19. Epilepsy analytic system with cloud computing.

    PubMed

    Shen, Chia-Ping; Zhou, Weizhi; Lin, Feng-Seng; Sung, Hsiao-Ya; Lam, Yan-Yu; Chen, Wei; Lin, Jeng-Wei; Pan, Ming-Kai; Chiu, Ming-Jang; Lai, Feipei

    2013-01-01

    Biomedical data analytic system has played an important role in doing the clinical diagnosis for several decades. Today, it is an emerging research area of analyzing these big data to make decision support for physicians. This paper presents a parallelized web-based tool with cloud computing service architecture to analyze the epilepsy. There are many modern analytic functions which are wavelet transform, genetic algorithm (GA), and support vector machine (SVM) cascaded in the system. To demonstrate the effectiveness of the system, it has been verified by two kinds of electroencephalography (EEG) data, which are short term EEG and long term EEG. The results reveal that our approach achieves the total classification accuracy higher than 90%. In addition, the entire training time accelerate about 4.66 times and prediction time is also meet requirements in real time.

  20. A survey on platforms for big data analytics.

    PubMed

    Singh, Dilpreet; Reddy, Chandan K

    The primary purpose of this paper is to provide an in-depth analysis of different platforms available for performing big data analytics. This paper surveys different hardware platforms available for big data analytics and assesses the advantages and drawbacks of each of these platforms based on various metrics such as scalability, data I/O rate, fault tolerance, real-time processing, data size supported and iterative task support. In addition to the hardware, a detailed description of the software frameworks used within each of these platforms is also discussed along with their strengths and drawbacks. Some of the critical characteristics described here can potentially aid the readers in making an informed decision about the right choice of platforms depending on their computational needs. Using a star ratings table, a rigorous qualitative comparison between different platforms is also discussed for each of the six characteristics that are critical for the algorithms of big data analytics. In order to provide more insights into the effectiveness of each of the platform in the context of big data analytics, specific implementation level details of the widely used k-means clustering algorithm on various platforms are also described in the form pseudocode.

  1. Let's Go Off the Grid: Subsurface Flow Modeling With Analytic Elements

    NASA Astrophysics Data System (ADS)

    Bakker, M.

    2017-12-01

    Subsurface flow modeling with analytic elements has the major advantage that no grid or time stepping are needed. Analytic element formulations exist for steady state and transient flow in layered aquifers and unsaturated flow in the vadose zone. Analytic element models are vector-based and consist of points, lines and curves that represent specific features in the subsurface. Recent advances allow for the simulation of partially penetrating wells and multi-aquifer wells, including skin effect and wellbore storage, horizontal wells of poly-line shape including skin effect, sharp changes in subsurface properties, and surface water features with leaky beds. Input files for analytic element models are simple, short and readable, and can easily be generated from, for example, GIS databases. Future plans include the incorporation of analytic element in parts of grid-based models where additional detail is needed. This presentation will give an overview of advanced flow features that can be modeled, many of which are implemented in free and open-source software.

  2. LISA Pathfinder Instrument Data Analysis

    NASA Technical Reports Server (NTRS)

    Guzman, Felipe

    2010-01-01

    LISA Pathfinder (LPF) is an ESA-launched demonstration mission of key technologies required for the joint NASA-ESA gravitational wave observatory in space, LISA. As part of the LPF interferometry investigations, analytic models of noise sources and corresponding noise subtraction techniques have been developed to correct for effects like the coupling of test mass jitter into displacement readout, and fluctuations of the laser frequency or optical pathlength difference. Ground testing of pre-flight hardware of the Optical Metrology subsystem is currently ongoing at the Albert Einstein Institute Hannover. In collaboration with NASA Goddard Space Flight Center, the LPF mission data analysis tool LTPDA is being used to analyze the data product of these tests. Furthermore, the noise subtraction techniques and in-flight experiment runs for noise characterization are being defined as part of the mission experiment master plan. We will present the data analysis outcome of preflight hardware ground tests and possible noise subtraction strategies for in-flight instrument operations.

  3. Analyticity without Differentiability

    ERIC Educational Resources Information Center

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  4. Analytical model of the optical vortex microscope.

    PubMed

    Płocinniczak, Łukasz; Popiołek-Masajada, Agnieszka; Masajada, Jan; Szatkowski, Mateusz

    2016-04-20

    This paper presents an analytical model of the optical vortex scanning microscope. In this microscope the Gaussian beam with an embedded optical vortex is focused into the sample plane. Additionally, the optical vortex can be moved inside the beam, which allows fine scanning of the sample. We provide an analytical solution of the whole path of the beam in the system (within paraxial approximation)-from the vortex lens to the observation plane situated on the CCD camera. The calculations are performed step by step from one optical element to the next. We show that at each step, the expression for light complex amplitude has the same form with only four coefficients modified. We also derive a simple expression for the vortex trajectory of small vortex displacements.

  5. Drawing cure: children's drawings as a psychoanalytic instrument.

    PubMed

    Wittmann, Barbara

    2010-01-01

    This essay deals with the special case of drawings as psychoanalytical instruments. It aims at a theoretical understanding of the specific contribution made by children's drawings as a medium of the psychical. In the influential play technique developed by Melanie Klein, drawing continuously interacts with other symptomatic (play) actions. Nonetheless, specific functions of drawing within the play technique can be identified. The essay will discuss four crucial aspects in-depth: 1) the strengthening of the analysis's recursivity associated with the graphic artifact; 2) the opening of the analytic process facilitated by drawing; 3) the creation of a genuinely graphic mode of producing meaning that allows the child to develop a "theory" of the workings of his own psychic apparatus; and 4) the new possibilities of symbolization associated with the latter. In contrast to classical definitions of the psychological instrument, the child's drawing is a weakly structured tool that does not serve to reproduce psychic processes in an artificial, controlled setting. The introduction of drawing into the psychoanalytic cure is by no means interested in replaying past events, but in producing events suited to effecting a transformation of the synchronic structures of the unconscious.

  6. Usefulness of Analytical Research: Rethinking Analytical R&D&T Strategies.

    PubMed

    Valcárcel, Miguel

    2017-11-07

    This Perspective is intended to help foster true innovation in Research & Development & Transfer (R&D&T) in Analytical Chemistry in the form of advances that are primarily useful for analytical purposes rather than solely for publishing. Devising effective means to strengthen the crucial contribution of Analytical Chemistry to progress in Chemistry, Science & Technology, and Society requires carefully examining the present status of our discipline and also identifying internal and external driving forces with a potential adverse impact on its development. The diagnostic process should be followed by administration of an effective therapy and supported by adoption of a theragnostic strategy if Analytical Chemistry is to enjoy a better future.

  7. Evaporative concentration on a paper-based device to concentrate analytes in a biological fluid.

    PubMed

    Wong, Sharon Y; Cabodi, Mario; Rolland, Jason; Klapperich, Catherine M

    2014-12-16

    We report the first demonstration of using heat on a paper device to rapidly concentrate a clinically relevant analyte of interest from a biological fluid. Our technology relies on the application of localized heat to a paper strip to evaporate off hundreds of microliters of liquid to concentrate the target analyte. This method can be used to enrich for a target analyte that is present at low concentrations within a biological fluid to enhance the sensitivity of downstream detection methods. We demonstrate our method by concentrating the tuberculosis-specific glycolipid, lipoarabinomannan (LAM), a promising urinary biomarker for the detection and diagnosis of tuberculosis. We show that the heat does not compromise the subsequent immunodetectability of LAM, and in 20 min, the tuberculosis biomarker was concentrated by nearly 20-fold in simulated urine. Our method requires only 500 mW of power, and sample flow is self-driven via capillary action. As such, our technology can be readily integrated into portable, battery-powered, instrument-free diagnostic devices intended for use in low-resource settings.

  8. Predicting playing frequencies for clarinets: A comparison between numerical simulations and simplified analytical formulas.

    PubMed

    Coyle, Whitney L; Guillemain, Philippe; Kergomard, Jean; Dalmont, Jean-Pierre

    2015-11-01

    When designing a wind instrument such as a clarinet, it can be useful to be able to predict the playing frequencies. This paper presents an analytical method to deduce these playing frequencies using the input impedance curve. Specifically there are two control parameters that have a significant influence on the playing frequency, the blowing pressure and reed opening. Four effects are known to alter the playing frequency and are examined separately: the flow rate due to the reed motion, the reed dynamics, the inharmonicity of the resonator, and the temperature gradient within the clarinet. The resulting playing frequencies for the first register of a particular professional level clarinet are found using the analytical formulas presented in this paper. The analytical predictions are then compared to numerically simulated results to validate the prediction accuracy. The main conclusion is that in general the playing frequency decreases above the oscillation threshold because of inharmonicity, then increases above the beating reed regime threshold because of the decrease of the flow rate effect.

  9. Mars2020 Entry, Descent, and Landing Instrumentation (MEDLI2): Science Objectives and Instrument Requirements

    NASA Technical Reports Server (NTRS)

    Bose, Deepak; White, Todd; Schoenenberger, Mark; Karlgaard, Chris; Wright, Henry

    2015-01-01

    NASAs exploration and technology roadmaps call for capability advancements in Mars entry, descent, and landing (EDL) systems to enable increased landed mass, a higher landing precision, and a wider planetary access. It is also recognized that these ambitious EDL performance goals must be met while maintaining a low mission risk in order to pave the way for future human missions. As NASA is engaged in developing new EDL systems and technologies via testing at Earth, instrumentation of existing Mars missions is providing valuable engineering data for performance improvement, risk reduction, and an improved definition of entry loads and environment. The most notable recent example is the Mars Entry, Descent and Landing Instrument (MEDLI) suite hosted by Mars Science Laboratory for its entry in Aug 2012. The MEDLI suite provided a comprehensive dataset for Mars entry aerodynamics, aerothermodynamics and thermal protection system (TPS) performance. MEDLI data has since been used for unprecedented reconstruction of aerodynamic drag, vehicle attitude, in-situ atmospheric density, aerothermal heating, and transition to turbulence, in-depth TPS performance and TPS ablation. [1,2] In addition to validating predictive models, MEDLI data has demonstrated extra margin available in the MSL forebody TPS, which can potentially be used to reduce vehicle parasitic mass. The presentation will introduce a follow-on MEDLI instrumentation suite (called MEDLI2) that is being developed for Mars-2020 mission. MEDLI2 has an enhanced scope that includes backshell instrumentation, a wider forebody coverage, and instruments that specifically target supersonic aerodynamics. Similar to MEDLI, MEDLI2 uses thermal plugs with embedded thermocouples and ports through the TPS to measure surface pressure. MEDLI2, however, also includes heat flux sensors in the backshell and a low range pressure transducer to measure afterbody pressure.

  10. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is

  11. Analytical techniques of pilot scanning behavior and their application

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.

    1986-01-01

    The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.

  12. Vision though afocal instruments: generalized magnification and eye-instrument interaction

    NASA Astrophysics Data System (ADS)

    Harris, William F.; Evans, Tanya

    2018-04-01

    In Gaussian optics all observers experience the same magnification, the instrument's angular magnification, when viewing distant objects though a telescope or other afocal instruments. However, analysis in linear optics shows that this is not necessarily so in the presence of astigmatism. Because astigmatism may distort and rotate images it is appropriate to work with generalized angular magnification represented by a 2 × 2 matrix. An expression is derived for the generalized magnification for an arbitrary eye looking through an arbitrary afocal instrument. With afocal instruments containing astigmatic refracting elements not all eyes experience the same generalized magnification; there is interaction between eye and instrument. Eye-instrument interaction may change as the instrument is rotated about its longitudinal axis, there being no interaction in particular orientations. A simple numerical example is given. For sake of completeness, expressions for generalized magnification are also presented in the case of instruments that are not afocal and objects that are not distant.

  13. Integrated polymerase chain reaction/electrophoresis instrument

    DOEpatents

    Andresen, Brian D.

    2000-01-01

    A new approach and instrument for field identification of micro-organisms and DNA fragments using a small and disposable device containing integrated polymerase chain reaction (PCR) enzymatic reaction wells, attached capillary electrophoresis (CE) channels, detectors, and read-out all on/in a small hand-held package. The analysis instrument may be made inexpensively, for example, of plastic, and thus is disposable, which minimizes cross contamination and the potential for false positive identification between samples. In addition, it is designed for multiple users with individual applications. The integrated PCR/CE is manufactured by the PCR well and CE channels are "stamped" into plastic depressions where conductive coatings are made in the wells and ends of the CE microchannels to carry voltage and current to heat the PCR reaction mixtures and simultaneously draw DNA bands up the CE channels. Light is transmitted through the instrument at appropriate points and detects PCR bands and identifies DNA fragments by size (retention time) and quantifies each by the amount of light generated as each phototransistor positioned below each CE channel detects a passing band. The instrument is so compact that at least 100 PCR/CE reactions/analyses can be performed easily on one detection device.

  14. VAP/VAT: video analytics platform and test bed for testing and deploying video analytics

    NASA Astrophysics Data System (ADS)

    Gorodnichy, Dmitry O.; Dubrofsky, Elan

    2010-04-01

    Deploying Video Analytics in operational environments is extremely challenging. This paper presents a methodological approach developed by the Video Surveillance and Biometrics Section (VSB) of the Science and Engineering Directorate (S&E) of the Canada Border Services Agency (CBSA) to resolve these problems. A three-phase approach to enable VA deployment within an operational agency is presented and the Video Analytics Platform and Testbed (VAP/VAT) developed by the VSB section is introduced. In addition to allowing the integration of third party and in-house built VA codes into an existing video surveillance infrastructure, VAP/VAT also allows the agency to conduct an unbiased performance evaluation of the cameras and VA software available on the market. VAP/VAT consists of two components: EventCapture, which serves to Automatically detect a "Visual Event", and EventBrowser, which serves to Display & Peruse of "Visual Details" captured at the "Visual Event". To deal with Open architecture as well as with Closed architecture cameras, two video-feed capture mechanisms have been developed within the EventCapture component: IPCamCapture and ScreenCapture.

  15. Development of an Analytical Method for Explosive Residues in Soil,

    DTIC Science & Technology

    1987-06-01

    confirm peak identities. The eluent for both columns should be 50:50 methanol-water. The elution time for all the analytes of interest on the LC -18 column...nitrate at 1.77 min for LC -8, 1.73 min for LC -DP, and 1.80 for LC -1. 23 Table A2. Instrument calibration results for HMX. Concentration Solution Soil* Peak ...LCT 12 AUG 2 0 W 1M 2j TNT Owl ""r’ L ,,,O MRYX TN L DNS 2 HMX 0 12 LC -CN 110 KMX S 8 TETRYL 6 RDXW 4 DNB and TNB 0 Approved for public release

  16. Instruments measuring spirituality in clinical research: a systematic review.

    PubMed

    Monod, Stéfanie; Brennan, Mark; Rochat, Etienne; Martin, Estelle; Rochat, Stéphane; Büla, Christophe J

    2011-11-01

    Numerous instruments have been developed to assess spirituality and measure its association with health outcomes. This study's aims were to identify instruments used in clinical research that measure spirituality; to propose a classification of these instruments; and to identify those instruments that could provide information on the need for spiritual intervention. A systematic literature search in MEDLINE, CINHAL, PsycINFO, ATLA, and EMBASE databases, using the terms "spirituality" and "adult$," and limited to journal articles was performed to identify clinical studies that used a spiritual assessment instrument. For each instrument identified, measured constructs, intended goals, and data on psychometric properties were retrieved. A conceptual and a functional classification of instruments were developed. Thirty-five instruments were retrieved and classified into measures of general spirituality (N = 22), spiritual well-being (N = 5), spiritual coping (N = 4), and spiritual needs (N = 4) according to the conceptual classification. Instruments most frequently used in clinical research were the FACIT-Sp and the Spiritual Well-Being Scale. Data on psychometric properties were mostly limited to content validity and inter-item reliability. According to the functional classification, 16 instruments were identified that included at least one item measuring a current spiritual state, but only three of those appeared suitable to address the need for spiritual intervention. Instruments identified in this systematic review assess multiple dimensions of spirituality, and the proposed classifications should help clinical researchers interested in investigating the complex relationship between spirituality and health. Findings underscore the scarcity of instruments specifically designed to measure a patient's current spiritual state. Moreover, the relatively limited data available on psychometric properties of these instruments highlight the need for additional research to

  17. Toxicologic evaluation of analytes from Tank 241-C-103

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahlum, D.D.; Young, J.Y.; Weller, R.E.

    1994-11-01

    Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team`s objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise,more » including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found.« less

  18. Analytical fingerprint for tantalum ores from African deposits

    NASA Astrophysics Data System (ADS)

    Melcher, F.; Graupner, T.; Sitnikova, M.; Oberthür, T.; Henjes-Kunst, F.; Gäbler, E.; Rantitsch, G.

    2009-04-01

    Illegal mining of gold, diamonds, copper, cobalt and, in the last decade, "coltan" has fuelled ongoing armed conflicts and civil war in a number of African countries. Following the United Nations initiative to fingerprint the origin of conflict materials and to develop a traceability system, our working group is investigating "coltan" (i.e. columbite-tantalite) mineralization especially in Africa, also within the wider framework of establishing certified trading chains (CTC). Special attention is directed towards samples from the main Ta-Nb-Sn provinces in Africa: DR Congo, Rwanda, Mozambique, Ethiopia, Egypt and Namibia. The following factors are taken into consideration in a methodological approach capable of distinguishing the origin of tantalum ores and concentrates with the utmost probability: (1) Quality and composition of coltan concentrates vary considerably. (2) Mineralogical and chemical compositions of Ta-Nb ores are extremely complex due to the wide range of the columbite-tantalite solid solution series and its ability to incorporate many additional elements. (3) Coltan concentrates may contain a number of other tantalum-bearing minerals besides columbite-tantalite. In our approach, coltan concentrates are analyzed in a step-by-step mode. State-of-the-art analytical tools employed are automated scanning electron microscopy (Mineral Liberation Analysis; MLA), electron microprobe analysis (major and trace elements), laser ablation-ICP-MS (trace elements, isotopes), and TIMS (U-Pb dating). Mineral assemblages in the ore concentrates, major and trace element concentration patterns, and zoning characteristics in the different pegmatites from Africa distinctly differ from each other. Chondrite-normalized REE distribution patterns vary significantly between columbite, tantalite, and microlite, and also relative to major element compositions of columbites. Some locations are characterized by low REE concentrations, others are highly enriched. Samples with

  19. Improvement of analytical dynamic models using modal test data

    NASA Technical Reports Server (NTRS)

    Berman, A.; Wei, F. S.; Rao, K. V.

    1980-01-01

    A method developed to determine maximum changes in analytical mass and stiffness matrices to make them consistent with a set of measured normal modes and natural frequencies is presented. The corrected model will be an improved base for studies of physical changes, boundary condition changes, and for prediction of forced responses. The method features efficient procedures not requiring solutions of the eigenvalue problem, and the ability to have more degrees of freedom than the test data. In addition, modal displacements are obtained for all analytical degrees of freedom, and the frequency dependence of the coordinate transformations is properly treated.

  20. New Analytical Monographs on TCM Herbal Drugs for Quality Proof.

    PubMed

    Wagner, Hildebert; Bauer, Rudolf; Melchart, Dieter

    2016-01-01

    Regardless of specific national drug regulations there is an international consensus that all TCM drugs must meet stipulated high quality standards focusing on authentication, identification and chemical composition. In addition, safety of all TCM drugs prescribed by physicians has to be guaranteed. During the 25 years history of the TCM hospital Bad Kötzting, 171 TCM drugs underwent an analytical quality proof including thin layer as well as high pressure liquid chromatography. As from now mass spectroscopy will also be available as analytical tool. The findings are compiled and already published in three volumes of analytical monographs. One more volume will be published shortly, and a fifth volume is in preparation. The main issues of the analytical procedure in TCM drugs like authenticity, botanical nomenclature, variability of plant species and parts as well as processing are pointed out and possible ways to overcome them are sketched. © 2016 S. Karger GmbH, Freiburg.

  1. Empirically Optimized Flow Cytometric Immunoassay Validates Ambient Analyte Theory

    PubMed Central

    Parpia, Zaheer A.; Kelso, David M.

    2010-01-01

    Ekins’ ambient analyte theory predicts, counter intuitively, that an immunoassay’s limit of detection can be improved by reducing the amount of capture antibody. In addition, it also anticipates that results should be insensitive to the volume of sample as well as the amount of capture antibody added. The objective of this study is to empirically validate all of the performance characteristics predicted by Ekins’ theory. Flow cytometric analysis was used to detect binding between a fluorescent ligand and capture microparticles since it can directly measure fractional occupancy, the primary response variable in ambient analyte theory. After experimentally determining ambient analyte conditions, comparisons were carried out between ambient and non-ambient assays in terms of their signal strengths, limits of detection, and their sensitivity to variations in reaction volume and number of particles. The critical number of binding sites required for an assay to be in the ambient analyte region was estimated to be 0.1VKd. As predicted, such assays exhibited superior signal/noise levels and limits of detection; and were not affected by variations in sample volume and number of binding sites. When the signal detected measures fractional occupancy, ambient analyte theory is an excellent guide to developing assays with superior performance characteristics. PMID:20152793

  2. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more

  3. 26 CFR 1.163-7 - Deduction for OID on certain debt instruments.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... difference as an adjustment to the issuer's interest expense for the original and additional debt instruments... over the term of the instrument using constant yield principles. (2) Positive adjustment. If the difference is positive (that is, the holder pays more than the adjusted issue price of the original debt...

  4. Evaluation of FUS-2000 urine analyzer: analytical properties and particle recognition.

    PubMed

    Beňovská, Miroslava; Wiewiorka, Ondřej; Pinkavová, Jana

    This study evaluates the performance of microscopic part of a hybrid analyzer FUS-2000 (Dirui Industrial Co., Changchun, China), its analytical properties and particle recognition. The evaluation of trueness, repeatability, detection limit, carry-over, linearity range and analytical stability was performed according to Dirui protocol guidelines designed by Dirui Company to guarantee the quality of the instrument. Trueness for low, medium and high-value concentrations was calculated with bias of 15.5, 4.7 and -6.6%, respectively. Detection limit of 5 Ery/μl was confirmed. Coefficient of variation of 11.0, 5.2 and 3.8% was measured for within-run repeatability of low, medium and high concentration. Between-run repeatability for daily quality control had coefficient of variation of 3.0%. Carry-over did not exceed 0.05%. Linearity was confirmed for range of 0-16,000 particles/μl (R 2  = 0.9997). The analytical stability had coefficient of variation of 4.3%. Out of 1258 analyzed urine samples, 362 positive were subjected to light microscopy urine sediment analysis and compared to the analyzer results. Cohen's kappa coefficients were calculated to express the concordance. Squared kappa coefficient was 0.927 (red blood cells), 0.888 (white blood cells), 0.908 (squamous epithelia), 0.634 (transitional epithelia), 0.628 (hyaline casts), 0.843 (granular casts) and 0.623 (bacteria). Single kappa coefficients were 0.885 (yeasts) and 0.756 (crystals), respectively. Aforementioned results show good analytical performance of the analyzer and tight agreement with light microscopy of urine sediment.

  5. Aversive pavlovian responses affect human instrumental motor performance.

    PubMed

    Rigoli, Francesco; Pavone, Enea Francesco; Pezzulo, Giovanni

    2012-01-01

    IN NEUROSCIENCE AND PSYCHOLOGY, AN INFLUENTIAL PERSPECTIVE DISTINGUISHES BETWEEN TWO KINDS OF BEHAVIORAL CONTROL: instrumental (habitual and goal-directed) and Pavlovian. Understanding the instrumental-Pavlovian interaction is fundamental for the comprehension of decision-making. Animal studies (as those using the negative auto-maintenance paradigm), have demonstrated that Pavlovian mechanisms can have maladaptive effects on instrumental performance. However, evidence for a similar effect in humans is scarce. In addition, the mechanisms modulating the impact of Pavlovian responses on instrumental performance are largely unknown, both in human and non-human animals. The present paper describes a behavioral experiment investigating the effects of Pavlovian conditioned responses on performance in humans, focusing on the aversive domain. Results showed that Pavlovian responses influenced human performance, and, similar to animal studies, could have maladaptive effects. In particular, Pavlovian responses either impaired or increased performance depending on modulator variables such as threat distance, task controllability, punishment history, amount of training, and explicit punishment expectancy. Overall, these findings help elucidating the computational mechanisms underlying the instrumental-Pavlovian interaction, which might be at the base of apparently irrational phenomena in economics, social behavior, and psychopathology.

  6. Aversive Pavlovian Responses Affect Human Instrumental Motor Performance

    PubMed Central

    Rigoli, Francesco; Pavone, Enea Francesco; Pezzulo, Giovanni

    2012-01-01

    In neuroscience and psychology, an influential perspective distinguishes between two kinds of behavioral control: instrumental (habitual and goal-directed) and Pavlovian. Understanding the instrumental-Pavlovian interaction is fundamental for the comprehension of decision-making. Animal studies (as those using the negative auto-maintenance paradigm), have demonstrated that Pavlovian mechanisms can have maladaptive effects on instrumental performance. However, evidence for a similar effect in humans is scarce. In addition, the mechanisms modulating the impact of Pavlovian responses on instrumental performance are largely unknown, both in human and non-human animals. The present paper describes a behavioral experiment investigating the effects of Pavlovian conditioned responses on performance in humans, focusing on the aversive domain. Results showed that Pavlovian responses influenced human performance, and, similar to animal studies, could have maladaptive effects. In particular, Pavlovian responses either impaired or increased performance depending on modulator variables such as threat distance, task controllability, punishment history, amount of training, and explicit punishment expectancy. Overall, these findings help elucidating the computational mechanisms underlying the instrumental-Pavlovian interaction, which might be at the base of apparently irrational phenomena in economics, social behavior, and psychopathology. PMID:23060738

  7. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  8. Simultaneous control of multiple instruments at the Advanced Technology Solar Telescope

    NASA Astrophysics Data System (ADS)

    Johansson, Erik M.; Goodrich, Bret

    2012-09-01

    The Advanced Technology Solar Telescope (ATST) is a 4-meter solar observatory under construction at Haleakala, Hawaii. The simultaneous use of multiple instruments is one of the unique capabilities that makes the ATST a premier ground based solar observatory. Control of the instrument suite is accomplished by the Instrument Control System (ICS), a layer of software between the Observatory Control System (OCS) and the instruments. The ICS presents a single narrow interface to the OCS and provides a standard interface for the instruments to be controlled. It is built upon the ATST Common Services Framework (CSF), an infrastructure for the implementation of a distributed control system. The ICS responds to OCS commands and events, coordinating and distributing them to the various instruments while monitoring their progress and reporting the status back to the OCS. The ICS requires no specific knowledge about the instruments. All information about the instruments used in an experiment is passed by the OCS to the ICS, which extracts and forwards the parameters to the appropriate instrument controllers. The instruments participating in an experiment define the active instrument set. A subset of those instruments must complete their observing activities in order for the experiment to be considered complete and are referred to as the must-complete instrument set. In addition, instruments may participate in eavesdrop mode, outside of the control of the ICS. All instrument controllers use the same standard narrow interface, which allows new instruments to be added without having to modify the interface or any existing instrument controllers.

  9. An instrument to assess subjective task value beliefs regarding the decision to pursue postgraduate training.

    PubMed

    Hagemeier, Nicholas E; Murawski, Matthew M

    2014-02-12

    To develop and validate an instrument to assess subjective ratings of the perceived value of various postgraduate training paths followed using expectancy-value as a theoretical framework; and to explore differences in value beliefs across type of postgraduate training pursued and type of pharmacy training completed prior to postgraduate training. A survey instrument was developed to sample 4 theoretical domains of subjective task value: intrinsic value, attainment value, utility value, and perceived cost. Retrospective self-report methodology was employed to examine respondents' (N=1,148) subjective task value beliefs specific to their highest level of postgraduate training completed. Exploratory and confirmatory factor analytic techniques were used to evaluate and validate value belief constructs. Intrinsic, attainment, utility, cost, and financial value constructs resulted from exploratory factor analysis. Cross-validation resulted in a 26-item instrument that demonstrated good model fit. Differences in value beliefs were noted across type of postgraduate training pursued and pharmacy training characteristics. The Postgraduate Training Value Instrument demonstrated evidence of reliability and construct validity. The survey instrument can be used to assess value beliefs regarding multiple postgraduate training options in pharmacy and potentially inform targeted recruiting of individuals to those paths best matching their own value beliefs.

  10. Instrumental variables and Mendelian randomization with invalid instruments

    NASA Astrophysics Data System (ADS)

    Kang, Hyunseung

    Instrumental variables (IV) methods have been widely used to determine the causal effect of a treatment, exposure, policy, or an intervention on an outcome of interest. The IV method relies on having a valid instrument, a variable that is (A1) associated with the exposure, (A2) has no direct effect on the outcome, and (A3) is unrelated to the unmeasured confounders associated with the exposure and the outcome. However, in practice, finding a valid instrument, especially those that satisfy (A2) and (A3), can be challenging. For example, in Mendelian randomization studies where genetic markers are used as instruments, complete knowledge about instruments' validity is equivalent to complete knowledge about the involved genes' functions. The dissertation explores the theory, methods, and application of IV methods when invalid instruments are present. First, when we have multiple candidate instruments, we establish a theoretical bound whereby causal effects are only identified as long as less than 50% of instruments are invalid, without knowing which of the instruments are invalid. We also propose a fast penalized method, called sisVIVE, to estimate the causal effect. We find that sisVIVE outperforms traditional IV methods when invalid instruments are present both in simulation studies as well as in real data analysis. Second, we propose a robust confidence interval under the multiple invalid IV setting. This work is an extension of our work on sisVIVE. However, unlike sisVIVE which is robust to violations of (A2) and (A3), our confidence interval procedure provides honest coverage even if all three assumptions, (A1)-(A3), are violated. Third, we study the single IV setting where the one IV we have may actually be invalid. We propose a nonparametric IV estimation method based on full matching, a technique popular in causal inference for observational data, that leverages observed covariates to make the instrument more valid. We propose an estimator along with

  11. Biomechanical testing of circumferential instrumentation after cervical multilevel corpectomy.

    PubMed

    Hartmann, Sebastian; Thomé, Claudius; Keiler, Alexander; Fritsch, Helga; Hegewald, Aldemar Andres; Schmölz, Werner

    2015-12-01

    Biomechanical investigation. This study describes ex vivo evaluation of the range of motion (ROM) to characterize the stability and need for additional dorsal fixation after cervical single-level, two-level or multilevel corpectomy (CE) to elucidate biomechanical differences between anterior-only and supplemental dorsal instrumentation. Twelve human cervical cadaveric spines were loaded in a spine tester with pure moments of 1.5 Nm in lateral bending (LB), flexion/extension (FE), and axial rotation (AR), followed by two cyclic loading periods for three-level corpectomies. After each cyclic loading session, flexibility tests were performed for anterior-only instrumentation (group_1, six specimens) and circumferential instrumentation (group_2, six specimens). The flexibility tests for all circumferential instrumentations showed a significant decrease in ROM in comparison with the intact state and anterior-only instrumentations. In comparison with the intact state, supplemental dorsal instrumentation after three-level CE reduced the ROM to 12% (±10%), 9% (±12%), and 22% (±18%) in LB, FE, and AR, respectively. The anterior-only construct outperformed the intact state only in FE, with a significant ROM reduction to 57% (±35 %), 60% (±27%), and 62% (±35%) for one-, two- and three-level CE, respectively. The supplemental dorsal instrumentation provided significantly more stability than the anterior-only instrumentation regardless of the number of levels resected and the direction of motion. After cyclic loading, the absolute differences in stability between the two instrumentations remained significant while both instrumentations showed a comparable increase of ROM after cyclic loading. The large difference in the absolute ROM of anterior-only compared to circumferential instrumentations supports a dorsal support in case of three-level approaches.

  12. Development of a versatile laser light scattering instrument

    NASA Astrophysics Data System (ADS)

    Meyer, William V.; Ansari, Rafat R.

    1990-10-01

    A versatile laser light scattering (LLS) instrument is developed for use in microgravity to measure microscopic particles of 30 A to above 3 microns. Since it is an optical technique, LLS does not affect the sample being studied. A LLS instrument built from modules allows several configurations, each optimized for a particular experiment. The multiangle LLS instrument can be mounted in the rack in the Space Shuttle and on Space Station Freedom. It is possible that a Space Shuttle glove-box and a lap-top computer containing a correlator card can be used to perform a number of experiments and to demonstrate the technology needed for more elaborate investigations. This offers simple means of flying a great number of experiments without the additional requirements of full-scale flight hardware experiments.

  13. Development of a versatile laser light scattering instrument

    NASA Technical Reports Server (NTRS)

    Meyer, William V.; Ansari, Rafat R.

    1990-01-01

    A versatile laser light scattering (LLS) instrument is developed for use in microgravity to measure microscopic particles of 30 A to above 3 microns. Since it is an optical technique, LLS does not affect the sample being studied. A LLS instrument built from modules allows several configurations, each optimized for a particular experiment. The multiangle LLS instrument can be mounted in the rack in the Space Shuttle and on Space Station Freedom. It is possible that a Space Shuttle glove-box and a lap-top computer containing a correlator card can be used to perform a number of experiments and to demonstrate the technology needed for more elaborate investigations. This offers simple means of flying a great number of experiments without the additional requirements of full-scale flight hardware experiments.

  14. Recent Advances in Analytical Pyrolysis to Investigate Organic Materials in Heritage Science.

    PubMed

    Degano, Ilaria; Modugno, Francesca; Bonaduce, Ilaria; Ribechini, Erika; Colombini, Maria Perla

    2018-06-18

    The molecular characterization of organic materials in samples from artworks and historical objects traditionally entailed qualitative and quantitative analyses by HPLC and GC. Today innovative approaches based on analytical pyrolysis enable samples to be analysed without any chemical pre-treatment. Pyrolysis, which is often considered as a screening technique, shows previously unexplored potential thanks to recent instrumental developments. Organic materials that are macromolecular in nature, or undergo polymerization upon curing and ageing can now be better investigated. Most constituents of paint layers and archaeological organic substances contain major insoluble and chemically non-hydrolysable fractions that are inaccessible to GC or HPLC. To date, molecular scientific investigations of the organic constituents of artworks and historical objects have mostly focused on the minor constituents of the sample. This review presents recent advances in the qualitative and semi-quantitative analyses of organic materials in heritage objects based on analytical pyrolysis coupled with mass spectrometry. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Pancreatitis Quality of Life Instrument: Development of a new instrument

    PubMed Central

    Bova, Carol; Barton, Bruce; Hartigan, Celia

    2014-01-01

    Objectives: The goal of this project was to develop the first disease-specific instrument for the evaluation of quality of life in chronic pancreatitis. Methods: Focus groups and interview sessions were conducted, with chronic pancreatitis patients, to identify items felt to impact quality of life which were subsequently formatted into a paper-and-pencil instrument. This instrument was used to conduct an online survey by an expert panel of pancreatologists to evaluate its content validity. Finally, the modified instrument was presented to patients during precognitive testing interviews to evaluate its clarity and appropriateness. Results: In total, 10 patients were enrolled in the focus groups and interview sessions where they identified 50 items. Once redundant items were removed, the 40 remaining items were made into a paper-and-pencil instrument referred to as the Pancreatitis Quality of Life Instrument. Through the processes of content validation and precognitive testing, the number of items in the instrument was reduced to 24. Conclusions: This marks the development of the first disease-specific instrument to evaluate quality of life in chronic pancreatitis. It includes unique features not found in generic instruments (economic factors, stigma, and spiritual factors). Although this marks a giant step forward, psychometric evaluation is still needed prior to its clinical use. PMID:26770703

  16. Cordless Instruments

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Black & Decker's new cordless lightweight battery powered precision instruments, adapted from NASA's Apollo Lunar Landing program, have been designed to give surgeons optimum freedom and versatility in the operating room. Orthopedic instrument line includes a drill, a driver/reamer and a sagittal saw. All provide up to 20 minutes on a single charge. Power pack is the instrument's handle which is removable for recharging. Microprocessor controlled recharging unit can recharge two power packs together in 30 minutes. Instruments can be gas sterilized, steam-sterilized in an autoclave or immersed for easy cleaning.

  17. Analytical method for tributyltin and triphenyltin contained in household products-preparing for the revision of authorized analytical method-.

    PubMed

    Nakashima, Harunobu; Tomiyama, Ken-Ichi; Kawakami, Tsuyoshi; Isama, Kazuo

    2010-07-01

    In preparing for the revision of the authorized analytical method for tributyltin (TBT) and triphenyltin (TPT), which are banned from using according to the "Act on the Control of Household Products Containing Harmful Substances", an examination was conducted on the detection method of these substances using gas chromatography/mass spectrometry (GC/MS), after derivatizing them (ethyl-derivatizing method and hydrogen-derivatizing method). Ethyl-derivatized compounds had stability, which enabled the detection of TPT with a higher sensitivity. In addition, a preparation suitable for the following analytical objects was established: (1) textile products, (2) water-based products (such as water-based paint), (3) oil-based products (such as wax), and (4) adhesives. Addition-recovery experiments were conducted using the prescribed pretreatment method, when each surrogate substances (TBT-d27, TPT-d15) were added and the data were corrected, good recovery rates (94.5-118.6% in TBT, and 86.6-110.1% in TPT) were obtained. When TBT and TPT in 31 commercially available products were analyzed based on the developed analytical method, an adhesive showed 13.2 microg/g of TBT content, which exceeded the regulatory criterion (1 microg/g as tin). Next, when the same products with different manufacturing date were analyzed, TBT (10.2-10.8 microg/g), which exceeded the regulatory criterion, was detected in 4 products among 8 products, and simultaneously, a high concentration (over 1000 microg/g) of dibutyltin (DBT) was detected. It was suggested that TBT as an impurity of DBT remained, and the manufacturer chose the voluntary recall of the product. The new method is considered sufficiently applicable as a revised method for the conventionally authorized method.

  18. Instrument Performance Monitoring at Gemini North

    NASA Astrophysics Data System (ADS)

    Emig, Kimberly; Pohlen, M.; Chene, A.

    2014-01-01

    An instrument performance monitoring (IPM) project at the Gemini North Observatory evaluates the delivered throughput and sensitivity of, among other instruments, the Near-Infrared Integral Field Spectrometer (NIFS), the Gemini Near-Infrared Spectrograph (GNIRS), and the Gemini Multi-Object Spectrograph (GMOS-N). Systematic observations of standard stars allow the quality of the instruments and mirror to be assessed periodically. An automated pipeline has been implemented to process and analyze data obtained with NIFS, GNIRS cross-dispersed (XD) and long slit (LS) modes, and GMOS (photometry and spectroscopy). We focus the discussion of this poster on NIFS and GNIRS. We present the spectroscopic throughput determined for ZJHK bands on NIFS, the XJHKLM band for GNIRS XD mode and the K band for GNIRS LS. Additionally, the sensitivity is available for the JHK bands in NIFS and GNIRS XD, and for the K band in GNIRS LS. We consider data taken as early as March 2011. Furthermore, the pipeline setup and the methods used to determine throughput and sensitivity are described.

  19. Non-traditional isotopes in analytical ecogeochemistry assessed by MC-ICP-MS

    NASA Astrophysics Data System (ADS)

    Prohaska, Thomas; Irrgeher, Johanna; Horsky, Monika; Hanousek, Ondřej; Zitek, Andreas

    2014-05-01

    Analytical ecogeochemistry deals with the development and application of tools of analytical chemistry to study dynamic biological and ecological processes within ecosystems and across ecosystem boundaries in time. It can be best described as a linkage between modern analytical chemistry and a holistic understanding of ecosystems ('The total human ecosystem') within the frame of transdisciplinary research. One focus of analytical ecogeochemistry is the advanced analysis of elements and isotopes in abiotic and biotic matrices and the application of the results to basic questions in different research fields like ecology, environmental science, climatology, anthropology, forensics, archaeometry and provenancing. With continuous instrumental developments, new isotopic systems have been recognized for their potential to study natural processes and well established systems could be analyzed with improved techniques, especially using multi collector inductively coupled plasma mass spectrometry (MC-ICP-MS). For example, in case of S, isotope ratio measurements at high mass resolution could be achieved at much lower S concentrations with ICP-MS as compared to IRMS, still keeping suitable uncertainty. Almost 50 different isotope systems have been investigated by ICP-MS, so far, with - besides Sr, Pb and U - Ca, Mg, Cd, Li, Hg, Si, Ge and B being the most prominent and considerably pushing the limits of plasma based mass spectrometry also by applying high mass resolution. The use of laser ablation in combination with MC-ICP-MS offers the possibility to achieve isotopic information on high spatial (µm-range) and temporal scale (in case of incrementally growing structures). The information gained with these analytical techniques can be linked between different hierarchical scales in ecosystems, offering means to better understand ecosystem processes. The presentation will highlight the use of different isotopic systems in ecosystem studies accomplished by ICP-MS. Selected

  20. Lab-on-a-bubble: direct and indirect assays with portable Raman instrumentation

    NASA Astrophysics Data System (ADS)

    Carron, Keith; Schmit, Virginia; Scott, Brandon; Martoglio, Richard

    2012-10-01

    Lab-on-a-Bubble (LoB) is a new method for SERS (Surface Enhanced Raman Scattering) assays that combines separationand concentration of the assay results. A direct LoB assay is comprised of gold nanoparticles coupled directly to the ~30 μm diameter buoyant silica bubble. The direct LoB method was evaluated with cyanide and 5,5'-dithiobis(2-nitrobenzoic acid) (DTNB). An indirect assay uses the same ~ 30 μm diameter buoyant silica bubble and a silica coated SERS reporter. Both the bubble and SERS reporter are coated with a coupling agent for the analyte. The assay measures the amount of SERS reporter coupled to the bubble through a sandwich created by the analyte. The couling agent could consist of an immunological coupling agent (antibody) or a nucleic acid coupling agent (single strand DNA). The indirect LoB method was examined with Cholera toxin (CT) and antibodies against the β subunit. An LOD of ~ 170 pptrillion was measured for cyanide and a limit of detection of 1100 ng was found for CT. Instrumentation for the assay and a novel technique of dynamic SERS (DSERS) will also be discussed. The instrument is a small hand-held Raman device called the CBEx (Chemical Biological Explosive) with a novel raster system to detect heterogeneous or light sensitive materials. DSERS is a mathematical algorithm which eliminates background interference in SERS measurements with colloidal nanoparticles.

  1. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  2. Information Management Systems in the Undergraduate Instrumental Analysis Laboratory.

    ERIC Educational Resources Information Center

    Merrer, Robert J.

    1985-01-01

    Discusses two applications of Laboratory Information Management Systems (LIMS) in the undergraduate laboratory. They are the coulometric titration of thiosulfate with electrogenerated triiodide ion and the atomic absorption determination of calcium using both analytical calibration curve and standard addition methods. (JN)

  3. AC instrumentation amplifier for bioimpedance measurements.

    PubMed

    Pallás-Areny, R; Webster, J G

    1993-08-01

    We analyze the input impedance and CMRR requirements for an amplifier for bioimpedance measurements when considering the capacitive components of the electrode-skin contact impedance. We describe an ac-coupled instrumentation amplifier (IA) that, in addition to fulfilling those requirements, both provides interference and noise reduction, and yields a zero phase shift over a wide frequency band without using broadband op amps.

  4. The vertical accelerometer, a new instrument for air navigation

    NASA Technical Reports Server (NTRS)

    Laboccetta, Letterio

    1923-01-01

    This report endeavors to show the possibility of determining the rate of acceleration and the advantage of having such an accelerometer in addition to other aviation instruments. Most of the discussions concern balloons.

  5. Analytical Nanoscience and Nanotechnology: Where we are and where we are heading.

    PubMed

    Laura Soriano, María; Zougagh, Mohammed; Valcárcel, Miguel; Ríos, Ángel

    2018-01-15

    The main aim of this paper is to offer an objective and critical overview of the situation and trends in Analytical Nanoscience and Nanotechnology (AN&N), which is an important break point in the evolution of Analytical Chemistry in the XXI century as they were computers and instruments in the second half of XX century. The first part of this overview is devoted to provide a general approach to AN&N by describing the state of the art of this recent topic, being the importance of it also emphasized. Secondly, particular but very relevant trends in this topic are outlined: the analysis of the nanoworld, the so "third way" in AN&N, the growing importance of bioanalysis, the evaluation of both nanosensors and nanosorbents, the impact of AN&N in bioimaging and in nanotoxicological studies, as well as the crucial importance of reliability of the nanotechnological processes and results for solving real analytical problems in the frame of Social Responsibility (SR) of science and technology. Several reflections are included at the end of this overview written as a bird's eye view, which is not an easy task for experts in AN&N. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Comparison of the analytical capabilities of the BAC Datamaster and Datamaster DMT forensic breath testing devices.

    PubMed

    Glinn, Michele; Adatsi, Felix; Curtis, Perry

    2011-11-01

    The State of Michigan uses the Datamaster as an evidential breath testing device. The newest version, the DMT, will replace current instruments in the field as they are retired from service. The Michigan State Police conducted comparison studies to test the analytical properties of the new instrument and to evaluate its response to conditions commonly cited in court defenses. The effects of mouth alcohol, objects in the mouth, and radiofrequency interference on paired samples from drinking subjects were assessed on the DMT. The effects of sample duration and chemical interferents were assessed on both instruments, using drinking subjects and wet-bath simulators, respectively. Our testing shows that Datamaster and DMT results are essentially identical; the DMT gave accurate readings as compared with measurements made using simulators containing standard ethanol solutions and that the DMT did not give falsely elevated breath alcohol results from any of the influences tested. © 2011 American Academy of Forensic Sciences.

  7. Pain after root canal treatment with different instruments: A systematic review and meta-analysis.

    PubMed

    Sun, Chengjun; Sun, Jicheng; Tan, Minmin; Hu, Bo; Gao, Xiang; Song, Jinlin

    2018-03-07

    The aims of this systematic review were to compare the incidence and intensity of postoperative pain after single-visit root canal treatment using manual, rotary and reciprocating instruments. An extensive literature search in PubMed, EMBASE, Cochrane Library, and Web of Science was performed to identify investigations that evaluated the effects of different instruments on post-endodontic pain. Meta-analyses and additional analyses, including subgroup and sensitivity analyses, were conducted. We included seventeen trials in this study. Pooled results showed that patients treated with rotary instruments experienced a significantly lower incidence of postoperative pain (RR, 0.32, P = 0.0005) and reduced pain intensity than did patients treated with manual instruments. In addition, patients treated with multiple rotary-file systems experienced a significantly lower incidence of postoperative pain than did those treated with reciprocating systems (RR, 0.73; P < 0.0001). The use of rotary instruments contributed to a lower incidence and intensity of postoperative pain than did the use of hand files in patients who received single-visit root canal treatment. In addition, the use of multiple rotary-file systems contributed to a lower incidence of postoperative pain than did the use of reciprocating systems. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  8. The Nature and Development of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Byrnes, James P.; Dunbar, Kevin N.

    2014-01-01

    In this article, we attempt to provide an overview of the features of the abilities, aptitudes, and frames of minds that are attributed to critical thinking and provide the broad outlines of the development of critical-analytic thinking (CAT) abilities. In addition, we evaluate the potential viability of three main hypotheses regarding the reasons…

  9. How health leaders can benefit from predictive analytics.

    PubMed

    Giga, Aliyah

    2017-11-01

    Predictive analytics can support a better integrated health system providing continuous, coordinated, and comprehensive person-centred care to those who could benefit most. In addition to dollars saved, using a predictive model in healthcare can generate opportunities for meaningful improvements in efficiency, productivity, costs, and better population health with targeted interventions toward patients at risk.

  10. A systematic review on assessment instruments for dementia in persons with intellectual disabilities.

    PubMed

    Zeilinger, Elisabeth L; Stiehl, Katharina A M; Weber, Germain

    2013-11-01

    This work describes an extensive systematic literature review on assessment instruments for dementia in persons with intellectual disability (ID). Existing instruments for the detection of dementia in persons with ID were collected and described systematically. This allows a direct and quick overview of available tools. Additionally, it contributes to the availability and usability of information about these instruments, thus enhancing further developments in this field. A systematic literature search in five databases (CINAHL, PsycInfo, PubMed, Scopus, and Web of Science) was conducted. In order to include gray literature an invisible college approach was used. Relevant studies were identified and selected using defined inclusion and exclusion criteria. After the selection process all instruments were coded and classified. It was determined which concepts they assess, whether they were especially developed or adapted for persons with ID, and whether they were designed to assess dementia. The selection of relevant papers, as well as the coding of instruments was done independently by two researchers. In total, 97 records met the search criteria. Out of these, 114 different instruments were extracted. There were 79 instruments to be completed by the person with ID, and 35 informant-based instruments. Additionally, four test batteries were found. Some of these instruments were neither designed for the assessment of dementia, nor for persons with ID. There are a variety of different tools used for the assessment of dementia in ID. Nevertheless, an agreed-upon approach or instrument is missing. Establishing this would improve the quality of assessment in clinical practice, and benefit research. Data collected would become comparable and combinable, and allow research to have more informative value. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Instruments at the Lowell Observatory Discovery Channel Telescope (DCT)

    NASA Astrophysics Data System (ADS)

    Jacoby, George H.; Bida, Thomas A.; Fischer, Debra; Horch, Elliott; Kutyrev, Alexander; Mace, Gregory N.; Massey, Philip; Roe, Henry G.; Prato, Lisa A.

    2017-01-01

    The Lowell Observatory Discovery Channel Telescope (DCT) has been in full science operation for 2 years (2015 and 2016). Five instruments have been commissioned during that period, and two additional instruments are planned for 2017. These include:+ Large Monolithic Imager (LMI) - a CCD imager (12.6 arcmin FoV)+ DeVeny - a general purpose optical spectrograph (2 arcmin slit length, 10 grating choices)+ NIHTS - a low resolution (R=160) YJHK spectrograph (1.3 arcmin slit)+ DSSI - a two-channel optical speckle imager (5 arcsec FoV)+ IGRINS - a high resolution (45,000) HK spectrograph, on loan from the University of Texas.In the upcoming year, instruments will be delivered from the University of Maryland (RIMAS - a YJHK imager/spectrograph) and from Yale University (EXPRES - a very high resolution stabilized optical echelle for PRV).Each of these instruments will be described, along with their primary science goals.

  12. Vibration isolation and pressure compensation apparatus for sensitive instrumentation

    NASA Technical Reports Server (NTRS)

    Averill, R. D. (Inventor)

    1983-01-01

    A system for attenuating the inherent vibration associated with a mechanical refrigeration unit employed to cryogenically cool sensitive instruments used in measuring chemical constituents of the atmosphere is described. A modular system including an instrument housing and a reaction bracket with a refrigerator unit floated there between comprise the instrumentation system. A pair of evacuated bellows that "float' refrigerator unit and provide pressure compensation at all levels of pressure from seal level to the vacuum of space. Vibration isolators and when needed provide additional vibration damping for the refrigerator unit. A flexible thermal strap (20 K) serves to provide essentially vibration free thermal contact between cold tip of the refrigerator unit and the instrument component mounted on the IDL mount. Another flexible strap (77 K) serves to provide vibration free thermal contact between the TDL mount thermal shroud and a thermal shroud disposed about the thermal shaft.

  13. Physical and Chemical Analytical Analysis: A key component of Bioforensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velsko, S P

    The anthrax letters event of 2001 has raised our awareness of the potential importance of non-biological measurements on samples of biological agents used in a terrorism incident. Such measurements include a variety of mass spectral, spectroscopic, and other instrumental techniques that are part of the current armamentarium of the modern materials analysis or analytical chemistry laboratory. They can provide morphological, trace element, isotopic, and other molecular ''fingerprints'' of the agent that may be key pieces of evidence, supplementing that obtained from genetic analysis or other biological properties. The generation and interpretation of such data represents a new domain of forensicmore » science, closely aligned with other areas of ''microbial forensics''. This paper describes some major elements of the R&D agenda that will define this sub-field in the immediate future and provide the foundations for a coherent national capability. Data from chemical and physical analysis of BW materials can be useful to an investigation of a bio-terror event in two ways. First, it can be used to compare evidence samples collected at different locations where such incidents have occurred (e.g. between the powders in the New York and Washington letters in the Amerithrax investigation) or between the attack samples and those seized during the investigation of sites where it is suspected the material was manufactured (if such samples exist). Matching of sample properties can help establish the relatedness of disparate incidents, and mis-matches might exclude certain scenarios, or signify a more complex etiology of the events under investigation. Chemical and morphological analysis for sample matching has a long history in forensics, and is likely to be acceptable in principle in court, assuming that match criteria are well defined and derived from known limits of precision of the measurement techniques in question. Thus, apart from certain operational issues (such as how

  14. Innovations for In-Pile Measurements in the Framework of the CEA-SCK•CEN Joint Instrumentation Laboratory

    NASA Astrophysics Data System (ADS)

    Villard, Jean-Francois; Schyns, Marc

    2010-12-01

    Optimizing the life cycle of nuclear systems under safety constraints requires high-performance experimental programs to reduce uncertainties on margins and limits. In addition to improvement in modeling and simulation, innovation in instrumentation is crucial for analytical and integral experiments conducted in research reactors. The quality of nuclear research programs relies obviously on an excellent knowledge of their experimental environment which constantly calls for better online determination of neutron and gamma flux. But the combination of continuously increasing scientific requirements and new experimental domains -brought for example by Generation IV programsnecessitates also major innovations for in-pile measurements of temperature, dimensions, pressure or chemical analysis in innovative mediums. At the same time, the recent arising of a European platform around the building of the Jules Horowitz Reactor offers new opportunities for research institutes and organizations to pool their resources in order to face these technical challenges. In this situation, CEA (French Nuclear Energy Commission) and SCK'CEN (Belgian Nuclear Research Centre) have combined their efforts and now share common developments through a Joint Instrumentation Laboratory. Significant progresses have thus been obtained recently in the field of in-pile measurements, on one hand by improvement of existing measurement methods, and on the other hand by introduction in research reactors of original measurement techniques. This paper highlights the state-of-the-art and the main requirements regarding in-pile measurements, particularly for the needs of current and future irradiation programs performed in material testing reactors. Some of the main on-going developments performed in the framework of the Joint Instrumentation Laboratory are also described, such as: - a unique fast neutron flux measurement system using fission chambers with 242Pu deposit and a specific online data processing

  15. An instrument thermal data base system. [for future shuttle missions

    NASA Technical Reports Server (NTRS)

    Bartoszek, J. T.; Csigi, K. I.; Ollendorf, S.; Oberright, J. E.

    1981-01-01

    The rationale for the implementation of an Instrument Thermal Data Base System (ITDBS) is discussed and the potential application of a data base management system in support of future space missions, the design of scientific instruments needed, and the potential payload groupings is described. Two basic data files are suggested, the first containing a detailed narrative information list pertaining to design configurations and optimum performance of each instrument, and the second consisting of a description of the parameters pertinent to the instruments' thermal control and design in the form of a summary record of coded information, and serving as a recall record. The applicability of a data request sheet for preliminary planning is described and is concluded that the proposed system may additionally prove to be a method of inventory control.

  16. Comparisons of Two Plasma Instruments on the International Space Station

    NASA Astrophysics Data System (ADS)

    Balthazor, R.; McHarg, M. G.; Minow, J. I.; Chandler, M. O.; Musick, J. D.; Feldmesser, H.; Darrin, M. A.; Osiander, R.

    2011-12-01

    The United States Air Force Academy's Canary instrument, a low-cost ion spectrometer with integrated charge multiplication, was installed on the International Space Station (ISS) on shuttle flight STS-134. The primary goal of the Canary experiment is to measure ion signals in the wake when ISS is flying in the standard +XVV attitude. However, the instrument is pointed (approximately) into ram and detects ambient Low Earth Orbit ions when the ISS is flying in the -XVV attitude. Simultaneous observations with NASA's Floating Plasma Measurement Unit (FPMU) have been taken during these times, and the results from each instrument are compared, in order to determine the origin of energy variations observed in the Canary ion signal. In addition, insights into the ISS floating plasma potential at the two different instrument locations can be obtained.

  17. Test results of the highly instrumented Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Mcconnaughey, H. V.; Leopard, J. L.; Lightfoot, R. M.

    1992-01-01

    Test results of a highly instrumented Space Shuttle Main Engine (SSME) are presented. The instrumented engine, when combined with instrumented high pressure turbopumps, contains over 750 special measurements, including flowrates, pressures, temperatures, and strains. To date, two different test series, accounting for a total of sixteen tests and 1,667 seconds, have been conducted with this engine. The first series, which utilized instrumented turbopumps, characterized the internal operating environment of the SSME for a variety of operating conditions. The second series provided system-level validation of a high pressure liquid oxygen turbopump that had been retrofitted with a fluid-film bearing in place of the usual pump-end ball bearings. Major findings from these two test series are highlighted in this paper. In addition, comparisons are made between model predictions and measured test data.

  18. Authentic Education, the Deeper and Multidisciplinary Perspective of Education, from the Viewpoint of Analytical Psychology

    ERIC Educational Resources Information Center

    Watagodakumbura, Chandana

    2014-01-01

    In this paper, the authentic education system defined with multidisciplinary perspectives (Watagodakumbura, 2013a, 2013b) is viewed from an additional perspective of analytical psychology. Analytical psychology provides insights into human development and is becoming more and more popular among practicing psychologist in the recent past. In…

  19. Static Load Test on Instrumented Pile - Field Data and Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Krasiński, Adam; Wiszniewski, Mateusz

    2017-09-01

    Static load tests on foundation piles are generally carried out in order to determine load - the displacement characteristic of the pile head. For standard (basic) engineering practices this type of test usually provides enough information. However, the knowledge of force distribution along the pile core and its division into the friction along the shaft and the resistance under the base can be very useful. Such information can be obtained by strain gage pile instrumentation [1]. Significant investigations have been completed on this technology, proving its utility and correctness [8], [10], [12]. The results of static tests on instrumented piles are not easy to interpret. There are many factors and processes affecting the final outcome. In order to understand better the whole testing process and soil-structure behavior some investigations and numerical analyses were done. In the paper, real data from a field load test on instrumented piles is discussed and compared with numerical simulation of such a test in similar conditions. Differences and difficulties in the results interpretation with their possible reasons are discussed. Moreover, the authors used their own analytical solution for more reliable determination of force distribution along the pile. The work was presented at the XVII French-Polish Colloquium of Soil and Rock Mechanics, Łódź, 28-30 November 2016.

  20. Energy conserving schemes for the simulation of musical instrument contact dynamics

    NASA Astrophysics Data System (ADS)

    Chatziioannou, Vasileios; van Walstijn, Maarten

    2015-03-01

    Collisions are an innate part of the function of many musical instruments. Due to the nonlinear nature of contact forces, special care has to be taken in the construction of numerical schemes for simulation and sound synthesis. Finite difference schemes and other time-stepping algorithms used for musical instrument modelling purposes are normally arrived at by discretising a Newtonian description of the system. However because impact forces are non-analytic functions of the phase space variables, algorithm stability can rarely be established this way. This paper presents a systematic approach to deriving energy conserving schemes for frictionless impact modelling. The proposed numerical formulations follow from discretising Hamilton's equations of motion, generally leading to an implicit system of nonlinear equations that can be solved with Newton's method. The approach is first outlined for point mass collisions and then extended to distributed settings, such as vibrating strings and beams colliding with rigid obstacles. Stability and other relevant properties of the proposed approach are discussed and further demonstrated with simulation examples. The methodology is exemplified through a case study on tanpura string vibration, with the results confirming the main findings of previous studies on the role of the bridge in sound generation with this type of string instrument.