Analytical methods for dating modern writing instrument inks on paper.
Ezcurra, Magdalena; Góngora, Juan M G; Maguregui, Itxaso; Alonso, Rosa
2010-04-15
This work reviews the different analytical methods that have been proposed in the field of forensic dating of inks from different modern writing instruments. The reported works have been classified according to the writing instrument studied and the ink component analyzed in relation to aging. The study, done chronologically, shows the advances experienced in the ink dating field in the last decades. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
Modern Instrumental Methods in Forensic Toxicology*
Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.
2009-01-01
This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968
Data Acquisition Programming (LabVIEW): An Aid to Teaching Instrumental Analytical Chemistry.
ERIC Educational Resources Information Center
Gostowski, Rudy
A course was developed at Austin Peay State University (Tennessee) which offered an opportunity for hands-on experience with the essential components of modern analytical instruments. The course aimed to provide college students with the skills necessary to construct a simple model instrument, including the design and fabrication of electronic…
Modern Analytical Chemistry in the Contemporary World
ERIC Educational Resources Information Center
Šíma, Jan
2016-01-01
Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among…
Challenges in Modern Anti-Doping Analytical Science.
Ayotte, Christiane; Miller, John; Thevis, Mario
2017-01-01
The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.
Early modern mathematical instruments.
Bennett, Jim
2011-12-01
In considering the appropriate use of the terms "science" and "scientific instrument," tracing the history of "mathematical instruments" in the early modern period is offered as an illuminating alternative to the historian's natural instinct to follow the guiding lights of originality and innovation, even if the trail transgresses contemporary boundaries. The mathematical instrument was a well-defined category, shared across the academic, artisanal, and commercial aspects of instrumentation, and its narrative from the sixteenth to the eighteenth century was largely independent from other classes of device, in a period when a "scientific" instrument was unheard of.
Developments in analytical instrumentation
NASA Astrophysics Data System (ADS)
Petrie, G.
The situation regarding photogrammetric instrumentation has changed quite dramatically over the last 2 or 3 years with the withdrawal of most analogue stereo-plotting machines from the market place and their replacement by analytically based instrumentation. While there have been few new developments in the field of comparators, there has been an explosive development in the area of small, relatively inexpensive analytical stereo-plotters based on the use of microcomputers. In particular, a number of new instruments have been introduced by manufacturers who mostly have not been associated previously with photogrammetry. Several innovative concepts have been introduced in these small but capable instruments, many of which are aimed at specialised applications, e.g. in close-range photogrammetry (using small-format cameras); for thematic mapping (by organisations engaged in environmental monitoring or resources exploitation); for map revision, etc. Another innovative and possibly significant development has been the production of conversion kits to convert suitable analogue stereo-plotting machines such as the Topocart, PG-2 and B-8 into fully fledged analytical plotters. The larger and more sophisticated analytical stereo-plotters are mostly being produced by the traditional mainstream photogrammetric systems suppliers with several new instruments and developments being introduced at the top end of the market. These include the use of enlarged photo stages to handle images up to 25 × 50 cm format; the complete integration of graphics workstations into the analytical plotter design; the introduction of graphics superimposition and stereo-superimposition; the addition of correlators for the automatic measurement of height, etc. The software associated with this new analytical instrumentation is now undergoing extensive re-development with the need to supply photogrammetric data as input to the more sophisticated G.I.S. systems now being installed by clients, instead
Modern analytical chemistry in the contemporary world
NASA Astrophysics Data System (ADS)
Šíma, Jan
2016-12-01
Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among sciences and in the contemporary world is discussed. Its interdisciplinary character and the necessity of the collaboration between analytical chemists and other experts in order to effectively solve the actual problems of the human society and the environment are emphasized. The importance of the analytical method validation in order to obtain the accurate and precise results is highlighted. The invalid results are not only useless; they can often be even fatal (e.g., in clinical laboratories). The curriculum of analytical chemistry at schools and universities is discussed. It is referred to be much broader than traditional equilibrium chemistry coupled with a simple description of individual analytical methods. Actually, the schooling of analytical chemistry should closely connect theory and practice.
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.
Integrating laboratory robots with analytical instruments--must it really be so difficult?
Kramer, G W
1990-09-01
Creating a reliable system from discrete laboratory instruments is often a task fraught with difficulties. While many modern analytical instruments are marvels of detection and data handling, attempts to create automated analytical systems incorporating such instruments are often frustrated by their human-oriented control structures and their egocentricity. The laboratory robot, while fully susceptible to these problems, extends such compatibility issues to the physical dimensions involving sample interchange, manipulation, and event timing. The workcell concept was conceived to describe the procedure and equipment necessary to carry out a single task during sample preparation. This notion can be extended to organize all operations in an automated system. Each workcell, no matter how complex its local repertoire of functions, must be minimally capable of accepting information (commands, data), returning information on demand (status, results), and being started, stopped, and reset by a higher level device. Even the system controller should have a mode where it can be directed by instructions from a higher level.
The Analog Revolution and Its On-Going Role in Modern Analytical Measurements.
Enke, Christie G
2015-12-15
The electronic revolution in analytical instrumentation began when we first exceeded the two-digit resolution of panel meters and chart recorders and then took the first steps into automated control. It started with the first uses of operational amplifiers (op amps) in the analog domain 20 years before the digital computer entered the analytical lab. Their application greatly increased both accuracy and precision in chemical measurement and they provided an elegant means for the electronic control of experimental quantities. Later, laboratory and personal computers provided an unlimited readout resolution and enabled programmable control of instrument parameters as well as storage and computation of acquired data. However, digital computers did not replace the op amp's critical role of converting the analog sensor's output to a robust and accurate voltage. Rather it added a new role: converting that voltage into a number. These analog operations are generally the limiting portions of our computerized instrumentation systems. Operational amplifier performance in gain, input current and resistance, offset voltage, and rise time have improved by a remarkable 3-4 orders of magnitude since their first implementations. Each 10-fold improvement has opened the doors for the development of new techniques in all areas of chemical analysis. Along with some interesting history, the multiple roles op amps play in modern instrumentation are described along with a number of examples of new areas of analysis that have been enabled by their improvements.
Microfabricated field calibration assembly for analytical instruments
Robinson, Alex L [Albuquerque, NM; Manginell, Ronald P [Albuquerque, NM; Moorman, Matthew W [Albuquerque, NM; Rodacy, Philip J [Albuquerque, NM; Simonson, Robert J [Cedar Crest, NM
2011-03-29
A microfabricated field calibration assembly for use in calibrating analytical instruments and sensor systems. The assembly comprises a circuit board comprising one or more resistively heatable microbridge elements, an interface device that enables addressable heating of the microbridge elements, and, in some embodiments, a means for positioning the circuit board within an inlet structure of an analytical instrument or sensor system.
Tsao, C C; Liou, J U; Wen, P H; Peng, C C; Liu, T S
2013-01-01
Aim To develop analytical models and analyse the stress distribution and flexibility of nickel–titanium (NiTi) instruments subject to bending forces. Methodology The analytical method was used to analyse the behaviours of NiTi instruments under bending forces. Two NiTi instruments (RaCe and Mani NRT) with different cross-sections and geometries were considered. Analytical results were derived using Euler–Bernoulli nonlinear differential equations that took into account the screw pitch variation of these NiTi instruments. In addition, the nonlinear deformation analysis based on the analytical model and the finite element nonlinear analysis was carried out. Numerical results are obtained by carrying out a finite element method. Results According to analytical results, the maximum curvature of the instrument occurs near the instrument tip. Results of the finite element analysis revealed that the position of maximum von Mises stress was near the instrument tip. Therefore, the proposed analytical model can be used to predict the position of maximum curvature in the instrument where fracture may occur. Finally, results of analytical and numerical models were compatible. Conclusion The proposed analytical model was validated by numerical results in analysing bending deformation of NiTi instruments. The analytical model is useful in the design and analysis of instruments. The proposed theoretical model is effective in studying the flexibility of NiTi instruments. Compared with the finite element method, the analytical model can deal conveniently and effectively with the subject of bending behaviour of rotary NiTi endodontic instruments. PMID:23173762
NASA Technical Reports Server (NTRS)
Panda, Binayak
2009-01-01
Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.
Modern Adaptive Analytics Approach to Lowering Seismic Network Detection Thresholds
NASA Astrophysics Data System (ADS)
Johnson, C. E.
2017-12-01
Modern seismic networks present a number of challenges, but perhaps most notably are those related to 1) extreme variation in station density, 2) temporal variation in station availability, and 3) the need to achieve detectability for much smaller events of strategic importance. The first of these has been reasonably addressed in the development of modern seismic associators, such as GLASS 3.0 by the USGS/NEIC, though some work still remains to be done in this area. However, the latter two challenges demand special attention. Station availability is impacted by weather, equipment failure or the adding or removing of stations, and while thresholds have been pushed to increasingly smaller magnitudes, new algorithms are needed to achieve even lower thresholds. Station availability can be addressed by a modern, adaptive architecture that maintains specified performance envelopes using adaptive analytics coupled with complexity theory. Finally, detection thresholds can be lowered using a novel approach that tightly couples waveform analytics with the event detection and association processes based on a principled repicking algorithm that uses particle realignment for enhanced phase discrimination.
Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"
NASA Astrophysics Data System (ADS)
Pal, Sangita; Singha, Mousumi; Meena, Sher Singh
2018-04-01
Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.
Merging Old and New: An Instrumentation-Based Introductory Analytical Laboratory
ERIC Educational Resources Information Center
Jensen, Mark B.
2015-01-01
An instrumentation-based laboratory curriculum combining traditional unknown analyses with student-designed projects has been developed for an introductory analytical chemistry course. In the first half of the course, students develop laboratory skills and instrumental proficiency by rotating through six different instruments performing…
Promoting Active Learning by Practicing the "Self-Assembly" of Model Analytical Instruments
ERIC Educational Resources Information Center
Algar, W. Russ; Krull, Ulrich J.
2010-01-01
In our upper-year instrumental analytical chemistry course, we have developed "cut-and-paste" exercises where students "build" models of analytical instruments from individual schematic images of components. These exercises encourage active learning by students. Instead of trying to memorize diagrams, students are required to think deeply about…
NASA Astrophysics Data System (ADS)
Wenzel, Thomas J.
2001-09-01
The availability of state-of-the-art instruments such as high performance liquid chromatograph, gas chromatograph-mass spectrometer, inductively coupled plasma-atomic emission spectrometer, capillary electrophoresis system, and ion chromatograph obtained through four Instructional Laboratory Improvement and one Course, Curriculum, and Laboratory Improvement grants from the National Science Foundation has led to a profound change in the structure of the analytical and general chemistry courses at Bates College. Students in both sets of courses now undertake ambitious, semester-long, small-group projects. The general chemistry course, which fulfills the prerequisite requirement for all upper-level chemistry courses, focuses on the connection between chemistry and the study of the environment. The projects provide students with an opportunity to conduct a real scientific investigation. The projects emphasize problem solving, team work, and communication, while still fostering the development of important laboratory skills. Cooperative learning is also used extensively in the classroom portion of these courses.
Analytical Electrochemistry: Theory and Instrumentation of Dynamic Techniques.
ERIC Educational Resources Information Center
Johnson, Dennis C.
1980-01-01
Emphasizes trends in the development of six topics concerning analytical electrochemistry, including books and reviews (34 references cited), mass transfer (59), charge transfer (25), surface effects (33), homogeneous reactions (21), and instrumentation (31). (CS)
Analytic Method for Computing Instrument Pointing Jitter
NASA Technical Reports Server (NTRS)
Bayard, David
2003-01-01
A new method of calculating the root-mean-square (rms) pointing jitter of a scientific instrument (e.g., a camera, radar antenna, or telescope) is introduced based on a state-space concept. In comparison with the prior method of calculating the rms pointing jitter, the present method involves significantly less computation. The rms pointing jitter of an instrument (the square root of the jitter variance shown in the figure) is an important physical quantity which impacts the design of the instrument, its actuators, controls, sensory components, and sensor- output-sampling circuitry. Using the Sirlin, San Martin, and Lucke definition of pointing jitter, the prior method of computing the rms pointing jitter involves a frequency-domain integral of a rational polynomial multiplied by a transcendental weighting function, necessitating the use of numerical-integration techniques. In practice, numerical integration complicates the problem of calculating the rms pointing error. In contrast, the state-space method provides exact analytic expressions that can be evaluated without numerical integration.
Weitz, Karl K [Pasco, WA; Moore, Ronald J [West Richland, WA
2010-07-13
A method and device are disclosed that provide for detection of fluid leaks in analytical instruments and instrument systems. The leak detection device includes a collection tube, a fluid absorbing material, and a circuit that electrically couples to an indicator device. When assembled, the leak detection device detects and monitors for fluid leaks, providing a preselected response in conjunction with the indicator device when contacted by a fluid.
Accurate mass measurements and their appropriate use for reliable analyte identification.
Godfrey, A Ruth; Brenton, A Gareth
2012-09-01
Accurate mass instrumentation is becoming increasingly available to non-expert users. This data can be mis-used, particularly for analyte identification. Current best practice in assigning potential elemental formula for reliable analyte identification has been described with modern informatic approaches to analyte elucidation, including chemometric characterisation, data processing and searching using facilities such as the Chemical Abstracts Service (CAS) Registry and Chemspider.
Continuing evolution of in-vitro diagnostic instrumentation
NASA Astrophysics Data System (ADS)
Cohn, Gerald E.
2000-04-01
The synthesis of analytical instrumentation and analytical biochemistry technologies in modern in vitro diagnostic instrumentation continues to generate new systems with improved performance and expanded capability. Detection modalities have expanded to include multichip modes of fluorescence, scattering, luminescence and reflectance so as to accommodate increasingly sophisticated immunochemical and nucleic acid based reagent systems. The time line graph of system development now extends from the earliest automated clinical spectrophotometers through molecule recognition assays and biosensors to the new breakthroughs of biochip and DNA diagnostics. This brief review traces some of the major innovations in the evolution of system technologies and previews the conference program.
The rise of environmental analytical chemistry as an interdisciplinary activity.
Brown, Richard
2009-07-01
Modern scientific endeavour is increasingly delivered within an interdisciplinary framework. Analytical environmental chemistry is a long-standing example of an interdisciplinary approach to scientific research where value is added by the close cooperation of different disciplines. This editorial piece discusses the rise of environmental analytical chemistry as an interdisciplinary activity and outlines the scope of the Analytical Chemistry and the Environmental Chemistry domains of TheScientificWorldJOURNAL (TSWJ), and the appropriateness of TSWJ's domain format in covering interdisciplinary research. All contributions of new data, methods, case studies, and instrumentation, or new interpretations and developments of existing data, case studies, methods, and instrumentation, relating to analytical and/or environmental chemistry, to the Analytical and Environmental Chemistry domains, are welcome and will be considered equally.
Development of Impurity Profiling Methods Using Modern Analytical Techniques.
Ramachandra, Bondigalla
2017-01-02
This review gives a brief introduction about the process- and product-related impurities and emphasizes on the development of novel analytical methods for their determination. It describes the application of modern analytical techniques, particularly the ultra-performance liquid chromatography (UPLC), liquid chromatography-mass spectrometry (LC-MS), high-resolution mass spectrometry (HRMS), gas chromatography-mass spectrometry (GC-MS) and high-performance thin layer chromatography (HPTLC). In addition to that, the application of nuclear magnetic resonance (NMR) spectroscopy was also discussed for the characterization of impurities and degradation products. The significance of the quality, efficacy and safety of drug substances/products, including the source of impurities, kinds of impurities, adverse effects by the presence of impurities, quality control of impurities, necessity for the development of impurity profiling methods, identification of impurities and regulatory aspects has been discussed. Other important aspects that have been discussed are forced degradation studies and the development of stability indicating assay methods.
Post-analytical Issues in Hemostasis and Thrombosis Testing.
Favaloro, Emmanuel J; Lippi, Giuseppe
2017-01-01
Analytical concerns within hemostasis and thrombosis testing are continuously decreasing. This is essentially attributable to modern instrumentation, improvements in test performance and reliability, as well as the application of appropriate internal quality control and external quality assurance measures. Pre-analytical issues are also being dealt with in some newer instrumentation, which are able to detect hemolysis, icteria and lipemia, and, in some cases, other issues related to sample collection such as tube under-filling. Post-analytical issues are generally related to appropriate reporting and interpretation of test results, and these are the focus of the current overview, which provides a brief description of these events, as well as guidance for their prevention or minimization. In particular, we propose several strategies for improved post-analytical reporting of hemostasis assays and advise that this may provide the final opportunity to prevent serious clinical errors in diagnosis.
Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.
Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S
2016-04-07
Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.
Weng, Naidong; Needham, Shane; Lee, Mike
2015-01-01
The 17th Annual Symposium on Clinical and Pharmaceutical Solutions through Analysis (CPSA) 29 September-2 October 2014, was held at the Sheraton Bucks County Hotel, Langhorne, PA, USA. The CPSA USA 2014 brought the various analytical fields defining the challenges of the modern analytical laboratory. Ongoing discussions focused on the future application of bioanalysis and other disciplines to support investigational new drugs (INDs) and new drug application (NDA) submissions, clinical diagnostics and pathology laboratory personnel that support patient sample analysis, and the clinical researchers that provide insights into new biomarkers within the context of the modern laboratory and personalized medicine.
NASA Astrophysics Data System (ADS)
Platonov, I. A.; Kolesnichenko, I. N.; Lange, P. K.
2018-05-01
In this paper, the chromatography desorption method of obtaining gas mixtures of known compositions stable for a time sufficient to calibrate analytical instruments is considered. The comparative analysis results of the preparation accuracy of gas mixtures with volatile organic compounds using diffusion, polyabarbotage and chromatography desorption methods are presented. It is shown that the application of chromatography desorption devices allows one to obtain gas mixtures that are stable for 10...60 hours in a dynamic condition. These gas mixtures contain volatile aliphatic and aromatic hydrocarbons with a concentration error of no more than 7%. It is shown that it is expedient to use such gas mixtures for analytical instruments calibration (chromatographs, spectrophotometers, etc.)
NASA Astrophysics Data System (ADS)
Zurweni, Wibawa, Basuki; Erwin, Tuti Nurian
2017-08-01
The framework for teaching and learning in the 21st century was prepared with 4Cs criteria. Learning providing opportunity for the development of students' optimal creative skills is by implementing collaborative learning. Learners are challenged to be able to compete, work independently to bring either individual or group excellence and master the learning material. Virtual laboratory is used for the media of Instrumental Analytical Chemistry (Vis, UV-Vis-AAS etc) lectures through simulations computer application and used as a substitution for the laboratory if the equipment and instruments are not available. This research aims to design and develop collaborative-creative learning model using virtual laboratory media for Instrumental Analytical Chemistry lectures, to know the effectiveness of this design model adapting the Dick & Carey's model and Hannafin & Peck's model. The development steps of this model are: needs analyze, design collaborative-creative learning, virtual laboratory media using macromedia flash, formative evaluation and test of learning model effectiveness. While, the development stages of collaborative-creative learning model are: apperception, exploration, collaboration, creation, evaluation, feedback. Development of collaborative-creative learning model using virtual laboratory media can be used to improve the quality learning in the classroom, overcome the limitation of lab instruments for the real instrumental analysis. Formative test results show that the Collaborative-Creative Learning Model developed meets the requirements. The effectiveness test of students' pretest and posttest proves significant at 95% confidence level, t-test higher than t-table. It can be concluded that this learning model is effective to use for Instrumental Analytical Chemistry lectures.
Two-dimensional convolute integers for analytical instrumentation
NASA Technical Reports Server (NTRS)
Edwards, T. R.
1982-01-01
As new analytical instruments and techniques emerge with increased dimensionality, a corresponding need is seen for data processing logic which can appropriately address the data. Two-dimensional measurements reveal enhanced unknown mixture analysis capability as a result of the greater spectral information content over two one-dimensional methods taken separately. It is noted that two-dimensional convolute integers are merely an extension of the work by Savitzky and Golay (1964). It is shown that these low-pass, high-pass and band-pass digital filters are truly two-dimensional and that they can be applied in a manner identical with their one-dimensional counterpart, that is, a weighted nearest-neighbor, moving average with zero phase shifting, convoluted integer (universal number) weighting coefficients.
Novel approaches to the construction of miniaturized analytical instrumentation
NASA Technical Reports Server (NTRS)
Porter, Marc D.; Otoole, Ronald P.; Coldiron, Shelley J.; Deninger, William D.; Deinhammer, Randall S.; Burns, Stanley G.; Bastiaans, Glenn J.; Braymen, Steve D.; Shanks, Howard R.
1992-01-01
This paper focuses on the design, construction, preliminary testing, and potential applications of three forms of miniaturized analytical instrumentation. The first is an optical fiber instrument for monitoring pH and other cations in aqueous solutions. The instrument couples chemically selective indicators that were immobilized at porous polymeric films with a hardware package that provides the excitation light source, required optical components, and detection and data processing hardware. The second is a new form of a piezoelectric mass sensor. The sensor was fabricated by the deposition of a thin (5.5 micron) film of piezoelectric aluminum nitride (AIN). The completed deposition process yields a thin film resonator (TFR) that is shaped as a 400 micron square and supports a standing bulk acoustic wave in a longitudinal mode at frequencies of approx. 1 GHz. Various deposition and vapor sorption studies indicate that the mass sensitivity of the TFR's rival those of the most sensitive mass sensors currently available, though offering such performance in a markedly smaller device. The third couples a novel form of liquid chromatography with microlithographic miniaturization techniques. The status of the miniaturization effort, the goal of which is to achieve chip-scale separations, is briefly discussed.
The modern rotor aerodynamic limits survey: A report and data survey
NASA Technical Reports Server (NTRS)
Cross, J.; Brilla, J.; Kufeld, R.; Balough, D.
1993-01-01
The first phase of the Modern Technology Rotor Program, the Modern Rotor Aerodynamic Limits Survey, was a flight test conducted by the United States Army Aviation Engineering Flight Activity for NASA Ames Research Center. The test was performed using a United States Army UH-60A Black Hawk aircraft and the United States Air Force HH-60A Night Hawk instrumented main-rotor blade. The primary purpose of this test was to gather high-speed, steady-state, and maneuvering data suitable for correlation purposes with analytical prediction tools. All aspects of the data base, flight-test instrumentation, and test procedures are presented and analyzed. Because of the high volume of data, only select data points are presented. However, access to the entire data set is available upon request.
ERIC Educational Resources Information Center
Gao, Ruomei
2015-01-01
In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…
NASA Astrophysics Data System (ADS)
Irwanto, Rohaeti, Eli; LFX, Endang Widjajanti; Suyanta
2017-05-01
This research aims to develop instrument and determine the characteristics of an integrated assessment instrument. This research uses 4-D model, which includes define, design, develop, and disseminate. The primary product is validated by expert judgment, tested it's readability by students, and assessed it's feasibility by chemistry teachers. This research involved 246 students of grade XI of four senior high schools in Yogyakarta, Indonesia. Data collection techniques include interview, questionnaire, and test. Data collection instruments include interview guideline, item validation sheet, users' response questionnaire, instrument readability questionnaire, and essay test. The results show that the integrated assessment instrument has Aiken validity value of 0.95. Item reliability was 0.99 and person reliability was 0.69. Teachers' response to the integrated assessment instrument is very good. Therefore, the integrated assessment instrument is feasible to be applied to measure the students' analytical thinking and science process skills.
Kling, Maximilian; Seyring, Nicole; Tzanova, Polia
2016-09-01
Economic instruments provide significant potential for countries with low municipal waste management performance in decreasing landfill rates and increasing recycling rates for municipal waste. In this research, strengths and weaknesses of landfill tax, pay-as-you-throw charging systems, deposit-refund systems and extended producer responsibility schemes are compared, focusing on conditions in countries with low waste management performance. In order to prioritise instruments for implementation in these countries, the analytic hierarchy process is applied using results of a literature review as input for the comparison. The assessment reveals that pay-as-you-throw is the most preferable instrument when utility-related criteria are regarded (wb = 0.35; analytic hierarchy process distributive mode; absolute comparison) mainly owing to its waste prevention effect, closely followed by landfill tax (wb = 0.32). Deposit-refund systems (wb = 0.17) and extended producer responsibility (wb = 0.16) rank third and fourth, with marginal differences owing to their similar nature. When cost-related criteria are additionally included in the comparison, landfill tax seems to provide the highest utility-cost ratio. Data from literature concerning cost (contrary to utility-related criteria) is currently not sufficiently available for a robust ranking according to the utility-cost ratio. In general, the analytic hierarchy process is seen as a suitable method for assessing economic instruments in waste management. Independent from the chosen analytic hierarchy process mode, results provide valuable indications for policy-makers on the application of economic instruments, as well as on their specific strengths and weaknesses. Nevertheless, the instruments need to be put in the country-specific context along with the results of this analytic hierarchy process application before practical decisions are made. © The Author(s) 2016.
Cultural Heritage of Observatories and Instruments - From Classical Astronomy to Modern Astrophysics
NASA Astrophysics Data System (ADS)
Wolfschmidt, Gudrun
Until the middle of the 19th century positioal astronomy with meridian circles played the dominant role. Pulkovo Observatory, St. Petersburg, was the leading institution for this kind of research. The design of this observatory was a model for the construction of observatories in the 19th century. In addition, in Hamburg Observatory and in some other observatories near the coast, time keeping and teaching of navigation were important tasks for astronomers. Around 1860 astronomy underwent a revolution. Astronomers began to investigate the properties of celestial bodies with physical and chemical methods. In the context of “classical astronomy”, only the direction of star light was studied. In the 1860s quantity and quality of radiation were studied for the first time. This was the beginning of modern “astrophysics”, a notion coined in 1865 by the Leipzig astronomer Karl Friedrich Zöllner (1834-1882). It is remarkable that many amateurs started this new astrophysics in private observatories but not in the established observatories like Greenwich, Paris or Pulkovo. In Germany this development started in Bothkamp Observatory near Kiel, with Hermann Carl Vogel (1841-1907), strongly influenced by Zöllner. An important enterprise was the foundation of the Astrophysical Observatory in Potsdam, near Berlin, in 1874 as the first observatory in the world dedicated to astrophysics - a foundation that inspired others. Important innovations and discoveries were made in Potsdam. The new field of astrophysics caused, and was caused by, new instrumentation: spectrographs, instruments for astrophotography, photometers and solar physics instruments. In particular, the glass mirror reflecting telescope was recognised as a more important instrument than a large refractor; for the new observatory in Hamburg-Bergedorf a 1-m reflector, the fourth largest in the world, made by Zeiss of Jena, was acquired in 1911. Another change was made in the architecture, the idea of a park
Juicing the Juice: A Laboratory-Based Case Study for an Instrumental Analytical Chemistry Course
ERIC Educational Resources Information Center
Schaber, Peter M.; Dinan, Frank J.; St. Phillips, Michael; Larson, Renee; Pines, Harvey A.; Larkin, Judith E.
2011-01-01
A young, inexperienced Food and Drug Administration (FDA) chemist is asked to distinguish between authentic fresh orange juice and suspected reconstituted orange juice falsely labeled as fresh. In an advanced instrumental analytical chemistry application of this case, inductively coupled plasma (ICP) spectroscopy is used to distinguish between the…
NASA Astrophysics Data System (ADS)
Echard, J.-P.; Cotte, M.; Dooryhee, E.; Bertrand, L.
2008-07-01
Though ancient violins and other stringed instruments are often revered for the beauty of their varnishes, the varnishing techniques are not much known. In particular, very few detailed varnish analyses have been published so far. Since 2002, a research program at the Musée de la musique (Paris) is dedicated to a detailed description of varnishes on famous ancient musical instruments using a series of novel analytical methods. For the first time, results are presented on the study of the varnish from a late 16th century Venetian lute, using synchrotron micro-analytical methods. Identification of both organic and inorganic compounds distributed within the individual layers of a varnish microsample has been performed using spatially resolved synchrotron Fourier transform infrared microscopy. The univocal identification of the mineral phases is obtained through synchrotron powder X-ray diffraction. The materials identified may be of utmost importance to understand the varnishing process and its similarities with some painting techniques. In particular, the proteinaceous binding medium and the calcium sulfate components (bassanite and anhydrite) that have been identified in the lower layers of the varnish microsample could be related, to a certain extent, to the ground materials of earlier Italian paintings.
Analytical description of the modern steam automobile
NASA Technical Reports Server (NTRS)
Peoples, J. A.
1974-01-01
The sensitivity of operating conditions upon performance of the modern steam automobile is discussed. The word modern has been used in the title to indicate that emphasis is upon miles per gallon rather than theoretical thermal efficiency. This has been accomplished by combining classical power analysis with the ideal Pressure-Volume diagram. Several parameters are derived which characterize performance capability of the modern steam car. The report illustrates that performance is dictated by the characteristics of the working medium, and the supply temperature. Performance is nearly independent of pressures above 800 psia. Analysis techniques were developed specifically for reciprocating steam engines suitable for automotive application. Specific performance charts have been constructed on the basis of water as a working medium. The conclusions and data interpretation are therefore limited within this scope.
Quantifying risks with exact analytical solutions of derivative pricing distribution
NASA Astrophysics Data System (ADS)
Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin
2017-04-01
Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.
Analytical techniques and instrumentation: A compilation
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information on developments in instrumentation is arranged into four sections: (1) instrumentation for analysis; (2) analysis of matter; (3) analysis of electrical and mechanical phenomena; and (4) structural analysis. Patent information for two of the instruments described is presented.
Instrumentation and fusion for congenital spine deformities.
Hedequist, Daniel J
2009-08-01
A retrospective clinical review. To review the use of modern instrumentation of the spine for congenital spinal deformities. Spinal instrumentation has evolved since the advent of the Harrington rod. There is a paucity of literature, which discusses the use of modern spinal instrumentation in congenital spine deformity cases. This review focuses on modern instrumentation techniques for congenital scoliosis and kyphosis. A systematic review was performed of the literature to discuss spinal implant use for congenital deformities. Spinal instrumentation may be safely and effectively used in cases of congenital spinal deformity. Spinal surgeons taking care of children with congenital spine deformities need to be trained in all aspects of modern spinal instrumentation.
Reviews in Modern Astronomy 12, Astronomical Instruments and Methods at the turn of the 21st Century
NASA Astrophysics Data System (ADS)
Schielicke, Reinhard E.
The yearbook series Reviews in Modern Astronomy of the Astronomische Gesellschaft (AG) was established in 1988 in order to bring the scientific events of the meetings of the society to the attention of the worldwide astronomical community. Reviews in Modern Astronomy is devoted exclusively to the invited Reviews, the Karl Schwarzschild Lectures, the Ludwig Biermann Award Lectures, and the highlight contributions from leading scientists reporting on recent progress and scientific achievements at their respective research institutes. Volume 12 continues the yearbook series with 16 contributions which were presented during the International Scientific Conference of the AG on ``Astronomical Instruments and Methods at the Turn of the 21st Century'' at Heidelberg from September 14 to 19, 1998
Gaze, David C; Prante, Christian; Dreier, Jens; Knabbe, Cornelius; Collet, Corinne; Launay, Jean-Marie; Franekova, Janka; Jabor, Antonin; Lennartz, Lieselotte; Shih, Jessie; del Rey, Jose Manuel; Zaninotto, Martina; Plebani, Mario; Collinson, Paul O
2014-06-01
Galectin-3 is secreted from macrophages and binds and activates fibroblasts forming collagen. Tissue fibrosis is central to the progression of chronic heart failure (CHF). We performed a European multicentered evaluation of the analytical performance of the two-step routine and Short Turn-Around-Time (STAT) galectin-3 immunoassay on the ARCHITECT i1000SR, i2000SR, and i4000SR (Abbott Laboratories). We evaluated the assay precision and dilution linearity for both routine and STAT assays and compared serum and plasma, and fresh vs. frozen samples. The reference interval and biological variability were also assessed. Measurable samples were compared between ARCHITECT instruments and between the routine and STAT assays and also to a galectin-3 ELISA (BG Medicine). The total assay coefficient of variation (CV%) was 2.3%-6.2% and 1.7%-7.4% for the routine and STAT assays, respectively. Both assays demonstrated linearity up to 120 ng/mL. Galectin-3 concentrations were higher in plasma samples than in serum samples and correlated well between fresh and frozen samples (R=0.997), between the routine and STAT assays, between the ARCHITECT i1000 and i2000 instruments and with the galectin-3 ELISA. The reference interval on 627 apparently healthy individuals (53% male) yielded upper 95th and 97.5th percentiles of 25.2 and 28.4 ng/mL, respectively. Values were significantly lower in subjects younger than 50 years. The galectin-3 routine and STAT assays on the Abbott ARCHITECT instruments demonstrated good analytical performance. Further clinical studies are required to demonstrate the diagnostic and prognostic potential of this novel marker in patients with CHF.
Westgard, Sten A
2016-06-01
To assess the analytical performance of instruments and methods through external quality assessment and proficiency testing data on the Sigma scale. A representative report from five different EQA/PT programs around the world (2 US, 1 Canadian, 1 UK, and 1 Australasian) was accessed. The instrument group standard deviations were used as surrogate estimates of instrument imprecision. Performance specifications from the US CLIA proficiency testing criteria were used to establish a common quality goal. Then Sigma-metrics were calculated to grade the analytical performance. Different methods have different Sigma-metrics for each analyte reviewed. Summary Sigma-metrics estimate the percentage of the chemistry analytes that are expected to perform above Five Sigma, which is where optimized QC design can be implemented. The range of performance varies from 37% to 88%, exhibiting significant differentiation between instruments and manufacturers. Median Sigmas for the different manufacturers in three analytes (albumin, glucose, sodium) showed significant differentiation. Chemistry tests are not commodities. Quality varies significantly from manufacturer to manufacturer, instrument to instrument, and method to method. The Sigma-assessments from multiple EQA/PT programs provide more insight into the performance of methods and instruments than any single program by itself. It is possible to produce a ranking of performance by manufacturer, instrument and individual method. Laboratories seeking optimal instrumentation would do well to consult this data as part of their decision-making process. To confirm that these assessments are stable and reliable, a longer term study should be conducted that examines more results over a longer time period. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Life cycle management of analytical methods.
Parr, Maria Kristina; Schmidt, Alexander H
2018-01-05
In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.
Letellier, G; Desjarlais, F
1985-12-01
We have investigated the effect of 20 drugs on the accuracy of results obtained from seven instruments now widely used in clinical biochemistry laboratories: Abbott VP, aca II, Cobas Bio, Ektachem 400, Hitachi 705, KDA and SMAC. Eleven to 18 constituents were analysed on each instrument. Our results lead us to the following conclusions: (1) only rarely does drug interference with a method lead to a clinically significant change in a measured value; (2) the magnitude of the change may relate linearly or non-linearly to the drug concentration but is usually independent of the target analyte concentration; (3) interference with a chemical reaction on one instrument does not always mean that the same reaction will be altered in the same way on other instruments; (4) no interferences were found for drugs with therapeutic levels in the low micro-molar range; (5) in most cases the interference could not be predicted from the chemical nature of drug.
Analytical Chemistry in Russia.
Zolotov, Yuri
2016-09-06
Research in Russian analytical chemistry (AC) is carried out on a significant scale, and the analytical service solves practical tasks of geological survey, environmental protection, medicine, industry, agriculture, etc. The education system trains highly skilled professionals in AC. The development and especially manufacturing of analytical instruments should be improved; in spite of this, there are several good domestic instruments and other satisfy some requirements. Russian AC has rather good historical roots.
Identification of Microorganisms by Modern Analytical Techniques.
Buszewski, Bogusław; Rogowska, Agnieszka; Pomastowski, Paweł; Złoch, Michał; Railean-Plugaru, Viorica
2017-11-01
Rapid detection and identification of microorganisms is a challenging and important aspect in a wide range of fields, from medical to industrial, affecting human lives. Unfortunately, classical methods of microorganism identification are based on time-consuming and labor-intensive approaches. Screening techniques require the rapid and cheap grouping of bacterial isolates; however, modern bioanalytics demand comprehensive bacterial studies at a molecular level. Modern approaches for the rapid identification of bacteria use molecular techniques, such as 16S ribosomal RNA gene sequencing based on polymerase chain reaction or electromigration, especially capillary zone electrophoresis and capillary isoelectric focusing. However, there are still several challenges with the analysis of microbial complexes using electromigration technology, such as uncontrolled aggregation and/or adhesion to the capillary surface. Thus, an approach using capillary electrophoresis of microbial aggregates with UV and matrix-assisted laser desorption ionization time-of-flight MS detection is presented.
Degradation of glass artifacts: application of modern surface analytical techniques.
Melcher, Michael; Wiesinger, Rita; Schreiner, Manfred
2010-06-15
A detailed understanding of the stability of glasses toward liquid or atmospheric attack is of considerable importance for preserving numerous objects of our cultural heritage. Glasses produced in the ancient periods (Egyptian, Greek, or Roman glasses), as well as modern glass, can be classified as soda-lime-silica glasses. In contrast, potash was used as a flux in medieval Northern Europe for the production of window panes for churches and cathedrals. The particular chemical composition of these potash-lime-silica glasses (low in silica and rich in alkali and alkaline earth components), in combination with increased levels of acidifying gases (such as SO(2), CO(2), NO(x), or O(3)) and airborne particulate matter in today's urban or industrial atmospheres, has resulted in severe degradation of important cultural relics, particularly over the last century. Rapid developments in the fields of microelectronics and computer sciences, however, have contributed to the development of a variety of nondestructive, surface analytical techniques for the scientific investigation and material characterization of these unique and valuable objects. These methods include scanning electron microscopy in combination with energy- or wavelength-dispersive spectrometry (SEM/EDX or SEM/WDX), secondary ion mass spectrometry (SIMS), and atomic force microscopy (AFM). In this Account, we address glass analysis and weathering mechanisms, exploring the possibilities (and limitations) of modern analytical techniques. Corrosion by liquid substances is well investigated in the glass literature. In a tremendous number of case studies, the basic reaction between aqueous solutions and the glass surfaces was identified as an ion-exchange reaction between hydrogen-bearing species of the attacking liquid and the alkali and alkaline earth ions in the glass, causing a depletion of the latter in the outermost surface layers. Although mechanistic analogies to liquid corrosion are obvious, atmospheric
ERIC Educational Resources Information Center
Chemical and Engineering News, 1979
1979-01-01
Surveys the state of commerical development of analytical instrumentation as reflected by the Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy. Includes optical spectroscopy, liquid chromatography, magnetic spectrometers, and x-ray. (Author/MA)
NASA Technical Reports Server (NTRS)
Panda, Binayak; Gorti, Sridhar
2013-01-01
A number of research instruments are available at NASA's Marshall Space Flight Center (MSFC) to support ISS researchers and their investigations. These modern analytical tools yield valuable and sometimes new informative resulting from sample characterization. Instruments include modern scanning electron microscopes equipped with field emission guns providing analytical capabilities that include angstron-level image resolution of dry, wet and biological samples. These microscopes are also equipped with silicon drift X-ray detectors (SDD) for fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations in crystalline alloys. Sample chambers admit large samples, provide variable pressures for wet samples, and quantitative analysis software to determine phase relations. Advances in solid-state electronics have also facilitated improvements for surface chemical analysis that are successfully employed to analyze metallic materials and alloys, ceramics, slags, and organic polymers. Another analytical capability at MSFC is a mganetic sector Secondary Ion Mass Spectroscopy (SIMS) that quantitatively determines and maps light elements such as hydrogen, lithium, and boron along with their isotopes, identifies and quantifies very low level impurities even at parts per billion (ppb) levels. Still other methods available at MSFC include X-ray photo-electron spectroscopy (XPS) that can determine oxidation states of elements as well as identify polymers and measure film thicknesses on coated materials, Scanning Auger electron spectroscopy (SAM) which combines surface sensitivity, spatial lateral resolution (approximately 20 nm), and depth profiling capabilities to describe elemental compositions in near surface regions and even the chemical state of analyzed atoms. Conventional Transmission Electron Microscope (TEM) for observing internal microstructures at very high magnifications and the Electron Probe
The NASA modern technology rotors program
NASA Technical Reports Server (NTRS)
Watts, M. E.; Cross, J. L.
1986-01-01
Existing data bases regarding helicopters are based on work conducted on 'old-technology' rotor systems. The Modern Technology Rotors (MTR) Program is to provide extensive data bases on rotor systems using present and emerging technology. The MTR is concerned with modern, four-bladed, rotor systems presently being manufactured or under development. Aspects of MTR philosophy are considered along with instrumentation, the MTR test program, the BV 360 Rotor, and the UH-60 Black Hawk. The program phases include computer modelling, shake test, model-scale test, minimally instrumented flight test, extensively pressure-instrumented-blade flight test, and full-scale wind tunnel test.
NASA Astrophysics Data System (ADS)
Petrova, N.; Zagidullin, A.; Nefedyev, Y.; Kosulin, V.; Andreev, A.
2017-11-01
Observing physical librations of celestial bodies and the Moon represents one of the astronomical methods of remotely assessing the internal structure of a celestial body without conducting expensive space experiments. The paper contains a review of recent advances in studying the Moon's structure using various methods of obtaining and applying the lunar physical librations (LPhL) data. In this article LPhL simulation methods of assessing viscoelastic and dissipative properties of the lunar body and lunar core parameters, whose existence has been recently confirmed during the seismic data reprocessing of ;Apollo; space mission, are described. Much attention is paid to physical interpretation of the free librations phenomenon and the methods for its determination. In the paper the practical application of the most accurate analytical LPhL tables (Rambaux and Williams, 2011) is discussed. The tables were built on the basis of complex analytical processing of the residual differences obtained when comparing long-term series of laser observations with the numerical ephemeris DE421. In the paper an efficiency analysis of two approaches to LPhL theory is conducted: the numerical and the analytical ones. It has been shown that in lunar investigation both approaches complement each other in various aspects: the numerical approach provides high accuracy of the theory, which is required for the proper processing of modern observations, the analytical approach allows to comprehend the essence of the phenomena in the lunar rotation, predict and interpret new effects in the observations of lunar body and lunar core parameters.
Veronesi, Umberto; Martinón-Torres, Marcos
2018-06-18
Glass distillation equipment from an early modern alchemical laboratory was analyzed for its technology of manufacture and potential origin. Chemical data show that the assemblage can be divided into sodium-rich, colorless distillation vessels made with glass from Venice or its European imitation, and potassium-rich dark-brown non-specialized forms produced within the technological tradition of forest glass typical for central and north-western Europe. These results complete our understanding of the supply of technical apparatus at one of the best-preserved alchemical laboratories and highlight an early awareness of the need for high-quality instruments to guarantee the successful outcome of specialized chemical operations. This study demonstrates the potential of archaeological science to inform historical research around the practice of early chemistry and the development of modern science. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Analytical capillary isotachophoresis after 50 years of development: Recent progress 2014-2016.
Malá, Zdena; Gebauer, Petr; Boček, Petr
2017-01-01
This review brings a survey of papers on analytical ITP published since 2014 until the first quarter of 2016. The 50th anniversary of ITP as a modern analytical method offers the opportunity to present a brief view on its beginnings and to discuss the present state of the art from the viewpoint of the history of its development. Reviewed papers from the field of theory and principles confirm the continuing importance of computer simulations in the discovery of new and unexpected phenomena. The strongly developing field of instrumentation and techniques shows novel channel methodologies including use of porous media and new on-chip assays, where ITP is often included in a preseparative or even preparative function. A number of new analytical applications are reported, with ITP appearing almost exclusively in combination with other principles and methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Designing new guides and instruments using McStas
NASA Astrophysics Data System (ADS)
Farhi, E.; Hansen, T.; Wildes, A.; Ghosh, R.; Lefmann, K.
With the increasing complexity of modern neutron-scattering instruments, the need for powerful tools to optimize their geometry and physical performances (flux, resolution, divergence, etc.) has become essential. As the usual analytical methods reach their limit of validity in the description of fine effects, the use of Monte Carlo simulations, which can handle these latter, has become widespread. The McStas program was developed at Riso National Laboratory in order to provide neutron scattering instrument scientists with an efficient and flexible tool for building Monte Carlo simulations of guides, neutron optics and instruments [1]. To date, the McStas package has been extensively used at the Institut Laue-Langevin, Grenoble, France, for various studies including cold and thermal guides with ballistic geometry, diffractometers, triple-axis, backscattering and time-of-flight spectrometers [2]. In this paper, we present some simulation results concerning different guide geometries that may be used in the future at the Institut Laue-Langevin. Gain factors ranging from two to five may be obtained for the integrated intensities, depending on the exact geometry, the guide coatings and the source.
Annual banned-substance review: Analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans
2018-01-01
Several high-profile revelations concerning anti-doping rule violations over the past 12 months have outlined the importance of tackling prevailing challenges and reducing the limitations of the current anti-doping system. At this time, the necessity to enhance, expand, and improve analytical test methods in response to the substances outlined in the World Anti-Doping Agency's (WADA) Prohibited List represents an increasingly crucial task for modern sports drug-testing programs. The ability to improve analytical testing methods often relies on the expedient application of novel information regarding superior target analytes for sports drug-testing assays, drug elimination profiles, alternative test matrices, together with recent advances in instrumental developments. This annual banned-substance review evaluates literature published between October 2016 and September 2017 offering an in-depth evaluation of developments in these arenas and their potential application to substances reported in WADA's 2017 Prohibited List. Copyright © 2017 John Wiley & Sons, Ltd.
Rogstad, Sarah; Pang, Eric; Sommers, Cynthia; Hu, Meng; Jiang, Xiaohui; Keire, David A; Boyne, Michael T
2015-11-01
Glatiramer acetate (GA) is a mixture of synthetic copolymers consisting of four amino acids (glutamic acid, lysine, alanine, and tyrosine) with a labeled molecular weight range of 5000 to 9000 Da. GA is marketed as Copaxone™ by Teva for the treatment of multiple sclerosis. Here, the agency has evaluated the structure and composition of GA and a commercially available comparator, Copolymer-1. Modern analytical technologies which can characterize these complex mixtures are desirable for analysis of their comparability and structural "sameness." In the studies herein, a molecular fingerprinting approach is taken using mass-accurate mass spectrometry (MS) analysis, nuclear magnetic resonance (NMR) (1D-(1)H-NMR, 1D-(13)C-NMR, and 2D NMR), and asymmetric field flow fractionation (AFFF) coupled with multi-angle light scattering (MALS) for an in-depth characterization of three lots of the marketplace drug and a formulated sample of the comparator. Statistical analyses were applied to the MS and AFFF-MALS data to assess these methods' ability to detect analytical differences in the mixtures. The combination of multiple orthogonal measurements by liquid chromatography coupled with MS (LC-MS), AFFF-MALS, and NMR on the same sample set was found to be fit for the intended purpose of distinguishing analytical differences between these complex mixtures of peptide chains.
A review of modern instrumental techniques for measurements of ice cream characteristics.
Bahram-Parvar, Maryam
2015-12-01
There is an increasing demand of the food industries and research institutes to have means of measurement allowing the characterization of foods. Ice cream, as a complex food system, consists of a frozen matrix containing air bubbles, fat globules, ice crystals, and an unfrozen serum phase. Some deficiencies in conventional methods for testing this product encourage the use of alternative techniques such as rheometry, spectroscopy, X-ray, electro-analytical techniques, ultrasound, and laser. Despite the development of novel instrumental applications in food science, use of some of them in ice cream testing is few, but has shown promising results. Developing the novel methods should increase our understanding of characteristics of ice cream and may allow online testing of the product. This review article discusses the potential of destructive and non-destructive methodologies in determining the quality and characteristics of ice cream and similar products. Copyright © 2015. Published by Elsevier Ltd.
Analytical techniques for steroid estrogens in water samples - A review.
Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza
2016-12-01
In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Modern Approach to College Analytical Chemistry.
ERIC Educational Resources Information Center
Neman, R. L.
1983-01-01
Describes a course which emphasizes all facets of analytical chemistry, including sampling, preparation, interference removal, selection of methodology, measurement of a property, and calculation/interpretation of results. Includes special course features (such as cooperative agreement with an environmental protection center) and course…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekechukwu, A.
This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses onmore » validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.« less
Integration of analytical instruments with computer scripting.
Carvalho, Matheus C
2013-08-01
Automation of laboratory routines aided by computer software enables high productivity and is the norm nowadays. However, the integration of different instruments made by different suppliers is still difficult, because to accomplish it, the user must have knowledge of electronics and/or low-level programming. An alternative approach is to control different instruments without an electronic connection between them, relying only on their software interface on a computer. This can be achieved through scripting, which is the emulation of user operations (mouse clicks and keyboard inputs) on the computer. The main advantages of this approach are its simplicity, which enables people with minimal knowledge of computer programming to employ it, and its universality, which enables the integration of instruments made by different suppliers, meaning that the user is totally free to choose the devices to be integrated. Therefore, scripting can be a useful, accessible, and economic solution for laboratory automation.
Recommendations for fluorescence instrument qualification: the new ASTM Standard Guide.
DeRose, Paul C; Resch-Genger, Ute
2010-03-01
Aimed at improving quality assurance and quantitation for modern fluorescence techniques, ASTM International (ASTM) is about to release a Standard Guide for Fluorescence, reviewed here. The guide's main focus is on steady state fluorometry, for which available standards and instrument characterization procedures are discussed along with their purpose, suitability, and general instructions for use. These include the most relevant instrument properties needing qualification, such as linearity and spectral responsivity of the detection system, spectral irradiance reaching the sample, wavelength accuracy, sensitivity or limit of detection for an analyte, and day-to-day performance verification. With proper consideration of method-inherent requirements and limitations, many of these procedures and standards can be adapted to other fluorescence techniques. In addition, procedures for the determination of other relevant fluorometric quantities including fluorescence quantum yields and fluorescence lifetimes are briefly introduced. The guide is a clear and concise reference geared for users of fluorescence instrumentation at all levels of experience and is intended to aid in the ongoing standardization of fluorescence measurements.
Goicoechea, Héctor C; Olivieri, Alejandro C; Tauler, Romà
2010-03-01
Correlation constrained multivariate curve resolution-alternating least-squares is shown to be a feasible method for processing first-order instrumental data and achieve analyte quantitation in the presence of unexpected interferences. Both for simulated and experimental data sets, the proposed method could correctly retrieve the analyte and interference spectral profiles and perform accurate estimations of analyte concentrations in test samples. Since no information concerning the interferences was present in calibration samples, the proposed multivariate calibration approach including the correlation constraint facilitates the achievement of the so-called second-order advantage for the analyte of interest, which is known to be present for more complex higher-order richer instrumental data. The proposed method is tested using a simulated data set and two experimental data systems, one for the determination of ascorbic acid in powder juices using UV-visible absorption spectral data, and another for the determination of tetracycline in serum samples using fluorescence emission spectroscopy.
da Veiga Soares Carvalho, Maria Cláudia; Luz, Madel Therezinha; Prado, Shirley Donizete
2011-01-01
Eating, nourishment or nutrition circulate in our culture as synonyms and thus do not account for the changes that occur in nourishment, which intended or unintended, have a hybridization pattern that represents a change of rules and food preferences. This paper aims to take these common sense conceptions as analytic categories for analyzing and interpreting research for the Humanities and Health Sciences in a theoretical perspective, through conceptualization. The food is associated with a natural function (biological), a concept in which nature is opposed to culture, and nourishment takes cultural meanings (symbolic), expressing the division of labor, wealth, and a historical and cultural creation through which one can study a society. One attributes to Nutrition a sense of rational action, derived from the constitution of this science in modernity, inserted in a historical process of scientific rationalization of eating and nourishing. We believe that through the practice of conceptualization in interdisciplinary research, which involves a shared space of knowledge, we can be less constrained by a unified theoretical model of learning and be freer to think about life issues.
Rossum, Huub H van; Kemperman, Hans
2017-07-26
General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.
Canine olfaction as an alternative to analytical instruments for ...
Recent literature has touted the use of canine olfaction as a diagnostic tool for identifying pre-clinical disease status, especially cancer and infection from biological media samples. Studies have shown a wide range of outcomes, ranging from almost perfect discrimination, all the way to essentially random results. This disparity is not likely to be a detection issue; dogs have been shown to have extremely sensitive noses as proven by their use for tracking, bomb detection and search and rescue. However, in contrast to analytical instruments, dogs are subject to boredom, fatigue, hunger and external distractions. These challenges are of particular importance in a clinical environment where task repetition is prized, but not as entertaining for a dog as chasing odours outdoors. The question addressed here is how to exploit the intrinsic sensitivity and simplicity of having a dog simply sniff out disease, in the face of variability in behavior and response. There is no argument that living cells emanate a variety of gas- and liquid-phase compounds as waste from normal metabolism, and that these compounds become easureable from various biological media including skin, blood, urine, breath, feces, etc. [1, 2] The overarching term for this phenomenon from the perspective of systems biology analysis is “cellular respiration”, which has become an important topic for the interpretation and documentation of the human exposome, the chemical counterpart to the genome.
Automated novel high-accuracy miniaturized positioning system for use in analytical instrumentation
NASA Astrophysics Data System (ADS)
Siomos, Konstadinos; Kaliakatsos, John; Apostolakis, Manolis; Lianakis, John; Duenow, Peter
1996-01-01
The development of three-dimensional automotive devices (micro-robots) for applications in analytical instrumentation, clinical chemical diagnostics and advanced laser optics, depends strongly on the ability of such a device: firstly to be positioned with high accuracy, reliability, and automatically, by means of user friendly interface techniques; secondly to be compact; and thirdly to operate under vacuum conditions, free of most of the problems connected with conventional micropositioners using stepping-motor gear techniques. The objective of this paper is to develop and construct a mechanically compact computer-based micropositioning system for coordinated motion in the X-Y-Z directions with: (1) a positioning accuracy of less than 1 micrometer, (the accuracy of the end-position of the system is controlled by a hard/software assembly using a self-constructed optical encoder); (2) a heat-free propulsion mechanism for vacuum operation; and (3) synchronized X-Y motion.
Kounali, Daphne Z; Button, Katherine S; Lewis, Glyn; Ades, Anthony E
2016-09-01
We present a meta-analytic method that combines information on treatment effects from different instruments from a network of randomized trials to estimate instrument relative responsiveness. Five depression-test instruments [Beck Depression Inventory (BDI I/II), Patient Health Questionnaire (PHQ9), Hamilton Rating for Depression 17 and 24 items, Montgomery-Asberg Depression Rating] and three generic quality of life measures [EuroQoL (EQ-5D), SF36 mental component summary (SF36 MCS), and physical component summary (SF36 PCS)] were compared. Randomized trials of treatments for depression reporting outcomes on any two or more of these instruments were identified. Information on the within-trial ratios of standardized treatment effects was pooled across the studies to estimate relative responsiveness. The between-instrument ratios of standardized treatment effects vary across trials, with a coefficient of variation of 13% (95% credible interval: 6%, 25%). There were important differences between the depression measures, with PHQ9 being the most responsive instrument and BDI the least. Responsiveness of the EQ-5D and SF36 PCS was poor. SF36 MCS performed similarly to depression instruments. Information on relative responsiveness of several test instruments can be pooled across networks of trials reporting at least two outcomes, allowing comparison and ranking of test instruments that may never have been compared directly. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
B. Franz, Heather; G. Trainer, Melissa; H. Wong, Michael; L. K. Manning, Heidi; C. Stern, Jennifer; R. Mahaffy, Paul; K. Atreya, Sushil; Benna, Mehdi; G. Conrad, Pamela; N. Harpold, Dan; A. Leshin, Laurie; A. Malespin, Charles; P. McKay, Christopher; Thomas Nolan, J.; Raaen, Eric
2014-06-01
The Sample Analysis at Mars (SAM) instrument suite is the largest scientific payload on the Mars Science Laboratory (MSL) Curiosity rover, which landed in Mars' Gale Crater in August 2012. As a miniature geochemical laboratory, SAM is well-equipped to address multiple aspects of MSL's primary science goal, characterizing the potential past or present habitability of Gale Crater. Atmospheric measurements support this goal through compositional investigations relevant to martian climate evolution. SAM instruments include a quadrupole mass spectrometer, a tunable laser spectrometer, and a gas chromatograph that are used to analyze martian atmospheric gases as well as volatiles released by pyrolysis of solid surface materials (Mahaffy et al., 2012). This report presents analytical methods for retrieving the chemical and isotopic composition of Mars' atmosphere from measurements obtained with SAM's quadrupole mass spectrometer. It provides empirical calibration constants for computing volume mixing ratios of the most abundant atmospheric species and analytical functions to correct for instrument artifacts and to characterize measurement uncertainties. Finally, we discuss differences in volume mixing ratios of the martian atmosphere as determined by SAM (Mahaffy et al., 2013) and Viking (Owen et al., 1977; Oyama and Berdahl, 1977) from an analytical perspective. Although the focus of this paper is atmospheric observations, much of the material concerning corrections for instrumental effects also applies to reduction of data acquired with SAM from analysis of solid samples. The Sample Analysis at Mars (SAM) instrument measures the composition of the martian atmosphere. Rigorous calibration of SAM's mass spectrometer was performed with relevant gas mixtures. Calibration included derivation of a new model to correct for electron multiplier effects. Volume mixing ratios for Ar and N2 obtained with SAM differ from those obtained with Viking. Differences between SAM and Viking
The modern trends in space electromagnetic instrumentation
NASA Astrophysics Data System (ADS)
Korepanov, V. E.
The future trends of the experimental plasma physics development in outer space demand more and more exact and sophisticated scientific instrumentation. Moreover, the situation is complicated by constant reduction of financial support of scientific research, even in leading countries. This resulted in the development of mini; micro and nanosatellites with low price and short preparation time. Consequently, it provoked the creation of new generation of scientific instruments with reduced weight and power consumption but increased level of metrological parameters. The recent state of the development of electromagnetic (EM) sensors for microsatellites is reported. For flux-gate magnetometers (FGM) the reduction of weight as well as power consumption was achieved not only due to the use of new electronic components but also because of the new operation mode development. The scientific and technological study allowed to decrease FGM noise and now the typical noise figure is about 10 picotesla rms at 1 Hz and the record one is below 1 picotesla. The super-light version of search-coil magnetometers (SCM) was created as the result of intensive research. These new SCMs can have about six decades of operational frequency band with upper limit ˜ 1 MHz and noise level of few femtotesla with total weight about 75 grams, including electronics. A new instrument.- wave probe (WP) - which combines three independent sensors in one body - SCM, split Langmuir probe and electric potential sensor - was created. The developed theory confirms that WP can directly measure the wave vector components in space plasmas.
Storey, Andrew P; Hieftje, Gary M
2016-12-01
Over the last several decades, science has benefited tremendously by the implementation of digital electronic components for analytical instrumentation. A pioneer in this area of scientific inquiry was Howard Malmstadt. Frequently, such revolutions in scientific history can be viewed as a series of discoveries without a great deal of attention as to how mentorship shapes the careers and methodologies of those who made great strides forward for science. This paper focuses on the verifiable relationships of those who are connected through the academic tree of Malmstadt and how their experiences and the context of world events influenced their scientific pursuits. Particular attention is dedicated to the development of American chemistry departments and the critical role played by many of the individuals in the tree in this process. © The Author(s) 2016.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekechukwu, A
Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validatemore » analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.« less
xQuake: A Modern Approach to Seismic Network Analytics
NASA Astrophysics Data System (ADS)
Johnson, C. E.; Aikin, K. E.
2017-12-01
While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.
Teaching social responsibility in analytical chemistry.
Valcárcel, M; Christian, G D; Lucena, R
2013-07-02
Analytical chemistry is key to the functioning of a modern society. From early days, ethics in measurements have been a concern and that remains today, especially as we have come to rely more on the application of analytical science in many aspects of our lives. The main aim of this Feature is to suggest ways of introducing the topic of social responsibility and its relation to analytical chemistry in undergraduate or graduate chemistry courses.
Accommodating subject and instrument variations in spectroscopic determinations
Haas, Michael J [Albuquerque, NM; Rowe, Robert K [Corrales, NM; Thomas, Edward V [Albuquerque, NM
2006-08-29
A method and apparatus for measuring a biological attribute, such as the concentration of an analyte, particularly a blood analyte in tissue such as glucose. The method utilizes spectrographic techniques in conjunction with an improved instrument-tailored or subject-tailored calibration model. In a calibration phase, calibration model data is modified to reduce or eliminate instrument-specific attributes, resulting in a calibration data set modeling intra-instrument or intra-subject variation. In a prediction phase, the prediction process is tailored for each target instrument separately using a minimal number of spectral measurements from each instrument or subject.
A Business Case for Nuclear Plant Control Room Modernization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Ken; Lawrie, Sean; Niedermuller, Josef M.
This paper presents a generic business case for implementation of technology that supports Control Room Modernization (CRM). The analysis presented in two forms; 1) a standalone technology upgrade, and 2) a technology upgrade that is built upon and incremental to a prior business case created for Mobile Work Packages (MWP). The business case contends that advanced communication and networking and analytical technologies will allow NPP to conduct control room operations with improved focus by reducing human factors and redundant manpower, and therefore operate with fewer errors. While some labor savings can be harvested in terms of overtime, the majority ofmore » savings are demonstrated as reduced time to take the plant off line and bring back on line in support of outages. The benefits are quantified to a rough order of magnitude that provides directional guidance to NPPs that are interested in developing a similar business case. This business case focuses on modernization of the operator control room and does not consider a complete overhaul and modernization of a plants instrument and control systems. While operators may be considering such an investment at their plants, the sizable capital investment required is not likely supported by a cost/benefit analysis alone. More likely, it is driven by obsolescence and reliability issues, and requires consideration of mechanical condition of plant systems, capital depreciation, financing, relicensing and overall viability of the plant asset over a 20-year horizon in a competitive market. Prior studies [REF] have indicated that such a modernization of plant I&C systems, alone or as part of a larger modernization effort, can yield very significant reductions in O&M costs. However, the depth of research and analysis required to develop a meaningful business case for a plant modernization effort is well beyond the scope of this study. While CRM as considered in this study can be easily integrated as part of grander plant
Analytical Chemistry Laboratory Progress Report for FY 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less
NASA Astrophysics Data System (ADS)
Goodall, Clive
1993-08-01
A decisive and lethal response to a naive radical skepticism concerning the prospects for the existence of Extraterrestrial Intelligence is derivable from core areas of Modern Analytic Philosophy. The naive skeptical view is fundamentally flawed in the way it oversimplifies certain complex issues, failing as it does, to recognize a special class of conceptual problems for what they really are and mistakenly treating them instead as empirical issues. Specifically, this skepticism is based upon an untenable oversimplifying mode of the 'mind-brain' relation. Moreover, independent logical considerations concerning the mind-brain relation provide evidential grounds for why we should in fact expect a priori that an Alien Intelligence will face constraints upon, and immense difficulties in, making its existence known by non- electromagnetic means.
Ideologeme "Order" in Modern American Linguistic World Image
ERIC Educational Resources Information Center
Ibatova, Aygul Z.; Vdovichenko, Larisa V.; Ilyashenko, Lubov K.
2016-01-01
The paper studies the topic of modern American linguistic world image. It is known that any language is the most important instrument of cognition of the world by a person but there is also no doubt that any language is the way of perception and conceptualization of this knowledge about the world. In modern linguistics linguistic world image is…
Science Update: Analytical Chemistry.
ERIC Educational Resources Information Center
Worthy, Ward
1980-01-01
Briefly discusses new instrumentation in the field of analytical chemistry. Advances in liquid chromatography, photoacoustic spectroscopy, the use of lasers, and mass spectrometry are also discussed. (CS)
Computer-Aided Instruction in Automated Instrumentation.
ERIC Educational Resources Information Center
Stephenson, David T.
1986-01-01
Discusses functions of automated instrumentation systems, i.e., systems which combine electrical measuring instruments and a controlling computer to measure responses of a unit under test. The computer-assisted tutorial then described is programmed for use on such a system--a modern microwave spectrum analyzer--to introduce engineering students to…
The modern trends in space electromagnetic instrumentation
NASA Astrophysics Data System (ADS)
Korepanov, V.
The future trends of the experimental plasma physical development in outer space demands more and more exact and sophisticated scientific instrumentation. Moreover, the situation is complicated by constant reducing of financial support of scientific research, even in leading countries. This resulted in the development of mini, micro and nanosatellites with low price and short preparation time. Consequently, it provoked the creation of new generation of scientific instruments with reduced weight and power consumption but increased level of metrological parameters. The recent state of the development of electromagnetic (EM) sensors for microsatellites is reported. The set of EM sensors produced at LCISR includes following devices. Flux-gate magnetometers (FGM). The reduction of new of satellite versions FGM weight as well as power consumption was achieved not only due to the use of new electronic components but also because the development of new operation modes. To this the scientific and technological study allowed to decrease FGM noise and now typical figure is about 10 picotesla rms at 1 Hz and the record one is below 1 picotesla. Also because of satellite weight reduction the possibility was studied to use FGM only for satellite attitude control. The magnetic orientation and stabilization system was developed and new FGM for orientation was created. It uses industrial components and special measures are taken to increase its reliability. Search-coil magnetometers (SCM). The super-light version of SCM was created as the result of intensive scientific and technological research. These new SCMs can have about six decades operational frequency band noise with upper limit ~ 1 MHz and noise level of few femtotesla with total weight about 75 grams. Electric probes (EP). The study of operation condition of EP immersed in space plasma allowed to find the possibilities to decrease the EP weight conserving the same noise factor. Two types of EP operating from DC and from 0
Modern data science for analytical chemical data - A comprehensive review.
Szymańska, Ewa
2018-10-22
Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
Cervical Spine Instrumentation in Children.
Hedequist, Daniel J; Emans, John B
2016-06-01
Instrumentation of the cervical spine enhances stability and improves arthrodesis rates in children undergoing surgery for deformity or instability. Various morphologic and clinical studies have been conducted in children, confirming the feasibility of anterior or posterior instrumentation of the cervical spine with modern implants. Knowledge of the relevant spine anatomy and preoperative imaging studies can aid the clinician in understanding the pitfalls of instrumentation for each patient. Preoperative planning, intraoperative positioning, and adherence to strict surgical techniques are required given the small size of children. Instrumentation options include anterior plating, occipital plating, and a variety of posterior screw techniques. Complications related to screw malposition include injury to the vertebral artery, neurologic injury, and instrumentation failure.
Advances in analytical chemistry
NASA Technical Reports Server (NTRS)
Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.
1991-01-01
Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.
The importance of system band broadening in modern size exclusion chromatography.
Goyon, Alexandre; Guillarme, Davy; Fekete, Szabolcs
2017-02-20
In the last few years, highly efficient UHP-SEC columns packed with sub-3μm particles were commercialized by several providers. Besides the particle size reduction, the dimensions of modern SEC stationary phases (150×4.6mm) was also modified compared to regular SEC columns (300×6 or 300×8mm). Because the analytes are excluded from the pores in SEC, the retention factors are very low, ranging from -1
Analytical Chemistry Laboratory
NASA Technical Reports Server (NTRS)
Anderson, Mark
2013-01-01
The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-10-01
This report presents the proceedings of the Specialist`s Meeting on Experience in Aging, Maintenance and Modernization of Instrumentation and Control Systems for Improving Nuclear Power Plant Availability that was held at the Ramada Inn in Rockville, Maryland on May 5--7, 1993. The Meeting was presented in cooperation with the Electric Power Research Institute, Oak Ridge National Laboratory and the International Atomic Energy Agency. There were approximately 65 participants from 13 countries at the Meeting. Individual reports have been cataloged separately.
Instrument Attitude Precision Control
NASA Technical Reports Server (NTRS)
Juang, Jer-Nan
2004-01-01
A novel approach is presented in this paper to analyze attitude precision and control for an instrument gimbaled to a spacecraft subject to an internal disturbance caused by a moving component inside the instrument. Nonlinear differential equations of motion for some sample cases are derived and solved analytically to gain insight into the influence of the disturbance on the attitude pointing error. A simple control law is developed to eliminate the instrument pointing error caused by the internal disturbance. Several cases are presented to demonstrate and verify the concept presented in this paper.
[Clinical Application of Analytical and Medical Instruments Mainly Using MS Techniques].
Tanaka, Koichi
2016-02-01
Analytical instruments for clinical use are commonly required to confirm the compounds and forms related to diseases with the highest possible sensitivity, quantitative performance, and specificity and minimal invasiveness within a short time, easily, and at a low cost. Advancements of technical innovation for Mass Spectrometer (MS) have led to techniques that meet such requirements. Besides confirming known substances, other purposes and advantages of MS that are not fully known to the public are using MS as a tool to discover unknown phenomena and compounds. An example is clarifying the mechanisms of human diseases. The human body has approximately 100 thousand types of protein, and there may be more than several million types of protein and their metabolites. Most of them have yet to be discovered, and their discovery may give birth to new academic fields and lead to the clarification of diseases, development of new medicines, etc. For example, using the MS system developed under "Contribution to drug discovery and diagnosis by next generation of advanced mass spectrometry system," one of the 30 projects of the "Funding Program for World-Leading Innovative R&D on Science and Technology" (FIRST program), and other individual basic technologies, we succeeded in discovering new disease biomarker candidates for Alzheimer's disease, cancer, etc. Further contribution of MS to clinical medicine can be expected through the development and improvement of new techniques, efforts to verify discoveries, and communications with the medical front.
Safina, Gulnara
2012-01-27
Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.
Climate Analytics as a Service
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Q.; McInerney, Mark A.; Webster, W. Phillip; Lee, Tsengdar J.
2014-01-01
Climate science is a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). CAaaS combines high-performance computing and data-proximal analytics with scalable data management, cloud computing virtualization, the notion of adaptive analytics, and a domain-harmonized API to improve the accessibility and usability of large collections of climate data. MERRA Analytic Services (MERRA/AS) provides an example of CAaaS. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of key climate variables. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, CAaaS is providing the agility required to meet our customers' increasing and changing data management and data analysis needs.
NASA Astrophysics Data System (ADS)
Dalipi, Rogerta; Marguí, Eva; Borgese, Laura; Bilo, Fabjola; Depero, Laura E.
2016-06-01
Recent technological improvements have led to a widespread adoption of benchtop total reflection X-ray fluorescence systems (TXRF) for analysis of liquid samples. However, benchtop TXRF systems usually present limited sensitivity compared with high-scale instrumentation which can restrict its application in some fields. The aim of the present work was to evaluate and compare the analytical capabilities of two TXRF systems, equipped with low power Mo and W target X-ray tubes, for multielemental analysis of wine samples. Using the Mo-TXRF system, the detection limits for most elements were one order of magnitude lower than those attained using the W-TXRF system. For the detection of high Z elements like Cd and Ag, however, W-TXRF remains a very good option due to the possibility of K-Lines detection. Accuracy and precision of the obtained results have been evaluated analyzing spiked real wine samples and comparing the TXRF results with those obtained by inductively coupled plasma emission spectroscopy (ICP-OES). In general, good agreement was obtained between ICP-OES and TXRF results for the analysis of both red and white wine samples except for light elements (i.e., K) which TXRF concentrations were underestimated. However, a further achievement of analytical quality of TXRF results can be achieved if wine analysis is performed after dilution of the sample with de-ionized water.
40 CFR 1066.130 - Measurement instrument calibrations and verifications.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Measurement instrument calibrations... (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Equipment, Measurement Instruments, Fuel, and Analytical Gas Specifications § 1066.130 Measurement instrument calibrations and verifications. The...
ENANTIOMER-SPECIFIC FATE AND EFFECTS OF MODERN CHIRAL PESTICIDES
This slide presentation presents enantiomer-specific fate and effects of modern chiral pesticides. The research areas presented were analytical separation of enantiomers; environmental occurrence of enantiomers; transformation rates and enantioselectivity; bioaccumulation; and e...
Trends in Analytical Scale Separations.
ERIC Educational Resources Information Center
Jorgenson, James W.
1984-01-01
Discusses recent developments in the instrumentation and practice of analytical scale operations. Emphasizes detection devices and procedures in gas chromatography, liquid chromatography, electrophoresis, supercritical fluid chromatography, and field-flow fractionation. (JN)
New software solutions for analytical spectroscopists
NASA Astrophysics Data System (ADS)
Davies, Antony N.
1999-05-01
minutes rather than hours. Fast networks now enable data analysis of even multi-dimensional spectroscopic data sets remote from the measuring instrument. A strong tendency to opt for a more unified graphical user interface which is substantially more user friendly allows even inexperienced users to rapidly get acquainted with even the complex mathematical analyses. Some examples of new spectroscopic software products will be given to demonstrate the aforesaid points and highlight the ease of integration into a modern analytical spectroscopy workplace.
NASA Astrophysics Data System (ADS)
Cristescu, Corneliu; Drumea, Petrin; Krevey, Petrica
2009-01-01
In this work is presented the modern instrumentation used for monitoring and controlling the main parameters for one regenerative drive system, used to recovering the kinetic energy of motor vehicles, lost in the braking phase, storing and using this energy in the starting or accelerating phases. Is presented a Romanian technical solution for a regenerative driving system, based on a hybrid solution containing a hydro-mechanic module and an existing thermal motor drive, all conceived as a mechatronics system. In order to monitoring and controlling the evolution of the main parameters, the system contains a series of sensors and transducers that provide the moment, rotation, temperature, flow and pressure values. The main sensors and transducers of the regenerative drive system, their principal features and tehnical conecting solutions are presented in this paper, both with the menaging electronic and informational subsystems.
The Interfaces Between Historical, Paleo-, and Modern Climatology
NASA Astrophysics Data System (ADS)
Mock, C. J.
2011-12-01
Historical climatology, commonly defined as the study of reconstructing past climates from documentary and early instrumental data, has routinely utilized data within the last several hundred years down to sub-daily temporal resolution prior to the advent of "modern" instrumental records beginning in the late 19th and 20th centuries. Historical climate reconstruction methods generally share similar aspects conducted in both paleoclimate reconstruction and modern climatology, given the need to quantify, calibrate, and conduct careful data quality assessments. Although some studies have integrated historical climatic studies with other high resolution paleoclimatic proxies, very few efforts have integrated historical data with modern "systematic" climate networks to further examine spatial and temporal patterns of climate variability. This presentation describes historical climate examples of how such data can be integrated within modern climate timescales, including examples of documentary data on tropical cyclones from the Western Pacific and Atlantic Basins, colonial records from Belize and Constantinople, ship logbooks in the Western Arctic, plantation diaries from the American Southeast, and newspaper data from the Fiji Islands and Bermuda. Some results include a unique wet period in Belize and active tropical cyclone periods in the Western and South Pacific in the early 20th century - both are not reflected in conventional modern climate datasets. Documentary data examples demonstrate high feasibility in further understanding extreme weather events at daily timeframes such as false spring/killing frost episodes and hydrological extremes in southeastern North America. Recent unique efforts also involve community participation, secondary education, and web- based volunteer efforts to digitize and archive historical weather and climate information.
[Artistic creativity in the light of Jungian analytical psychology].
Trixler, Mátyás; Gáti, Agnes; Tényi, Tamás
2010-01-01
C.G. Jung's analytical psychology points at important issues in the psychological understanding of creativity. The theories of the Collective Unconscious and the Archetypes contributed to important discoveries in the interpretation of artistic creativity. Jung was concerned to show the relevance of Analytical Psychology to the understanding of European Modernism. Our paper deals with a short Jungian interpretation of Csontvary's art, too.
NASA Astrophysics Data System (ADS)
van Gend, Carel; Lombaard, Briehan; Sickafoose, Amanda; Whittal, Hamish
2016-07-01
Until recently, software for instruments on the smaller telescopes at the South African Astronomical Observatory (SAAO) has not been designed for remote accessibility and frequently has not been developed using modern software best-practice. We describe a software architecture we have implemented for use with new and upgraded instruments at the SAAO. The architecture was designed to allow for multiple components and to be fast, reliable, remotely- operable, support different user interfaces, employ as much non-proprietary software as possible, and to take future-proofing into consideration. Individual component drivers exist as standalone processes, communicating over a network. A controller layer coordinates the various components, and allows a variety of user interfaces to be used. The Sutherland High-speed Optical Cameras (SHOC) instruments incorporate an Andor electron-multiplying CCD camera, a GPS unit for accurate timing and a pair of filter wheels. We have applied the new architecture to the SHOC instruments, with the camera driver developed using Andor's software development kit. We have used this to develop an innovative web-based user-interface to the instrument.
Standard NIM Instrumentation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costrell, Louis; Lenkszus, Frank R.; Rudnick, Stanley J.
NIM is a standard modular instrumentation system that is in wide use throughout the world. As the NIM system developed and accommodations were made to a dynamic instrumentation field and a rapidly advancing technology, additions, revisions and clarifications were made. These were incorporated into the standard in the form of addenda and errata. This standard is a revision of the NIM document, AEC Report TID-20893 (Rev 4) dated July 1974. It includes all the addenda and errata items that were previously issued as well as numerous additional items to make the standard current with modern technology and manufacturing practice.
ERIC Educational Resources Information Center
Akarsu, Bayram
2011-01-01
In present paper, we propose a new diagnostic test to measure students' conceptual knowledge of principles of modern physics topics. Over few decades since born of physics education research (PER), many diagnostic instruments that measure students' conceptual understanding of various topics in physics, the earliest tests developed in PER are Force…
Martins, Thomas B
2002-01-01
The ability of the Luminex system to simultaneously quantitate multiple analytes from a single sample source has proven to be a feasible and cost-effective technology for assay development. In previous studies, my colleagues and I introduced two multiplex profiles consisting of 20 individual assays into the clinical laboratory. With the Luminex instrument's ability to classify up to 100 distinct microspheres, however, we have only begun to realize the enormous potential of this technology. By utilizing additional microspheres, it is now possible to add true internal controls to each individual sample. During the development of a seven-analyte serologic viral respiratory antibody profile, internal controls for detecting sample addition and interfering rheumatoid factor (RF) were investigated. To determine if the correct sample was added, distinct microspheres were developed for measuring the presence of sufficient quantities of immunoglobulin G (IgG) or IgM in the diluted patient sample. In a multiplex assay of 82 samples, the IgM verification control correctly identified 23 out of 23 samples with low levels (<20 mg/dl) of this antibody isotype. An internal control microsphere for RF detected 30 out of 30 samples with significant levels (>10 IU/ml) of IgM RF. Additionally, RF-positive samples causing false-positive adenovirus and influenza A virus IgM results were correctly identified. By exploiting the Luminex instrument's multiplexing capabilities, I have developed true internal controls to ensure correct sample addition and identify interfering RF as part of a respiratory viral serologic profile that includes influenza A and B viruses, adenovirus, parainfluenza viruses 1, 2, and 3, and respiratory syncytial virus. Since these controls are not assay specific, they can be incorporated into any serologic multiplex assay.
The New Zealand Tsunami Database: historical and modern records
NASA Astrophysics Data System (ADS)
Barberopoulou, A.; Downes, G. L.; Cochran, U. A.; Clark, K.; Scheele, F.
2016-12-01
A database of historical (pre-instrumental) and modern (instrumentally recorded)tsunamis that have impacted or been observed in New Zealand has been compiled andpublished online. New Zealand's tectonic setting, astride an obliquely convergenttectonic boundary on the Pacific Rim, means that it is vulnerable to local, regional andcircum-Pacific tsunamis. Despite New Zealand's comparatively short written historicalrecord of c. 200 years there is a wealth of information about the impact of past tsunamis.The New Zealand Tsunami Database currently has 800+ entries that describe >50 highvaliditytsunamis. Sources of historical information include witness reports recorded indiaries, notes, newspapers, books, and photographs. Information on recent events comesfrom tide gauges and other instrumental recordings such as DART® buoys, and media ofgreater variety, for example, video and online surveys. The New Zealand TsunamiDatabase is an ongoing project with information added as further historical records cometo light. Modern tsunamis are also added to the database once the relevant data for anevent has been collated and edited. This paper briefly overviews the procedures and toolsused in the recording and analysis of New Zealand's historical tsunamis, with emphasison database content.
Instrumental Analysis in Environmental Chemistry - Gas Phase Detection Systems
ERIC Educational Resources Information Center
Stedman, Donald H.; Meyers, Philip A.
1974-01-01
Discusses advances made in chemical analysis instrumentation used in environmental monitoring. This first of two articles is concerned with analytical instrumentation in which detection and dispersion depend ultimately on the properties of gaseous molecules. (JR)
ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...
Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli
ERIC Educational Resources Information Center
Toh, Chee-Seng
2007-01-01
A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.
Mechanical Study of a Modern Yo-Yo
ERIC Educational Resources Information Center
de Izarra, Charles
2011-01-01
This paper presents the study of a modern yo-yo having a centrifugal clutch allowing the free rolling. First, the mechanical parts of the yo-yo are measured, allowing us to determine analytically its velocity according to its height of fall. Then, we are more particularly interested in the centrifugal device constituted by springs and small…
Saito, Takeshi; Tominaga, Aya; Nozawa, Mayu; Unei, Hiroko; Hatano, Yayoi; Fujita, Yuji; Iseki, Ken; Hori, Yasushi
2013-09-01
In a 2008 survey of the 73 emergency and critical care centers around the nation that were equipped with the drug and chemical analytical instrument provided by the Ministry of Welfare (currently the Ministry of Health, Labour, and Welfare) in 1998, 36 of those facilities were using the analytical instruments. Of these 36 facilities, a follow-up survey of the 17 facilities that recorded 50 or analyses per year. Responses were gained from 16 of the facilities and we learned that of those, 14 facilities (87.5%) were conducting analyses using the instrument. There was a positive mutual correlation between the annual number of cases of the 14 facilities conducting analyses with the instrument and the number of work hours. Depending on the instrument in use, average analytical instrument parts and maintenance expenses were roughly three million yen and consumables required a maximum three million yen for analysis of 51-200 cases per year. From this, we calculate that such expenses can be covered under the allowed budget for advanced emergency and critical care centers of 5,000 NHI points (1 point = 10 yen). We found there were few facilities using the instrument for all 15 of the toxic substances recommended for testing by the Japanese Society for Clinical Toxicology. There tended to be no use of the analytical instrument for compounds with no toxicology cases. However, flexible responses were noted at each facility in relation to frequently analyzed compounds. It is thought that a reevaluation of compounds subject to analysis is required.
Analytical Chemistry and the Microchip.
ERIC Educational Resources Information Center
Lowry, Robert K.
1986-01-01
Analytical techniques used at various points in making microchips are described. They include: Fourier transform infrared spectrometry (silicon purity); optical emission spectroscopy (quantitative thin-film composition); X-ray photoelectron spectroscopy (chemical changes in thin films); wet chemistry, instrumental analysis (process chemicals);…
Selected Analytical Methods for Environmental Remediation and Recovery (SAM) - Home
The SAM Home page provides access to all information provided in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM), and includes a query function allowing users to search methods by analyte, sample type and instrumentation.
Selecting Suicide Ideation Assessment Instruments: A Meta-Analytic Review
ERIC Educational Resources Information Center
Erford, Bradley T.; Jackson, Jessica; Bardhoshi, Gerta; Duncan, Kelly; Atalay, Zumra
2018-01-01
Psychometric meta-analyses and reviews were provided for four commonly used suicidal ideation instruments: the Beck Scale for Suicide Ideation, the Suicide Ideation Questionnaire, the Suicide Probability Scale, and Columbia--Suicide Severity Rating Scale. Practical and technical issues and best use recommendations for screening and outcome…
The evolution of analytical chemistry methods in foodomics.
Gallo, Monica; Ferranti, Pasquale
2016-01-08
The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.
Comparison of modern icing cloud instruments
NASA Technical Reports Server (NTRS)
Takeuchi, D. M.; Jahnsen, L. J.; Callander, S. M.; Humbert, M. C.
1983-01-01
Intercomparison tests with Particle Measuring Systems (PMS) were conducted. Cloud liquid water content (LWC) measurements were also taken with a Johnson and Williams (JW) hot-wire device and an icing rate device (Leigh IDS). Tests include varying cloud LWC (0.5 to 5 au gm), cloud median volume diameter (MVD) (15 to 26 microns), temperature (-29 to 20 C), and air speeds (50 to 285 mph). Comparisons were based upon evaluating probe estimates of cloud LWC and median volume diameter for given tunnel settings. Variations of plus or minus 10% and plus or minus 5% in LWC and MVD, respectively, were determined of spray clouds between test made at given tunnel settings (fixed LWC, MVD, and air speed) indicating cloud conditions were highly reproducible. Although LWC measurements from JW and Leigh devices were consistent with tunnel values, individual probe measurements either consistently over or underestimated tunnel values by factors ranging from about 0.2 to 2. Range amounted to a factor of 6 differences between LWC estimates of probes for given cloud conditions. For given cloud conditions, estimates of cloud MVD between probes were within plus or minus 3 microns and 93% of the test cases. Measurements overestimated tunnel values in the range between 10 to 20 microns. The need for improving currently used calibration procedures was indicated. Establishment of test facility (or facilities) such as an icing tunnel where instruments can be calibrated against known cloud standards would be a logical choice.
Pressure-Assisted Chelating Extraction as a Teaching Tool in Instrumental Analysis
ERIC Educational Resources Information Center
Sadik, Omowunmi A.; Wanekaya, Adam K.; Yevgeny, Gelfand
2004-01-01
A novel instrumental-digestion technique using pressure-assisted chelating extraction (PACE), for undergraduate laboratory is reported. This procedure is used for exposing students to safe sample-preparation techniques, for correlating wet-chemical methods with modern instrumental analysis and comparing the performance of PACE with conventional…
NASA Astrophysics Data System (ADS)
Batory, Krzysztof J.; Govindjee; Andersen, Dale; Presley, John; Lucas, John M.; Sears, S. Kelly; Vali, Hojatollah
Unambiguous detection of extraterrestrial nitrogenous hydrocarbon microbiology requires an instrument both to recognize potential biogenic specimens and to successfully discriminate them from geochemical settings. Such detection should ideally be in-situ and not jeopardize other experiments by altering samples. Taken individually most biomarkers are inconclusive. For example, since amino acids can be synthesized abiotically they are not always considered reliable biomarkers. An enantiomeric imbalance, which is characteristic of all terrestrial life, may be questioned because chirality can also be altered abiotically. However, current scientific understanding holds that aggregates of identical proteins or proteinaceous complexes, with their well-defined amino acid residue sequences, are indisputable biomarkers. Our paper describes the Mars Analytical Microimager, an instrument for the simultaneous imaging of generic autofluorescent biomarkers and overall morphology. Autofluorescence from ultraviolet to near-infrared is emitted by all known terrestrial biology, and often as consistent complex bands uncharacteristic of abiotic mineral luminescence. The MAM acquires morphology, and even sub-micron morphogenesis, at a 3-centimeter working distance with resolution approaching a laser scanning microscope. Luminescence is simultaneously collected via a 2.5-micron aperture, thereby permitting accurate correlation of multi-dimensional optical behavior with specimen morphology. A variable wavelength excitation source and photospectrometer serve to obtain steady-state and excitation spectra of biotic and luminescent abiotic sources. We believe this is the first time instrumentation for detecting hydrated or desiccated microbiology non-destructively in-situ has been demonstrated. We have obtained excellent preliminary detection of biota and inorganic matrix discrimination from terrestrial polar analogues, and perimetric morphology of individual magnetotactic bacteria. Proposed
Laboratory instrumentation modernization at the WPI Nuclear Reactor Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1995-01-01
With partial funding from the Department of Energy (DOE) University Reactor Instrumentation Program several laboratory instruments utilized by students and researchers at the WPI Nuclear Reactor Facility have been upgraded or replaced. Designed and built by General Electric in 1959, the open pool nuclear training reactor at WPI was one of the first such facilities in the nation located on a university campus. Devoted to undergraduate use, the reactor and its related facilities have been since used to train two generations of nuclear engineers and scientists for the nuclear industry. The low power output of the reactor and an ergonomicmore » facility design make it an ideal tool for undergraduate nuclear engineering education and other training. The reactor, its control system, and the associate laboratory equipment are all located in the same room. Over the years, several important milestones have taken place at the WPI reactor. In 1969, the reactor power level was upgraded from 1 kW to 10 kW. The reactor`s Nuclear Regulatory Commission operating license was renewed for 20 years in 1983. In 1988, under DOE Grant No. DE-FG07-86ER75271, the reactor was converted to low-enriched uranium fuel. In 1992, again with partial funding from DOE (Grant No. DE-FG02-90ER12982), the original control console was replaced.« less
Collaborative Thinking: The Challenge of the Modern University
ERIC Educational Resources Information Center
Corrigan, Kevin
2012-01-01
More collaborative work in the humanities could be instrumental in helping to break down the traditional rigid boundaries between academic divisions and disciplines in modern universities. The value of the traditional model of the solitary humanities scholar or the collaborative science paradigm should not be discounted. However, increasing the…
Foundations of measurement and instrumentation
NASA Technical Reports Server (NTRS)
Warshawsky, Isidore
1990-01-01
The user of instrumentation has provided an understanding of the factors that influence instrument performance, selection, and application, and of the methods of interpreting and presenting the results of measurements. Such understanding is prerequisite to the successful attainment of the best compromise among reliability, accuracy, speed, cost, and importance of the measurement operation in achieving the ultimate goal of a project. Some subjects covered are dimensions; units; sources of measurement error; methods of describing and estimating accuracy; deduction and presentation of results through empirical equations, including the method of least squares; experimental and analytical methods of determining the static and dynamic behavior of instrumentation systems, including the use of analogs.
Instrumentation for analytical scale supercritical fluid chromatography.
Berger, Terry A
2015-11-20
Analytical scale supercritical fluid chromatography (SFC) is largely a sub-discipline of high performance liquid chromatography (HPLC), in that most of the hardware and software can be used for either technique. The aspects that separate the 2 techniques stem from the use of carbon dioxide (CO2) as the main component of the mobile phase in SFC. The high compressibility and low viscosity of CO2 mean that pumps, and autosamplers designed for HPLC either need to be modified or an alternate means of dealing with compressibility needs to be found. The inclusion of a back pressure regulator and a high pressure flow cell for any UV-Vis detector are also necessary. Details of the various approaches, problems and solutions are described. Characteristics, such as adiabatic vs. isothermal compressibility, thermal gradients, and refractive index issues are dealt with in detail. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Williamson, Ben
2016-01-01
Educational institutions and governing practices are increasingly augmented with digital database technologies that function as new kinds of policy instruments. This article surveys and maps the landscape of digital policy instrumentation in education and provides two detailed case studies of new digital data systems. The Learning Curve is a…
Development of TPS flight test and operational instrumentation
NASA Technical Reports Server (NTRS)
Carnahan, K. R.; Hartman, G. J.; Neuner, G. J.
1975-01-01
Thermal and flow sensor instrumentation was developed for use as an integral part of the space shuttle orbiter reusable thermal protection system. The effort was performed in three tasks: a study to determine the optimum instruments and instrument installations for the space shuttle orbiter RSI and RCC TPS; tests and/or analysis to determine the instrument installations to minimize measurement errors; and analysis using data from the test program for comparison to analytical methods. A detailed review of existing state of the art instrumentation in industry was performed to determine the baseline for the departure of the research effort. From this information, detailed criteria for thermal protection system instrumentation were developed.
A new approach for instrument software at Gemini
NASA Astrophysics Data System (ADS)
Gillies, Kim; Nunez, Arturo; Dunn, Jennifer
2008-07-01
Gemini Observatory is now developing its next generation of astronomical instruments, the Aspen instruments. These new instruments are sophisticated and costly requiring large distributed, collaborative teams. Instrument software groups often include experienced team members with existing mature code. Gemini has taken its experience from the previous generation of instruments and current hardware and software technology to create an approach for developing instrument software that takes advantage of the strengths of our instrument builders and our own operations needs. This paper describes this new software approach that couples a lightweight infrastructure and software library with aspects of modern agile software development. The Gemini Planet Imager instrument project, which is currently approaching its critical design review, is used to demonstrate aspects of this approach. New facilities under development will face similar issues in the future, and the approach presented here can be applied to other projects.
Non-traditional isotopes in analytical ecogeochemistry assessed by MC-ICP-MS
NASA Astrophysics Data System (ADS)
Prohaska, Thomas; Irrgeher, Johanna; Horsky, Monika; Hanousek, Ondřej; Zitek, Andreas
2014-05-01
Analytical ecogeochemistry deals with the development and application of tools of analytical chemistry to study dynamic biological and ecological processes within ecosystems and across ecosystem boundaries in time. It can be best described as a linkage between modern analytical chemistry and a holistic understanding of ecosystems ('The total human ecosystem') within the frame of transdisciplinary research. One focus of analytical ecogeochemistry is the advanced analysis of elements and isotopes in abiotic and biotic matrices and the application of the results to basic questions in different research fields like ecology, environmental science, climatology, anthropology, forensics, archaeometry and provenancing. With continuous instrumental developments, new isotopic systems have been recognized for their potential to study natural processes and well established systems could be analyzed with improved techniques, especially using multi collector inductively coupled plasma mass spectrometry (MC-ICP-MS). For example, in case of S, isotope ratio measurements at high mass resolution could be achieved at much lower S concentrations with ICP-MS as compared to IRMS, still keeping suitable uncertainty. Almost 50 different isotope systems have been investigated by ICP-MS, so far, with - besides Sr, Pb and U - Ca, Mg, Cd, Li, Hg, Si, Ge and B being the most prominent and considerably pushing the limits of plasma based mass spectrometry also by applying high mass resolution. The use of laser ablation in combination with MC-ICP-MS offers the possibility to achieve isotopic information on high spatial (µm-range) and temporal scale (in case of incrementally growing structures). The information gained with these analytical techniques can be linked between different hierarchical scales in ecosystems, offering means to better understand ecosystem processes. The presentation will highlight the use of different isotopic systems in ecosystem studies accomplished by ICP-MS. Selected
Applications of Business Analytics in Healthcare.
Ward, Michael J; Marsolo, Keith A; Froehle, Craig M
2014-09-01
The American healthcare system is at a crossroads, and analytics, as an organizational skill, figures to play a pivotal role in its future. As more healthcare systems capture information electronically and as they begin to collect more novel forms of data, such as human DNA, how will we leverage these resources and use them to improve human health at a manageable cost? In this article, we argue that analytics will play a fundamental role in the transformation of the American healthcare system. However, there are numerous challenges to the application and use of analytics, namely the lack of data standards, barriers to the collection of high-quality data, and a shortage of qualified personnel to conduct such analyses. There are also multiple managerial issues, such as how to get end users of electronic data to employ it consistently for improving healthcare delivery, and how to manage the public reporting and sharing of data. In this article, we explore applications of analytics in healthcare, barriers and facilitators to its widespread adoption, and how analytics can help us achieve the goals of the modern healthcare system: high-quality, responsive, affordable, and efficient care.
Applications of Business Analytics in Healthcare
Ward, Michael J.; Marsolo, Keith A.
2014-01-01
The American healthcare system is at a crossroads, and analytics, as an organizational skill, figures to play a pivotal role in its future. As more healthcare systems capture information electronically and as they begin to collect more novel forms of data, such as human DNA, how will we leverage these resources and use them to improve human health at a manageable cost? In this article, we argue that analytics will play a fundamental role in the transformation of the American healthcare system. However, there are numerous challenges to the application and use of analytics, namely the lack of data standards, barriers to the collection of high-quality data, and a shortage of qualified personnel to conduct such analyses. There are also multiple managerial issues, such as how to get end users of electronic data to employ it consistently for improving healthcare delivery, and how to manage the public reporting and sharing of data. In this article, we explore applications of analytics in healthcare, barriers and facilitators to its widespread adoption, and how analytics can help us achieve the goals of the modern healthcare system: high-quality, responsive, affordable, and efficient care. PMID:25429161
Jiménez-Díaz, I; Vela-Soria, F; Rodríguez-Gómez, R; Zafra-Gómez, A; Ballesteros, O; Navalón, A
2015-09-10
In the present work, a review of the analytical methods developed in the last 15 years for the determination of endocrine disrupting chemicals (EDCs) in human samples related with children, including placenta, cord blood, amniotic fluid, maternal blood, maternal urine and breast milk, is proposed. Children are highly vulnerable to toxic chemicals in the environment. Among these environmental contaminants to which children are at risk of exposure are EDCs -substances able to alter the normal hormone function of wildlife and humans-. The work focuses mainly on sample preparation and instrumental techniques used for the detection and quantification of the analytes. The sample preparation techniques include, not only liquid-liquid extraction (LLE) and solid-phase extraction (SPE), but also modern microextraction techniques such as extraction with molecular imprinted polymers (MIPs), stir-bar sorptive extraction (SBSE), hollow-fiber liquid-phase microextraction (HF-LPME), dispersive liquid-liquid microextraction (DLLME), matrix solid phase dispersion (MSPD) or ultrasound-assisted extraction (UAE), which are becoming alternatives in the analysis of human samples. Most studies focus on minimizing the number of steps and using the lowest solvent amounts in the sample treatment. The usual instrumental techniques employed include liquid chromatography (LC), gas chromatography (GC) mainly coupled to tandem mass spectrometry. Multiresidue methods are being developed for the determination of several families of EDCs with one extraction step and limited sample preparation. Copyright © 2015 Elsevier B.V. All rights reserved.
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
Ultrasound physics and instrumentation for pathologists.
Lieu, David
2010-10-01
Interest in pathologist-performed ultrasound-guided fine-needle aspiration is increasing. Educational courses discuss clinical ultrasound and biopsy techniques but not ultrasound physics and instrumentation. To review modern ultrasound physics and instrumentation to help pathologists understand the basis of modern ultrasound. A review of recent literature and textbooks was performed. Ultrasound physics and instrumentation are the foundations of clinical ultrasound. The key physical principle is the piezoelectric effect. When stimulated by an electric current, certain crystals vibrate and produce ultrasound. A hand-held transducer converts electricity into ultrasound, transmits it into tissue, and listens for reflected ultrasound to return. The returning echoes are converted into electrical signals and used to create a 2-dimensional gray-scale image. Scanning at a high frequency improves axial resolution but has low tissue penetration. Electronic focusing moves the long-axis focus to depth of the object of interest and improves lateral resolution. The short-axis focus in 1-dimensional transducers is fixed, which results in poor elevational resolution away from the focal zone. Using multiple foci improves lateral resolution but degrades temporal resolution. The sonographer can adjust the dynamic range to change contrast and bring out subtle masses. Contrast resolution is limited by processing speed, monitor resolution, and gray-scale perception of the human eye. Ultrasound is an evolving field. New technologies include miniaturization, spatial compound imaging, tissue harmonics, and multidimensional transducers. Clinical cytopathologists who understand ultrasound physics, instrumentation, and clinical ultrasound are ready for the challenges of cytopathologist-performed ultrasound-guided fine-needle aspiration and core-needle biopsy in the 21st century.
The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars
Brown, Adrian J.; Michaels, Timothy I.; Byrne, Shane; Sun, Wenbo; Titus, Timothy N.; Colaprete, Anthony; Wolff, Michael J.; Videen, Gorden; Grund, Christian J.
2014-01-01
We present the scientific case to build a multiple-wavelength, active, near-infrared (NIR) instrument to measure the reflected intensity and polarization characteristics of backscattered radiation from planetary surfaces and atmospheres. We focus on the ability of such an instrument to enhance, perhaps revolutionize, our understanding of climate, volatiles and astrobiological potential of modern-day Mars.
Chemical and pharmacological comparison of modern and traditional dosage forms of Joshanda.
Parveen, Sajida; Irfan Bukhari, Nadeem; Shehzadi, Naureen; Qamar, Shaista; Ali, Ejaz; Naheed, Surriya; Latif, Abida; Yuchi, Alamgeer; Hussain, Khalid
2017-12-11
Recently, a traditional remedy (Joshanda) has been replaced largely by modern ready-to-use dosage forms, which have not been compared to the original remedy. Therefore, the present study aimed to compare a number of modern dosage forms with traditional remedy. Seven brands, 3 batches each, were compared with a Lab-made formulation with reference to analytical (proximate analyses, spectroscopic and chromatographic metabolomes) and pharmacological profiles (anti-inflammatory and antibacterial activities). Chemical and pharmacological differences were found between Lab-made Joshanda and modern dosage forms. Such variations were also found within the brands and batches of modern formulations (p < 0.05). The Lab-made Joshanda showed significantly higher pharmacological activities as compared to modern brands (p ). The results of the present study indicate that modern dosage forms are unstandardised and less effective than the traditional remedy. Characteristic profiles obtained from Lab-made Joshanda may be used as reference to produce comparable dosage forms.
Instrumental Surveillance of Water Quality.
ERIC Educational Resources Information Center
Miller, J. A.; And Others
The role analytical instrumentation performs in the surveillance and control of the quality of water resources is reviewed. Commonly performed analyses may range from simple tests for physical parameters to more highly sophisticated radiological or spectrophotometric methods. This publication explores many of these types of water quality analyses…
The role of light microscopy in aerospace analytical laboratories
NASA Technical Reports Server (NTRS)
Crutcher, E. R.
1977-01-01
Light microscopy has greatly reduced analytical flow time and added new dimensions to laboratory capability. Aerospace analytical laboratories are often confronted with problems involving contamination, wear, or material inhomogeneity. The detection of potential problems and the solution of those that develop necessitate the most sensitive and selective applications of sophisticated analytical techniques and instrumentation. This inevitably involves light microscopy. The microscope can characterize and often identify the cause of a problem in 5-15 minutes with confirmatory tests generally less than one hour. Light microscopy has and will make a very significant contribution to the analytical capabilities of aerospace laboratories.
Atkins, Rahshida
2014-01-01
Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis.
Atkins, Rahshida
2015-01-01
Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis. PMID:25626225
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haugen, G.R.; Bystroff, R.I.; Downey, R.M.
1975-09-01
In the area of automation and instrumentation, progress in the following studies is reported: computer automation of the Cary model 17I spectrophotometer; a new concept for monitoring the concentration of water in gases; on-line gas analysis for a gas circulation experiment; and count-rate-discriminator technique for measuring grain-boundary composition. In the area of analytical methodology and measurements, progress is reported in the following studies: separation of molecular species by radiation pressure; study of the vaporization of U(thd)$sub 4$, (thd = 2,2,6,6-tetramethylheptane-3,5-drone); study of the vaporization of U(C$sub 8$H$sub 8$)$sub 2$; determination of ethylenic unsaturation in polyimide resins; and, semimicrodetermination of hydroxylmore » and amino groups with pyromellitic dianhydride (PMDA). (JGB)« less
Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.
Lubes, Giuseppe; Goodarzi, Mohammad
2017-05-10
Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.
Pérez-Parada, Andrés; Gómez-Ramos, María del Mar; Martínez Bueno, María Jesús; Uclés, Samanta; Uclés, Ana; Fernández-Alba, Amadeo R
2012-02-01
Instrumental capabilities and software tools of modern hybrid mass spectrometry (MS) instruments such as high-resolution mass spectrometry (HRMS), quadrupole time-of-flight (QTOF), and quadrupole linear ion trap (QLIT) were experimentally investigated for the study of emerging contaminants in Henares River water samples. Automated screening and confirmatory capabilities of QTOF working in full-scan MS and tandem MS (MS/MS) were explored when dealing with real samples. Investigations on the effect of sensitivity and resolution power influence on mass accuracy were studied for the correct assignment of the amoxicillin transformation product 5(R) amoxicillin-diketopiperazine-2',5' as an example of a nontarget compound. On the other hand, a comparison of quantitative and qualitative strategies based on direct injection analysis and off-line solid-phase extraction sample treatment were assayed using two different QLIT instruments for a selected group of emerging contaminants when operating in selected reaction monitoring (SRM) and information-dependent acquisition (IDA) modes. Software-aided screening usually needs a further confirmatory step. Resolving power and MS/MS feature of QTOF showed to confirm/reject most findings in river water, although sensitivity-related limitations are usually found. Superior sensitivity of modern QLIT-MS/MS offered the possibility of direct injection analysis for proper quantitative study of a variety of contaminants, while it simultaneously reduced the matrix effect and increased the reliability of the results. Confirmation of ethylamphetamine, which lacks on a second SRM transition, was accomplished by using the IDA feature. Hybrid MS instruments equipped with high resolution and high sensitivity contributes to enlarge the scope of targeted analytes in river waters. However, in the tested instruments, there is a margin of improvement principally in required sensitivity and data treatment software tools devoted to reliable confirmation
Instrumental Landing Using Audio Indication
NASA Astrophysics Data System (ADS)
Burlak, E. A.; Nabatchikov, A. M.; Korsun, O. N.
2018-02-01
The paper proposes an audio indication method for presenting to a pilot the information regarding the relative positions of an aircraft in the tasks of precision piloting. The implementation of the method is presented, the use of such parameters of audio signal as loudness, frequency and modulation are discussed. To confirm the operability of the audio indication channel the experiments using modern aircraft simulation facility were carried out. The simulated performed the instrument landing using the proposed audio method to indicate the aircraft deviations in relation to the slide path. The results proved compatible with the simulated instrumental landings using the traditional glidescope pointers. It inspires to develop the method in order to solve other precision piloting tasks.
Physical and Chemical Analytical Analysis: A key component of Bioforensics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Velsko, S P
The anthrax letters event of 2001 has raised our awareness of the potential importance of non-biological measurements on samples of biological agents used in a terrorism incident. Such measurements include a variety of mass spectral, spectroscopic, and other instrumental techniques that are part of the current armamentarium of the modern materials analysis or analytical chemistry laboratory. They can provide morphological, trace element, isotopic, and other molecular ''fingerprints'' of the agent that may be key pieces of evidence, supplementing that obtained from genetic analysis or other biological properties. The generation and interpretation of such data represents a new domain of forensicmore » science, closely aligned with other areas of ''microbial forensics''. This paper describes some major elements of the R&D agenda that will define this sub-field in the immediate future and provide the foundations for a coherent national capability. Data from chemical and physical analysis of BW materials can be useful to an investigation of a bio-terror event in two ways. First, it can be used to compare evidence samples collected at different locations where such incidents have occurred (e.g. between the powders in the New York and Washington letters in the Amerithrax investigation) or between the attack samples and those seized during the investigation of sites where it is suspected the material was manufactured (if such samples exist). Matching of sample properties can help establish the relatedness of disparate incidents, and mis-matches might exclude certain scenarios, or signify a more complex etiology of the events under investigation. Chemical and morphological analysis for sample matching has a long history in forensics, and is likely to be acceptable in principle in court, assuming that match criteria are well defined and derived from known limits of precision of the measurement techniques in question. Thus, apart from certain operational issues (such as how
Developing automated analytical methods for scientific environments using LabVIEW.
Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard
2010-01-15
The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.
Tunable lasers and their application in analytical chemistry
NASA Technical Reports Server (NTRS)
Steinfeld, J. I.
1975-01-01
The impact that laser techniques might have in chemical analysis is examined. Absorption, scattering, and heterodyne detection is considered. Particular emphasis is placed on the advantages of using frequency-tunable sources, and dye solution lasers are regarded as the outstanding example of this type of laser. Types of spectroscopy that can be carried out with lasers are discussed along with the ultimate sensitivity or minimum detectable concentration of molecules that can be achieved with each method. Analytical applications include laser microprobe analysis, remote sensing and instrumental methods such as laser-Raman spectroscopy, atomic absorption/fluorescence spectrometry, fluorescence assay techniques, optoacoustic spectroscopy, and polarization measurements. The application of lasers to spectroscopic methods of analysis would seem to be a rewarding field both for research in analytical chemistry and for investments in instrument manufacturing.
Modern Psychometrics for Assessing Achievement Goal Orientation: A Rasch Analysis
ERIC Educational Resources Information Center
Muis, Krista R.; Winne, Philip H.; Edwards, Ordene V.
2009-01-01
Background: A program of research is needed that assesses the psychometric properties of instruments designed to quantify students' achievement goal orientations to clarify inconsistencies across previous studies and to provide a stronger basis for future research. Aim: We conducted traditional psychometric and modern Rasch-model analyses of the…
ERIC Educational Resources Information Center
Shindler, John; Taylor, Clint; Cadenas, Herminia; Jones, Albert
This study was a pilot effort to examine the efficacy of an analytic trait scale school climate assessment instrument and democratic change system in two urban high schools. Pilot study results indicate that the instrument shows promising soundness in that it exhibited high levels of validity and reliability. In addition, the analytic trait format…
Cross-instrument Analysis Correlation Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
McJunkin, Timothy R.
This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog boxmore » driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.« less
Impact of natural products in modern drug development.
Dev, Sukh
2010-03-01
Usage of natural substances as therapeutic agents in modern medicine has sharply declined from the predominant position held in the early decades of last century, but search for bioactive molecules from nature (plants, animals, microflora) continues to play an important role in fashioning new medicinal agents. With the advent of modern techniques, instrumentation and automation in isolation and structural characterisation, we have on hand an enormous repository of natural compounds. In parallel to this, biology has also made tremendous progress in expanding its frontiers of knowledge. An interplay of these two disciplines constitutes the modern thrust in research in the realm of compounds elaborated by nature. The purpose of this article is to underline how natural products research continues to make significant contributions in the domain of discovery and development of new medicinal products. It is proposed to present the material under several heads, each of which has made natural products research relevant in the search for new and better medication.
CMOS Electrochemical Instrumentation for Biosensor Microsystems: A Review.
Li, Haitao; Liu, Xiaowen; Li, Lin; Mu, Xiaoyi; Genov, Roman; Mason, Andrew J
2016-12-31
Modern biosensors play a critical role in healthcare and have a quickly growing commercial market. Compared to traditional optical-based sensing, electrochemical biosensors are attractive due to superior performance in response time, cost, complexity and potential for miniaturization. To address the shortcomings of traditional benchtop electrochemical instruments, in recent years, many complementary metal oxide semiconductor (CMOS) instrumentation circuits have been reported for electrochemical biosensors. This paper provides a review and analysis of CMOS electrochemical instrumentation circuits. First, important concepts in electrochemical sensing are presented from an instrumentation point of view. Then, electrochemical instrumentation circuits are organized into functional classes, and reported CMOS circuits are reviewed and analyzed to illuminate design options and performance tradeoffs. Finally, recent trends and challenges toward on-CMOS sensor integration that could enable highly miniaturized electrochemical biosensor microsystems are discussed. The information in the paper can guide next generation electrochemical sensor design.
CMOS Electrochemical Instrumentation for Biosensor Microsystems: A Review
Li, Haitao; Liu, Xiaowen; Li, Lin; Mu, Xiaoyi; Genov, Roman; Mason, Andrew J.
2016-01-01
Modern biosensors play a critical role in healthcare and have a quickly growing commercial market. Compared to traditional optical-based sensing, electrochemical biosensors are attractive due to superior performance in response time, cost, complexity and potential for miniaturization. To address the shortcomings of traditional benchtop electrochemical instruments, in recent years, many complementary metal oxide semiconductor (CMOS) instrumentation circuits have been reported for electrochemical biosensors. This paper provides a review and analysis of CMOS electrochemical instrumentation circuits. First, important concepts in electrochemical sensing are presented from an instrumentation point of view. Then, electrochemical instrumentation circuits are organized into functional classes, and reported CMOS circuits are reviewed and analyzed to illuminate design options and performance tradeoffs. Finally, recent trends and challenges toward on-CMOS sensor integration that could enable highly miniaturized electrochemical biosensor microsystems are discussed. The information in the paper can guide next generation electrochemical sensor design. PMID:28042860
Advanced Instrumentation for Positron Emission Tomography [PET
DOE R&D Accomplishments Database
Derenzo, S. E.; Budinger, T. F.
1985-04-01
This paper summarizes the physical processes and medical science goals that underlay modern instrumentation design for Positron Emission Tomography. The paper discusses design factors such as detector material, crystalphototube coupling, shielding geometry, sampling motion, electronics design, time-of-flight, and the interrelationships with quantitative accuracy, spatial resolution, temporal resolution, maximum data rates, and cost.
Integrated Array/Metadata Analytics
NASA Astrophysics Data System (ADS)
Misev, Dimitar; Baumann, Peter
2015-04-01
Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.
A Comparison of Wood Density between Classical Cremonese and Modern Violins
Stoel, Berend C.; Borman, Terry M.
2008-01-01
Classical violins created by Cremonese masters, such as Antonio Stradivari and Giuseppe Guarneri Del Gesu, have become the benchmark to which the sound of all violins are compared in terms of their abilities of expressiveness and projection. By general consensus, no luthier since that time has been able to replicate the sound quality of these classical instruments. The vibration and sound radiation characteristics of a violin are determined by an instrument's geometry and the material properties of the wood. New test methods allow the non-destructive examination of one of the key material properties, the wood density, at the growth ring level of detail. The densities of five classical and eight modern violins were compared, using computed tomography and specially developed image-processing software. No significant differences were found between the median densities of the modern and the antique violins, however the density difference between wood grains of early and late growth was significantly smaller in the classical Cremonese violins compared with modern violins, in both the top (Spruce) and back (Maple) plates (p = 0.028 and 0.008, respectively). The mean density differential (SE) of the top plates of the modern and classical violins was 274 (26.6) and 183 (11.7) gram/liter. For the back plates, the values were 128 (2.6) and 115 (2.0) gram/liter. These differences in density differentials may reflect similar changes in stiffness distributions, which could directly impact vibrational efficacy or indirectly modify sound radiation via altered damping characteristics. Either of these mechanisms may help explain the acoustical differences between the classical and modern violins. PMID:18596937
Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr
2016-03-20
Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Using Longitudinal Scales Assessment for Instrumental Music Students
ERIC Educational Resources Information Center
Simon, Samuel H.
2014-01-01
In music education, current assessment trends emphasize student reflection, tracking progress over time, and formative as well as summative measures. This view of assessment requires instrumental music educators to modernize their approaches without interfering with methods that have proven to be successful. To this end, the Longitudinal Scales…
Computational overlay metrology with adaptive data analytics
NASA Astrophysics Data System (ADS)
Schmitt-Weaver, Emil; Subramony, Venky; Ullah, Zakir; Matsunobu, Masazumi; Somasundaram, Ravin; Thomas, Joel; Zhang, Linmiao; Thul, Klaus; Bhattacharyya, Kaustuve; Goossens, Ronald; Lambregts, Cees; Tel, Wim; de Ruiter, Chris
2017-03-01
With photolithography as the fundamental patterning step in the modern nanofabrication process, every wafer within a semiconductor fab will pass through a lithographic apparatus multiple times. With more than 20,000 sensors producing more than 700GB of data per day across multiple subsystems, the combination of a light source and lithographic apparatus provide a massive amount of information for data analytics. This paper outlines how data analysis tools and techniques that extend insight into data that traditionally had been considered unmanageably large, known as adaptive analytics, can be used to show how data collected before the wafer is exposed can be used to detect small process dependent wafer-towafer changes in overlay.
The relationship of modernity of sex roles to pregnancy planning.
Jurich, J
1984-08-01
This study investigates the relationship of women's role modernity to pregnancy planning. The subjects were 59 married primiparous women aged 18 to 33 who had given birth in a metropolitan midwestern hospital. Over 1/2 the sample had some college eduction. The pregnancy planning variable is operationalized as the implementation of family planning goals. Subjects who desired pregnancy and actively attempted to conceive are considered to be planners. In contrast, nonplanners are defined as women who preferred to avoid pregnancy but were not successful and women who did not actively seek or avoid pregancy. The modernity of sex roles variable is operationalized through use of the Scanzoni instrument. This instrument is constructed from a series of items that measure 3 social positions related to sex roles in the family context: those of wife, husband and mother. The instrument is modified in this investigation, leaving 21 5-point scale items to be included in the data analysis. Smallest space analysis of the inter-item correlation matrix demonstrate that the social positions of wife and husband do not clearly reflect different aspects of sex role modernity. A comparison of the average inter-item correlation for the variables within each social position with the average inter-item for the variables across the positions reveals that the dimensions proposed by Scanzoni are not empirically different. In light of these findings, further exploratory data analysis of all items was conducted to discern which items do empirically cluster together. Scanzoni's 21 sex role items were submitted to principal component factor analysis; 3 factors emerged. 1) wife-husband equlity; 2) flexibility in role integration; and 3) values regarding primary role. 3 new sex role modernity values were created to correspond to the 3 factors and were then used to explore the relationship between sex role modernity and pregnancy planning. Chi square analyses were not statistically significant
High-sensitivity ESCA instrument
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davies, R.D.; Herglotz, H.K.; Lee, J.D.
1973-01-01
A new electron spectroscopy for chemical analysis (ESCA) instrument has been developed to provide high sensitivity and efficient operation for laboratory analysis of composition and chemical bonding in very thin surface layers of solid samples. High sensitivity is achieved by means of the high-intensity, efficient x-ray source described by Davies and Herglotz at the 1968 Denver X-Ray Conference, in combination with the new electron energy analyzer described by Lee at the 1972 Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy. A sample chamber designed to provide for rapid introduction and replacement of samples has adequate facilities for various sample treatmentsmore » and conditiouing followed immediately by ESCA analysis of the sample. Examples of application are presented, demonstrating the sensitivity and resolution achievable with this instrument. Its usefulness in trace surface analysis is shown and some chemical shifts'' measured by the instrument are compared with those obtained by x-ray spectroscopy. (auth)« less
ERIC Educational Resources Information Center
Flato, J. B.
2007-01-01
Princeton Applied Research Corporation (PAR) was a small electronic instrument company in early 1960s but once they entered electrochemistry they were very successful. Since then they have developed and designed successful instruments with their tremendous knowledge and have made great contribution to the field of analytical chemistry.
Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.
ERIC Educational Resources Information Center
Hercules, David M.; Hercules, Shirley H.
1984-01-01
Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)
2014-01-01
This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Hassib, Lamyaa
2005-06-01
Multicomponent polymer-based formulations of optical sensor materials are difficult and time consuming to optimize using conventional approaches. To address these challenges, our long-term goal is to determine relationships between sensor formulation and sensor response parameters using new scientific methodologies. As the first step, we have designed and implemented an automated analytical instrumentation infrastructure for combinatorial and high-throughput development of polymeric sensor materials for optical sensors. Our approach is based on the fabrication and performance screening of discrete and gradient sensor arrays. Simultaneous formation of multiple sensor coatings into discrete 4×6, 6×8, and 8×12 element arrays (3-15μL volume per element) and their screening provides not only a well-recognized acceleration in the screening rate, but also considerably reduces or even eliminates sources of variability, which are randomly affecting sensors response during a conventional one-at-a-time sensor coating evaluation. The application of gradient sensor arrays provides additional capabilities for rapid finding of the optimal formulation parameters.
Jonathan W. Amy and the Amy Facility for Instrumentation Development.
Cooks, R Graham
2017-05-16
This Perspective describes the unique Jonathan Amy Facility for Chemical Instrumentation in the Department of Chemistry at Purdue University, tracing its history and mode of operation. It also describes aspects of the career of its namesake and some of his insights which have been central to analytical instrumentation development, improvement, and utilization, both at Purdue and nationally.
TELICS—A Telescope Instrument Control System for Small/Medium Sized Astronomical Observatories
NASA Astrophysics Data System (ADS)
Srivastava, Mudit K.; Ramaprakash, A. N.; Burse, Mahesh P.; Chordia, Pravin A.; Chillal, Kalpesh S.; Mestry, Vilas B.; Das, Hillol K.; Kohok, Abhay A.
2009-10-01
For any modern astronomical observatory, it is essential to have an efficient interface between the telescope and its back-end instruments. However, for small and medium-sized observatories, this requirement is often limited by tight financial constraints. Therefore a simple yet versatile and low-cost control system is required for such observatories to minimize cost and effort. Here we report the development of a modern, multipurpose instrument control system TELICS (Telescope Instrument Control System) to integrate the controls of various instruments and devices mounted on the telescope. TELICS consists of an embedded hardware unit known as a common control unit (CCU) in combination with Linux-based data acquisition and user interface. The hardware of the CCU is built around the ATmega 128 microcontroller (Atmel Corp.) and is designed with a backplane, master-slave architecture. A Qt-based graphical user interface (GUI) has been developed and the back-end application software is based on C/C++. TELICS provides feedback mechanisms that give the operator good visibility and a quick-look display of the status and modes of instruments as well as data. TELICS has been used for regular science observations since 2008 March on the 2 m, f/10 IUCAA Telescope located at Girawali in Pune, India.
Biochemical Applications in the Analytical Chemistry Lab
ERIC Educational Resources Information Center
Strong, Cynthia; Ruttencutter, Jeffrey
2004-01-01
An HPLC and a UV-visible spectrophotometer are identified as instruments that helps to incorporate more biologically-relevant experiments into the course, in order to increase the students understanding of selected biochemistry topics and enhances their ability to apply an analytical approach to biochemical problems. The experiment teaches…
Black Boxes in Analytical Chemistry: University Students' Misconceptions of Instrumental Analysis
ERIC Educational Resources Information Center
Carbo, Antonio Domenech; Adelantado, Jose Vicente Gimeno; Reig, Francisco Bosch
2010-01-01
Misconceptions of chemistry and chemical engineering university students concerning instrumental analysis have been established from coordinated tests, tutorial interviews and laboratory lessons. Misconceptions can be divided into: (1) formal, involving specific concepts and formulations within the general frame of chemistry; (2)…
Contributions of Analytical Chemistry to the Clinical Laboratory.
ERIC Educational Resources Information Center
Skogerboe, Kristen J.
1988-01-01
Highlights several analytical techniques that are being used in state-of-the-art clinical labs. Illustrates how other advances in instrumentation may contribute to clinical chemistry in the future. Topics include: biosensors, polarization spectroscopy, chemiluminescence, fluorescence, photothermal deflection, and chromatography in clinical…
Existential Measurement: A Factor Analytic Study of Some Current Psychometric Instruments.
ERIC Educational Resources Information Center
Thauberger, Patrick C.; And Others
1982-01-01
Research in existentialism and ontology has given rise to several psychometric instruments. Used both exploratory and confirmatory principal-factor analyses to study relationships among 16 existential scales. Exploratory factor analysis provided some support of the theory that the avoidance of existential confrontation is a central function of…
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
Recent developments in hydrologic instrumentation
Latkovich, Vito J.; Futrell, James C.; Kane, Douglas L.
1986-01-01
The programs of the U.S. Geological Survey require instrumentation for collecting and monitoring hydrologic data in cold regions. The availability of space-age materials and implementation of modern electronics and mechanics is making possible the recent developments of hydrologic instrumentation, especially in the area of measuring streamflow under ice cover. Material developments include: synthetic-fiber sounding and tag lines; polymer (plastic) sheaves, pulleys, and sampler components; and polymer (plastic) current-meter bucket wheels. Electronic and mechanical developments include: a current-meter digitizer; a fiber-optic closure system for current-meters; non-contact water-level sensors; an adaptable hydrologic data acquisition system; a minimum data recorder; an ice rod; an ice foot; a handled sediment sampler; a light weight ice auger with improved cutter head and blades; and an ice chisel.
Monte Carlo simulations of neutron-scattering instruments using McStas
NASA Astrophysics Data System (ADS)
Nielsen, K.; Lefmann, K.
2000-06-01
Monte Carlo simulations have become an essential tool for improving the performance of neutron-scattering instruments, since the level of sophistication in the design of instruments is defeating purely analytical methods. The program McStas, being developed at Risø National Laboratory, includes an extension language that makes it easy to adapt it to the particular requirements of individual instruments, and thus provides a powerful and flexible tool for constructing such simulations. McStas has been successfully applied in such areas as neutron guide design, flux optimization, non-Gaussian resolution functions of triple-axis spectrometers, and time-focusing in time-of-flight instruments.
Comparison of icing cloud instruments for 1982-1983 icing season flight program
NASA Technical Reports Server (NTRS)
Ide, R. F.; Richter, G. P.
1984-01-01
A number of modern and old style liquid water content (LWC) and droplet sizing instruments were mounted on a DeHavilland DHC-6 Twin Otter and operated in natural icing clouds in order to determine their comparative operating characteristics and their limitations over a broad range of conditions. The evaluation period occurred during the 1982-1983 icing season from January to March 1983. Time histories of all instrument outputs were plotted and analyzed to assess instrument repeatability and reliability. Scatter plots were also generated for comparison of instruments. The measured LWC from four instruments differed by as much as 20 percent. The measured droplet size from two instruments differed by an average of three microns. The overall effort demonstrated the need for additional data, and for some means of calibrating these instruments to known standards.
Epilepsy analytic system with cloud computing.
Shen, Chia-Ping; Zhou, Weizhi; Lin, Feng-Seng; Sung, Hsiao-Ya; Lam, Yan-Yu; Chen, Wei; Lin, Jeng-Wei; Pan, Ming-Kai; Chiu, Ming-Jang; Lai, Feipei
2013-01-01
Biomedical data analytic system has played an important role in doing the clinical diagnosis for several decades. Today, it is an emerging research area of analyzing these big data to make decision support for physicians. This paper presents a parallelized web-based tool with cloud computing service architecture to analyze the epilepsy. There are many modern analytic functions which are wavelet transform, genetic algorithm (GA), and support vector machine (SVM) cascaded in the system. To demonstrate the effectiveness of the system, it has been verified by two kinds of electroencephalography (EEG) data, which are short term EEG and long term EEG. The results reveal that our approach achieves the total classification accuracy higher than 90%. In addition, the entire training time accelerate about 4.66 times and prediction time is also meet requirements in real time.
42 CFR 493.1252 - Standard: Test systems, equipment, instruments, reagents, materials, and supplies.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Test systems, equipment, instruments... REQUIREMENTS Quality System for Nonwaived Testing Analytic Systems § 493.1252 Standard: Test systems, equipment...) Temperature. (3) Humidity. (4) Protection of equipment and instruments from fluctuations and interruptions in...
Analysis of Variance in the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
Deloach, Richard
2010-01-01
This paper is a tutorial introduction to the analysis of variance (ANOVA), intended as a reference for aerospace researchers who are being introduced to the analytical methods of the Modern Design of Experiments (MDOE), or who may have other opportunities to apply this method. One-way and two-way fixed-effects ANOVA, as well as random effects ANOVA, are illustrated in practical terms that will be familiar to most practicing aerospace researchers.
Cranial index in a modern people of Thai ancestry
Jung, Hyunwoo
2018-01-01
The present research aims to examine the cranial index in a modern people of Thai ancestry. Ultimately, this study will help to create a databank containing a cranial index for the classifications of the people from Asia. In this study, 185 modern crania of people of supposed Thai ancestry were examined. They were collected from the Department of Anatomy at Chulalongkorn University in Bangkok, Thailand. The maximum cranial length and breadth were measured using standard anthropometric instruments based on Martin's methods. The cranial index was calculated using the equation ([maximum cranial breadth/maximum cranial length]×100). The mean cranial indices for the male and female skulls examined were 81.81±4.23 and 82.99±4.37, respectively. The most common type of skull in the modern Thai people in this study was the brachycranic type with a frequency of 42.7%, followed by the mesocranic (27.03%) and hyperbrachycranic types (25.59%). The rarest type observed in this study was the dolichocranic type (4.32%). The present study provides valuable data pertaining to the cranial index in a modern Thai population and reveals that modern Thai males and females belong to the brachycranic group. The results of this study will be of forensic anthropological importance to populations in close proximity to the location where the skulls studied here were sourced. PMID:29644107
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gałuszka, Agnieszka, E-mail: Agnieszka.Galuszka@ujk.edu.pl; Migaszewski, Zdzisław M.; Namieśnik, Jacek
The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector),more » ultraviolet–visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. - Highlights: • Field portable instruments are widely used in environmental sample analysis. • Field portable instruments are indispensable for analysis in emergency response. • Miniaturization of field portable instruments reduces resource consumption. • In situ analysis is in agreement with green analytical
Engineering Bioluminescent Proteins: Expanding their Analytical Potential
Rowe, Laura; Dikici, Emre; Daunert, Sylvia
2009-01-01
Synopsis Bioluminescence has been observed in nature since the dawn of time, but now, scientists are harnessing it for analytical applications. Laura Rowe, Emre Dikici, and Sylvia Daunert of the University of Kentucky describe the origins of bioluminescent proteins and explore their uses in the modern chemistry laboratory. The cover features spectra of bioluminescent light superimposed on an image of jellyfish, which are a common source of bioluminescent proteins. Images courtesy of Emre Dikici and Shutterstock. PMID:19725502
Thermo Scientific Ozone Analyzer Instrument Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springston, S. R.
The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data ismore » being collected.« less
Modern developments for ground-based monitoring of fire behavior and effects
Colin C. Hardy; Robert Kremens; Matthew B. Dickinson
2010-01-01
Advances in electronic technology over the last several decades have been staggering. The cost of electronics continues to decrease while system performance increases seemingly without limit. We have applied modern techniques in sensors, electronics and instrumentation to create a suite of ground based diagnostics that can be used in laboratory (~ 1 m2), field scale...
MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications
Medina, Isabel; Cappiello, Achille; Careri, Maria
2018-01-01
Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370
[Bavarian mental health reform 1851. An instrument of administrative modernization].
Burgmair, Wolfgang; Weber, Matthias M
2008-01-01
By 1850 the reformation of institutional psychiatric care in Bavaria was given the highest priority by monarchy and administration. Cooperating with experts, especially the psychiatrist Karl August von Solbrig, they provided for new asylums to be established throughout Bavaria in a surprisingly short period of time. It was, however, only at personal intervention of King Max II. that the administrative and financial difficulties which had existed since the beginning of the 19th century could be overcome. The planning of asylums done by each administrative district of Bavaria vividly reflects rivalry as well as cooperation between all governmental and professional agencies involved. Modernization of psychiatry was publicly justified by referring to scientism, the need for a more progressive restructuring of administration, and the paternalistic care of the monarchy, whereas, from an administrative point of view, aspects of psychiatric treatment, like what kind of asylum would be best, were rather insignificant. The structures established by means of the alliance between state administration and psychiatric care under the rule of King Max II. had a lasting effect on the further development of Bavaria.
Modern analytical methods for the detection of food fraud and adulteration by food category.
Hong, Eunyoung; Lee, Sang Yoo; Jeong, Jae Yun; Park, Jung Min; Kim, Byung Hee; Kwon, Kisung; Chun, Hyang Sook
2017-09-01
This review provides current information on the analytical methods used to identify food adulteration in the six most adulterated food categories: animal origin and seafood, oils and fats, beverages, spices and sweet foods (e.g. honey), grain-based food, and others (organic food and dietary supplements). The analytical techniques (both conventional and emerging) used to identify adulteration in these six food categories involve sensory, physicochemical, DNA-based, chromatographic and spectroscopic methods, and have been combined with chemometrics, making these techniques more convenient and effective for the analysis of a broad variety of food products. Despite recent advances, the need remains for suitably sensitive and widely applicable methodologies that encompass all the various aspects of food adulteration. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Potential sources of analytical bias and error in selected trace element data-quality analyses
Paul, Angela P.; Garbarino, John R.; Olsen, Lisa D.; Rosen, Michael R.; Mebane, Christopher A.; Struzeski, Tedmund M.
2016-09-28
Potential sources of analytical bias and error associated with laboratory analyses for selected trace elements where concentrations were greater in filtered samples than in paired unfiltered samples were evaluated by U.S. Geological Survey (USGS) Water Quality Specialists in collaboration with the USGS National Water Quality Laboratory (NWQL) and the Branch of Quality Systems (BQS).Causes for trace-element concentrations in filtered samples to exceed those in associated unfiltered samples have been attributed to variability in analytical measurements, analytical bias, sample contamination either in the field or laboratory, and (or) sample-matrix chemistry. These issues have not only been attributed to data generated by the USGS NWQL but have been observed in data generated by other laboratories. This study continues the evaluation of potential analytical bias and error resulting from matrix chemistry and instrument variability by evaluating the performance of seven selected trace elements in paired filtered and unfiltered surface-water and groundwater samples collected from 23 sampling sites of varying chemistries from six States, matrix spike recoveries, and standard reference materials.Filtered and unfiltered samples have been routinely analyzed on separate inductively coupled plasma-mass spectrometry instruments. Unfiltered samples are treated with hydrochloric acid (HCl) during an in-bottle digestion procedure; filtered samples are not routinely treated with HCl as part of the laboratory analytical procedure. To evaluate the influence of HCl on different sample matrices, an aliquot of the filtered samples was treated with HCl. The addition of HCl did little to differentiate the analytical results between filtered samples treated with HCl from those samples left untreated; however, there was a small, but noticeable, decrease in the number of instances where a particular trace-element concentration was greater in a filtered sample than in the associated
NASA Astrophysics Data System (ADS)
Storchak, Dmitry; Di Giacomo, Domenico
2015-04-01
Systematic seismological observations of earthquakes using seismic instruments on a global scale began more than 100 years ago. Since then seismologists made many discoveries about the Earth interior and the physics of the earthquakes, also thanks to major developments in the seismic instrumentation deployed around the world. Besides, since the establishment of the first global networks (Milne and Jesuit networks), seismologists around the world stored and exchanged the results of routine observations (e.g., picking of arrival times, amplitude-period measurements, etc.) or more sophisticated analyses (e.g., moment tensor inversion) in seismological bulletins/catalogues. With a project funded by the GEM Foundation (www.globalquakemodel.org), the ISC and the Team of International Experts released a new global earthquake catalogue, the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php), which, differently from previous global seismic catalogues, has the unique feature of covering the entire period of instrumental seismology with locations and magnitude re-assessed using modern approaches for the global earthquakes selected for processing (in the current version approximately 21,000). During the 110 years covered by the ISC-GEM catalogue many seismological developments occurred in terms of instrumentation, seismological practice and knowledge of the physics of the earthquakes. In this contribution we give a brief overview of the major milestones characterizing the last 100+ years of instrumental seismology that were relevant for the production of the ISC-GEM catalogue and the major challenges we faced to obtain a catalogue as homogenous as possible.
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.
2012-12-01
MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.
2016-07-27
make risk-informed decisions during serious games . Statistical models of intra- game performance were developed to determine whether behaviors in...specific facets of the gameplay workflow were predictive of analytical performance and games outcomes. A study of over seventy instrumented teams revealed...more accurate game decisions. 2 Keywords: Humatics · Serious Games · Human-System Interaction · Instrumentation · Teamwork · Communication Analysis
Marketing urbanistyczny jako instrument aktywizacji turystyki
NASA Astrophysics Data System (ADS)
Polska, Anna
2009-01-01
In the paper there are two studies of cases of council individuals from the Lublin voivodship presented. In order to stimulate the development of tourism and socio-economic activation, authorities of both communes applied in the tools of town-planning marketing. Presented instruments are: the strategy of socio-economic development and the many years' plan of development. Particular attention was paid to modernization of spatial structure and transformation in the sphere of town-planning as well as architecture.
Hyphenated analytical techniques for materials characterisation
NASA Astrophysics Data System (ADS)
Armstrong, Gordon; Kailas, Lekshmi
2017-09-01
This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the
ERIC Educational Resources Information Center
Vogt, Frank
2011-01-01
Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…
Streaming Swarm of Nano Space Probes for Modern Analytical Methods Applied to Planetary Science
NASA Astrophysics Data System (ADS)
Vizi, P. G.; Horvath, A. F.; Berczi, Sz.
2017-11-01
Streaming swarms gives possibilities to collect data from big fields in one time. The whole streaming fleet possible to behave like one big organization and can be realized as a planetary mission solution with stream type analytical methods.
ERIC Educational Resources Information Center
Peters, Michael A.; Besley, Tina A. C.
2014-01-01
This article offers a broad philosophical and historical background to the dyad of social exclusion/inclusion by examining the analytics and politics of exclusion first by reference to Michel Foucault who studies the modern history of exclusion and makes it central to his approach in understanding the development of modern institutions of emerging…
Instrument control software requirement specification for Extremely Large Telescopes
NASA Astrophysics Data System (ADS)
Young, Peter J.; Kiekebusch, Mario J.; Chiozzi, Gianluca
2010-07-01
Engineers in several observatories are now designing the next generation of optical telescopes, the Extremely Large Telescopes (ELT). These are very complex machines that will host sophisticated astronomical instruments to be used for a wide range of scientific studies. In order to carry out scientific observations, a software infrastructure is required to orchestrate the control of the multiple subsystems and functions. This paper will focus on describing the considerations, strategies and main issues related to the definition and analysis of the software requirements for the ELT's Instrument Control System using modern development processes and modelling tools like SysML.
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Mancinelli, Rocco; Martin, Joe; Holland, Paul M.; Stimac, Robert M.; Kaye, William J.
2005-01-01
The Mars Geochemical Instrument, MarGI, was developed to provide a comprehensive analysis of the rocks and surface material on Mars. The instrument combines Differential Thermal Analysis (DTA) with miniature Gas Chromatography-Ion Mobility Spectrometry (GC-IMS) to identify minerals, the presence and state of water, and organic compounds. Miniature pyrolysis ovens are used to both, conduct DTA analysis of soil or crushed rocks samples, and pyrolyze the samples at temperatures up to 1000 degrees C for GC-IMS analysis of the released gases. This combination of analytical processes and techniques, which can characterize the mineralogy of the rocks and soil, and identify and quantify volatiles released during pyrolysis, has applications across a wide range of target sites including comets, planets, asteroids, and moons such as Titan and Europa. The MarGI analytical approach evolved from the Cometary Ice and Dust Experiment (CIDEX) selected to fly on the Comet Rendezvous Asteroid Flyby Mission (CRAF).
Martian Soil Delivery to Analytical Instrument on Phoenix
NASA Technical Reports Server (NTRS)
2008-01-01
The Robotic Arm of NASA's Phoenix Mars Lander released a sample of Martian soil onto a screened opening of the lander's Thermal and Evolved-Gas Analyzer (TEGA) during the 12th Martian day, or sol, since landing (June 6, 2008). TEGA did not confirm that any of the sample had passed through the screen. The Robotic Arm Camera took this image on Sol 12. Soil from the sample delivery is visible on the sloped surface of TEGA, which has a series of parallel doors. The two doors for the targeted cell of TEGA are the one positioned vertically, at far right, and the one partially open just to the left of that one. The soil between those two doors is resting on a screen designed to let fine particles through while keeping bigger ones Efrom clogging the interior of the instrument. Each door is about 10 centimeters (4 inches) long. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.Caesium sputter ion source compatible with commercial SIMS instruments
NASA Astrophysics Data System (ADS)
Belykh, S. F.; Palitsin, V. V.; Veryovkin, I. V.; Kovarsky, A. P.; Chang, R. J. H.; Adriaens, A.; Dowsett, M.; Adams, F.
2006-07-01
A simple design for a caesium sputter cluster ion source compatible with commercially available secondary ion mass spectrometers is reported. This source has been tested with the Cameca IMS 4f instrument using the cluster Si n- and Cu n- ions, and will shortly be retrofitted to the floating low energy ion gun (FLIG) of the type used on the Cameca 4500/4550 quadruple instruments. Our experiments with surface characterization and depth profiling conducted to date demonstrate improvements of analytical capabilities of the SIMS instrument due to the non-additive enhancement of secondary ion emission and shorter ion ranges of polyatomic projectiles compared to atomic ions with the same impact energy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kenneth Thomas
2012-02-01
Life extension beyond 60 years for the U.S operating nuclear fleet requires that instrumentation and control (I&C) systems be upgraded to address aging and reliability concerns. It is impractical for the legacy systems based on 1970's vintage technology operate over this extended time period. Indeed, utilities have successfully engaged in such replacements when dictated by these operational concerns. However, the replacements have been approached in a like-for-like manner, meaning that they do not take advantage of the inherent capabilities of digital technology to improve business functions. And so, the improvement in I&C system performance has not translated to bottom-line performancemore » improvement for the fleet. Therefore, wide-scale modernization of the legacy I&C systems could prove to be cost-prohibitive unless the technology is implemented in a manner to enable significant business innovation as a means of off-setting the cost of upgrades. A Future Vision of a transformed nuclear plant operating model based on an integrated digital environment has been developed as part of the Advanced Instrumentation, Information, and Control (II&C) research pathway, under the Light Water Reactor (LWR) Sustainability Program. This is a research and development program sponsored by the U.S. Department of Energy (DOE), performed in close collaboration with the nuclear utility industry, to provide the technical foundations for licensing and managing the long-term, safe and economical operation of current nuclear power plants. DOE's program focus is on longer-term and higher-risk/reward research that contributes to the national policy objectives of energy security and environmental security . The Advanced II&C research pathway is being conducted by the Idaho National Laboratory (INL). The Future Vision is based on a digital architecture that encompasses all aspects of plant operations and support, integrating plant systems, plant work processes, and plant workers in a
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kenneth Thomas; Bruce Hallbert
2013-02-01
Life extension beyond 60 years for the U.S operating nuclear fleet requires that instrumentation and control (I&C) systems be upgraded to address aging and reliability concerns. It is impractical for the legacy systems based on 1970’s vintage technology operate over this extended time period. Indeed, utilities have successfully engaged in such replacements when dictated by these operational concerns. However, the replacements have been approached in a like-for-like manner, meaning that they do not take advantage of the inherent capabilities of digital technology to improve business functions. And so, the improvement in I&C system performance has not translated to bottom-line performancemore » improvement for the fleet. Therefore, wide-scale modernization of the legacy I&C systems could prove to be cost-prohibitive unless the technology is implemented in a manner to enable significant business innovation as a means of off-setting the cost of upgrades. A Future Vision of a transformed nuclear plant operating model based on an integrated digital environment has been developed as part of the Advanced Instrumentation, Information, and Control (II&C) research pathway, under the Light Water Reactor (LWR) Sustainability Program. This is a research and development program sponsored by the U.S. Department of Energy (DOE), performed in close collaboration with the nuclear utility industry, to provide the technical foundations for licensing and managing the long-term, safe and economical operation of current nuclear power plants. DOE’s program focus is on longer-term and higher-risk/reward research that contributes to the national policy objectives of energy security and environmental security . The Advanced II&C research pathway is being conducted by the Idaho National Laboratory (INL). The Future Vision is based on a digital architecture that encompasses all aspects of plant operations and support, integrating plant systems, plant work processes, and plant workers in a
NASA Astrophysics Data System (ADS)
Kabanov, Mikhail V.
2002-02-01
Peculiarity of nature and climate changes in middle latitudes of the Northern Hemisphere and in Siberia is that the temporal variability of meteorological quantities here has a wide range and their spatial variability has a complicated zone structure. Therefore, regional monitoring of modern nature and climate changes in Siberia is of scientific interest from the viewpoint of the global changes observed. Another Siberian peculiarity is associated with the fact that there are many unique objects that have global importance both as natural complexes (boreal forests, water- bog systems, Baikal lake, etc.) And as technogenic objects (oil and gas production, coal mining, metallurgy, transport, etc.). Therefore monitoring and modeling of regional nature and climate changes in Siberia have great practical importance, which is underestimated now, for industrial development of Siberia. Taking into account the above peculiarities and tendencies on investigation of global and regional environmental and climate changes, the multidisciplinary project on Climate and Ecological Monitoring of Siberia (CEMS) was accepted to the research and development program Sibir' since 1993. To realize this project, the Climate and Ecological Observatory was established in Tomsk at the Institute for Optical Monitoring (IOM) SB RAS. At the present time the stations (the basic and background ones) of this observatory are in a progress and theory and instruments for monitoring are being developed as well. In this paper we discuss some results obtained in the framework of CEMS project that were partially published in the monographs, in scientific journals, and will be published in the Proceedings of the 8th Joint International Symposium on Atmospheric and Ocean Optics and Atmosphere Physics. This review has a purpose not only to discuss the obtained regularities but also to formulate scientific and technical tasks for further investigations into the regional changes of technogenic, natural, and
Introduction to Instrumental Analysis of Water Pollutants. Training Manual.
ERIC Educational Resources Information Center
Office of Water Program Operations (EPA), Cincinnati, OH. National Training and Operational Technology Center.
This course is designed for those requiring an introduction to instruments commonly used in water pollution analyses. Examples are: pH, conductivity, dissolved oxygen meters, spectrophotometers, turbidimeters, carbon analyzer, and gas chromatographs. Students should have a basic knowledge of analytical chemistry. (CO)
Shipboard Analytical Capabilities on the Renovated JOIDES Resolution, IODP Riserless Drilling Vessel
NASA Astrophysics Data System (ADS)
Blum, P.; Foster, P.; Houpt, D.; Bennight, C.; Brandt, L.; Cobine, T.; Crawford, W.; Fackler, D.; Fujine, K.; Hastedt, M.; Hornbacher, D.; Mateo, Z.; Moortgat, E.; Vasilyev, M.; Vasilyeva, Y.; Zeliadt, S.; Zhao, J.
2008-12-01
The JOIDES Resolution (JR) has conducted 121 scientific drilling expeditions during the Ocean Drilling Program (ODP) and the first phase of the Integrated Ocean Drilling Program (IODP) (1983-2006). The vessel and scientific systems have just completed an NSF-sponsored renovation (2005-2008). Shipboard analytical systems have been upgraded, within funding constraints imposed by market driven vessel conversion cost increases, to include: (1) enhanced shipboard analytical services including instruments and software for sampling and the capture of chemistry, physical properties, and geological data; (2) new data management capabilities built around a laboratory information management system (LIMS), digital asset management system, and web services; (3) operations data services with enhanced access to navigation and rig instrumentation data; and (4) a combination of commercial and home-made user applications for workflow- specific data extractions, generic and customized data reporting, and data visualization within a shipboard production environment. The instrumented data capture systems include a new set of core loggers for rapid and non-destructive acquisition of images and other physical properties data from drill cores. Line-scan imaging and natural gamma ray loggers capture data at unprecedented quality due to new and innovative designs. Many instruments used to characterize chemical compounds of rocks, sediments, and interstitial fluids were upgraded with the latest technology. The shipboard analytical environment features a new and innovative framework (DESCinfo) and application (DESClogik) for capturing descriptive and interpretive data from geological sub-domains such as sedimentology, petrology, paleontology, structural geology, stratigraphy, etc. This system fills a long-standing gap by providing a global database, controlled vocabularies and taxa name lists with version control, a highly configurable spreadsheet environment for data capture, and
Random Forest as a Predictive Analytics Alternative to Regression in Institutional Research
ERIC Educational Resources Information Center
He, Lingjun; Levine, Richard A.; Fan, Juanjuan; Beemer, Joshua; Stronach, Jeanne
2018-01-01
In institutional research, modern data mining approaches are seldom considered to address predictive analytics problems. The goal of this paper is to highlight the advantages of tree-based machine learning algorithms over classic (logistic) regression methods for data-informed decision making in higher education problems, and stress the success of…
Build Your Own Photometer: A Guided-Inquiry Experiment to Introduce Analytical Instrumentation
ERIC Educational Resources Information Center
Wang, Jessie J.; Nun´ez, Jose´ R. Rodríguez; Maxwell, E. Jane; Algar, W. Russ
2016-01-01
A guided-inquiry project designed to teach students the basics of spectrophotometric instrumentation at the second year level is presented. Students design, build, program, and test their own single-wavelength, submersible photometer using low-cost light-emitting diodes (LEDs) and inexpensive household items. A series of structured prelaboratory…
Four Bad Habits of Modern Psychologists
Grice, James; Cota, Lisa; Taylor, Zachery; Garner, Samantha; Medellin, Eliwid; Vest, Adam
2017-01-01
Four data sets from studies included in the Reproducibility Project were re-analyzed to demonstrate a number of flawed research practices (i.e., “bad habits”) of modern psychology. Three of the four studies were successfully replicated, but re-analysis showed that in one study most of the participants responded in a manner inconsistent with the researchers’ theoretical model. In the second study, the replicated effect was shown to be an experimental confound, and in the third study the replicated statistical effect was shown to be entirely trivial. The fourth study was an unsuccessful replication, yet re-analysis of the data showed that questioning the common assumptions of modern psychological measurement can lead to novel techniques of data analysis and potentially interesting findings missed by traditional methods of analysis. Considered together, these new analyses show that while it is true replication is a key feature of science, causal inference, modeling, and measurement are equally important and perhaps more fundamental to obtaining truly scientific knowledge of the natural world. It would therefore be prudent for psychologists to confront the limitations and flaws in their current analytical methods and research practices. PMID:28805739
Four Bad Habits of Modern Psychologists.
Grice, James; Barrett, Paul; Cota, Lisa; Felix, Crystal; Taylor, Zachery; Garner, Samantha; Medellin, Eliwid; Vest, Adam
2017-08-14
Four data sets from studies included in the Reproducibility Project were re-analyzed to demonstrate a number of flawed research practices (i.e., "bad habits") of modern psychology. Three of the four studies were successfully replicated, but re-analysis showed that in one study most of the participants responded in a manner inconsistent with the researchers' theoretical model. In the second study, the replicated effect was shown to be an experimental confound, and in the third study the replicated statistical effect was shown to be entirely trivial. The fourth study was an unsuccessful replication, yet re-analysis of the data showed that questioning the common assumptions of modern psychological measurement can lead to novel techniques of data analysis and potentially interesting findings missed by traditional methods of analysis. Considered together, these new analyses show that while it is true replication is a key feature of science, causal inference, modeling, and measurement are equally important and perhaps more fundamental to obtaining truly scientific knowledge of the natural world. It would therefore be prudent for psychologists to confront the limitations and flaws in their current analytical methods and research practices.
Gałuszka, Agnieszka; Migaszewski, Zdzisław M; Namieśnik, Jacek
2015-07-01
The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector), ultraviolet-visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. Copyright © 2015 Elsevier Inc. All rights reserved.
The Pavlovian analysis of instrumental conditioning.
Gormezano, I; Tait, R W
1976-01-01
An account was given of the development within the Russian literature of a uniprocess formulation of classical and instrumental conditioning, known as the bidirectional conditioning hypothesis. The hypothesis purports to offer a single set of Pavlovian principles to account for both paradigms, based upon a neural model which assumes that bidirectional (forward and backward) connections are formed in both calssical and instrumental conditioning situations. In instrumental conditioning, the bidirectional connections are hypothesized to be simply more complex than those in classical conditioning, and any differences in empirical functions are presumed to lie not in difference in mechanism, but in the strength of the forward and backward connections. Although bidirectional connections are assumed to develop in instrumental conditioning, the experimental investigation of the bidirectional conditioning hypothesis has been essentially restricted to the classical conditioning operations of pairing two CSs (sensory preconditioning training), a US followed by a CS (backward conditioning training) and two USs. However, the paradigm involving the pairing of two USs, because of theoretical and analytical considerations, is the one most commonly employed by Russian investigators. The results of an initial experiment involving the pairing of two USs, and reference to the results of a more extensive investigation, leads us to tentatively question the validity of the bidirectional conditioning account of instrumental conditioning.
NASA Technical Reports Server (NTRS)
Saha, C. P.; Bryson, C. E.; Sarrazin, P.; Blake, D. F.
2005-01-01
Many Mars in situ instruments require fine-grained high-fidelity samples of rocks or soil. Included are instruments for the determination of mineralogy as well as organic and isotopic chemistry. Powder can be obtained as a primary objective of a sample collection system (e.g., by collecting powder as a surface is abraded by a rotary abrasion tool (RAT)), or as a secondary objective (e.g, by collecting drill powder as a core is drilled). In the latter case, a properly designed system could be used to monitor drilling in real time as well as to deliver powder to analytical instruments which would perform complementary analyses to those later performed on the intact core. In addition, once a core or other sample is collected, a system that could transfer intelligently collected subsamples of power from the intact core to a suite of analytical instruments would be highly desirable. We have conceptualized, developed and tested a breadboard Powder Delivery System (PoDS) intended to satisfy the collection, processing and distribution requirements of powder samples for Mars in-situ mineralogic, organic and isotopic measurement instruments.
Nineteenth Century Long-Term Instrumental Records, Examples From the Southeastern United States
NASA Astrophysics Data System (ADS)
Mock, C. J.
2001-12-01
Early instrumental records in the United States, defined as those operating before 1892 which is regarded the period prior to the modern climate record, provide a longer perspective of climatic variability at decadal and interannual timescales. Such reconstructions also provide a means of verification for other proxy data. This paper provides a American perspective of historical climatic research, emphasizing the urgent need to properly evaluate data quality and provide necessary corrections to make them compatible with the modern record. Different fixed observation times, different practices of weather instrument exposures, and statistical methods for calibration are the main issues in applying corrections and conducting proper climatic interpretations. I illustrate several examples on methodologies of this historical climatic research, focusing on the following in the Southeastern United States: daily reconstructed temperature time-series centered on Charleston SC and Natchez MS back to the late eighteenth century, and precipitation frequency reconstructions during the antebellum period for the Gulf Coast and coastal Southeast Atlantic states. Results indicate several prominent extremes unprecedented as compared to the modern record, such as the widespread warm winter of 1827-28, and the severe cold winters of 1856 and 1857. The reconstructions also yield important information concerning responses to past ENSO events, the PNA, NAO, and the PDO, particularly when compared with instrumental data from other regions. A high potential also exists for applying the climate reconstructions to assess historical climatic impacts on society in the Southeast, such as to understand climatic linkages to famous case studies of Yellow Fever epidemics and severe drought.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2013-12-20
Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.
Sample Analysis at Mars Instrument Simulator
NASA Technical Reports Server (NTRS)
Benna, Mehdi; Nolan, Tom
2013-01-01
The Sample Analysis at Mars Instrument Simulator (SAMSIM) is a numerical model dedicated to plan and validate operations of the Sample Analysis at Mars (SAM) instrument on the surface of Mars. The SAM instrument suite, currently operating on the Mars Science Laboratory (MSL), is an analytical laboratory designed to investigate the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. SAMSIM was developed using Matlab and Simulink libraries of MathWorks Inc. to provide MSL mission planners with accurate predictions of the instrument electrical, thermal, mechanical, and fluid responses to scripted commands. This tool is a first example of a multi-purpose, full-scale numerical modeling of a flight instrument with the purpose of supplementing or even eliminating entirely the need for a hardware engineer model during instrument development and operation. SAMSIM simulates the complex interactions that occur between the instrument Command and Data Handling unit (C&DH) and all subsystems during the execution of experiment sequences. A typical SAM experiment takes many hours to complete and involves hundreds of components. During the simulation, the electrical, mechanical, thermal, and gas dynamics states of each hardware component are accurately modeled and propagated within the simulation environment at faster than real time. This allows the simulation, in just a few minutes, of experiment sequences that takes many hours to execute on the real instrument. The SAMSIM model is divided into five distinct but interacting modules: software, mechanical, thermal, gas flow, and electrical modules. The software module simulates the instrument C&DH by executing a customized version of the instrument flight software in a Matlab environment. The inputs and outputs to this synthetic C&DH are mapped to virtual sensors and command lines that mimic in their structure and connectivity the layout of the instrument harnesses. This module executes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hierlemann, A.; Hill, M.; Ricco, A.J.
We have developed instrumentation to enable the combination of surface acoustic wave (SAW) sensor measurements with direct, in-situ molecular spectroscopic measurements to understand the response of the SAW sensors with respect to the interfacial chemistry of surface-confined sensing films interacting with gas-phase analytes. Specifically, the instrumentation and software was developed to perform in-situ Fourier-transform infrared external-reflectance spectroscopy (FTIR-ERS) on operating SAW devices during dosing of their chemically modified surfaces with analytes. By probing the surface with IR spectroscopy during gas exposure, it is possible to understand in unprecedented detail the interaction processes between the sorptive SAW coatings and the gaseousmore » analyte molecules. In this report, we provide details of this measurement system, and also demonstrate the utility of these combined measurements by characterizing the SAW and FTIR-ERS responses of organic thin-film sensor coatings interacting with gas-phase analytes.« less
ERIC Educational Resources Information Center
Elias, Ryan J.; Hopfer, Helene; Hofstaedter, Amanda N.; Hayes, John E.
2017-01-01
The human nose is a very sensitive detector and is able to detect potent aroma compounds down to low ng/L levels. These levels are often below detection limits of analytical instrumentation. The following laboratory exercise is designed to compare instrumental and human methods for the detection of volatile odor active compounds. Reference…
NASA Astrophysics Data System (ADS)
Bacon, Alice Louise
This thesis represents a new interdisciplinary approach to the conservation, care and curatorial study of 'brass' wind musical instruments. It attempts to combine metallurgical, chronological and historical aspects for a selection of instruments. The research consists of the systematic study of seventy-seven musical instruments, by known makers, using standardised non-destructive energy dispersive x-ray fluorescence (XRF). Such compositional data are virtually non-existent for historical 'brass' instruments in Britain and what few technical data that do exist are sporadic in quantity and quality. The development of brass instruments is interwoven with the history of brass making, but because there are a limited number of appropriate examples such links can be difficult to identify. This thesis describes the development of brass production from the cementation process to the commercial production of zinc and modern brass. Its relationship to the musical instrument industry in Britain is linked with historical evidence. It will be shown that innovation and known historical metallurgical achievements are reflected in the compositional changes of the alloys used for musical instruments. This thesis focuses on specific named brass wind musical instrument makers. This thesis sets out to investigate the extent to which a single analytical technique such as non-destructive analysis utilising XRF could be useful in the curatorial and conservation care of musical instruments. The results of the analyses revealed new aspects to the use of metals for making musical instruments. They give new information on approximate alloy compositions and, in particular, the results have shown that in the seventeenth-century in England, a ternary alloy of copper/tin/zinc was used, and that it was, perhaps, only superseded by brass (copper/zinc alloy) in the eighteenth century. It has been possible to arrange the results into a chronology of alloys particularly reflecting the change from the
THE IMPORTANCE OF PROPER INTENSITY CALIBRATION FOR RAMAN ANALYSIS OF LOW-LEVEL ANALYTES IN WATER
Modern dispersive Raman spectroscopy offers unique advantages for the analysis of low-concentration analytes in aqueous solution. However, we have found that proper intensity calibration is critical for obtaining these benefits. This is true not only for producing spectra with ...
ERIC Educational Resources Information Center
Sabnani, Haresh B.; Ponterotto, Joseph G.
1992-01-01
Reviews eight instruments specifically conceptualized and developed for use in racial/ethnic minority-focused psychological research: Cultural Mistrust Inventory, African Self-Consciousness Scale, Cross-Cultural Counseling Inventory-Revised, Modern Racism Scale, Value Orientation Scale, Acculturation Rating Scale for Mexican Americans, Racial…
Benazzi, Stefano
2012-01-01
The discovery of new human fossil remains is one of the most obvious ways to improve our understanding of the dynamics of human evolution. The reanalysis of existing fossils using newer methods is also crucial, and may lead to a reconsideration of the biological and taxonomical status of some specimens, and improve our understanding of highly debated periods in human prehistory. This is particularly true for those remains that have previously been studied using traditional approaches, with only morphological descriptions and standard calliper measurements available. My own interest in the Uluzzian, and its associated human remains grew from my interest in applying recently developed analytical techniques to quantify morphological variation. Discovered more than 40 years ago, the two deciduous molars from Grotta del Cavallo (Apulia, Italy) are the only human remains associated with the Uluzzian culture (one of the main three European "transitional" cultures). These teeth were previously attributed to Neanderthals. This attribution contributed to a consensus view that the Uluzzian, with its associated ornament and tool complexes, was produced by Neanderthals. A reassessment of these deciduous teeth by means of digital morphometric analysis revealed that these remains belong to anatomically modern humans (AMHs). This finding contradicts previous assumptions and suggests that modern humans, and not Neanderthals, created the Uluzzian culture. Of equal importance, new chronometric analyses date these dental remains to 43,000-45,000 cal BP. Thus, the teeth from Grotta del Cavallo represent the oldest European AMH currently known.
NASA Astrophysics Data System (ADS)
Dodds, S. F.; Mock, C. J.
2009-12-01
All available instrumental winter precipitation data for the Central Valley of California back to 1850 were digitized and analyzed to construct continuous time series. Many of these data, in paper or microfilm format, extend prior to modern National Weather Service Cooperative Data Program and Historical Climate Network data, and were recorded by volunteer observers from networks such as the US Army Surgeon General, Smithsonian Institution, and US Army Signal Service. Given incomplete individual records temporally, detailed documentary data from newspapers, personal diaries and journals, ship logbooks, and weather enthusiasts’ instrumental data, were used in conjunction with instrumental data to reconstruct precipitation frequency per month and season, continuous days of precipitation, and to identify anomalous precipitation events. Multilinear regression techniques, using surrounding stations and the relationships between modern and historical records, bridge timeframes lacking data and provided homogeneous nature of time series. The metadata for each station was carefully screened, and notes were made about any possible changes to the instrumentation, location of instruments, or an untrained observer to verify that anomalous events were not recorded incorrectly. Precipitation in the Central Valley varies throughout the entire region, but waterways link the differing elevations and latitudes. This study integrates the individual station data with additional accounts of flood descriptions through unique newspaper and journal data. River heights and flood extent inundating cities, agricultural lands, and individual homes are often recorded within unique documentary sources, which add to the understanding of flood occurrence within this area. Comparisons were also made between dam and levee construction through time and how waters are diverted through cities in natural and anthropogenically changed environments. Some precipitation that lead to flooding events that
Sub-federal ecological modernization: A case study of Colorado's new energy economy
NASA Astrophysics Data System (ADS)
Giannakouros, Stratis
European nations have often employed policies of explicit government intervention as a preferred means of addressing environmental and economic challenges. These policies have ranged from grey industrial policies focused solely on industrial growth, competitiveness and innovation to policies of stronger ecological modernization, which seek to align industrial interests with environmental protection. In recent years these policies have been mobilized to address the threat of climate change and promote environmental innovation. While some US Administrations have similarly recognized the need to address these challenges, the particular historical and political institutional dynamics of the US have meant that explicit government intervention has been eschewed in favor of more indirect strategies when dealing with economic and environmental challenges. This is evident in the rise of sub-federal policies at the level of US states. Supported by federal laboratories and public research, US states have adopted policies that look very much like sub-federal versions of industrial or ecological modernization policy. This thesis uses the Colorado case to highlight the importance of sub-federal institutions in addressing environmental and economic challenges in the US and explore its similarities to, and differences from, European approaches. To achieve this goal it first develops an analytical scheme within which to place policy initiatives on a continuum from grey industrial policy to strong ecological modernization policy by identifying key institutions that are influential in each policy type. This analytical scheme is then applied to the transitional renewable energy policy period from 2004-2012 in the state of Colorado. This period starts with the adoption of a renewable energy portfolio in 2004 and includes the `new energy economy' period from 2007-2010 as well as the years since. Looking at three key turning points this paper interprets the `new energy economy' strategy
ANALYTICAL METHOD COMPARISONS BY ESTIMATES OF PRECISION AND LOWER DETECTION LIMIT
The paper describes the use of principal component analysis to estimate the operating precision of several different analytical instruments or methods simultaneously measuring a common sample of a material whose actual value is unknown. This approach is advantageous when none of ...
Social Learning Analytics: Navigating the Changing Settings of Higher Education
ERIC Educational Resources Information Center
de Laat, Maarten; Prinsen, Fleur R.
2014-01-01
Current trends and challenges in higher education (HE) require a reorientation towards openness, technology use and active student participation. In this article we will introduce Social Learning Analytics (SLA) as instrumental in formative assessment practices, aimed at supporting and strengthening students as active learners in increasingly open…
The role of multi-target policy instruments in agri-environmental policy mixes.
Schader, Christian; Lampkin, Nicholas; Muller, Adrian; Stolze, Matthias
2014-12-01
The Tinbergen Rule has been used to criticise multi-target policy instruments for being inefficient. The aim of this paper is to clarify the role of multi-target policy instruments using the case of agri-environmental policy. Employing an analytical linear optimisation model, this paper demonstrates that there is no general contradiction between multi-target policy instruments and the Tinbergen Rule, if multi-target policy instruments are embedded in a policy-mix with a sufficient number of targeted instruments. We show that the relation between cost-effectiveness of the instruments, related to all policy targets, is the key determinant for an economically sound choice of policy instruments. If economies of scope with respect to achieving policy targets are realised, a higher cost-effectiveness of multi-target policy instruments can be achieved. Using the example of organic farming support policy, we discuss several reasons why economies of scope could be realised by multi-target agri-environmental policy instruments. Copyright © 2014 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Parkes, Jenny; Heslop, Jo; Januario, Francisco; Oando, Samwel; Sabaa, Susan
2016-01-01
This paper interrogates the influence of a tradition-modernity dichotomy on perspectives and practices on sexual violence and sexual relationships involving girls in three districts of Kenya, Ghana and Mozambique. Through deploying an analytical framework of positioning within multiple discursive sites, we argue that although the dichotomy…
Fogarty, Laurel; Wakano, Joe Yuichiro; Feldman, Marcus W; Aoki, Kenichi
2017-03-01
The forces driving cultural accumulation in human populations, both modern and ancient, are hotly debated. Did genetic, demographic, or cognitive features of behaviorally modern humans (as opposed to, say, early modern humans or Neanderthals) allow culture to accumulate to its current, unprecedented levels of complexity? Theoretical explanations for patterns of accumulation often invoke demographic factors such as population size or density, whereas statistical analyses of variation in cultural complexity often point to the importance of environmental factors such as food stability, in determining cultural complexity. Here we use both an analytical model and an agent-based simulation model to show that a full understanding of the emergence of behavioral modernity, and the cultural evolution that has followed, depends on understanding and untangling the complex relationships among culture, genetically determined cognitive ability, and demographic history. For example, we show that a small but growing population could have a different number of cultural traits from a shrinking population with the same absolute number of individuals in some circumstances.
Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique
2018-03-01
Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Popplow, Marcus
2015-12-01
Recent critical approaches to what has conventionally been described as "scientific" and "technical" knowledge in early modern Europe have provided a wealth of new insights. So far, the various analytical concepts suggested by these studies have not yet been comprehensively discussed. The present essay argues that such comprehensive approaches might prove of special value for long-term and cross-cultural reflections on technology-related knowledge. As heuristic tools, the notions of "formalization" and "interaction" are proposed as part of alternative narratives to those highlighting the emergence of "science" as the most relevant development for technology-related knowledge in early modern Europe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutmacher, R.; Crawford, R.
This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338
Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.
NASA Technical Reports Server (NTRS)
Lesco, D. J.; Weikle, D. H.
1980-01-01
The wideband electric power measurement related topics of electronic wattmeter calibration and specification are discussed. Tested calibration techniques are described in detail. Analytical methods used to determine the bandwidth requirements of instrumentation for switching circuit waveforms are presented and illustrated with examples from electric vehicle type applications. Analog multiplier wattmeters, digital wattmeters and calculating digital oscilloscopes are compared. The instrumentation characteristics which are critical to accurate wideband power measurement are described.
Analytical Protein Microarrays: Advancements Towards Clinical Applications
Sauer, Ursula
2017-01-01
Protein microarrays represent a powerful technology with the potential to serve as tools for the detection of a broad range of analytes in numerous applications such as diagnostics, drug development, food safety, and environmental monitoring. Key features of analytical protein microarrays include high throughput and relatively low costs due to minimal reagent consumption, multiplexing, fast kinetics and hence measurements, and the possibility of functional integration. So far, especially fundamental studies in molecular and cell biology have been conducted using protein microarrays, while the potential for clinical, notably point-of-care applications is not yet fully utilized. The question arises what features have to be implemented and what improvements have to be made in order to fully exploit the technology. In the past we have identified various obstacles that have to be overcome in order to promote protein microarray technology in the diagnostic field. Issues that need significant improvement to make the technology more attractive for the diagnostic market are for instance: too low sensitivity and deficiency in reproducibility, inadequate analysis time, lack of high-quality antibodies and validated reagents, lack of automation and portable instruments, and cost of instruments necessary for chip production and read-out. The scope of the paper at hand is to review approaches to solve these problems. PMID:28146048
The XGS instrument on-board THESEUS
NASA Astrophysics Data System (ADS)
Fuschino, F.; Campana, R.; Labanti, C.; Marisaldi, M.; Amati, L.; Fiorini, M.; Uslenghi, M.; Baldazzi, G.; Evangelista, Y.; Elmi, I.; Feroci, M.; Frontera, F.; Rachevski, A.; Rignanese, L. P.; Vacchi, A.; Zampa, G.; Zampa, N.; Rashevskaya, I.; Bellutti, P.; Piemonte, C.
2016-10-01
Consolidated techniques used for space-borne X-ray and gamma-ray instruments are based on the use of scintillators coupled to Silicon photo-detectors. This technology associated with modern very low noise read-out electronics allows the design of innovative architectures able to reduce drastically the system complexity and power consumption, also with a moderate-to-high number of channels. These detector architectures can be exploited in the design of space instrumentation for gamma-spectroscopy with the benefit of possible smart background rejection strategies. We describe a detector prototype with 3D imaging capabilities to be employed in future gamma-ray and particle space missions in the 0.002-100 MeV energy range. The instrument is based on a stack of scintillating bars read out by Silicon Drift Detectors (SDDs) at both ends. The spatial segmentation and the crystal double-side readout allow a 3D position reconstruction with ∼3 mm accuracy within the full active volume, using a 2D readout along the two external faces of the detector. Furthermore, one of the side of SDDs can be used simultaneously to detect X-rays in the 2-30 keV energy range. The characteristics of this instrument make it suitable in next generation gamma-ray and particle space missions for Earth or outer space observations, and it will be briefly illustrated.
DOT National Transportation Integrated Search
1974-01-01
The report contains the results of an experimental and analytical evaluation of instruments and techniques designed to prevent an intoxicated driver from operating his automobile. The prototype 'Alcohol Safety Interlock Systems' tested were developed...
Chandra ACIS-I particle background: an analytical model
NASA Astrophysics Data System (ADS)
Bartalucci, I.; Mazzotta, P.; Bourdin, H.; Vikhlinin, A.
2014-06-01
Aims: Imaging and spectroscopy of X-ray extended sources require a proper characterisation of a spatially unresolved background signal. This background includes sky and instrumental components, each of which are characterised by its proper spatial and spectral behaviour. While the X-ray sky background has been extensively studied in previous work, here we analyse and model the instrumental background of the ACIS-I detector on board the Chandra X-ray observatory in very faint mode. Methods: Caused by interaction of highly energetic particles with the detector, the ACIS-I instrumental background is spectrally characterised by the superimposition of several fluorescence emission lines onto a continuum. To isolate its flux from any sky component, we fitted an analytical model of the continuum to observations performed in very faint mode with the detector in the stowed position shielded from the sky, and gathered over the eight-year period starting in 2001. The remaining emission lines were fitted to blank-sky observations of the same period. We found 11 emission lines. Analysing the spatial variation of the amplitude, energy and width of these lines has further allowed us to infer that three lines of these are presumably due to an energy correction artefact produced in the frame store. Results: We provide an analytical model that predicts the instrumental background with a precision of 2% in the continuum and 5% in the lines. We use this model to measure the flux of the unresolved cosmic X-ray background in the Chandra deep field south. We obtain a flux of 10.2+0.5-0.4 × 10-13 erg cm-2 deg-2 s-1 for the [1-2] keV band and (3.8 ± 0.2) × 10-12 erg cm-2 deg-2 s-1 for the [2-8] keV band.
Widening Participation, the Instrumentalization of Knowledge and the Reproduction of Inequality
ERIC Educational Resources Information Center
Mavelli, Luca
2014-01-01
According to Michel Foucault, modernity is predicated on the emergence of an instrumental idea of knowledge, which does not affect the constitution of the individual as a subject. This article aims to explore this thesis in the context of British Higher Education through a problematization of widening participation policies, and how they have been…
Gas-analytic measurement complexes of Baikal atmospheric-limnological observatory
NASA Astrophysics Data System (ADS)
Pestunov, D. A.; Shamrin, A. M.; Shmargunov, V. P.; Panchenko, M. V.
2015-11-01
The paper presents the present-day structure of stationary and mobile hardware-software gas-analytical complexes of Baikal atmospheric-limnological observatory (BALO) Siberian Branch Russian Academy of Sciences (SB RAS), designed to study the processes of gas exchange of carbon-containing gases in the "atmosphere-water" system, which are constantly updated to include new measuring and auxiliary instrumentation.
Recent trends in atomic fluorescence spectrometry towards miniaturized instrumentation-A review.
Zou, Zhirong; Deng, Yujia; Hu, Jing; Jiang, Xiaoming; Hou, Xiandeng
2018-08-17
Atomic fluorescence spectrometry (AFS), as one of the common atomic spectrometric techniques with high sensitivity, simple instrumentation, and low acquisition and running cost, has been widely used in various fields for trace elemental analysis, notably the determination of hydride-forming elements by hydride generation atomic fluorescence spectrometry (HG-AFS). In recent years, the soaring demand of field analysis has significantly promoted the miniaturization of analytical atomic spectrometers or at least instrumental components. Various techniques have also been developed to approach the goal of portable/miniaturized AFS instrumentation for field analysis. In this review, potentially portable/miniaturized AFS techniques, primarily involving advanced instrumental components and whole instrumentation with references since 2000, are summarized and discussed. The discussion mainly includes five aspects: radiation source, atomizer, detector, sample introduction, and miniaturized atomic fluorescence spectrometer/system. Copyright © 2018 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Yearty, Kasey L.; Sharp, Joseph T.; Meehan, Emma K.; Wallace, Doyle R.; Jackson, Douglas M.; Morrison, Richard W.
2017-01-01
[Superscript 1]H NMR analysis is an important analytical technique presented in introductory organic chemistry courses. NMR instrument access is limited for undergraduate organic chemistry students due to the size of the instrument, price of NMR solvents, and the maintenance level required for instrument upkeep. The University of Georgia Chemistry…
NASA Technical Reports Server (NTRS)
Newton, R. L.
1999-01-01
The objective of this research was to construct a chemical sensor/instrumentation package that was smaller in weight and volume than conventional instrumentation. This reduction in weight and volume is needed to assist in further reducing the cost of launching payloads into space. To accomplish this, fiber optic sensors, miniaturized spectrometers, and wireless modems were employed. The system was evaluated using iodine as a calibration analyte.
Median of patient results as a tool for assessment of analytical stability.
Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György
2015-06-15
In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.
Chapter 16 - Predictive Analytics for Comprehensive Energy Systems State Estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yingchen; Yang, Rui; Hodge, Brian S
Energy sustainability is a subject of concern to many nations in the modern world. It is critical for electric power systems to diversify energy supply to include systems with different physical characteristics, such as wind energy, solar energy, electrochemical energy storage, thermal storage, bio-energy systems, geothermal, and ocean energy. Each system has its own range of control variables and targets. To be able to operate such a complex energy system, big-data analytics become critical to achieve the goal of predicting energy supplies and consumption patterns, assessing system operation conditions, and estimating system states - all providing situational awareness to powermore » system operators. This chapter presents data analytics and machine learning-based approaches to enable predictive situational awareness of the power systems.« less
Status of the Neutron Imaging and Diffraction Instrument IMAT
NASA Astrophysics Data System (ADS)
Kockelmann, Winfried; Burca, Genoveva; Kelleher, Joe F.; Kabra, Saurabh; Zhang, Shu-Yan; Rhodes, Nigel J.; Schooneveld, Erik M.; Sykora, Jeff; Pooley, Daniel E.; Nightingale, Jim B.; Aliotta, Francesco; Ponterio, Rosa C.; Salvato, Gabriele; Tresoldi, Dario; Vasi, Cirino; McPhate, Jason B.; Tremsin, Anton S.
A cold neutron imaging and diffraction instrument, IMAT, is currently being constructed at the ISIS second target station. IMAT will capitalize on time-of-flight transmission and diffraction techniques available at a pulsed neutron source. Analytical techniques will include neutron radiography, neutron tomography, energy-selective neutron imaging, and spatially resolved diffraction scans for residual strain and texture determination. Commissioning of the instrument will start in 2015, with time-resolving imaging detectors and two diffraction detector prototype modules. IMAT will be operated as a user facility for material science applications and will be open for developments of time-of-flight imaging methods.
Andrei Andreevich Bolibrukh's works on the analytic theory of differential equations
NASA Astrophysics Data System (ADS)
Anosov, Dmitry V.; Leksin, Vladimir P.
2011-02-01
This paper contains an account of A.A. Bolibrukh's results obtained in the new directions of research that arose in the analytic theory of differential equations as a consequence of his sensational counterexample to the Riemann-Hilbert problem. A survey of results of his students in developing topics first considered by Bolibrukh is also presented. The main focus is on the role of the reducibility/irreducibility of systems of linear differential equations and their monodromy representations. A brief synopsis of results on the multidimensional Riemann-Hilbert problem and on isomonodromic deformations of Fuchsian systems is presented, and the main methods in the modern analytic theory of differential equations are sketched. Bibliography: 69 titles.
The Modern Design of Experiments for Configuration Aerodynamics: A Case Study
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2006-01-01
The effects of slowly varying and persisting covariate effects on the accuracy and precision of experimental result is reviewed, as is the rationale for run-order randomization as a quality assurance tactic employed in the Modern Design of Experiments (MDOE) to defend against such effects. Considerable analytical complexity is introduced by restrictions on randomization in configuration aerodynamics tests because they involve hard-to-change configuration variables that cannot be randomized conveniently. Tradeoffs are examined between quality and productivity associated with varying degrees of rigor in accounting for such randomization restrictions. Certain characteristics of a configuration aerodynamics test are considered that may justify a relaxed accounting for randomization restrictions to achieve a significant reduction in analytical complexity with a comparably negligible adverse impact on the validity of the experimental results.
Electrochemical Detection of Multiple Bioprocess Analytes
NASA Technical Reports Server (NTRS)
Rauh, R. David
2010-01-01
An apparatus that includes highly miniaturized thin-film electrochemical sensor array has been demonstrated as a prototype of instruments for simultaneous detection of multiple substances of interest (analytes) and measurement of acidity or alkalinity in bioprocess streams. Measurements of pH and of concentrations of nutrients and wastes in cell-culture media, made by use of these instruments, are to be used as feedback for optimizing the growth of cells or the production of desired substances by the cultured cells. The apparatus is designed to utilize samples of minimal volume so as to minimize any perturbation of monitored processes. The apparatus can function in a potentiometric mode (for measuring pH), an amperometric mode (detecting analytes via oxidation/reduction reactions), or both. The sensor array is planar and includes multiple thin-film microelectrodes covered with hydrous iridium oxide. The oxide layer on each electrode serves as both a protective and electrochemical transducing layer. In its transducing role, the oxide provides electrical conductivity for amperometric measurement or pH response for potentiometric measurement. The oxide on an electrode can also serve as a matrix for one or more enzymes that render the electrode sensitive to a specific analyte. In addition to transducing electrodes, the array includes electrodes for potential control. The array can be fabricated by techniques familiar to the microelectronics industry. The sensor array is housed in a thin-film liquid-flow cell that has a total volume of about 100 mL. The flow cell is connected to a computer-controlled subsystem that periodically draws samples from the bioprocess stream to be monitored. Before entering the cell, each 100-mL sample is subjected to tangential-flow filtration to remove particles. In the present version of the apparatus, the electrodes are operated under control by a potentiostat and are used to simultaneously measure the pH and the concentration of glucose
Liberals think more analytically (more "WEIRD") than conservatives.
Talhelm, Thomas; Haidt, Jonathan; Oishi, Shigehiro; Zhang, Xuemin; Miao, Felicity F; Chen, Shimin
2015-02-01
Henrich, Heine, and Norenzayan summarized cultural differences in psychology and argued that people from one particular culture are outliers: people from societies that are Western, educated, industrialized, rich, and democratic (WEIRD). This study shows that liberals think WEIRDer than conservatives. In five studies with more than 5,000 participants, we found that liberals think more analytically (an element of WEIRD thought) than moderates and conservatives. Study 3 replicates this finding in the very different political culture of China, although it held only for people in more modernized urban centers. These results suggest that liberals and conservatives in the same country think as if they were from different cultures. Studies 4 to 5 show that briefly training people to think analytically causes them to form more liberal opinions, whereas training them to think holistically causes shifts to more conservative opinions. © 2014 by the Society for Personality and Social Psychology, Inc.
Comparison of removed dentin thickness with hand and rotary instruments
Shahriari, Shahriar; Abedi, Hasan; Hashemi, Mahdi; Jalalzadeh, Seyed Mohsen
2009-01-01
INTRODUCTION: The aim of this study was to evaluate the amount of dentine removed after canal preparation using stainless steel (SS) hand instruments or rotary ProFile instruments. MATERIALS AND METHODS: Thirty-six extracted human teeth with root canal curvatures less than 30º were embedded in clear polyester resin. The roots were cut horizontally at apical 2, 4 and 7 mm. Dentin thickness was measured at each section and the sections were accurately reassembled using a muffle. Root canals were randomly prepared by SS hand instruments or rotary ProFile instruments. Root sections were again separated, and the remaining dentin thickness was measured. Mann-Whitney U and t tests were performed for analytic comparison of the results. RESULTS: The thickness of removed dentin was significantly different between the two used methods (P<0.05). Significantly greater amounts of dentin was removed mesially in all sections in hand instrumentation group (P<0.001). CONCLUSION: ProFile rotary instrumentation prepares root canals with a greater conservation of tooth structure. PMID:23940489
Curiosity: organic molecules on Mars? (Italian Title: Curiosity: molecole organiche su Marte?)
NASA Astrophysics Data System (ADS)
Guaita, C.
2015-05-01
First analytical results from SAM instrument onboard of Curiosity are coherent with the presence, on Mars, of organic molecules possibly linked to bacterial metabolism. These data require also a modern revision of the debated results obtained by Viking landers.
Modern air protection technologies at thermal power plants (review)
NASA Astrophysics Data System (ADS)
Roslyakov, P. V.
2016-07-01
Realization of the ecologically safe technologies for fuel combustion in the steam boiler furnaces and the effective ways for treatment of flue gases at modern thermal power plants have been analyzed. The administrative and legal measures to stimulate introduction of the technologies for air protection at TPPs have been considered. It has been shown that both the primary intrafurnace measures for nitrogen oxide suppression and the secondary flue gas treatment methods are needed to meet the modern ecological standards. Examples of the environmentally safe methods for flame combustion of gas-oil and solid fuels in the boiler furnaces have been provided. The effective methods and units to treat flue gases from nitrogen and sulfur oxides and flue ash have been considered. It has been demonstrated that realization of the measures for air protection should be accompanied by introduction of the systems for continuous instrumentation control of the composition of combustion products in the gas path of boiler units and for monitoring of atmospheric emissions.
Schultze, A E; Irizarry, A R
2017-02-01
Veterinary clinical pathologists are well positioned via education and training to assist in investigations of unexpected results or increased variation in clinical pathology data. Errors in testing and unexpected variability in clinical pathology data are sometimes referred to as "laboratory errors." These alterations may occur in the preanalytical, analytical, or postanalytical phases of studies. Most of the errors or variability in clinical pathology data occur in the preanalytical or postanalytical phases. True analytical errors occur within the laboratory and are usually the result of operator or instrument error. Analytical errors are often ≤10% of all errors in diagnostic testing, and the frequency of these types of errors has decreased in the last decade. Analytical errors and increased data variability may result from instrument malfunctions, inability to follow proper procedures, undetected failures in quality control, sample misidentification, and/or test interference. This article (1) illustrates several different types of analytical errors and situations within laboratories that may result in increased variability in data, (2) provides recommendations regarding prevention of testing errors and techniques to control variation, and (3) provides a list of references that describe and advise how to deal with increased data variability.
The second modern condition? Compressed modernity as internalized reflexive cosmopolitization.
Kyung-Sup, Chang
2010-09-01
Compressed modernity is a civilizational condition in which economic, political, social and/or cultural changes occur in an extremely condensed manner in respect to both time and space, and in which the dynamic coexistence of mutually disparate historical and social elements leads to the construction and reconstruction of a highly complex and fluid social system. During what Beck considers the second modern stage of humanity, every society reflexively internalizes cosmopolitanized risks. Societies (or their civilizational conditions) are thereby being internalized into each other, making compressed modernity a universal feature of contemporary societies. This paper theoretically discusses compressed modernity as nationally ramified from reflexive cosmopolitization, and, then, comparatively illustrates varying instances of compressed modernity in advanced capitalist societies, un(der)developed capitalist societies, and system transition societies. In lieu of a conclusion, I point out the declining status of national societies as the dominant unit of (compressed) modernity and the interactive acceleration of compressed modernity among different levels of human life ranging from individuals to the global community. © London School of Economics and Political Science 2010.
Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo
2015-01-15
Microbial fuel cells were rediscovered twenty years ago and now are a very active research area. The reasons behind this new activity are the relatively recent discovery of electrogenic or electroactive bacteria and the vision of two important practical applications, as wastewater treatment coupled with clean energy production and power supply systems for isolated low-power sensor devices. Although some analytical applications of MFCs were proposed earlier (as biochemical oxygen demand sensing) only lately a myriad of new uses of this technology are being presented by research groups around the world, which combine both biological-microbiological and electroanalytical expertises. This is the second part of a review of MFC applications in the area of analytical sciences. In Part I a general introduction to biological-based analytical methods including bioassays, biosensors, MFCs design, operating principles, as well as, perhaps the main and earlier presented application, the use as a BOD sensor was reviewed. In Part II, other proposed uses are presented and discussed. As other microbially based analytical systems, MFCs are satisfactory systems to measure and integrate complex parameters that are difficult or impossible to measure otherwise, such as water toxicity (where the toxic effect to aquatic organisms needed to be integrated). We explore here the methods proposed to measure toxicity, microbial metabolism, and, being of special interest to space exploration, life sensors. Also, some methods with higher specificity, proposed to detect a single analyte, are presented. Different possibilities to increase selectivity and sensitivity, by using molecular biology or other modern techniques are also discussed here. Copyright © 2014 Elsevier B.V. All rights reserved.
Olsson, Viktoria; Håkansson, Andreas
2018-01-01
Varying processing conditions can strongly affect the microstructure of mayonnaise, opening up new applications for the creation of products tailored to meet different consumer preferences. The aim of the study was to evaluate the effect of emulsification intensity on sensory and instrumental characteristics of full-fat mayonnaise. Mayonnaise, based on a standard recipe, was processed at low and high emulsification intensities, with selected sensory and instrumental properties then evaluated using an analytical panel and a back extrusion method. The evaluation also included a commercial reference mayonnaise. The overall effects of a higher emulsification intensity on the sensory and instrumental characteristics of full-fat mayonnaise were limited. However, texture was affected, with a more intense emulsification resulting in a firmer mayonnaise according to both back extrusion data and the analytical sensory panel. Appearance, taste and flavor attributes were not affected by processing. PMID:29342128
Feasibility of modern airships - Preliminary assessment
NASA Technical Reports Server (NTRS)
Ardema, M. D.
1977-01-01
Attention is given to the NASA program, Feasibility Study of Modern Airships, initiated to investigate potential research and technology programs associated with airship development. A historical survey of the program is presented, including the development of past airship concepts, aerodynamical and design improvements, structure and material concepts, and research in controls, avionics, instrumentation, flight operations, and ground handling. A mission analysis was carried out which considered passenger and cargo transportation, heavy-lift, short-haul applications, surveillance missions, and the transportation of natural gas. A vehicle parametric analysis examined the entire range of airship concepts, discussing both conventional airships and hybrids. Various design options were evaluated, such as choice of structural materials, use of boundary-layer control, and choice of lifting gas.
Applications of everyday IT and communications devices in modern analytical chemistry: A review.
Grudpan, Kate; Kolev, Spas D; Lapanantnopakhun, Somchai; McKelvie, Ian D; Wongwilai, Wasin
2015-05-01
This paper reviews the development and recent use of everyday communications and IT equipment (mobile phones, digital cameras, scanners, webcams, etc) as detection devices for colorimetric chemistries. Such devices can readily be applied for visible detection using reaction formats such as microfluidic paper based analytical devices (µPADs), indicator papers, and well plate reaction vessels. Their use is highly advantageous with respect to cost, simplicity and portability, and offers many opportunities in the areas of point of care diagnosis, and at-site monitoring of environmental, agricultural, food and beverage parameters. Copyright © 2015 Elsevier B.V. All rights reserved.
Proposed techniques for launching instrumented balloons into tornadoes
NASA Technical Reports Server (NTRS)
Grant, F. C.
1971-01-01
A method is proposed to introduce instrumented balloons into tornadoes by means of the radial pressure gradient, which supplies a buoyancy force driving to the center. Presented are analytical expressions, verified by computer calculations, which show the possibility of introducing instrumented balloons into tornadoes at or below the cloud base. The times required to reach the center are small enough that a large fraction of tornadoes are suitable for the technique. An experimental procedure is outlined in which a research airplane puts an instrumented, self-inflating balloon on the track ahead of the tornado. The uninflated balloon waits until the tornado closes to, typically, 750 meters; then it quickly inflates and spirals up and into the core, taking roughly 3 minutes. Since the drive to the center is automatically produced by the radial pressure gradient, a proper launch radius is the only guidance requirement.
Delre, Antonio; Mønster, Jacob; Samuelsson, Jerker; Fredenslund, Anders M; Scheutz, Charlotte
2018-09-01
The tracer gas dispersion method (TDM) is a remote sensing method used for quantifying fugitive emissions by relying on the controlled release of a tracer gas at the source, combined with concentration measurements of the tracer and target gas plumes. The TDM was tested at a wastewater treatment plant for plant-integrated methane emission quantification, using four analytical instruments simultaneously and four different tracer gases. Measurements performed using a combination of an analytical instrument and a tracer gas, with a high ratio between the tracer gas release rate and instrument precision (a high release-precision ratio), resulted in well-defined plumes with a high signal-to-noise ratio and a high methane-to-tracer gas correlation factor. Measured methane emission rates differed by up to 18% from the mean value when measurements were performed using seven different instrument and tracer gas combinations. Analytical instruments with a high detection frequency and good precision were established as the most suitable for successful TDM application. The application of an instrument with a poor precision could only to some extent be overcome by applying a higher tracer gas release rate. A sideward misplacement of the tracer gas release point of about 250m resulted in an emission rate comparable to those obtained using a tracer gas correctly simulating the methane emission. Conversely, an upwind misplacement of about 150m resulted in an emission rate overestimation of almost 50%, showing the importance of proper emission source simulation when applying the TDM. Copyright © 2018 Elsevier B.V. All rights reserved.
Automation, consolidation, and integration in autoimmune diagnostics.
Tozzoli, Renato; D'Aurizio, Federica; Villalta, Danilo; Bizzaro, Nicola
2015-08-01
Over the past two decades, we have witnessed an extraordinary change in autoimmune diagnostics, characterized by the progressive evolution of analytical technologies, the availability of new tests, and the explosive growth of molecular biology and proteomics. Aside from these huge improvements, organizational changes have also occurred which brought about a more modern vision of the autoimmune laboratory. The introduction of automation (for harmonization of testing, reduction of human error, reduction of handling steps, increase of productivity, decrease of turnaround time, improvement of safety), consolidation (combining different analytical technologies or strategies on one instrument or on one group of connected instruments) and integration (linking analytical instruments or group of instruments with pre- and post-analytical devices) opened a new era in immunodiagnostics. In this article, we review the most important changes that have occurred in autoimmune diagnostics and present some models related to the introduction of automation in the autoimmunology laboratory, such as automated indirect immunofluorescence and changes in the two-step strategy for detection of autoantibodies; automated monoplex immunoassays and reduction of turnaround time; and automated multiplex immunoassays for autoantibody profiling.
Evaluation of analytical errors in a clinical chemistry laboratory: a 3 year experience.
Sakyi, As; Laing, Ef; Ephraim, Rk; Asibey, Of; Sadique, Ok
2015-01-01
Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified.
The 1985 pittsburgh conference: a special instrumentation report.
1985-03-29
For the first time in its 36 years of operation, the Pittsburgh Conference and Exposition on Analytical Chemistry and Applied Spectroscopy had a sharp drop in attendance-down 16 percent to 20,731. That loss was attributed to the fact that the meeting was held in New Orleans for the first time, and most of the lost attendees were students and young professionals who had previously come for only 1 day. The number of exhibitors and the number of booths, however, were both up about 15 percent, to 730 and 1856, respectively. A large proportion of that increase was contributed by foreign companies exhibiting for the first time, but there were also some well-known names, such as General Electric and Xerox, making first forays into analytical chemistry. There was also a sharp increase in the number and type of instruments displayed. "The key skill now in analytical chemistry," says Perkin-Elmer president Horace McDonell, Jr., "may be simply finding the right tool to obtain the answers you need." The predominant theme of the show, as it has been for the past few years, was automation of both laboratories and instruments. That trend is having major effects in chemical laboratories, but it is also affecting the instrument companies themselves. At large companies such as Varian, Beckman, and Perkin-Elmer, as much as 50 percent of the research and development budget is now going toward development of software-a much higher percentage than it was even 5 years ago. Another trend in automation also seemed clear at the show. As recently as 2 or 3 years ago, much of the available software for chemistry was designed for Apple and similar computers. Now, the laboratory standard is the IBM PC. As a representative of another company that manufactures computers noted with only slight exaggeration, "There's probably not a booth on the floor that doesn't have one."
Grace Sun; Rebecca Ibach; Marek Gnatowski; Jessie Glaeser; Mathew Leung; John Haight
2014-01-01
Various instrumental techniques were used to study the fungal decay process in wood plastic composite (WPC) boards. Commercial boards exposed near Hilo, Hawaii (HI) for eight years in both sun and shadow locations were inspected and tested periodically. After eight years of exposure, both boards were evaluated using magnetic resonance imaging (MRI), while a selected...
NASA Astrophysics Data System (ADS)
Kus, Orcun; Kocaman, Ibrahim; Topcu, Yucel; Karaca, Volkan
2012-05-01
The problem of defending a specific airspace is among the main issues a military commander to solve. Proper protection of own airspace is crucial for mission success at the battlefield. The military doctrines of most world armed forces involve two main options of defending the airspace. One of them is utilizing formations of fighter aircraft, which is a flexible choice. The second option is deploying modern SAM (Surface to Air Missile) systems, which is more expansive. On the other hand the decision makers are to cope with miscellaneous restrictions such as the budgeting problems. This study defines air defense concept according to modern air warfare doctrine. It considers an air defense scenario over an arbitrary airspace and compares the performance and cost-effectiveness of employing fighter aircraft and SAM systems. It also presents SWOT (Strenghts - Weakness - Opportunities - Threats) analyses of air defense by fighter aircraft and by modern SAMs and tries to point out whichever option is better. We conclude that deploying SAMs has important advantages over using fighter aircraft by means of interception capacity within a given time period and is cost-effective.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Y.; Keppens, R.; Xia, C.
2016-09-10
We report our implementation of the magneto-frictional method in the Message Passing Interface Adaptive Mesh Refinement Versatile Advection Code (MPI-AMRVAC). The method aims at applications where local adaptive mesh refinement (AMR) is essential to make follow-up dynamical modeling affordable. We quantify its performance in both domain-decomposed uniform grids and block-adaptive AMR computations, using all frequently employed force-free, divergence-free, and other vector comparison metrics. As test cases, we revisit the semi-analytic solution of Low and Lou in both Cartesian and spherical geometries, along with the topologically challenging Titov–Démoulin model. We compare different combinations of spatial and temporal discretizations, and find thatmore » the fourth-order central difference with a local Lax–Friedrichs dissipation term in a single-step marching scheme is an optimal combination. The initial condition is provided by the potential field, which is the potential field source surface model in spherical geometry. Various boundary conditions are adopted, ranging from fully prescribed cases where all boundaries are assigned with the semi-analytic models, to solar-like cases where only the magnetic field at the bottom is known. Our results demonstrate that all the metrics compare favorably to previous works in both Cartesian and spherical coordinates. Cases with several AMR levels perform in accordance with their effective resolutions. The magneto-frictional method in MPI-AMRVAC allows us to model a region of interest with high spatial resolution and large field of view simultaneously, as required by observation-constrained extrapolations using vector data provided with modern instruments. The applications of the magneto-frictional method to observations are shown in an accompanying paper.« less
The Use and Abuse of Limits of Detection in Environmental Analytical Chemistry
Brown, Richard J. C.
2008-01-01
The limit of detection (LoD) serves as an important method performance measure that is useful for the comparison of measurement techniques and the assessment of likely signal to noise performance, especially in environmental analytical chemistry. However, the LoD is only truly related to the precision characteristics of the analytical instrument employed for the analysis and the content of analyte in the blank sample. This article discusses how other criteria, such as sampling volume, can serve to distort the quoted LoD artificially and make comparison between various analytical methods inequitable. In order to compare LoDs between methods properly, it is necessary to state clearly all of the input parameters relating to the measurements that have been used in the calculation of the LoD. Additionally, the article discusses that the use of LoDs in contexts other than the comparison of the attributes of analytical methods, in particular when reporting analytical results, may be confusing, less informative than quoting the actual result with an accompanying statement of uncertainty, and may act to bias descriptive statistics. PMID:18690384
Dubé, Laurette; Labban, Alice; Moubarac, Jean-Claude; Heslop, Gabriela; Ma, Yu; Paquet, Catherine
2014-12-01
Building greater reciprocity between traditional and modern food systems and better convergence of human and economic development outcomes may enable the production and consumption of accessible, affordable, and appealing nutritious food for all. Information being key to such transformations, this roadmap paper offers a strategy that capitalizes on Big Data and advanced analytics, setting the foundation for an integrative intersectoral knowledge platform to better inform and monitor behavioral change and ecosystem transformation. Building upon the four P's of marketing (product, price, promotion, placement), we examine digital commercial marketing data through the lenses of the four A's of food security (availability, accessibility, affordability, appeal) using advanced consumer choice analytics for archetypal traditional (fresh fruits and vegetables) and modern (soft drinks) product categories. We demonstrate that business practices typically associated with the latter also have an important, if not more important, impact on purchases of the former category. Implications and limitations of the approach are discussed. © 2014 New York Academy of Sciences.
Instrumental images: the visual rhetoric of self-presentation in Hevelius's Machina Coelestis.
Vertesi, Janet
2010-06-01
This article places the famous images of Johannes Hevelius's instruments in his Machina Coelestis (1673) in the context of Hevelius's contested cometary observations and his debate with Hooke over telescopic sights. Seen thus, the images promote a crafted vision of Hevelius's astronomical practice and skills, constituting a careful self-presentation to his distant professional network and a claim as to which instrumental techniques guarantee accurate observations. Reviewing the reception of the images, the article explores how visual rhetoric may be invoked and challenged in the context of controversy, and suggests renewed analytical attention to the role of laboratory imagery in instrumental cultures in the history of science.
Goesmann, Fred; Brinckerhoff, William B.; Raulin, François; Danell, Ryan M.; Getty, Stephanie A.; Siljeström, Sandra; Mißbach, Helge; Steininger, Harald; Arevalo, Ricardo D.; Buch, Arnaud; Freissinet, Caroline; Grubisic, Andrej; Meierhenrich, Uwe J.; Pinnick, Veronica T.; Stalport, Fabien; Szopa, Cyril; Vago, Jorge L.; Lindner, Robert; Schulte, Mitchell D.; Brucato, John Robert; Glavin, Daniel P.; Grand, Noel; Li, Xiang; van Amerom, Friso H. W.
2017-01-01
Abstract The Mars Organic Molecule Analyzer (MOMA) instrument onboard the ESA/Roscosmos ExoMars rover (to launch in July, 2020) will analyze volatile and refractory organic compounds in martian surface and subsurface sediments. In this study, we describe the design, current status of development, and analytical capabilities of the instrument. Data acquired on preliminary MOMA flight-like hardware and experimental setups are also presented, illustrating their contribution to the overall science return of the mission. Key Words: Mars—Mass spectrometry—Life detection—Planetary instrumentation. Astrobiology 17, 655–685.
Analytical balance-based Faraday magnetometer
NASA Astrophysics Data System (ADS)
Riminucci, Alberto; Uhlarz, Marc; De Santis, Roberto; Herrmannsdörfer, Thomas
2017-03-01
We introduce a Faraday magnetometer based on an analytical balance in which we were able to apply magnetic fields up to 0.14 T. We calibrated it with a 1 mm Ni sphere previously characterized in a superconducting quantum interference device (SQUID) magnetometer. The proposed magnetometer reached a theoretical sensitivity of 3 × 10-8 A m2. We demonstrated its operation on magnetic composite scaffolds made of poly(ɛ-caprolactone)/iron-doped hydroxyapatite. To confirm the validity of the method, we measured the same scaffold properties in a SQUID magnetometer. The agreement between the two measurements was within 5% at 0.127 T and 12% at 24 mT. With the addition, for a small cost, of a permanent magnet and computer controlled linear translators, we were thus able to assemble a Faraday magnetometer based on an analytical balance, which is a virtually ubiquitous instrument. This will make simple but effective magnetometry easily accessible to most laboratories, in particular, to life sciences ones, which are increasingly interested in magnetic materials.
NASA Technical Reports Server (NTRS)
Blacic, James D.
1992-01-01
A Teleoperated Lunar Explorer, or TOPLEX, consisting of a lunar lander payload in which a small, instrument-carrying lunar surface rover is robotically landed and teleoperated from Earth to perform extended lunar geoscience and resource evaluation traverses is proposed. The rover vehicle would mass about 100 kg and carry approximately 100 kg of analytic instruments. Four instruments are envisioned: (1) a Laser-Induced Breakdown Spectrometer (LIBS) for geochemical analysis at ranges up to 100 m, capable of operating in three different modes; (2) a combined x-ray fluorescence and x-ray diffraction (XRF/XRD) instrument for elemental and mineralogic analysis of acquired samples; (3) a mass spectrometer system for stepwise heating analysis of gases released from acquired samples; and (4) a geophysical instrument package for subsurface mapping of structures such as lava tubes.
MEMS-Based Micro Instruments for In-Situ Planetary Exploration
NASA Technical Reports Server (NTRS)
George, Thomas; Urgiles, Eduardo R; Toda, Risaku; Wilcox, Jaroslava Z.; Douglas, Susanne; Lee, C-S.; Son, Kyung-Ah; Miller, D.; Myung, N.; Madsen, L.;
2005-01-01
NASA's planetary exploration strategy is primarily targeted to the detection of extant or extinct signs of life. Thus, the agency is moving towards more in-situ landed missions as evidenced by the recent, successful demonstration of twin Mars Exploration Rovers. Also, future robotic exploration platforms are expected to evolve towards sophisticated analytical laboratories composed of multi-instrument suites. MEMS technology is very attractive for in-situ planetary exploration because of the promise of a diverse and capable set of advanced, low mass and low-power devices and instruments. At JPL, we are exploiting this diversity of MEMS for the development of a new class of miniaturized instruments for planetary exploration. In particular, two examples of this approach are the development of an Electron Luminescence X-ray Spectrometer (ELXS), and a Force-Detected Nuclear Magnetic Resonance (FDNMR) Spectrometer.
Isolation by ion-exchange methods. In Sarker S.D. (ed) Natural Products Isolation, 3rd edition
USDA-ARS?s Scientific Manuscript database
The primary goal of many natural products chemists is to extract, isolate, and characterize specific analytes from complex plant, animal, microbial, and food matrices. To achieve this goal, they rely considerably on highly sophisticated and highly hyphenated modern instrumentation. Yet, the vast maj...
Integrated Response Time Evaluation Methodology for the Nuclear Safety Instrumentation System
NASA Astrophysics Data System (ADS)
Lee, Chang Jae; Yun, Jae Hee
2017-06-01
Safety analysis for a nuclear power plant establishes not only an analytical limit (AL) in terms of a measured or calculated variable but also an analytical response time (ART) required to complete protective action after the AL is reached. If the two constraints are met, the safety limit selected to maintain the integrity of physical barriers used for preventing uncontrolled radioactivity release will not be exceeded during anticipated operational occurrences and postulated accidents. Setpoint determination methodologies have actively been developed to ensure that the protective action is initiated before the process conditions reach the AL. However, regarding the ART for a nuclear safety instrumentation system, an integrated evaluation methodology considering the whole design process has not been systematically studied. In order to assure the safety of nuclear power plants, this paper proposes a systematic and integrated response time evaluation methodology that covers safety analyses, system designs, response time analyses, and response time tests. This methodology is applied to safety instrumentation systems for the advanced power reactor 1400 and the optimized power reactor 1000 nuclear power plants in South Korea. The quantitative evaluation results are provided herein. The evaluation results using the proposed methodology demonstrate that the nuclear safety instrumentation systems fully satisfy corresponding requirements of the ART.
Walking towards Instrumental Appropriation of Mobile Devices. A Comparison of Studies
ERIC Educational Resources Information Center
Hernandez Serrano, Maria José; Yang, Lingling
2013-01-01
The study of instrumental appropriation is considered a relevant outstanding and productive perspective in the arena of Mobile ICT and learning. This paper seeks for the consolidation of this perspective at a theoretical and analytical level. Regarding the theoretical level, two characteristics of mobile devices--flexibility and mobility--are…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-06
... Analytic Methods and Sampling Procedures for the United States National Residue Program for Meat, Poultry... implementing several multi-residue methods for analyzing samples of meat, poultry, and egg products for animal.... These modern, high-efficiency methods will conserve resources and provide useful and reliable results...
Diana, Esther
2008-01-01
Around the second half of the nineteenth century, the collection of physics-mathematical instruments that Vincenzo Viviani (1622-1703) had bequeathed to the Santa Maria Nuova Hospital of Florence stirred new interest. The process of modernising the hospital was indeed to lead to the progressive alienation of the institution's rich historical patrimony, including the scientific collections. In tracing back the negotiations that led to the sale of the Viviani collection, archive documents have also brought to light the collection inventory, which is now proposed a new to help recount the history of how scientific instruments became museum collectibles in Florence.
DOT National Transportation Integrated Search
1974-01-01
This report contains the results of an experimental and analytical evaluation of instruments and techniques designed to prevent an intoxicated driver from operating his automobile. The prototype 'Alcohol Safety Interlock Systems' tested were develope...
Pure-rotational spectrometry: a vintage analytical method applied to modern breath analysis.
Hrubesh, Lawrence W; Droege, Michael W
2013-09-01
Pure-rotational spectrometry (PRS) is an established method, typically used to study structures and properties of polar gas-phase molecules, including isotopic and isomeric varieties. PRS has also been used as an analytical tool where it is particularly well suited for detecting or monitoring low-molecular-weight species that are found in exhaled breath. PRS is principally notable for its ultra-high spectral resolution which leads to exceptional specificity to identify molecular compounds in complex mixtures. Recent developments using carbon aerogel for pre-concentrating polar molecules from air samples have extended the sensitivity of PRS into the part-per-billion range. In this paper we describe the principles of PRS and show how it may be configured in several different modes for breath analysis. We discuss the pre-concentration concept and demonstrate its use with the PRS analyzer for alcohols and ammonia sampled directly from the breath.
Modern U-Pb chronometry of meteorites: advancing to higher time resolution reveals new problems
Amelin, Y.; Connelly, J.; Zartman, R.E.; Chen, J.-H.; Gopel, C.; Neymark, L.A.
2009-01-01
In this paper, we evaluate the factors that influence the accuracy of lead (Pb)-isotopic ages of meteorites, and may possibly be responsible for inconsistencies between Pb-isotopic and extinct nuclide timescales of the early Solar System: instrumental mass fractionation and other possible analytical sources of error, presence of more than one component of non-radiogenic Pb, migration of ancient radiogenic Pb by diffusion and other mechanisms, possible heterogeneity of the isotopic composition of uranium (U), uncertainties in the decay constants of uranium isotopes, possible presence of "freshly synthesized" actinides with short half-life (e.g. 234U) in the early Solar System, possible initial disequilibrium in the uranium decay chains, and potential fractionation of radiogenic Pb isotopes and U isotopes caused by alpha-recoil and subsequent laboratory treatment. We review the use of 232Th/238U values to assist in making accurate interpretations of the U-Pb ages of meteorite components. We discuss recently published U-Pb dates of calcium-aluminum-rich inclusions (CAIs), and their apparent disagreement with the extinct nuclide dates, in the context of capability and common pitfalls in modern meteorite chronology. Finally, we discuss the requirements of meteorites that are intended to be used as the reference points in building a consistent time scale of the early Solar System, based on the combined use of the U-Pb system and extinct nuclide chronometers.
Evaluation of Analytical Errors in a Clinical Chemistry Laboratory: A 3 Year Experience
Sakyi, AS; Laing, EF; Ephraim, RK; Asibey, OF; Sadique, OK
2015-01-01
Background: Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. Aim: We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. Materials and Methods: We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). Results: A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Conclusion: Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified. PMID:25745569
Generic and Automated Data Evaluation in Analytical Measurement.
Adam, Martin; Fleischer, Heidi; Thurow, Kerstin
2017-04-01
In the past year, automation has become more and more important in the field of elemental and structural chemical analysis to reduce the high degree of manual operation and processing time as well as human errors. Thus, a high number of data points are generated, which requires fast and automated data evaluation. To handle the preprocessed export data from different analytical devices with software from various vendors offering a standardized solution without any programming knowledge should be preferred. In modern laboratories, multiple users will use this software on multiple personal computers with different operating systems (e.g., Windows, Macintosh, Linux). Also, mobile devices such as smartphones and tablets have gained growing importance. The developed software, Project Analytical Data Evaluation (ADE), is implemented as a web application. To transmit the preevaluated data from the device software to the Project ADE, the exported XML report files are detected and the included data are imported into the entities database using the Data Upload software. Different calculation types of a sample within one measurement series (e.g., method validation) are identified using information tags inside the sample name. The results are presented in tables and diagrams on different information levels (general, detailed for one analyte or sample).
Instrument for Real-Time Digital Nucleic Acid Amplification on Custom Microfluidic Devices
Selck, David A.
2016-01-01
Nucleic acid amplification tests that are coupled with a digital readout enable the absolute quantification of single molecules, even at ultralow concentrations. Digital methods are robust, versatile and compatible with many amplification chemistries including isothermal amplification, making them particularly invaluable to assays that require sensitive detection, such as the quantification of viral load in occult infections or detection of sparse amounts of DNA from forensic samples. A number of microfluidic platforms are being developed for carrying out digital amplification. However, the mechanistic investigation and optimization of digital assays has been limited by the lack of real-time kinetic information about which factors affect the digital efficiency and analytical sensitivity of a reaction. Commercially available instruments that are capable of tracking digital reactions in real-time are restricted to only a small number of device types and sample-preparation strategies. Thus, most researchers who wish to develop, study, or optimize digital assays rely on the rate of the amplification reaction when performed in a bulk experiment, which is now recognized as an unreliable predictor of digital efficiency. To expand our ability to study how digital reactions proceed in real-time and enable us to optimize both the digital efficiency and analytical sensitivity of digital assays, we built a custom large-format digital real-time amplification instrument that can accommodate a wide variety of devices, amplification chemistries and sample-handling conditions. Herein, we validate this instrument, we provide detailed schematics that will enable others to build their own custom instruments, and we include a complete custom software suite to collect and analyze the data retrieved from the instrument. We believe assay optimizations enabled by this instrument will improve the current limits of nucleic acid detection and quantification, improving our fundamental
Feedback control of acoustic musical instruments: collocated control using physical analogs.
Berdahl, Edgar; Smith, Julius O; Niemeyer, Günter
2012-01-01
Traditionally, the average professional musician has owned numerous acoustic musical instruments, many of them having distinctive acoustic qualities. However, a modern musician could prefer to have a single musical instrument whose acoustics are programmable by feedback control, where acoustic variables are estimated from sensor measurements in real time and then fed back in order to influence the controlled variables. In this paper, theory is presented that describes stable feedback control of an acoustic musical instrument. The presentation should be accessible to members of the musical acoustics community who may have limited or no experience with feedback control. First, the only control strategy guaranteed to be stable subject to any musical instrument mobility is described: the sensors and actuators must be collocated, and the controller must emulate a physical analog system. Next, the most fundamental feedback controllers and the corresponding physical analog systems are presented. The effects that these controllers have on acoustic musical instruments are described. Finally, practical design challenges are discussed. A proof explains why changing the resonance frequency of a musical resonance requires much more control power than changing the decay time of the resonance. © 2012 Acoustical Society of America.
Digital signal conditioning for flight test instrumentation
NASA Technical Reports Server (NTRS)
Bever, Glenn A.
1991-01-01
An introduction to digital measurement processes on aircraft is provided. Flight test instrumentation systems are rapidly evolving from analog-intensive to digital intensive systems, including the use of onboard digital computers. The topics include measurements that are digital in origin, as well as sampling, encoding, transmitting, and storing data. Particular emphasis is placed on modern avionic data bus architectures and what to be aware of when extracting data from them. Examples of data extraction techniques are given. Tradeoffs between digital logic families, trends in digital development, and design testing techniques are discussed. An introduction to digital filtering is also covered.
Instrumental Aid by Japanese Official Development Assistance for Astronomy in Developing Countries
NASA Astrophysics Data System (ADS)
Kitamura, Masatoshi
In order to promote education and research in developing countries, the Japanese Government has been providing developing countries with high-grade equipment under the framework of the Official Development Assistance (ODA) cooperation programme since 1982. Under this successful cooperation programme, 24 astronomical instruments have been donated to 19 developing countries up to the end of the Japanese fiscal year 2003. The instruments donated included university-level reflecting telescopes, as well as modern planetaria used for educational purposes, together with various accessories. This paper describes a continuation of the previous ODA donations (Astronomical Herald 1997) and the subsequent follow-up programmes provided with the assistance of Japan International Cooperation Agency (JICA).
Vallefuoco, L; Sorrentino, R; Spalletti Cernia, D; Colucci, G; Portella, G
2012-12-01
The cobas p 630, a fully automated pre-analytical instrument for primary tube handling recently introduced to complete the Cobas(®) TaqMan systems portfolio, was evaluated in conjunction with: the COBAS(®) AmpliPrep/COBAS(®) TaqMan HBV Test, v2.0, COBAS(®) AmpliPrep/COBAS(®) TaqMan HCV Test, v1.0 and COBAS(®) AmpliPrep/COBAS(®) TaqMan HIV Test, v2.0. The instrument performance in transferring samples from primary to secondary tubes, its impact in improving COBAS(®) AmpliPrep/COBAS(®) TaqMan workflow and hands-on reduction and the risk of possible cross-contamination were assessed. Samples from 42 HBsAg positive, 42 HCV and 42 HIV antibody (Ab) positive patients as well as 21 healthy blood donors were processed with or without automated primary tubes. HIV, HCV and HBsAg positive samples showed a correlation index of 0.999, 0.987 and of 0.994, respectively. To assess for cross-contamination, high titer HBV DNA positive samples, HCV RNA and HIV RNA positive samples were distributed in the cobas p 630 in alternate tube positions, adjacent to negative control samples within the same rack. None of the healthy donor samples showed any reactivity. Based on these results, the cobas p 630 can improve workflow and sample tracing in laboratories performing molecular tests, and reduce turnaround time, errors, and risks. Copyright © 2012 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Todd, Amber; Romine, William L.; Cook Whitt, Katahdin
2017-01-01
We describe the development, validation, and use of the "Learning Progression-Based Assessment of Modern Genetics" (LPA-MG) in a high school biology context. Items were constructed based on a current learning progression framework for genetics (Shea & Duncan, 2013; Todd & Kenyon, 2015). The 34-item instrument, which was tied to…
Dialogue on Modernity and Modern Education in Dispute
ERIC Educational Resources Information Center
Baker, Michael; Peters, Michael A.
2012-01-01
This is a dialogue or conversation between Michael Baker (MB) and Michael A. Peters (MP) on the concept of modernity and its significance for educational theory. The dialogue took place originally as a conversation about a symposium on modernity held at the American Educational Studies Association meeting 2010. It was later developed for…
Development of a canopy Solar-induced chlorophyll fluorescence measurement instrument
NASA Astrophysics Data System (ADS)
Sun, G.; Wang, X.; Niu, Zh; Chen, F.
2014-02-01
A portable solar-induced chlorophyll fluorescence detecting instrument based on Fraunhofer line principle was designed and tested. The instrument has a valid survey area of 1.3 × 1.3 meter when the height was fixed to 1.3 meter. The instrument uses sunlight as its light source. The instrument is quipped with two sets of special photoelectrical detectors with the centre wavelength at 760 nm and 771 nm respectively and bandwidth less than 1nm. Both sets of detectors are composed of an upper detector which are used for detecting incidence sunlight and a bottom detector which are used for detecting reflex light from the canopy of crop. This instrument includes photoelectric detector module, signal process module, A/D convert module, the data storage and upload module and human-machine interface module. The microprocessor calculates solar-induced fluorescence value based on the A/D values get from detectors. And the value can be displayed on the instrument's LCD, stored in the flash memory of instrument and can also be uploaded to PC through the PC's serial interface. The prototype was tested in the crop field and the results demonstrate that the instrument can measure the solar-induced chlorophyll value exactly with the correlation coefficients was 0.9 compared to the values got from Analytical Spectral Devices FieldSpec Pro spectrometer. This instrument can diagnose the plant growth status by the acquired spectral response.
Rodushkin, I; Bergman, T; Douglas, G; Engström, E; Sörlin, D; Baxter, D C
2007-02-05
Different analytical approaches for origin differentiation between vendace and whitefish caviars from brackish- and freshwaters were tested using inductively coupled plasma double focusing sector field mass spectrometry (ICP-SFMS) and multi-collector inductively coupled plasma mass spectrometry (MC-ICP-MS). These approaches involve identifying differences in elemental concentrations or sample-specific isotopic composition (Sr and Os) variations. Concentrations of 72 elements were determined by ICP-SFMS following microwave-assisted digestion in vendace and whitefish caviar samples from Sweden (from both brackish and freshwater), Finland and USA, as well as in unprocessed vendace roe and salt used in caviar production. This data set allows identification of elements whose contents in caviar can be affected by salt addition as well as by contamination during production and packaging. Long-term method reproducibility was assessed for all analytes based on replicate caviar preparations/analyses and variations in element concentrations in caviar from different harvests were evaluated. The greatest utility for differentiation was demonstrated for elements with varying concentrations between brackish and freshwaters (e.g. As, Br, Sr). Elemental ratios, specifically Sr/Ca, Sr/Mg and Sr/Ba, are especially useful for authentication of vendace caviar processed from brackish water roe, due to the significant differences between caviar from different sources, limited between-harvest variations and relatively high concentrations in samples, allowing precise determination by modern analytical instrumentation. Variations in the 87Sr/86Sr ratio for vendace caviar from different harvests (on the order of 0.05-0.1%) is at least 10-fold less than differences between caviar processed from brackish and freshwater roe. Hence, Sr isotope ratio measurements (either by ICP-SFMS or by MC-ICP-MS) have great potential for origin differentiation. On the contrary, it was impossible to
The Development of Proofs in Analytical Mathematics for Undergraduate Students
NASA Astrophysics Data System (ADS)
Ali, Maselan; Sufahani, Suliadi; Hasim, Nurnazifa; Saifullah Rusiman, Mohd; Roslan, Rozaini; Mohamad, Mahathir; Khalid, Kamil
2018-04-01
Proofs in analytical mathematics are essential parts of mathematics, difficult to learn because its underlying concepts are not visible. This research consists of problems involving logic and proofs. In this study, a short overview was provided on how proofs in analytical mathematics were used by university students. From the results obtained, excellent students obtained better scores compared to average and poor students. The research instruments used in this study consisted of two parts: test and interview. In this way, analysis of students’ actual performances can be obtained. The result of this study showed that the less able students have fragile conceptual and cognitive linkages but the more able students use their strong conceptual linkages to produce effective solutions
Assessing Proposals for New Global Health Treaties: An Analytic Framework.
Hoffman, Steven J; Røttingen, John-Arne; Frenk, Julio
2015-08-01
We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties.
Assessing Proposals for New Global Health Treaties: An Analytic Framework
Røttingen, John-Arne; Frenk, Julio
2015-01-01
We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties. PMID:26066926
NASA Astrophysics Data System (ADS)
Marshall, T. C.; Stolzenburg, M.
2006-12-01
One of Benjamin Franklin's most famous experiments was the kite experiment, which showed that thunderstorms are electrically charged. It is not as commonly noted that the kite experiment was also one of the the first attempts to make an in situ measurement of any storm parameter. Franklin realized the importance of making measurements close to and within storms, and this realization has been shared by later atomspheric scientists. In this presentation we focus on a modern version of Franklin's kite--instrumented balloons--used for in situ measurements of electric field and other storm parameters. In particular, most of our knowledge of the charge structure inside thunderstorms is based on balloon soundings of electric field. Balloon measurements of storm electricity began with the work of Simpson and colleagues in the 1930's and 1940's. The next major instrumentation advances were made by Winn and colleagues in the 1970's and 1980's. Today's instruments are digital versions of the Winn design. We review the main instrument techniques that have allowed balloons to be the worthy successors to kites. We also discuss some of the key advances in our understanding of thunderstorm electrification made with in situ balloon-borne instruments.
ERIC Educational Resources Information Center
Economou, A.; Papargyris, D.; Stratis, J.
2004-01-01
The development of an FI analyzer for chemiluminescence detection using a low-cost photoiodide is presented. The experiment clearly demonstrates in a single interdisciplinary project the way in which different aspects in chemical instrumentation fit together to produce a working analytical system.
NASA Astrophysics Data System (ADS)
Parro, Víctor; Fernández-Calvo, Patricia; Rodríguez Manfredi, José A.; Moreno-Paz, Mercedes; Rivas, Luis A.; García-Villadangos, Miriam; Bonaccorsi, Rosalba; González-Pastor, José Eduardo; Prieto-Ballesteros, Olga; Schuerger, Andrew C.; Davidson, Mark; Gómez-Elvira, Javier; Stoker, Carol R.
2008-10-01
A field prototype of an antibody array-based life-detector instrument, Signs Of LIfe Detector (SOLID2), has been tested in a Mars drilling mission simulation called MARTE (Mars Astrobiology Research and Technology Experiment). As one of the analytical instruments on the MARTE robotic drilling rig, SOLID2 performed automatic sample processing and analysis of ground core samples (0.5 g) with protein microarrays that contained 157 different antibodies. Core samples from different depths (down to 5.5 m) were analyzed, and positive reactions were obtained in antibodies raised against the Gram-negative bacterium Leptospirillum ferrooxidans, a species of the genus Acidithiobacillus (both common microorganisms in the Río Tinto area), and extracts from biofilms and other natural samples from the Río Tinto area. These positive reactions were absent when the samples were previously subjected to a high-temperature treatment, which indicates the biological origin and structural dependency of the antibody-antigen reactions. We conclude that an antibody array-based life-detector instrument like SOLID2 can detect complex biological material, and it should be considered as a potential analytical instrument for future planetary missions that search for life.
Parro, Víctor; Fernández-Calvo, Patricia; Rodríguez Manfredi, José A; Moreno-Paz, Mercedes; Rivas, Luis A; García-Villadangos, Miriam; Bonaccorsi, Rosalba; González-Pastor, José Eduardo; Prieto-Ballesteros, Olga; Schuerger, Andrew C; Davidson, Mark; Gómez-Elvira, Javier; Stoker, Carol R
2008-10-01
A field prototype of an antibody array-based life-detector instrument, Signs Of LIfe Detector (SOLID2), has been tested in a Mars drilling mission simulation called MARTE (Mars Astrobiology Research and Technology Experiment). As one of the analytical instruments on the MARTE robotic drilling rig, SOLID2 performed automatic sample processing and analysis of ground core samples (0.5 g) with protein microarrays that contained 157 different antibodies. Core samples from different depths (down to 5.5 m) were analyzed, and positive reactions were obtained in antibodies raised against the Gram-negative bacterium Leptospirillum ferrooxidans, a species of the genus Acidithiobacillus (both common microorganisms in the Río Tinto area), and extracts from biofilms and other natural samples from the Río Tinto area. These positive reactions were absent when the samples were previously subjected to a high-temperature treatment, which indicates the biological origin and structural dependency of the antibody-antigen reactions. We conclude that an antibody array-based life-detector instrument like SOLID2 can detect complex biological material, and it should be considered as a potential analytical instrument for future planetary missions that search for life.
Generic System for Remote Testing and Calibration of Measuring Instruments: Security Architecture
NASA Astrophysics Data System (ADS)
Jurčević, M.; Hegeduš, H.; Golub, M.
2010-01-01
Testing and calibration of laboratory instruments and reference standards is a routine activity and is a resource and time consuming process. Since many of the modern instruments include some communication interfaces, it is possible to create a remote calibration system. This approach addresses a wide range of possible applications and permits to drive a number of different devices. On the other hand, remote calibration process involves a number of security issues due to recommendations specified in standard ISO/IEC 17025, since it is not under total control of the calibration laboratory personnel who will sign the calibration certificate. This approach implies that the traceability and integrity of the calibration process directly depends on the collected measurement data. The reliable and secure remote control and monitoring of instruments is a crucial aspect of internet-enabled calibration procedure.
Computational methods in the pricing and risk management of modern financial derivatives
NASA Astrophysics Data System (ADS)
Deutsch, Hans-Peter
1999-09-01
In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2014-01-01
Monitoring the misuse of drugs and the abuse of substances and methods potentially or evidently improving athletic performance by analytical chemistry strategies is one of the main pillars of modern anti-doping efforts. Owing to the continuously growing knowledge in medicine, pharmacology, and (bio)chemistry, new chemical entities are frequently established and developed, various of which present a temptation for sportsmen and women due to assumed/attributed beneficial effects of such substances and preparations on, for example, endurance, strength, and regeneration. By means of new technologies, expanded existing test protocols, new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA), analytical assays have been further improved in agreement with the content of the 2013 Prohibited List. In this annual banned-substance review, literature concerning human sports drug testing that was published between October 2012 and September 2013 is summarized and reviewed with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2013 John Wiley & Sons, Ltd.
Career Decision Statuses among Portuguese Secondary School Students: A Cluster Analytical Approach
ERIC Educational Resources Information Center
Santos, Paulo Jorge; Ferreira, Joaquim Armando
2012-01-01
Career indecision is a complex phenomenon and an increasing number of authors have proposed that undecided individuals do not form a group with homogeneous characteristics. This study examines career decision statuses among a sample of 362 12th-grade Portuguese students. A cluster-analytical procedure, based on a battery of instruments designed to…
Validating multiplexes for use in conjunction with modern interpretation strategies.
Taylor, Duncan; Bright, Jo-Anne; McGoven, Catherine; Hefford, Christopher; Kalafut, Tim; Buckleton, John
2016-01-01
In response to requests from the forensic community, commercial companies are generating larger, more sensitive, and more discriminating STR multiplexes. These multiplexes are now applied to a wider range of samples including complex multi-person mixtures. In parallel there is an overdue reappraisal of profile interpretation methodology. Aspects of this reappraisal include 1. The need for a quantitative understanding of allele and stutter peak heights and their variability, 2. An interest in reassessing the utility of smaller peaks below the often used analytical threshold, 3. A need to understand not just the occurrence of peak drop-in but also the height distribution of such peaks, and 4. A need to understand the limitations of the multiplex-interpretation strategy pair implemented. In this work we present a full scheme for validation of a new multiplex that is suitable for informing modern interpretation practice. We predominantly use GlobalFiler™ as an example multiplex but we suggest that the aspects investigated here are fundamental to introducing any multiplex in the modern interpretation environment. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ivanov, Yu. A.
2007-12-01
An analytical review is given of Russian and foreign measurement instruments employed in a system for automatically monitoring the water chemistry of the reactor coolant circuit and used in the development of projects of nuclear power stations equipped with VVER-1000 reactors and the nuclear station project AES 2006. The results of experience gained from the use of such measurement instruments at nuclear power stations operating in Russia and abroad are presented.
Elemental analyses of modern dust in southern Nevada and California
Reheis, M.C.; Budahn, J.R.; Lamothe, P.J.
1999-01-01
Selected samples of modern dust collected in marble traps at sites in southern Nevada and California (Reheis and Kihl, 1995; Reheis, 1997) have been analyzed for elemental composition using instrumental neutron activation analysis (INAA) and inductively coupled plasma atomic emission spectroscopy (ICP-AES) and inductively coupled plasma mass spectroscopy (ICP-MS). For information on these analytical techniques and their levels of precision and accuracy, refer to Baedecker and McKown (1987) for INAA, to Briggs (1996) for ICP-AES, and to Briggs and Meier (1999) for ICP-MS. This report presents the elemental compositions obtained using these techniques on dust samples collected from 1991 through 1997.The dust-trap sites were established at varying times; some have been maintained since 1984, others since 1991. For details on site location, dust-trap construction, and collection techniques, see Reheis and Kihl (1995) and Reheis (1997). Briefly, the trap consists of a coated angel-food cake pan painted black on the outside and mounted on a post about 2 m above the ground. Glass marbles rest on a circular piece of galvanized hardware cloth (now replaced by stainless-steel mesh), which is fitted into the pan so that it rests 3-4 cm below the rim. The 2-m height eliminates most saltating sand-sized particles. The marbles simulate the effect of a gravelly fan surface and prevent dust that has filtered or washed into the bottom of the pan from being blown back out. The dust traps are fitted with two metal straps looped in an inverted basket shape; the top surfaces of the straps are coated with a sticky material that effectively discourages birds from roosting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hallbert, Bruce Perry; Thomas, Kenneth David
2015-10-01
Reliable instrumentation, information, and control (II&C) systems technologies are essential to ensuring safe and efficient operation of the U.S. light water reactor (LWR) fleet. These technologies affect every aspect of nuclear power plant (NPP) and balance-of-plant operations. In 1997, the National Research Council conducted a study concerning the challenges involved in modernization of digital instrumentation and control systems in NPPs. Their findings identified the need for new II&C technology integration.
ERIC Educational Resources Information Center
Gaševic, Dragan; Jovanovic, Jelena; Pardo, Abelardo; Dawson, Shane
2017-01-01
The use of analytic methods for extracting learning strategies from trace data has attracted considerable attention in the literature. However, there is a paucity of research examining any association between learning strategies extracted from trace data and responses to well-established self-report instruments and performance scores. This paper…
Nuclear weapons modernizations
NASA Astrophysics Data System (ADS)
Kristensen, Hans M.
2014-05-01
This article reviews the nuclear weapons modernization programs underway in the world's nine nuclear weapons states. It concludes that despite significant reductions in overall weapons inventories since the end of the Cold War, the pace of reductions is slowing - four of the nuclear weapons states are even increasing their arsenals, and all the nuclear weapons states are busy modernizing their remaining arsenals in what appears to be a dynamic and counterproductive nuclear competition. The author questions whether perpetual modernization combined with no specific plan for the elimination of nuclear weapons is consistent with the nuclear Non-Proliferation Treaty and concludes that new limits on nuclear modernizations are needed.
ERIC Educational Resources Information Center
Shaban, Zakariyya Shaban
2015-01-01
This study aimed to investigate arrange of include communication skills text books modernism and contemporary value, and is there experience sequence, and the study tried to determine the orientation behind this concentration. A list of values included 10th modernism and contemporary values. Content analysis was used as a tool in collecting data,…
There have been a number of revolutionary developments during the past decade that have led to a much more comprehensive understanding of per- and polyfluoroalkyl substances (PFASs) in the environment. Improvements in analytical instrumentation have made liquid chromatography tri...
ERIC Educational Resources Information Center
Feng, Z. Vivian; Buchman, Joseph T.
2012-01-01
The potential of replacing petroleum fuels with renewable biofuels has drawn significant public interest. Many states have imposed biodiesel mandates or incentives to use commercial biodiesel blends. We present an inquiry-driven experiment where students are given the tasks to gather samples, develop analytical methods using various instrumental…
MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.
Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang
2005-01-01
MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.
MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory
Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang
2005-01-01
MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721
Model of personal consumption under conditions of modern economy
NASA Astrophysics Data System (ADS)
Rakhmatullina, D. K.; Akhmetshina, E. R.; Ignatjeva, O. A.
2017-12-01
In the conditions of the modern economy, in connection with the development of production, the expansion of the market for goods and services, its differentiation, active use of marketing tools in the sphere of sales, changes occur in the system of values and consumer needs. Motives that drive the consumer are transformed, stimulating it to activity. The article presents a model of personal consumption that takes into account modern trends in consumer behavior. The consumer, making a choice, seeks to maximize the overall utility from consumption, physiological and socio-psychological satisfaction, in accordance with his expectations, preferences and conditions of consumption. The system of his preferences is formed under the influence of factors of a different nature. It is also shown that the structure of consumer spending allows us to characterize and predict its further behavior in the market. Based on the proposed model and analysis of current trends in consumer behavior, conclusions and recommendations have been made that can be used by legislative and executive government bodies, business organizations, research centres and other structures to form a methodological and analytical tool for preparing a forecast model of consumption.
Immersion versus interactivity and analytic field.
Civitarese, Giuseppe
2008-04-01
Losing oneself in a story, a film or a picture is nothing but another step in the suspension of disbelief that permits one to become immersed in the 'novel' of reality. It is not by chance that the text-world metaphor informs classical aesthetics that, more than anything else, emphasizes emotional involvement. On the contrary, as in much of modern art, self-reflexivity and metafictional attention to the rhetoric of the real, to the framework, to the conventions and to the processes of meaning production, all involve a disenchanted, detached and sceptic vision--in short, an aesthetics of the text as game. By analogy, any analytic style or model that aims to produce a transformative experience must satisfactorily resolve the conflict between immersion (the analyst's emotional participation and sticking to the dreamlike or fictional climate of the session, dreaming knowing it's a dream) and interactivity (for the most part, interpretation as an anti-immersive device that 'wakes' one from fiction and demystifies consciousness). In analytic field theory the setting can be defined--because of the weight given to performativity of language, to the sensory matrix of the transference and the transparency of the medium--the place where an ideal balance is sought between immersion and interaction.
State-of-the-art Instruments for Detecting Extraterrestrial Life
NASA Technical Reports Server (NTRS)
Bada, Jeffrey L.
2003-01-01
In the coming decades, state-of-the-art spacecraft-based instruments that can detect key components associated with life as we know it on Earth will directly search for extinct or extant extraterrestrial life in our solar system. Advances in our analytical and detection capabilities, especially those based on microscale technologies, will be important in enhancing the abilities of these instruments. Remote sensing investigations of the atmospheres of extrasolar planets could provide evidence of photosynthetic-based life outside our solar system, although less advanced life will remain undetectable by these methods. Finding evidence of extraterrestrial life would have profound consequences both with respect to our understanding of chemical and biological evolution, and whether the biochemistry on Earth is unique in the universe.
Mikhnevo: from seismic station no. 1 to a modern geophysical observatory
NASA Astrophysics Data System (ADS)
Adushkin, V. V.; Ovchinnikov, V. M.; Sanina, I. A.; Riznichenko, O. Yu.
2016-01-01
The Mikhnevo seismic station was founded in accordance with directive no. 1134 RS of the Council of Ministers of the Soviet Union of February 6, 1954. The station, installed south of Moscow, began its operations on monitoring nuclear tests in the United States and England in 1954. For dozens of years this station was the leading experimental base for elaborating new technical solutions and methods for monitoring nuclear explosions, equipped with modern seismological instruments. At present, the focus of activities has been moved from military applications to fundamental geophysical research. The station preserves its leading position in seismological observations due to the development of national high-performance digital instruments and creation of the small-aperture seismic array, the only one in the central part of European Russia, which is capable of recording weak seismic events with M L ≥ 1.5 within a distance of 100 km.
Low-Level Analytical Methodology Updates to Support Decontaminant Performance Evaluations
2011-06-01
from EPDM and tire rubber coupon materials that were spiked with a known amount of the chemical agent VX, treated with bleach decontaminant, and...to evaluate the performance of bleach decontaminant on EPDM and tire rubber coupons. Dose-confirmation or Tool samples were collected by delivering...components • An aging or damaged analytical column • Dirty detector • Other factors related to general instrument and/or sample analysis performance
Nuclear weapons modernizations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kristensen, Hans M.
This article reviews the nuclear weapons modernization programs underway in the world's nine nuclear weapons states. It concludes that despite significant reductions in overall weapons inventories since the end of the Cold War, the pace of reductions is slowing - four of the nuclear weapons states are even increasing their arsenals, and all the nuclear weapons states are busy modernizing their remaining arsenals in what appears to be a dynamic and counterproductive nuclear competition. The author questions whether perpetual modernization combined with no specific plan for the elimination of nuclear weapons is consistent with the nuclear Non-Proliferation Treaty and concludesmore » that new limits on nuclear modernizations are needed.« less
Applications of the Analytical Electron Microscope to Materials Science
NASA Technical Reports Server (NTRS)
Goldstein, J. I.
1992-01-01
In the last 20 years, the analytical electron microscope (AEM) as allowed investigators to obtain chemical and structural information from less than 50 nanometer diameter regions in thin samples of materials and to explore problems where reactions occur at boundaries and interfaces or within small particles or phases in bulk samples. Examples of the application of the AEM to materials science problems are presented in this paper and demonstrate the usefulness and the future potential of this instrument.
Boggess, Andrew; Crump, Stephen; Gregory, Clint; ...
2017-12-06
Here, unique hazards are presented in the analysis of radiologically contaminated samples. Strenuous safety and security precautions must be in place to protect the analyst, laboratory, and instrumentation used to perform analyses. A validated method has been optimized for the analysis of select nitroaromatic explosives and degradative products using gas chromatography/mass spectrometry via sonication extraction of radiologically contaminated soils, for samples requiring ISO/IEC 17025 laboratory conformance. Target analytes included 2-nitrotoluene, 4-nitrotoluene, 2,6-dinitrotoluene, and 2,4,6-trinitrotoluene, as well as the degradative product 4-amino-2,6-dinitrotoluene. Analytes were extracted from soil in methylene chloride by sonication. Administrative and engineering controls, as well as instrument automationmore » and quality control measures, were utilized to minimize potential human exposure to radiation at all times and at all stages of analysis, from receiving through disposition. Though thermal instability increased uncertainties of these selected compounds, a mean lower quantitative limit of 2.37 µg/mL and mean accuracy of 2.3% relative error and 3.1% relative standard deviation were achieved. Quadratic regression was found to be optimal for calibration of all analytes, with compounds of lower hydrophobicity displaying greater parabolic curve. Blind proficiency testing (PT) of spiked soil samples demonstrated a mean relative error of 9.8%. Matrix spiked analyses of PT samples demonstrated that 99% recovery of target analytes was achieved. To the knowledge of the authors, this represents the first safe, accurate, and reproducible quantitative method for nitroaromatic explosives in soil for specific use on radiologically contaminated samples within the constraints of a nuclear analytical lab.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggess, Andrew; Crump, Stephen; Gregory, Clint
Here, unique hazards are presented in the analysis of radiologically contaminated samples. Strenuous safety and security precautions must be in place to protect the analyst, laboratory, and instrumentation used to perform analyses. A validated method has been optimized for the analysis of select nitroaromatic explosives and degradative products using gas chromatography/mass spectrometry via sonication extraction of radiologically contaminated soils, for samples requiring ISO/IEC 17025 laboratory conformance. Target analytes included 2-nitrotoluene, 4-nitrotoluene, 2,6-dinitrotoluene, and 2,4,6-trinitrotoluene, as well as the degradative product 4-amino-2,6-dinitrotoluene. Analytes were extracted from soil in methylene chloride by sonication. Administrative and engineering controls, as well as instrument automationmore » and quality control measures, were utilized to minimize potential human exposure to radiation at all times and at all stages of analysis, from receiving through disposition. Though thermal instability increased uncertainties of these selected compounds, a mean lower quantitative limit of 2.37 µg/mL and mean accuracy of 2.3% relative error and 3.1% relative standard deviation were achieved. Quadratic regression was found to be optimal for calibration of all analytes, with compounds of lower hydrophobicity displaying greater parabolic curve. Blind proficiency testing (PT) of spiked soil samples demonstrated a mean relative error of 9.8%. Matrix spiked analyses of PT samples demonstrated that 99% recovery of target analytes was achieved. To the knowledge of the authors, this represents the first safe, accurate, and reproducible quantitative method for nitroaromatic explosives in soil for specific use on radiologically contaminated samples within the constraints of a nuclear analytical lab.« less
Fonseca, Alexandre Brasil; de Souza, Thaís Salema Nogueira; Frozi, Daniela Sanches; Pereira, Rosangela Alves
2011-09-01
The scope of this work was to illustrate what dietary modernity represents for sociology and anthropology, which is a subject based on a bibliographic review that is discussed in this article. Initially, the presence of the theme of food and nutrition was assessed in studies in the social sciences, by focusing on the approaches related to dietary modernity, especially as found in the works of Claude Fischler. The main subjects of discussion were related to food and nutrition and changes in the work environment, the expansion of commerce, the feminization of society and the question of identity. By understanding the food phenomenon and consumption thereof using a more qualitative approach, it is possible to make progress in configuring the nutritional sciences, adopting a comprehensive approach to food and nutrition in this day and age. Future studies should be dedicated to investigating food consumption as a social phenomenon in order to aggregate new analytical components with a biomedical emphasis to the body of results.
A Review of Calibration Transfer Practices and Instrument Differences in Spectroscopy.
Workman, Jerome J
2018-03-01
Calibration transfer for use with spectroscopic instruments, particularly for near-infrared, infrared, and Raman analysis, has been the subject of multiple articles, research papers, book chapters, and technical reviews. There has been a myriad of approaches published and claims made for resolving the problems associated with transferring calibrations; however, the capability of attaining identical results over time from two or more instruments using an identical calibration still eludes technologists. Calibration transfer, in a precise definition, refers to a series of analytical approaches or chemometric techniques used to attempt to apply a single spectral database, and the calibration model developed using that database, for two or more instruments, with statistically retained accuracy and precision. Ideally, one would develop a single calibration for any particular application, and move it indiscriminately across instruments and achieve identical analysis or prediction results. There are many technical aspects involved in such precision calibration transfer, related to the measuring instrument reproducibility and repeatability, the reference chemical values used for the calibration, the multivariate mathematics used for calibration, and sample presentation repeatability and reproducibility. Ideally, a multivariate model developed on a single instrument would provide a statistically identical analysis when used on other instruments following transfer. This paper reviews common calibration transfer techniques, mostly related to instrument differences, and the mathematics of the uncertainty between instruments when making spectroscopic measurements of identical samples. It does not specifically address calibration maintenance or reference laboratory differences.
Salinas-Castillo, Alfonso; Morales, Diego P; Lapresta-Fernández, Alejandro; Ariza-Avidad, María; Castillo, Encarnación; Martínez-Olmos, Antonio; Palma, Alberto J; Capitan-Vallvey, Luis Fermin
2016-04-01
A portable reconfigurable platform for copper (Cu(II)) determination based on luminescent carbon dot (Cdots) quenching is described. The electronic setup consists of a light-emitting diode (LED) as the carbon dot optical exciter and a photodiode as a light-to-current converter integrated in the same instrument. Moreover, the overall analog conditioning is simply performed with one integrated solution, a field-programmable analog array (FPAA), which makes it possible to reconfigure the filter and gain stages in real time. This feature provides adaptability to use the platform as an analytical probe for carbon dots coming from different batches with some variations in luminescence characteristics. The calibration functions obtained that fit a modified Stern-Volmer equation were obtained using luminescence signals from Cdots quenching by Cu(II). The analytical applicability of the reconfigurable portable instrument for Cu(II) using Cdots has been successfully demonstrated in tap water analysis.
Total organic carbon (TOC) and dissolved organic carbon (DOC) have long been used to estimate the amount of natural organic matter (NOM) found in raw and finished drinking water. In recent years, computer automation and improved instrumental analysis technologies have created a ...
Gyroscopic Instruments for Instrument Flying
NASA Technical Reports Server (NTRS)
Brombacher, W G; Trent, W C
1938-01-01
The gyroscopic instruments commonly used in instrument flying in the United States are the turn indicator, the directional gyro, the gyromagnetic compass, the gyroscopic horizon, and the automatic pilot. These instruments are described. Performance data and the method of testing in the laboratory are given for the turn indicator, the directional gyro, and the gyroscopic horizon. Apparatus for driving the instruments is discussed.
NASA Astrophysics Data System (ADS)
Black, S.; Hynek, B. M.; Kierein-Young, K. S.; Avard, G.; Alvarado-Induni, G.
2015-12-01
Proper characterization of mineralogy is an essential part of geologic interpretation. This process becomes even more critical when attempting to interpret the history of a region remotely, via satellites and/or landed spacecraft. Orbiters and landed missions to Mars carry with them a wide range of analytical tools to aid in the interpretation of Mars' geologic history. However, many instruments make a single type of measurement (e.g., APXS: elemental chemistry; XRD: mineralogy), and multiple data sets must be utilized to develop a comprehensive understanding of a sample. Hydrothermal alteration products often exist in intimate mixtures, and vary widely across a site due to changing pH, temperature, and fluid/gas chemistries. These characteristics require that we develop a detailed understanding regarding the possible mineral mixtures that may exist, and their detectability in different instrument data sets. This comparative analysis study utilized several analytical methods on existing or planned Mars rovers (XRD Raman, LIBS, Mössbauer, and APXS) combined with additional characterization (thin section, VNIR, XRF, SEM-EMP) to develop a comprehensive suite of data for hydrothermal alteration products collected from Poás and Turrialba volcanoes in Costa Rica. Analyzing the same samples across a wide range of instruments allows for direct comparisons of results, and identification of instrumentation "blind spots." This provides insight into the ability of in-situ analyses to comprehensively characterize sites on Mars exhibiting putative hydrothermal characteristics, such as the silica and sulfate deposits at Gusev crater [eg: Squyres et al., 2008], as well as valuable information for future mission planning and data interpretation. References: Squyres et al. (2008), Detection of Silica-Rich Deposits on Mars, Science, 320, 1063-1067, doi:10.1126/science.1155429.
Meta-analytic guidelines for evaluating single-item reliabilities of personality instruments.
Spörrle, Matthias; Bekk, Magdalena
2014-06-01
Personality is an important predictor of various outcomes in many social science disciplines. However, when personality traits are not the principal focus of research, for example, in global comparative surveys, it is often not possible to assess them extensively. In this article, we first provide an overview of the advantages and challenges of single-item measures of personality, a rationale for their construction, and a summary of alternative ways of assessing their reliability. Second, using seven diverse samples (Ntotal = 4,263) we develop the SIMP-G, the German adaptation of the Single-Item Measures of Personality, an instrument assessing the Big Five with one item per trait, and evaluate its validity and reliability. Third, we integrate previous research and our data into a first meta-analysis of single-item reliabilities of personality measures, and provide researchers with guidelines and recommendations for the evaluation of single-item reliabilities. © The Author(s) 2013.
NASA Astrophysics Data System (ADS)
Gerontas, Apostolos
2014-08-01
Chromatographic instrumentation has been really influential in shaping the modern chemical practice, and yet it has been largely overlooked by history of science.Gas chromatography in the 1960s was considered the analytical technique closer to becoming dominant, and being the first automated chromatography set the standards that all the subsequent chromatographic instrumentation had to fulfill. Networks of specialists, groups of actors, corporate strategies and the analytical practice itself, were all affected and in many ways because of the entrance of gas chromatography in the chemical laboratory and in the instrumentation market. This paper gives a view of the early history of the gas chromatography instrumentation, relates it to the broader research-technology phenomenon and discusses issues of education and group reproduction in the case of the groups of technologists of the era. The chaotic elements of knowledge transfer during the instrumentation revolution in chemistry are being highlighted and they are being connected to the observable radical innovation of the period.
NASA Astrophysics Data System (ADS)
Heyd, R. S.; McArthur, G. A.; Leis, R.; Fennema, A.; Wolf, N.; Schaller, C. J.; Sutton, S.; Plassmann, J.; Forrester, T.; Fine, K.
2018-04-01
The HiRISE ground data system is a mature data processing system in operation for over 12 years. The experience gained from this system will be applied to developing a new and more modern GDS to process data from the CaSSIS instrument.
Paper-based analytical devices for environmental analysis.
Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S
2016-03-21
The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.
2013-12-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.
2014-01-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to
Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd
2015-01-01
The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585
Ottaway, Josh; Farrell, Jeremy A; Kalivas, John H
2013-02-05
An essential part to calibration is establishing the analyte calibration reference samples. These samples must characterize the sample matrix and measurement conditions (chemical, physical, instrumental, and environmental) of any sample to be predicted. Calibration usually requires measuring spectra for numerous reference samples in addition to determining the corresponding analyte reference values. Both tasks are typically time-consuming and costly. This paper reports on a method named pure component Tikhonov regularization (PCTR) that does not require laboratory prepared or determined reference values. Instead, an analyte pure component spectrum is used in conjunction with nonanalyte spectra for calibration. Nonanalyte spectra can be from different sources including pure component interference samples, blanks, and constant analyte samples. The approach is also applicable to calibration maintenance when the analyte pure component spectrum is measured in one set of conditions and nonanalyte spectra are measured in new conditions. The PCTR method balances the trade-offs between calibration model shrinkage and the degree of orthogonality to the nonanalyte content (model direction) in order to obtain accurate predictions. Using visible and near-infrared (NIR) spectral data sets, the PCTR results are comparable to those obtained using ridge regression (RR) with reference calibration sets. The flexibility of PCTR also allows including reference samples if such samples are available.
Modern Climate Analogues of Late-Quaternary Paleoclimates for the Western United States.
NASA Astrophysics Data System (ADS)
Mock, Cary Jeffrey
mountainous areas as suggested by paleoclimatic evidence. Modern analogues of past climates supplement modeling approaches by providing information below the resolution of model simulations. Analogues can be used to examine the controls of spatial paleoclimatic variation if sufficient instrumental data and paleoclimatic evidence are available, and if one carefully exercises uniformitarianism when extrapolating modern relationships to the past.
Mineralogy: a modern approach to teaching a traditional discipline
NASA Astrophysics Data System (ADS)
Cook, G. W.
2011-12-01
Mineralogy has traditionally been a primary component in undergraduate geoscience curriculum. In recent years, there has been a trend in which mineralogy and petrology have been combined into Earth Materials courses. This is unfortunate as these disciplines each have much to offer students, and content once considered essential is eliminated out of necessity. Mineralogy is still fundamental to students' understanding of the Earth and Earth processes. Using a modern approach to time-honored concepts, I teach a quarter-long Introductory Mineralogy class offered through the Scripps Institution of Oceanography at the University of California, San Diego. Student evaluations of this course unequivocally indicate a high degree of learning and interest in the material, confirming that mineralogy continues to be a valuable class into the 21st century. While much of the content remains similar to what has been taught over the last century, my strategy involves a well-balanced approach to old and new. The first third of the course is background including the relevance of mineralogy, crystal chemistry, and crystallography; the second third of the course is systematic mineralogy using the Dana system; the last third of the course is devoted to understanding optical mineralogy, using modern analytical equipment such as XRD and SEM, and learning to use the petrographic microscope. Throughout the quarter, a strong emphasis is placed on the importance of hand-sample identification. Field work, traditionally not emphasized in mineralogy courses, has been re-introduced to the curriculum. I use modern technology to facilitate and support student learning. A lecture-based approach is employed with carefully crafted and organized PowerPoint presentations. PowerPoint lectures can be effective and highly engaging. The key is to ensure that the lectures are not overly reliant on text, instead relying on diagrams, charts, photos, and embedded media such as 3-D animations (ex. to teach
Current advances in synchrotron radiation instrumentation for MX experiments
Owen, Robin L.; Juanhuix, Jordi; Fuchs, Martin
2017-01-01
Following pioneering work 40 years ago, synchrotron beamlines dedicated to macromolecular crystallography (MX) have improved in almost every aspect as instrumentation has evolved. Beam sizes and crystal dimensions are now on the single micron scale while data can be collected from proteins with molecular weights over 10 MDa and from crystals with unit cell dimensions over 1000 Å. Furthermore it is possible to collect a complete data set in seconds, and obtain the resulting structure in minutes. The impact of MX synchrotron beamlines and their evolution is reflected in their scientific output, and MX is now the method of choice for a variety of aims from ligand binding to structure determination of membrane proteins, viruses and ribosomes, resulting in a much deeper understanding of the machinery of life. A main driving force of beamline evolution have been advances in almost every aspect of the instrumentation comprising a synchrotron beamline. In this review we aim to provide an overview of the current status of instrumentation at modern MX experiments. The most critical optical components are discussed, as are aspects of endstation design, sample delivery, visualization and positioning, the sample environment, beam shaping, detectors and data acquisition and processing. PMID:27046341
NASA Astrophysics Data System (ADS)
Behling, Hermann; da Costa, Marcondes Lima
2004-12-01
A coastal environment has been interpreted from 110 cm thick mudstone deposits found at the base of a 10 m immature laterite profile, which forms the modern coastal cliff on Mosqueiro Island in northeastern Pará state, northern Brazil. The late Tertiary sediment deposits of the Barreiras Formation are studied by multi-element geochemistry and pollen analyses. The mineralogical and geochemical results show that the gray, organic-rich deposits are composed of kaolinite, quartz, and illite/muscovite, as well as pyrite and anatase. They are rich in SiO 2, Al 2O 3, and some FeO. The composition is homogenous, indicating that the detritus source area is formed of lateritic soils derived from acid rock composition. Their chemical composition, including trace elements, is somewhat comparable to continental shale, and the values are below the upper continental Earth crust composition. The pollen analytical data document that the mudstone deposits were formed by an ancient mangrove ecosystem. Mineralogical, geochemical, and pollen analytical data obtained from late Tertiary mangrove deposits are compared with modern mangrove deposits from the Bragança Peninsula of the northeastern coast of Pará state. Although the pollen composition of the deposits is very similar to the modern one, the geochemical and mineralogical composition is different. Smectite was only found in the modern deposit; illite/mica occurs in the ancient deposit, along with Mg, K, and Na. The pollen signature and detrital minerals (kaolinite, quartz and anatase) found in both mangrove deposits show that during the Miocene, a humid tropical climate condition prevailed, similar to modern conditions.
ERIC Educational Resources Information Center
Gerontas, Apostolos
2014-01-01
Chromatographic instrumentation has been really influential in shaping the modern chemical practice, and yet it has been largely overlooked by history of science.Gas chromatography in the 1960s was considered the analytical technique closer to becoming dominant, and being the first automated chromatography set the standards that all the subsequent…
Temperature Dependence of Viscosities of Common Carrier Gases
ERIC Educational Resources Information Center
Sommers, Trent S.; Nahir, Tal M.
2005-01-01
Theoretical and experimental evidence for the dependence of viscosities of the real gases on temperature is described, suggesting that this dependence is greater than that predicted by the kinetic theory of gases. The experimental results were obtained using common modern instrumentation and could be reproduced by students in analytical or…
GMI Instrument Spin Balance Method, Optimization, Calibration, and Test
NASA Technical Reports Server (NTRS)
Ayari, Laoucet; Kubitschek, Michael; Ashton, Gunnar; Johnston, Steve; Debevec, Dave; Newell, David; Pellicciotti, Joseph
2014-01-01
The Global Microwave Imager (GMI) instrument must spin at a constant rate of 32 rpm continuously for the 3 year mission life. Therefore, GMI must be very precisely balanced about the spin axis and CG to maintain stable scan pointing and to minimize disturbances imparted to the spacecraft and attitude control on-orbit. The GMI instrument is part of the core Global Precipitation Measurement (GPM) spacecraft and is used to make calibrated radiometric measurements at multiple microwave frequencies and polarizations. The GPM mission is an international effort managed by the National Aeronautics and Space Administration (NASA) to improve climate, weather, and hydro-meteorological predictions through more accurate and frequent precipitation measurements. Ball Aerospace and Technologies Corporation (BATC) was selected by NASA Goddard Space Flight Center to design, build, and test the GMI instrument. The GMI design has to meet a challenging set of spin balance requirements and had to be brought into simultaneous static and dynamic spin balance after the entire instrument was already assembled and before environmental tests began. The focus of this contribution is on the analytical and test activities undertaken to meet the challenging spin balance requirements of the GMI instrument. The novel process of measuring the residual static and dynamic imbalances with a very high level of accuracy and precision is presented together with the prediction of the optimal balance masses and their locations.
GMI Instrument Spin Balance Method, Optimization, Calibration and Test
NASA Technical Reports Server (NTRS)
Ayari, Laoucet; Kubitschek, Michael; Ashton, Gunnar; Johnston, Steve; Debevec, Dave; Newell, David; Pellicciotti, Joseph
2014-01-01
The Global Microwave Imager (GMI) instrument must spin at a constant rate of 32 rpm continuously for the 3-year mission life. Therefore, GMI must be very precisely balanced about the spin axis and center of gravity (CG) to maintain stable scan pointing and to minimize disturbances imparted to the spacecraft and attitude control on-orbit. The GMI instrument is part of the core Global Precipitation Measurement (GPM) spacecraft and is used to make calibrated radiometric measurements at multiple microwave frequencies and polarizations. The GPM mission is an international effort managed by the National Aeronautics and Space Administration (NASA) to improve climate, weather, and hydro-meteorological predictions through more accurate and frequent precipitation measurements. Ball Aerospace and Technologies Corporation (BATC) was selected by NASA Goddard Space Flight Center to design, build, and test the GMI instrument. The GMI design has to meet a challenging set of spin balance requirements and had to be brought into simultaneous static and dynamic spin balance after the entire instrument was already assembled and before environmental tests began. The focus of this contribution is on the analytical and test activities undertaken to meet the challenging spin balance requirements of the GMI instrument. The novel process of measuring the residual static and dynamic imbalances with a very high level of accuracy and precision is presented together with the prediction of the optimal balance masses and their locations.
Analytic integrable systems: Analytic normalization and embedding flows
NASA Astrophysics Data System (ADS)
Zhang, Xiang
In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.
Impulsive-Analytic Disposition in Mathematical Problem Solving: A Survey and a Mathematics Test
ERIC Educational Resources Information Center
Lim, Kien H.; Wagler, Amy
2012-01-01
The Likelihood-to-Act (LtA) survey and a mathematics test were used in this study to assess students' impulsive-analytic disposition in the context of mathematical problem solving. The results obtained from these two instruments were compared to those obtained using two widely-used scales: Need for Cognition (NFC) and Barratt Impulsivity Scale…
Astronomical Optical Interferometry. I. Methods and Instrumentation
NASA Astrophysics Data System (ADS)
Jankov, S.
2010-12-01
Previous decade has seen an achievement of large interferometric projects including 8-10m telescopes and 100m class baselines. Modern computer and control technology has enabled the interferometric combination of light from separate telescopes also in the visible and infrared regimes. Imaging with milli-arcsecond (mas) resolution and astrometry with micro-arcsecond (muas) precision have thus become reality. Here, I review the methods and instrumentation corresponding to the current state in the field of astronomical optical interferometry. First, this review summarizes the development from the pioneering works of Fizeau and Michelson. Next, the fundamental observables are described, followed by the discussion of the basic design principles of modern interferometers. The basic interferometric techniques such as speckle and aperture masking interferometry, aperture synthesis and nulling interferometry are disscused as well. Using the experience of past and existing facilities to illustrate important points, I consider particularly the new generation of large interferometers that has been recently commissioned (most notably, the CHARA, Keck, VLT and LBT Interferometers). Finally, I discuss the longer-term future of optical interferometry, including the possibilities of new large-scale ground-based projects and prospects for space interferometry.
Islamic Modernism and Architectural Modernism of Muhammadiyah’s Lio Mosque
NASA Astrophysics Data System (ADS)
Prajawisastra, A. F.; Aryanti, T.
2017-03-01
The Muhammadiyah’s Lio Mosque is one of the masterpieces of Achmad Noe’man, the great Indonesian mosque architect. The mosque was built as a community mosque at the center of Muhammadiyah’s quarter in Garut, West Java, in conjuction with the construction of the district’s Muhammadiyah branch. Having a shape out of the existing grip, the mosque has neither a dome nor a tajug tumpang tiga (three-tiered pyramidal roof) like other mosques nearby, but instead uses a gable roof and minarets towering. This article aims to analyze the architecture of the Lio Mosque and to learn Achmad Noe’man’s interpretation of modernism, both Islamic modernism and architectural modernism, reflected in the mosque design. Employing a qualitative approach, this study used observation and interviews with the mosque’s stakeholders. This article argues that the ideology of modernism, believed by Achmad Noe’man and the Muhammadiyah organization, was embodied in the Lio Mosque architecture.
Many-core graph analytics using accelerated sparse linear algebra routines
NASA Astrophysics Data System (ADS)
Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric
2016-05-01
Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.
ERIC Educational Resources Information Center
Edmonstone, Alastair Graham
2012-01-01
This dissertation will deal with the issue of incorporating modern piano music into the repertoire of piano students at the undergraduate degree level. For the purposes of this paper, it is assumed the students will be pursuing a major in music at a conservatory or university with piano as their instrument. In no way does this paper serve as an…
NASA Astrophysics Data System (ADS)
Xu, Y.; Pearson, S. P.; Kilbourne, K.
2013-12-01
Tropical sea surface temperature (SST) has been implicated as a driver of climate changes during the Medieval Climate Anomaly (MCA, 950-1300 A.D.) but little data exists from the tropical oceans during this time period. We collected three modern and seven sub-fossil Diploria strigosa coral colonies from an overwash deposit on Anegada, British Virgin Islands (18.73 °N, 63.33 °W) in order to reconstruct climate in the northeastern Caribbean and Tropical North Atlantic during the MCA. The first step in our reconstruction was to verify the climate signal from this species at this site. We sub-sampled the modern corals along thecal walls with an average sampling resolution of 11-13 samples per year. Sr/Ca ratios measured in the sub-samples were calibrated to temperature using three different calibration techniques (ordinary least squares, reduced major axis, and weighted least squares (WLS)) on the monthly data that includes the seasonal cycles and on the monthly anomaly data. WLS regression accounts for unequal errors in the x and y terms, so we consider it the most robust technique. The WLS regression slope between gridded SST and coral Sr/Ca is similar to the previous two calibrations of this species. Mean Sr/Ca for each of the three modern corals is 8.993 × 0.004 mmol/mol, 9.127 × 0.003 mmol/mol, and 8.960 × 0.007 mmol/mol. These straddle the mean Diploria strigosa Sr/Ca found by Giry et al., (2010), 9.080 mmol/mol, at a site with nearly the same mean SST as Anegada (27.4 °C vs. 27.5 °C). The climatological seasonal cycles for SST derived from the modern corals are statistically indistinguishable from the seasonal cycles in the instrumental SST data. The coral-based seasonal cycles have ranges of 2.70 × 0.31 °C, 2.65 × 0.08 °C and 2.71 × 0.53 °C. These results indicate that this calibration can be applied to our sub-fossil coral data. We applied the WLS calibration to monthly-resolution Sr/Ca data from multiple sub-fossil corals dating to the medieval
Utility perspective on USEPA analytical methods program redirection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, B.; Davis, M.K.; Krasner, S.W.
1996-11-01
The Metropolitan Water District of Southern California (Metropolitan) is a public, municipal corporation, created by the State of California, which wholesales supplemental water trough 27 member agencies (cities and water districts). Metropolitan serves nearly 16 million people in an area along the coastal plain of Southern California that covers approximately 5200 square miles. Water deliveries have averaged up to 2.5 million acre-feet per year. Metropolitan`s Water Quality Laboratory (WQL) conducts compliance monitoring of its source and finished drinking waters for chemical and microbial constituents. The laboratory maintains certification of a large number and variety of analytical procedures. The WQL operatesmore » in a 17,000-square-foot facility. The equipment is state-of-the-art analytical instrumentation. The staff consists of 40 professional chemists and microbiologists whose experience and expertise are extensive and often highly specialized. The staff turnover is very low, and the laboratory is consistently, efficiently, and expertly run.« less
Religion, modernity and foreign nurses in Iceland 1896-1930.
Björnsdóttir, Kristin; Malchau, Susanne
2004-09-01
This paper describes the influence of foreign nurses upon the development of modern healthcare services and the nursing profession in Iceland in the first three decades of the twentieth century. It represents a case study of how new ideas, traditions and practices migrated between countries and cultures in the twentieth century. Icelandic society was, at that time, still premodern in many ways. Healthcare institutions were almost nonexistent and the means of production were undeveloped. It was into this context that the idea of nursing as a professional activity was introduced. Groups of nurses, the Catholic Sisters of St Joseph of Chambéry and secular nurses, mainly from Denmark, came to the country to organize and provide healthcare services, of which nursing was of central importance. These groups were diasporas, in that they brought traditions and practices from other cultures. The Sisters of St Joseph built, owned and ran the first modern hospital in the country. The Danish nurses introduced nursing as a specialized field of work, in leprosy and tuberculosis nursing and by initiating public health nursing services. They were instrumental in promoting education as an important condition to becoming a nurse, and the development of an Icelandic nursing profession. These nurses were generally respected by the Icelandic people for their contributions and were received with interest and appreciation. The healthcare services introduced by these different groups of nurses reflected modern ways of living and a commitment to professionalism, which involved providing assistance to patients based on the best knowledge available and a philosophy of respect and care.
Guided-inquiry laboratory experiments to improve students' analytical thinking skills
NASA Astrophysics Data System (ADS)
Wahyuni, Tutik S.; Analita, Rizki N.
2017-12-01
This study aims to improve the experiment implementation quality and analytical thinking skills of undergraduate students through guided-inquiry laboratory experiments. This study was a classroom action research conducted in three cycles. The study has been carried out with 38 undergraduate students of the second semester of Biology Education Department of State Islamic Institute (SII) of Tulungagung, as a part of Chemistry for Biology course. The research instruments were lesson plans, learning observation sheets and undergraduate students' experimental procedure. Research data were analyzed using quantitative-descriptive method. The increasing of analytical thinking skills could be measured using gain score normalized and statistical paired t-test. The results showed that guided-inquiry laboratory experiments model was able to improve both the experiment implementation quality and the analytical thinking skills. N-gain score of the analytical thinking skills was increased, in spite of just 0.03 with low increase category, indicated by experimental reports. Some of undergraduate students have had the difficulties in detecting the relation of one part to another and to an overall structure. The findings suggested that giving feedback the procedural knowledge and experimental reports were important. Revising the experimental procedure that completed by some scaffolding questions were also needed.
ERIC Educational Resources Information Center
Shen, Hao-Yu; Shen, Bo; Hardacre, Christopher
2013-01-01
A systematic approach to develop the teaching of instrumental analytical chemistry is discussed, as well as a conceptual framework for organizing and executing lectures and a laboratory course. Three main components are used in this course: theoretical knowledge developed in the classroom, simulations via a virtual laboratory, and practical…
Evolution of instruments for harvest of the skin grafts
Ameer, Faisal; Singh, Arun Kumar; Kumar, Sandeep
2013-01-01
Background: The harvest of autologous skin graft is considered to be a fundamental skill of the plastic surgeon. The objective of this article is to provide an interesting account of the development of skin grafting instruments as we use them today in various plastic surgical procedures. Materials and Methods: The authors present the chronological evolution and modifications of the skin grafting knife, including those contributions not often cited in the literature, using articles sourced from MEDLINE, ancient manuscripts, original quotes, techniques and illustrations. Results: This article traces the evolution of instrumentation for harvest of skin grafts from free hand techniques to precise modern automated methods. Conclusions: Although skin grafting is one of the basic techniques used in reconstructive surgery yet harvest of a uniform graft of desired thickness poses a challenge. This article is dedicated to innovators who have devoted their lives and work to the advancement of the field of plastic surgery. PMID:23960303
Contribution of Electrochemistry to the Biomedical and Pharmaceutical Analytical Sciences.
Kauffmann, Jean-Michel; Patris, Stephanie; Vandeput, Marie; Sarakbi, Ahmad; Sakira, Abdul Karim
2016-01-01
All analytical techniques have experienced major progress since the last ten years and electroanalysis is also involved in this trend. The unique characteristics of phenomena occurring at the electrode-solution interface along with the variety of electrochemical methods currently available allow for a broad spectrum of applications. Potentiometric, conductometric, voltammetric and amperometric methods are briefly reviewed with a critical view in terms of performance of the developed instrumentation with special emphasis on pharmaceutical and biomedical applications.
Modern Chinese: History and Sociolinguistics.
ERIC Educational Resources Information Center
Chen, Ping
This book presents a comprehensive and up-to-date account of the development of modern Chinese from the late 19th century up to the 1990s, concentrating on three major aspects: modern spoken Chinese, modern written Chinese, and the modern Chinese writing system. It describes and analyzes in detail, from historical and sociolinguistic perspectives,…
NASA Astrophysics Data System (ADS)
Shulyak, D.; Paladini, C.; Causi, G. Li; Perraut, K.; Kochukhov, O.
2014-09-01
By means of numerical experiments we explore the application of interferometry to the detection and characterization of abundance spots in chemically peculiar (CP) stars using the brightest star ε UMa as a case study. We find that the best spectral regions to search for spots and stellar rotation signatures are in the visual domain. The spots can clearly be detected already at a first visibility lobe and their signatures can be uniquely disentangled from that of rotation. The spots and rotation signatures can also be detected in near-infrared at low spectral resolution but baselines longer than 180 m are needed for all potential CP candidates. According to our simulations, an instrument like VEGA (or its successor e.g. Fibered and spectrally Resolved Interferometric Equipment New Design) should be able to detect, in the visual, the effect of spots and spots+rotation, provided that the instrument is able to measure V2 ≈ 10-3, and/or closure phase. In infrared, an instrument like AMBER but with longer baselines than the ones available so far would be able to measure rotation and spots. Our study provides necessary details about strategies of spot detections and the requirements for modern and planned interferometric facilities essential for CP star research.
ANALYTICAL CHEMISTRY DIVISION ANNUAL PROGRESS REPORT FOR PERIOD ENDING DECEMBER 31, 1961
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1962-02-01
Research and development progress is reported on analytlcal instrumentation, dlssolver-solution analyses, special research problems, reactor projects analyses, x-ray and spectrochemical analyses, mass spectrometry, optical and electron microscopy, radiochemical analyses, nuclear analyses, inorganic preparations, organic preparations, ionic analyses, infrared spectral studies, anodization of sector coils for the Analog II Cyclotron, quality control, process analyses, and the Thermal Breeder Reactor Projects Analytical Chemistry Laboratory. (M.C.G.)
NASA Technical Reports Server (NTRS)
Hess, R. A.
1976-01-01
Paramount to proper utilization of electronic displays is a method for determining pilot-centered display requirements. Display design should be viewed fundamentally as a guidance and control problem which has interactions with the designer's knowledge of human psychomotor activity. From this standpoint, reliable analytical models of human pilots as information processors and controllers can provide valuable insight into the display design process. A relatively straightforward, nearly algorithmic procedure for deriving model-based, pilot-centered display requirements was developed and is presented. The optimal or control theoretic pilot model serves as the backbone of the design methodology, which is specifically directed toward the synthesis of head-down, electronic, cockpit display formats. Some novel applications of the optimal pilot model are discussed. An analytical design example is offered which defines a format for the electronic display to be used in a UH-1H helicopter in a landing approach task involving longitudinal and lateral degrees of freedom.
Non-conventional applications of a noninvasive portable X-ray diffraction/fluorescence instrument
NASA Astrophysics Data System (ADS)
Chiari, Giacomo; Sarrazin, Philippe; Heginbotham, Arlen
2016-11-01
Noninvasive techniques have become widespread in the cultural heritage analytical domain. The popular handheld X-ray fluorescence (XRF) devices give the elemental composition of all the layers that X-rays can penetrate, but no information on how atoms are bound together or at which depth they are located. A noninvasive portable X-ray powder diffraction/X-ray fluorescence (XRD/XRF) device may offer a solution to these limitations, since it can provide information on the composition of crystalline materials. This paper introduces applications of XRD beyond simple phase recognition. The two fundamental principles for XRD are: (1) the crystallites should be randomly oriented, to ensure proper intensity to all the diffraction peaks, and (2) the material should be positioned exactly in the focal plane of the instrument, respecting its geometry, as any displacement of the sample would results in 2 θ shifts of the diffraction peaks. In conventional XRD, the sample is ground and set on the properly positioned sample holder. Using a noninvasive portable instrument, these two requirements are seldom fulfilled. The position, size and orientation of a given crystallite within a layered structure depend on the object itself. Equation correlating the displacement (distance from the focal plane) versus peak shift (angular difference in 2 θ from the standard value) is derived and used to determine the depth at which a given substance is located. The quantitative composition of two binary Cu/Zn alloys, simultaneously present, was determined measuring the cell volume and using Vegard's law. The analysis of the whole object gives information on the texture and possible preferred orientations of the crystallites, which influences the peak intensity. This allows for the distinction between clad and electroplated daguerreotypes in the case of silver and between ancient and modern gilding for gold. Analyses of cross sections can be carried out successfully. Finally, beeswax, used in
NASA Technical Reports Server (NTRS)
Oglebay, J. C.
1977-01-01
A thermal analytic model for a 30-cm engineering model mercury-ion thruster was developed and calibrated using the experimental test results of tests of a pre-engineering model 30-cm thruster. A series of tests, performed later, simulated a wide range of thermal environments on an operating 30-cm engineering model thruster, which was instrumented to measure the temperature distribution within it. The modified analytic model is described and analytic and experimental results compared for various operating conditions. Based on the comparisons, it is concluded that the analytic model can be used as a preliminary design tool to predict thruster steady-state temperature distributions for stage and mission studies and to define the thermal interface bewteen the thruster and other elements of a spacecraft.
This paper employs analytical and numerical general equilibrium models to examine the significance of pre-existing factor taxes for the costs of pollution reduction under a wide range of environmental policy instruments. Pre-existing taxes imply significantly ...
Computational and analytical methods in nonlinear fluid dynamics
NASA Astrophysics Data System (ADS)
Walker, James
1993-09-01
The central focus of the program was on the application and development of modern analytical and computational methods to the solution of nonlinear problems in fluid dynamics and reactive gas dynamics. The research was carried out within the Division of Engineering Mathematics in the Department of Mechanical Engineering and Mechanics and principally involved Professors P.A. Blythe, E. Varley and J.D.A. Walker. In addition. the program involved various international collaborations. Professor Blythe completed work on reactive gas dynamics with Professor D. Crighton FRS of Cambridge University in the United Kingdom. Professor Walker and his students carried out joint work with Professor F.T. Smith, of University College London, on various problems in unsteady flow and turbulent boundary layers.
Comparing particle-size distributions in modern and ancient sand-bed rivers
NASA Astrophysics Data System (ADS)
Hajek, E. A.; Lynds, R. M.; Huzurbazar, S. V.
2011-12-01
Particle-size distributions yield valuable insight into processes controlling sediment supply, transport, and deposition in sedimentary systems. This is especially true in ancient deposits, where effects of changing boundary conditions and autogenic processes may be detected from deposited sediment. In order to improve interpretations in ancient deposits and constrain uncertainty associated with new methods for paleomorphodynamic reconstructions in ancient fluvial systems, we compare particle-size distributions in three active sand-bed rivers in central Nebraska (USA) to grain-size distributions from ancient sandy fluvial deposits. Within the modern rivers studied, particle-size distributions of active-layer, suspended-load, and slackwater deposits show consistent relationships despite some morphological and sediment-supply differences between the rivers. In particular, there is substantial and consistent overlap between bed-material and suspended-load distributions, and the coarsest material found in slackwater deposits is comparable to the coarse fraction of suspended-sediment samples. Proxy bed-load and slackwater-deposit samples from the Kayenta Formation (Lower Jurassic, Utah/Colorado, USA) show overlap similar to that seen in the modern rivers, suggesting that these deposits may be sampled for paleomorphodynamic reconstructions, including paleoslope estimation. We also compare grain-size distributions of channel, floodplain, and proximal-overbank deposits in the Willwood (Paleocene/Eocene, Bighorn Basin, Wyoming, USA), Wasatch (Paleocene/Eocene, Piceance Creek Basin, Colorado, USA), and Ferris (Cretaceous/Paleocene, Hanna Basin, Wyoming, USA) formations. Grain-size characteristics in these deposits reflect how suspended- and bed-load sediment is distributed across the floodplain during channel avulsion events. In order to constrain uncertainty inherent in such estimates, we evaluate uncertainty associated with sample collection, preparation, analytical
Semi-empirical and phenomenological instrument functions for the scanning tunneling microscope
NASA Astrophysics Data System (ADS)
Feuchtwang, T. E.; Cutler, P. H.; Notea, A.
1988-08-01
Recent progress in the development of a convenient algorithm for the determination of a quantitative local density of states (LDOS) of the sample, from data measured in the STM, is reviewd. It is argued that the sample LDOS strikes a good balance between the information content of a surface characteristic and effort required to obtain it experimentally. Hence, procedures to determine the sample LDOS as directly and as tip-model independently as possible are emphasized. The solution of the STM's "inverse" problem in terms of novel versions of the instrument (or Green) function technique is considered in preference to the well known, more direct solutions. Two types of instrument functions are considered: Approximations of the basic tip-instrument function obtained from the transfer Hamiltonian theory of the STM-STS. And, phenomenological instrument functions devised as a systematic scheme for semi-empirical first order corrections of "ideal" models. The instrument function, in this case, describes the corrections as the response of an independent component of the measuring apparatus inserted between the "ideal" instrument and the measured data. This linear response theory of measurement is reviewed and applied. A procedure for the estimation of the consistency of the model and the systematic errors due to the use of an approximate instrument function is presented. The independence of the instrument function techniques from explicit microscopic models of the tip is noted. The need for semi-empirical, as opposed to strictly empirical or analytical determination of the instrument function is discussed. The extension of the theory to the scanning tunneling spectrometer is noted, as well as its use in a theory of resolution.
Current advances in synchrotron radiation instrumentation for MX experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owen, Robin L.; Juanhuix, Jordi; Fuchs, Martin
2016-07-01
Following pioneering work 40 years ago, synchrotron beamlines dedicated to macromolecular crystallography (MX) have improved in almost every aspect as instrumentation has evolved. Beam sizes and crystal dimensions are now on the single micron scale while data can be collected from proteins with molecular weights over 10 MDa and from crystals with unit cell dimensions over 1000 Å. Furthermore it is possible to collect a complete data set in seconds, and obtain the resulting structure in minutes. The impact of MX synchrotron beamlines and their evolution is reflected in their scientific output, and MX is now the method of choicemore » for a variety of aims from ligand binding to structure determination of membrane proteins, viruses and ribosomes, resulting in a much deeper understanding of the machinery of life. A main driving force of beamline evolution have been advances in almost every aspect of the instrumentation comprising a synchrotron beamline. In this review we aim to provide an overview of the current status of instrumentation at modern MX experiments. The most critical optical components are discussed, as are aspects of endstation design, sample delivery, visualisation and positioning, the sample environment, beam shaping, detectors and data acquisition and processing.« less
Current advances in synchrotron radiation instrumentation for MX experiments.
Owen, Robin L; Juanhuix, Jordi; Fuchs, Martin
2016-07-15
Following pioneering work 40 years ago, synchrotron beamlines dedicated to macromolecular crystallography (MX) have improved in almost every aspect as instrumentation has evolved. Beam sizes and crystal dimensions are now on the single micron scale while data can be collected from proteins with molecular weights over 10 MDa and from crystals with unit cell dimensions over 1000 Å. Furthermore it is possible to collect a complete data set in seconds, and obtain the resulting structure in minutes. The impact of MX synchrotron beamlines and their evolution is reflected in their scientific output, and MX is now the method of choice for a variety of aims from ligand binding to structure determination of membrane proteins, viruses and ribosomes, resulting in a much deeper understanding of the machinery of life. A main driving force of beamline evolution have been advances in almost every aspect of the instrumentation comprising a synchrotron beamline. In this review we aim to provide an overview of the current status of instrumentation at modern MX experiments. The most critical optical components are discussed, as are aspects of endstation design, sample delivery, visualisation and positioning, the sample environment, beam shaping, detectors and data acquisition and processing. Copyright © 2016. Published by Elsevier Inc.
Current advances in synchrotron radiation instrumentation for MX experiments
Owen, Robin L.; Juanhuix, Jordi; Fuchs, Martin
2016-04-01
Following pioneering work 40 years ago, synchrotron beamlines dedicated to macromolecular crystallography (MX) have improved in almost every aspect as instrumentation has evolved. Beam sizes and crystal dimensions are now on the single micron scale while data can be collected from proteins with molecular weights over 10 MDa and from crystals with unit cell dimensions over 1000 Å. Moreover, it is possible to collect a complete data set in seconds, and obtain the resulting structure in minutes. The impact of MX synchrotron beamlines and their evolution is reflected in their scientific output, and MX is now the method of choicemore » for a variety of aims from ligand binding to structure determination of membrane proteins, viruses and ribosomes, resulting in a much deeper understanding of the machinery of life. One main driving force of beamline evolution have been advances in almost every aspect of the instrumentation comprising a synchrotron beamline. In this review we aim to provide an overview of the current status of instrumentation at modern MX experiments. Furthermore, we discuss the most critical optical components, aspects of endstation design, sample delivery, visualisation and positioning, the sample environment, beam shaping, detectors and data acquisition and processing.« less
iPod touch-assisted instrumentation of the spine: a technical report.
Jost, Gregory F; Bisson, Erica F; Schmidt, Meic H
2013-12-01
Instrumentation of the spine depends on choosing the correct insertion angles to implant screws. Although modern image guidance facilitates precise instrumentation of the spine, the equipment is costly and availability is limited. Although most surgeons use lateral fluoroscopy to guide instrumentation in the sagittal plane, the lateromedial angulation is often chosen by estimation. To overcome the associated uncertainty, iPod touch-based applications for measuring angles can be used to assist with screw implantation. To evaluate the use of the iPod touch to adjust instruments to the optimal axial insertion angle for placement of pedicle screws in the lumbar spine. Twenty lumbar pedicle screws in 5 consecutive patients were implanted using the iPod touch. The lateromedial angulation was measured on preoperative images and reproduced in the operative field with the iPod touch. The instruments to implant the screws were aligned with the side of the iPod for screw insertion. Actual screw angles were remeasured on postoperative imaging. We collected demographic, clinical, and operative data for each patient. In 16 of 20 screws, the accuracy of implantation was within 3 degrees of the ideal trajectory. The 4 screws with an angle mismatch of 7 to 13 degrees were all implanted at the caudal end of the exposure, where maintaining the planned angulation was impeded by strong muscles pushing medially. iPod touch-assisted instrumentation of the spine is a very simple technique, which, in combination with a lateral fluoroscopy, may guide placement of pedicle screws in the lumbar spine.
ERIC Educational Resources Information Center
JACKSON, R. GRAHAM
CHOICES AND ISSUES IN SELECTING MATERIALS FOR MODERNIZATION OF SCHOOL BUILDINGS ARE DISCUSSED IN THIS REPORT. BACKGROUND INFORMATION IS INTRODUCED IN TERMS OF REASONS FOR ABANDONMENT, THE CAUSES AND EFFECTS OF SCHOOL BUILDING OBSOLESCENCE, AND PROBLEMS IN THE MODERNIZATION PROCESS. INTERIOR PARTITIONS ARE DISCUSSED IN TERMS OF BUILDING MATERIALS,…
An alpha particle instrument with alpha, proton, and X-ray modes for planetary chemical analyses
NASA Technical Reports Server (NTRS)
Economou, T. E.; Turkevich, A. L.
1976-01-01
The interaction of alpha particles with matter is employed in a compact instrument that could provide rather complete in-situ chemical analyses of surfaces and thin atmospheres of extraterrestrial bodies. The instrument is a miniaturized and improved version of the Surveyor lunar instrument. The backscattering of alpha particles and (alpha, p) reactions provide analytical data on the light elements (carbon-iron). An X-ray mode that detects the photons produced by the alpha sources provides sensitivity and resolution for the chemical elements heavier than about silicon. The X-rays are detected by semiconductor detectors having a resolution between 150 and 250 eV at 5.9 keV. Such an instrument can identify and determine with good accuracy 99 percent of the atoms (except hydrogen) in rocks. For many trace elements, the detecting sensitivity is a few ppm. Auxiliary sources could be used to enhance the sensitivities for elements of special interest. The instrument could probably withstand the acceleration involved in semi-hard landings.
NASA Astrophysics Data System (ADS)
Pan, Jun-Yang; Xie, Yi
2015-02-01
With tremendous advances in modern techniques, Einstein's general relativity has become an inevitable part of deep space missions. We investigate the relativistic algorithm for time transfer between the proper time τ of the onboard clock and the Geocentric Coordinate Time, which extends some previous works by including the effects of propagation of electromagnetic signals. In order to evaluate the implicit algebraic equations and integrals in the model, we take an analytic approach to work out their approximate values. This analytic model might be used in an onboard computer because of its limited capability to perform calculations. Taking an orbiter like Yinghuo-1 as an example, we find that the contributions of the Sun, the ground station and the spacecraft dominate the outcomes of the relativistic corrections to the model.
NASA Astrophysics Data System (ADS)
Schulte, Wolfgang; Thiele, Hans; Hofmann, Peter; Baglioni, Pietro
The ExoMars program will search for past and present life on Mars. ExoMars will address important scientific goals and demonstrate key in-situ enabling technologies. Among such technologies are the acquisition, preparation, distribution and analysis of samples from Mars surface rocks and from the subsurface. The 2018 mission will land an ESA rover on Mars which carries a sample preparation and distribution system (SPDS) and a suite of analytical instruments, the Pasteur Payload with its Analytical Laboratory Drawer (ALD). Kayser-Threde GmbH (Germany) will be responsible for the SPDS as a subcontractor under the mission prime Thales Alenia Space. The SPDS comprises a number of complex mechanisms and mechanical devices designed to transport drill core samples within the rover analytical laboratory, to crush them to powder with a fine grain size, to portion discrete amounts of powdered sample material, to distribute and fill the material into sample containers and to prepare flat sample surfaces for scientific analysis. Breadboards of the crushing mechanism, the dosing mechanism and a distribution carousel with sample containers and a powder sample surface flattening mechanism were built and tested. Kayser-Threde, as a member of the Spanish led ExoMars Raman Instrument team, is also responsible for development of the Raman optical head, which will be mounted inside ALD and will inspect the crushed samples, when they are presented to the instrument by the distribution carousel. Within this activity, which is performed under contract with the Institute of Physical Chemistry of the University of Jena (Germany) and funded by the German DLR, Kayser-Threde can demonstrate Raman measurements with the optical head and a COTS laser and spectrometer and thus simulate the full Raman instrument optical path. An autofocus system with actuator and feedback optics is also part of this activity, which allows focusing the 50 m Raman spot on the surface of the powdered sample
Drugs as instruments: a new framework for non-addictive psychoactive drug use.
Müller, Christian P; Schumann, Gunter
2011-12-01
Most people who are regular consumers of psychoactive drugs are not drug addicts, nor will they ever become addicts. In neurobiological theories, non-addictive drug consumption is acknowledged only as a "necessary" prerequisite for addiction, but not as a stable and widespread behavior in its own right. This target article proposes a new neurobiological framework theory for non-addictive psychoactive drug consumption, introducing the concept of "drug instrumentalization." Psychoactive drugs are consumed for their effects on mental states. Humans are able to learn that mental states can be changed on purpose by drugs, in order to facilitate other, non-drug-related behaviors. We discuss specific "instrumentalization goals" and outline neurobiological mechanisms of how major classes of psychoactive drugs change mental states and serve non-drug-related behaviors. We argue that drug instrumentalization behavior may provide a functional adaptation to modern environments based on a historical selection for learning mechanisms that allow the dynamic modification of consummatory behavior. It is assumed that in order to effectively instrumentalize psychoactive drugs, the establishment of and retrieval from a drug memory is required. Here, we propose a new classification of different drug memory subtypes and discuss how they interact during drug instrumentalization learning and retrieval. Understanding the everyday utility and the learning mechanisms of non-addictive psychotropic drug use may help to prevent abuse and the transition to drug addiction in the future.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm
2016-01-01
The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.
Behavior of healthcare workers after injuries from sharp instruments.
Adib-Hajbaghery, Mohsen; Lotfi, Mohammad Sajjad
2013-09-01
Injuries with sharps are common occupational hazards for healthcare workers. Such injuries predispose the staff to dangerous infections such as hepatitis B, C and HIV. The present study was conducted to investigate the behaviors of healthcare workers in Kashan healthcare centers after needle sticks and injuries with sharps in 2012. A cross-sectional study was conducted on 298 healthcare workers of medical centers governed by Kashan University of Medical Sciences. A questionnaire was used in this study. The first part included questions about demographic characteristics. The second part of the questionnaire consisted of 16 items related to the sharp instrument injuries. For data analysis, descriptive and analytical statistics (chi-square, ANOVA and Pearson correlation coefficient) SPSS version 16.0 software was used. From a total of 298 healthcare workers, 114 (38.3%) had a history of injury from needles and sharp instruments in the last six months. Most needle stick and sharp instrument injuries had occurred among the operating room nurses and midwifes; 32.5% of injuries from sharp instruments occurred in the morning shift. Needles were responsible for 46.5% of injuries. The most common actions taken after needle stick injuries were compression (27.2%) and washing the area with soap and water (15.8%). Only 44.6% of the injured personnel pursued follow-up measures after a needle stick or sharp instrument injury. More than a half of the healthcare workers with needle stick or sharp instrument injury had refused follow-up for various reasons. The authorities should implement education programs along with protocols to be implemented after needle stick injuries or sharps.
Learning about human population history from ancient and modern genomes.
Stoneking, Mark; Krause, Johannes
2011-08-18
Genome-wide data, both from SNP arrays and from complete genome sequencing, are becoming increasingly abundant and are now even available from extinct hominins. These data are providing new insights into population history; in particular, when combined with model-based analytical approaches, genome-wide data allow direct testing of hypotheses about population history. For example, genome-wide data from both contemporary populations and extinct hominins strongly support a single dispersal of modern humans from Africa, followed by two archaic admixture events: one with Neanderthals somewhere outside Africa and a second with Denisovans that (so far) has only been detected in New Guinea. These new developments promise to reveal new stories about human population history, without having to resort to storytelling.
EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.
NASA Astrophysics Data System (ADS)
Sorensen, Ira Joseph
A primary objective of the effort reported here is to develop a radiometric instrument modeling environment to provide complete end-to-end numerical models of radiometric instruments, integrating the optical, electro-thermal, and electronic systems. The modeling environment consists of a Monte Carlo ray-trace (MCRT) model of the optical system coupled to a transient, three-dimensional finite-difference electrothermal model of the detector assembly with an analytic model of the signal-conditioning circuitry. The environment provides a complete simulation of the dynamic optical and electrothermal behavior of the instrument. The modeling environment is used to create an end-to-end model of the CERES scanning radiometer, and its performance is compared to the performance of an operational CERES total channel as a benchmark. A further objective of this effort is to formulate an efficient design environment for radiometric instruments. To this end, the modeling environment is then combined with evolutionary search algorithms known as genetic algorithms (GA's) to develop a methodology for optimal instrument design using high-level radiometric instrument models. GA's are applied to the design of the optical system and detector system separately and to both as an aggregate function with positive results.
Ab initio simulation of diffractometer instrumental function for high-resolution X-ray diffraction1
Mikhalychev, Alexander; Benediktovitch, Andrei; Ulyanenkova, Tatjana; Ulyanenkov, Alex
2015-01-01
Modeling of the X-ray diffractometer instrumental function for a given optics configuration is important both for planning experiments and for the analysis of measured data. A fast and universal method for instrumental function simulation, suitable for fully automated computer realization and describing both coplanar and noncoplanar measurement geometries for any combination of X-ray optical elements, is proposed. The method can be identified as semi-analytical backward ray tracing and is based on the calculation of a detected signal as an integral of X-ray intensities for all the rays reaching the detector. The high speed of calculation is provided by the expressions for analytical integration over the spatial coordinates that describe the detection point. Consideration of the three-dimensional propagation of rays without restriction to the diffraction plane provides the applicability of the method for noncoplanar geometry and the accuracy for characterization of the signal from a two-dimensional detector. The correctness of the simulation algorithm is checked in the following two ways: by verifying the consistency of the calculated data with the patterns expected for certain simple limiting cases and by comparing measured reciprocal-space maps with the corresponding maps simulated by the proposed method for the same diffractometer configurations. Both kinds of tests demonstrate the agreement of the simulated instrumental function shape with the measured data. PMID:26089760
ERIC Educational Resources Information Center
Rasmussen, Casper Hvenegaard; Jochumsen, Henrik
2007-01-01
The public library is a product of modernity that follows in the wake of industrialization, urbanization, and popular movements, while at the same time the public library itself supports the building up and development of the modern. This article will examine the arrival of modernity and the prerequisites for the rise of public libraries, as well…
NASA Astrophysics Data System (ADS)
Van de Voorde, Lien; Vekemans, Bart; Verhaeven, Eddy; Tack, Pieter; De Wolf, Robin; Garrevoet, Jan; Vandenabeele, Peter; Vincze, Laszlo
2015-08-01
A new, commercially available, mobile system combining X-ray diffraction and X-ray fluorescence has been evaluated which enables both elemental analysis and phase identification simultaneously. The instrument makes use of a copper or molybdenum based miniature X-ray tube and a silicon-Pin diode energy-dispersive detector to count the photons originating from the samples. The X-ray tube and detector are both mounted on an X-ray diffraction protractor in a Bragg-Brentano θ:θ geometry. The mobile instrument is one of the lightest and most compact instruments of its kind (3.5 kg) and it is thus very useful for in situ purposes such as the direct (non-destructive) analysis of cultural heritage objects which need to be analyzed on site without any displacement. The supplied software allows both the operation of the instrument for data collection and in-depth data analysis using the International Centre for Diffraction Data database. This paper focuses on the characterization of the instrument, combined with a case study on pigment identification and an illustrative example for the analysis of lead alloyed printing letters. The results show that this commercially available light-weight instrument is able to identify the main crystalline phases non-destructively, present in a variety of samples, with a high degree of flexibility regarding sample size and position.
[Jan Evangelista Purkynĕ and his instruments for microscopic research.
Chvtal, A
2016-10-01
The results obtained during the studies of the microscopic structure of animal and human tissues by the famous 19th century Czech scientist Jan Evangelista Purkynĕ are already sufficiently described in a variety of older and newer publications. The contents of the present paper are an overview of the microscopes and other tools and instruments that Purkynĕ and his assistants and pupils used for research of tissue histology and during teaching, and in whose development there were directly involved. A brief overview of the development of the cutting engines suggests that the first microtome, from which all modern sliding microtomes are derived, originated under the supervision of Purkynĕ at the Institute of Physiology in Wroclaw. Purkynĕ and his assistants thus not only obtained priority results in the field of the structure of animal and human tissues, but also substantially contributed to the development of instruments and equipment for their study, which is often forgotten today.
Chemical Pollution from Combustion of Modern Spacecraft Materials
NASA Technical Reports Server (NTRS)
Mudgett, Paul D.
2013-01-01
Fire is one of the most critical contingencies in spacecraft and any closed environment including submarines. Currently, NASA uses particle based technology to detect fires and hand-held combustion product monitors to track the clean-up and restoration of habitable cabin environment after the fire is extinguished. In the future, chemical detection could augment particle detection to eliminate frequent nuisance false alarms triggered by dust. In the interest of understanding combustion from both particulate and chemical generation, NASA Centers have been collaborating on combustion studies at White Sands Test Facility using modern spacecraft materials as fuels, and both old and new technology to measure the chemical and particulate products of combustion. The tests attempted to study smoldering pyrolysis at relatively low temperatures without ignition to flaming conditions. This paper will summarize the results of two 1-week long tests undertaken in 2012, focusing on the chemical products of combustion. The results confirm the key chemical products are carbon monoxide (CO), hydrogen cyanide (HCN), hydrogen fluoride (HF) and hydrogen chloride (HCl), whose concentrations depend on the particular material and test conditions. For example, modern aerospace wire insulation produces significant concentration of HF, which persists in the test chamber longer than anticipated. These compounds are the analytical targets identified for the development of new tunable diode laser based hand-held monitors, to replace the aging electrochemical sensor based devices currently in use on the International Space Station.
Update on Modern Management of Pheochromocytoma and Paraganglioma.
Lenders, Jacques W M; Eisenhofer, Graeme
2017-06-01
Despite all technical progress in modern diagnostic methods and treatment modalities of pheochromocytoma/paraganglioma, early consideration of the presence of these tumors remains the pivotal link towards the best possible outcome for patients. A timely diagnosis and proper treatment can prevent the wide variety of potentially catastrophic cardiovascular complications. Modern biochemical testing should include tests that offer the best available diagnostic performance, measurements of metanephrines and 3-methoxytyramine in plasma or urine. To minimize false-positive test results particular attention should be paid to pre-analytical sampling conditions. In addition to anatomical imaging by computed tomography (CT) or magnetic resonance imaging, new promising functional imaging modalities of photon emission tomography/CT using with somatostatin analogues such as ⁶⁸Ga-DOTATATE (⁶⁸Ga-labeled DOTA(0)-Tyr(3)-octreotide) will probably replace ¹²³I-MIBG (iodine-123-metaiodobenzylguanidine) in the near future. As nearly half of all pheochromocytoma patients harbor a mutation in one of the 14 tumor susceptibility genes, genetic testing and counseling should at least be considered in all patients with a proven tumor. Post-surgical annual follow-up of patients by measurements of plasma or urinary metanephrines should last for at least 10 years for timely detection of recurrent or metastatic disease. Patients with a high risk for recurrence or metastatic disease (paraganglioma, young age, multiple or large tumors, genetic background) should be followed up lifelong. Copyright © 2017 Korean Endocrine Society.
NASA Astrophysics Data System (ADS)
Javier Romualdez, Luis
Scientific balloon-borne instrumentation offers an attractive, competitive, and effective alternative to space-borne missions when considering the overall scope, cost, and development timescale required to design and launch scientific instruments. In particular, the balloon-borne environment provides a near-space regime that is suitable for a number of modern astronomical and cosmological experiments, where the atmospheric interference suffered by ground-based instrumentation is negligible at stratospheric altitudes. This work is centered around the analytical strategies and implementation considerations for the attitude determination and control of SuperBIT, a scientific balloon-borne payload capable of meeting the strict sub-arcsecond pointing and image stability requirements demanded by modern cosmological experiments. Broadly speaking, the designed stability specifications of SuperBIT coupled with its observational efficiency, image quality, and accessibility rivals state-of-the-art astronomical observatories such as the Hubble Space Telescope. To this end, this work presents an end-to-end design methodology for precision pointing balloon-borne payloads such as SuperBIT within an analytical yet implementationally grounded context. Simulation models of SuperBIT are analytically derived to aid in pre-assembly trade-off and case studies that are pertinent to the dynamic balloon-borne environment. From these results, state estimation techniques and control methodologies are extensively developed, leveraging the analytical framework of simulation models and design studies. This pre-assembly design phase is physically validated during assembly, integration, and testing through implementation in real-time hardware and software, which bridges the gap between analytical results and practical application. SuperBIT attitude determination and control is demonstrated throughout two engineering test flights that verify pointing and image stability requirements in flight
Development of flying qualities criteria for single pilot instrument flight operations
NASA Technical Reports Server (NTRS)
Bar-Gill, A.; Nixon, W. B.; Miller, G. E.
1982-01-01
Flying qualities criteria for Single Pilot Instrument Flight Rule (SPIFR) operations were investigated. The ARA aircraft was modified and adapted for SPIFR operations. Aircraft configurations to be flight-tested were chosen and matched on the ARA in-flight simulator, implementing modern control theory algorithms. Mission planning and experimental matrix design were completed. Microprocessor software for the onboard data acquisition system was debugged and flight-tested. Flight-path reconstruction procedure and the associated FORTRAN program were developed. Algorithms associated with the statistical analysis of flight test results and the SPIFR flying qualities criteria deduction are discussed.
Post-communism: postmodernity or modernity revisited?
Ray, L
1997-12-01
Coinciding with the popularity of postmodern theory, the fall of communism appeared to offer further evidence of the exhaustion of modernity. Such analysis is grounded in a view that the Soviet system was the epitome of modernity. An alternative approach regards post-communism as opening new terrains of struggle for modernity. Thus Habermas and others suggest that post-communist societies are rejoining the trajectory of western modernity whose problems they now recapitulate. This alternative view implies that Soviet systems were something other than 'modern', although their nature is not always clearly defined. However, even if post-communist societies do encounter problems of modernity, they do so in new circumstances where modernist notions of social development have become problematic. This article argues that, contrary to those who regard modernization or postmodernization as irresistible trends, core post-communist societies are likely to develop along an alternative path to that of western modernity. This is tentatively described as 'neo-mercantilist'.
Policies, economic incentives and the adoption of modern irrigation technology in China
NASA Astrophysics Data System (ADS)
Cremades, R.; Wang, J.; Morris, J.
2015-07-01
The challenges China faces in terms of water availability in the agricultural sector are exacerbated by the sector's low irrigation efficiency. To increase irrigation efficiency, promoting modern irrigation technology has been emphasized by policy makers in the country. The overall goal of this paper is to understand the effect of governmental support and economic incentives on the adoption of modern irrigation technology in China, with a focus on household-based irrigation technology and community-based irrigation technology. Based on a unique data set collected at household and village levels from seven provinces, the results indicated that household-based irrigation technology has become noticeable in almost every Chinese village. In contrast, only about half of Chinese villages have adopted community-based irrigation technology. Despite the relatively high adoption level of household-based irrigation technology at the village level, its actual adoption in crop sown areas was not high, even lower for community-based irrigation technology. The econometric analysis results revealed that governmental support instruments like subsidies and extension services policies have played an important role in promoting the adoption of modern irrigation technology. Strikingly, the present irrigation pricing policy has played a significant but contradictory role in promoting the adoption of different types of modern irrigation technology. Irrigation pricing showed a positive impact on household-based irrigation technology, and a negative impact on community-based irrigation technology, possibly related to the substitution effect that is, the higher rate of adoption of household-based irrigation technology leads to lower incentives for investment in community-based irrigation technology. The paper finally concludes and discusses some policy implications.
NASA Astrophysics Data System (ADS)
Pavolotsky, Alexey
2018-01-01
Modern and future heterodyne radio astronomy instrumentation critically depends on availability of advanced fabrication technologies and components. In Part1 of the Poster, we present the thin film fabrication process for SIS mixer receivers, utilizing either AlOx, or AlN barrier superconducting tunnel junctions developed and supported by GARD. The summary of the process design rules is presented. It is well known that performance of waveguide mixer components critically depends on accuracy of their geometrical dimensions. At GARD, all critical mechanical parts are 3D-mapped with a sub-um accuracy. Further progress of heterodyne instrumentation requires new efficient and compact sources of LO signal. We present SIS-based frequency multiplier, which could become a new option for LO source. Future radio astronomy THz receivers will need waveguide components, which fabricating due to their tiny dimensions is not feasible by traditional mechanical machining. We present the alternative micromachining technique for fabricating waveguide component for up 5 THz band and probably beyond.
NASA Technical Reports Server (NTRS)
Woronowicz, Michael; Abel, Joshua; Autrey, David; Blackmon, Rebecca; Bond, Tim; Brown, Martin; Buffington, Jesse; Cheng, Edward; DeLatte, Danielle; Garcia, Kelvin;
2014-01-01
The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to detect NH3 coolant leaks in the ISS thermal control system. An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performance to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations ("directionality"). The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lb-mass/yr. to about 1 lb-mass/day. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ram/wake flows and structural shadowing within low Earth orbit.
NASA Technical Reports Server (NTRS)
Woronowicz, Michael S.; Abel, Joshua C.; Autrey, David; Blackmon, Rebecca; Bond, Tim; Brown, Martin; Buffington, Jesse; Cheng, Edward; DeLatte, Danielle; Garcia, Kelvin;
2014-01-01
The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to detect NH3 coolant leaks in the ISS thermal control system.An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performance to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations (directionality).The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lbmyr. to about 1 lbmday. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ramwake flows and structural shadowing within low Earth orbit.
Psychometric Properties and Normative Data for a Swedish Version of the Modern Health Worries Scale.
Palmquist, Eva; Petrie, Keith J; Nordin, Steven
2017-02-01
The modern health worries (MHW) scale was developed to assess individuals' worries about aspects of modernity and technology affecting personal health. The aim of this study was to psychometrically evaluate a Swedish version of the MHW scale and to provide Swedish normative data. Data were collected as part of the Västerbotten Environmental Health Study, which has a random sample of 3406 Swedish adults (18-79 years). The Swedish version of the MHW scale showed excellent internal consistency and satisfactory convergent validity. A four-factor structure consistent with the original version was confirmed. The model showed invariance across age and sex. A slightly positively skewed and platykurtic distribution was found. Normative data for the general population and for combinations of specific age groups (young, middle aged, and elderly) and sex are presented. The psychometric properties of the Swedish version of the MHW scale suggest that use of this instrument is appropriate for assessing worries about modernity in Swedish-speaking and similar populations. The scale now has the advantage of good normative data being available. MHW may hold importance for understanding and predicting the development of functional disorders, such as idiopathic environmental intolerance and other medically unexplained conditions.
Predesign study for a modern 4-bladed rotor for the NASA rotor systems research aircraft
NASA Technical Reports Server (NTRS)
Bishop, H. E.; Burkam, J. E.; Heminway, R. C.; Keys, C. N.; Smith, K. E.; Smith, J. H.; Staley, J. A.
1981-01-01
Trade-off study results and the rationale for the final selection of an existing modern four-bladed rotor system that can be adapted for installation on the Rotor Systems Research Aircraft (RSRA) are reported. The results of the detailed integration studies, parameter change studies, and instrumentation studies and the recommended plan for development and qualification of the rotor system is also given. Its parameter variants, integration on the RSRA, and support of ground and flight test programs are also discussed.
Perspectives on making big data analytics work for oncology.
El Naqa, Issam
2016-12-01
Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from
2016-01-01
The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations. PMID:27956882
Hicks, Michael B; Regalado, Erik L; Tan, Feng; Gong, Xiaoyi; Welch, Christopher J
2016-01-05
Supercritical fluid chromatography (SFC) has long been a preferred method for enantiopurity analysis in support of pharmaceutical discovery and development, but implementation of the technique in regulated GMP laboratories has been somewhat slow, owing to limitations in instrument sensitivity, reproducibility, accuracy and robustness. In recent years, commercialization of next generation analytical SFC instrumentation has addressed previous shortcomings, making the technique better suited for GMP analysis. In this study we investigate the use of modern SFC for enantiopurity analysis of several pharmaceutical intermediates and compare the results with the conventional HPLC approaches historically used for analysis in a GMP setting. The findings clearly illustrate that modern SFC now exhibits improved precision, reproducibility, accuracy and robustness; also providing superior resolution and peak capacity compared to HPLC. Based on these findings, the use of modern chiral SFC is recommended for GMP studies of stereochemistry in pharmaceutical development and manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.
PREFACE: 4th International Symposium on Instrumentation Science and Technology (ISIST'2006)
NASA Astrophysics Data System (ADS)
Jiubin, Tan
2006-10-01
On behalf of the International Program Committee of ISIST'2006 and the symposium coordinators, I would like to thank all the participants for their presence at the 4th International Symposium on Instrumentation Science and Technology (ISIST'2006), a platform for scientists, researchers and experts from different parts of the world to present their achievements and to exchange their views on ways and means to further develop modern instrumentation science and technology. In the present information age, instrumentation science and technology is playing a more and more important role, not only in the acquisition and conversion of information at the very beginning of the information transformation chain, but also in the transfer, manipulation and utilization of information. It provides an analysis and test means for bioengineering, medical engineering, life science, environmental engineering and micro/nanometer technology, and integrates these disciplines to form new subdivisions of their own. The major subject of the symposium is crossover and fusion between instrumentation science and technology and other sciences and technologies. ISIST'2006 received more than 800 full papers from 12 countries and regions, from which 300 papers were finally selected by the international program committee for inclusion in the proceedings of ISIST'2006, published in 2 volumes. The major topics include instrumentation basic theory and methodology, sensors and conversion technology, signal and image processing, instruments and systems, laser and optical fiber instrumentation, advanced optical instrumentation, optoelectronics instrumentation, MEMS, nanotechnology and instrumentation, biomedical and environmental instrumentation, automatic test and control. The International Symposium on Instrumentation Science and Technology (ISIST) is sponsored by ICMI, NSFC, CSM, and CIS, and organized by ICMI, HIT and IC-CSM, and held every two years. The 1st symposium was held in LuoYang, China in
DCMS: A data analytics and management system for molecular simulation.
Kumar, Anand; Grupcev, Vladimir; Berrada, Meryem; Fogarty, Joseph C; Tu, Yi-Cheng; Zhu, Xingquan; Pandit, Sagar A; Xia, Yuni
Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface ( i.e. , SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression.
An optical instrument to test pesticide residues in agricultural products
NASA Astrophysics Data System (ADS)
Qiu, Zhengjun; Zheng, Wenzhong; Fang, Hui; He, Yong
2005-10-01
Pesticide is one of the indispensability materials in modern agricultural management, however the excessive use of pesticides has threatened the ecological environment and people's health. This paper introduced an optical instrument to test the pesticide residues in agricultural products based on the inhibition rate of organophosphates against acrtyl-cholinesterase (AchE). The instrument consists mainly of a solid light source with 410nm wavelength, a sampling container, an optical sensor, a temperature sensor, and a MCU based data acquisition board. The light illuminated through the liquid in the sampling container, and the absorptivity was determined by the amount of the pesticide residues in the liquid. This paper involves the design of optical testing system, the data acquisition and calibration of the optical sensor, the design of microcontroller-based electrical board. Tests were done to reveal the affection of temperature and reacting time on AchE, to establish the relationship between the amount of methamidophos and dichlorvos with AchE. The results showed that the absorption rate was related to the pesticide residues and it could be concluded that the pesticide residues exceeded the normal level when the inhibition rate was over 50 percent. The instrument has potential application in vegetable markets.
Stoebenau, Kirsten; Nair, Rama C; Rambeloson, Valérie; Rakotoarison, Paul Ghislain; Razafintsalama, Violette; Labonté, Ronald
2013-03-19
Ethnographic evidence suggests that transactional sex is sometimes motivated by youth's interest in the consumption of modern goods as much as it is in basic survival. There are very few quantitative studies that examine the association between young people's interests in the consumption of modern goods and their sexual behaviour. We examined this association in two regions and four residence zones of Madagascar: urban, peri-urban and rural Antananarivo, and urban Antsiranana. We expected risky sexual behaviour would be associated with interests in consuming modern goods or lifestyles; urban residence; and socio-cultural characteristics. We administered a population-based survey to 2, 255 youth ages 15-24 in all four residence zones. Focus group discussions guided the survey instrument which assessed socio-demographic and economic characteristics, consumption of modern goods, preferred activities and sexual behaviour. Our outcomes measures included: multiple sexual partners in the last year (for men and women); and ever practicing transactional sex (for women). Overall, 7.3% of women and 30.7% of men reported having had multiple partners in the last year; and 5.9% of women reported ever practicing transactional sex. Bivariate results suggested that for both men and women having multiple partners was associated with perceptions concerning the importance of fashion and a series of activities associated with modern lifestyles. A subset of lifestyle characteristics remained significant in multivariate models. For transactional sex bivariate results suggested perceptions around fashion, nightclub attendance, and getting to know a foreigner were key determinants; and all remained significant in multivariate analysis. We found peri-urban residence more associated with transactional sex than urban residence; and ethnic origin was the strongest predictor of both outcomes for women. While we found indication of an association between sexual behaviour and interest in modern
Instrumental neutron activation analysis for studying size-fractionated aerosols
NASA Astrophysics Data System (ADS)
Salma, Imre; Zemplén-Papp, Éva
1999-10-01
Instrumental neutron activation analysis (INAA) was utilized for studying aerosol samples collected into a coarse and a fine size fraction on Nuclepore polycarbonate membrane filters. As a result of the panoramic INAA, 49 elements were determined in an amount of about 200-400 μg of particulate matter by two irradiations and four γ-spectrometric measurements. The analytical calculations were performed by the absolute ( k0) standardization method. The calibration procedures, application protocol and the data evaluation process are described and discussed. They make it possible now to analyse a considerable number of samples, with assuring the quality of the results. As a means of demonstrating the system's analytical capabilities, the concentration ranges, median or mean atmospheric concentrations and detection limits are presented for an extensive series of aerosol samples collected within the framework of an urban air pollution study in Budapest. For most elements, the precision of the analysis was found to be beyond the uncertainty represented by the sampling techniques and sample variability.
Semantic Interaction for Visual Analytics: Toward Coupling Cognition and Computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander
2014-07-01
The dissertation discussed in this article [1] was written in the midst of an era of digitization. The world is becoming increasingly instrumented with sensors, monitoring, and other methods for generating data describing social, physical, and natural phenomena. Thus, data exist with the potential of being analyzed to uncover, or discover, the phenomena from which it was created. However, as the analytic models leveraged to analyze these data continue to increase in complexity and computational capability, how can visualizations and user interaction methodologies adapt and evolve to continue to foster discovery and sensemaking?
Experimental performance and acoustic investigation of modern, counterrotating blade concepts
NASA Technical Reports Server (NTRS)
Hoff, G. E.
1990-01-01
The aerodynamic, acoustic, and aeromechanical performance of counterrotating blade concepts were evaluated both theoretically and experimentally. Analytical methods development and design are addressed. Utilizing the analytical methods which evolved during the conduct of this work, aerodynamic and aeroacoustic predictions were developed, which were compared to NASA and GE wind tunnel test results. The detailed mechanical design and fabrication of five different composite shell/titanium spar counterrotating blade set configurations are presented. Design philosophy, analyses methods, and material geometry are addressed, as well as the influence of aerodynamics, aeromechanics, and aeroacoustics on the design procedures. Blade fabrication and quality control procedures are detailed; bench testing procedures and results of blade integrity verification are presented; and instrumentation associated with the bench testing also is identified. Additional hardware to support specialized testing is described, as are operating blade instrumentation and the associated stress limits. The five counterrotating blade concepts were scaled to a tip diameter of 2 feet, so they could be incorporated into MPS (model propulsion simulators). Aerodynamic and aeroacoustic performance testing was conducted in the NASA Lewis 8 x 6 supersonic and 9 x 15 V/STOL (vertical or short takeoff and landing) wind tunnels and in the GE freejet anechoic test chamber (Cell 41) to generate an experimental data base for these counterrotating blade designs. Test facility and MPS vehicle matrices are provided, and test procedures are presented. Effects on performance of rotor-to-rotor spacing, angle-of-attack, pylon proximity, blade number, reduced-diameter aft blades, and mismatched rotor speeds are addressed. Counterrotating blade and specialized aeromechanical hub stability test results are also furnished.
Multimedia Analysis plus Visual Analytics = Multimedia Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chinchor, Nancy; Thomas, James J.; Wong, Pak C.
2010-10-01
Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.
Analysis of commercial equipment and instrumentation for Spacelab payloads, volume 2
NASA Technical Reports Server (NTRS)
1974-01-01
Technical results are presented of a study to investigate analytically the feasibility of using commercially available laboratory equipment and instrumentation in the spacelab in support of various experiments. The feasibility is demonstrated by the breadth of application of commercial, airborne, and military equipment to experiment equipment requirements in the spacelab, and the cost effectiveness of utilizing this class of equipment instead of custom-built aerospace equipment typical of past designs. Equipment design and specifications are discussed.
Human factor engineering based design and modernization of control rooms with new I and C systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larraz, J.; Rejas, L.; Ortega, F.
2012-07-01
Instrumentation and Control (I and C) systems of the latest nuclear power plants are based on the use of digital technology, distributed control systems and the integration of information in data networks (Distributed Control and Instrumentation Systems). This has a repercussion on Control Rooms (CRs), where the operations and monitoring interfaces correspond to these systems. These technologies are also used in modernizing I and C systems in currently operative nuclear power plants. The new interfaces provide additional capabilities for operation and supervision, as well as a high degree of flexibility, versatility and reliability. An example of this is the implementationmore » of solutions such as compact stations, high level supervision screens, overview displays, computerized procedures, new operational support systems or intelligent alarms processing systems in the modernized Man-Machine Interface (MMI). These changes in the MMI are accompanied by newly added Software (SW) controls and new solutions in automation. Tecnatom has been leading various projects in this area for several years, both in Asian countries and in the United States, using in all cases international standards from which Tecnatom own methodologies have been developed and optimized. The experience acquired in applying this methodology to the design of new control rooms is to a large extent applicable also to the modernization of current control rooms. An adequate design of the interface between the operator and the systems will facilitate safe operation, contribute to the prompt identification of problems and help in the distribution of tasks and communications between the different members of the operating shift. Based on Tecnatom experience in the field, this article presents the methodological approach used as well as the most relevant aspects of this kind of project. (authors)« less
Analytic Solution of the Electromagnetic Eigenvalues Problem in a Cylindrical Resonator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Checchin, Mattia; Martinello, Martina
Resonant accelerating cavities are key components in modern particles accelerating facilities. These take advantage of electromagnetic fields resonating at microwave frequencies to accelerate charged particles. Particles gain finite energy at each passage through a cavity if in phase with the resonating field, reaching energies even of the order of $TeV$ when a cascade of accelerating resonators are present. In order to understand how a resonant accelerating cavity transfers energy to charged particles, it is important to determine how the electromagnetic modes are exited into such resonators. In this paper we present a complete analytical calculation of the resonating fields formore » a simple cylindrical-shaped cavity.« less
NASA Astrophysics Data System (ADS)
Schlager, Kenneth J.; Ruchti, Timothy L.
1995-04-01
TAMM for Transcutaneous Analyte Measuring Method is a near infrared spectroscopic technique for the noninvasive measurement of human blood chemistry. A near infrared indium gallium arsenide (InGaAs) photodiode array spectrometer has been developed and tested on over 1,000 patients as a part of an SBIR program sponsored by the Naval Medical Research and Development Command. Nine (9) blood analytes have been measured and evaluated during pre-clinical testing: sodium, chloride, calcium, potassium, bicarbonate, BUN, glucose, hematocrit and hemoglobin. A reflective rather than a transmissive invasive approach to measurement has been taken to avoid variations resulting from skin color and sensor positioning. The current status of the instrumentation, neural network pattern recognition algorithms and test results will be discussed.
Steam thermolysis of tire shreds: modernization in afterburning of accompanying gas with waste steam
NASA Astrophysics Data System (ADS)
Kalitko, V. A.
2010-03-01
On the basis of experience in the commercial operation of tire-shred steam thermolysis in EnresTec Inc. (Taiwan) producing high-grade commercial carbon, liquid pyrolysis fuel, and accompanying fuel gas by this method, we have proposed a number of engineering solutions and calculated-analytical substantiations for modernization and intensification of the process by afterburning the accompanying gas with waste steam condensable in the scrubber of water gas cleaning of afterburning products. The condensate is completely freed of the organic pyrolysis impurities and the necessity of separating it from the liquid fuel, as is the case with the active process, is excluded.
Towards a standardized method to assess straylight in earth observing optical instruments
NASA Astrophysics Data System (ADS)
Caron, J.; Taccola, M.; Bézy, J.-L.
2017-09-01
Straylight is a spurious effect that can seriously degrade the radiometric accuracy achieved by Earth observing optical instruments, as a result of the high contrast in the observed Earth radiance scenes and spectra. It is considered critical for several ESA missions such as Sentinel-5, FLEX and potential successors to CarbonSat. Although it is traditionally evaluated by Monte-Carlo simulations performed with commercial softwares (e.g. ASAP, Zemax, LightTools), semi-analytical approximate methods [1,2] have drawn some interest in recent years due to their faster computing time and the greater insight they provide in straylight mechanisms. They cannot replace numerical simulations, but may be more advantageous in contexts where many iterations are needed, for instance during the early phases of an instrument design.
Depot Maintenance Modernization
1988-02-01
process. However, this mobilization-planning process has not been implemented in the shipyards. Instead, NAVSEA has announcedl that " National economic...illustrate the type of modernization planning that occurs at the ALCs, we draw upon Oklahoma City’s Technology Enhancement and Modernization of Plant ... National Product - he concluded that the inflation-adjusted rate of return in the private sector was about 10 percent in 1965 (12 percent nominal rate
Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.
Stolper, Charles D; Perer, Adam; Gotz, David
2014-12-01
As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications
Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.
2018-01-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069
Initiating an Online Reputation Monitoring System with Open Source Analytics Tools
NASA Astrophysics Data System (ADS)
Shuhud, Mohd Ilias M.; Alwi, Najwa Hayaati Md; Halim, Azni Haslizan Abd
2018-05-01
Online reputation is an invaluable asset for modern organizations as it can help in business performance especially in sales and profit. However, if we are not aware of our reputation, it is difficult to maintain it. Thus, social media analytics is a new tool that can provide online reputation monitoring in various ways such as sentiment analysis. As a result, numerous large-scale organizations have implemented Online Reputation Monitoring (ORM) systems. However, this solution is not supposed to be exclusively for high-income organizations, as many organizations regardless sizes and types are now online. This research attempts to propose an affordable and reliable ORM system using combination of open source analytics tools for both novice practitioners and academicians. We also evaluate its prediction accuracy and we discovered that the system provides acceptable predictions (sixty percent accuracy) and demonstrate a tally prediction of major polarity by human annotation. The proposed system can help in supporting business decisions with flexible monitoring strategies especially for organization that want to initiate and administrate ORM themselves at low cost.
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.
Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D
2017-04-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.
Investigation of Stability of Precise Geodetic Instruments Used in Deformation Monitoring
NASA Astrophysics Data System (ADS)
Woźniak, Marek; Odziemczyk, Waldemar
2017-12-01
Monitoring systems using automated electronic total stations are an important element of safety control of many engineering objects. In order to ensure the appropriate credibility of acquired data, it is necessary that instruments (total stations in most of the cases) used for measurements meet requirements of measurement accuracy, as well as the stability of instrument axis system geometry. With regards to the above, it is expedient to conduct quality control of data acquired using electronic total stations in the context of performed measurement procedures. This paper presents results of research conducted at the Faculty of Geodesy and Cartography at Warsaw University of Technology investigating the stability of "basic" error values (collimation, zero location for V circle, inclination), for two types of automatic total stations: TDA 5005 and TCRP 1201+. Research provided also information concerning the influence of temperature changes upon the stability of investigated instrument's optical parameters. Results are presented in graphical analytic technique. Final conclusions propose methods, which allow avoiding negative results of measuring tool-set geometry changes during conducting precise deformation monitoring measurements.
Cantrill, Richard C
2008-01-01
Methods of analysis for products of modern biotechnology are required for national and international trade in seeds, grain and food in order to meet the labeling or import/export requirements of different nations and trading blocks. Although many methods were developed by the originators of transgenic events, governments, universities, and testing laboratories, trade is less complicated if there exists a set of international consensus-derived analytical standards. In any analytical situation, multiple methods may exist for testing for the same analyte. These methods may be supported by regional preferences and regulatory requirements. However, tests need to be sensitive enough to determine low levels of these traits in commodity grain for regulatory purposes and also to indicate purity of seeds containing these traits. The International Organization for Standardization (ISO) and its European counterpart have worked to produce a suite of standards through open, balanced and consensus-driven processes. Presently, these standards are approaching the time for their first review. In fact, ISO 21572, the "protein standard" has already been circulated for systematic review. In order to expedite the review and revision of the nucleic acid standards an ISO Technical Specification (ISO/TS 21098) was drafted to set the criteria for the inclusion of precision data from collaborative studies into the annexes of these standards.
Instrumentation: Software-Driven Instrumentation: The New Wave.
ERIC Educational Resources Information Center
Salit, M. L.; Parsons, M. L.
1985-01-01
Software-driven instrumentation makes measurements that demand a computer as an integral part of either control, data acquisition, or data reduction. The structure of such instrumentation, hardware requirements, and software requirements are discussed. Examples of software-driven instrumentation (such as wavelength-modulated continuum source…
Analytical surveillance of emerging drugs of abuse and drug formulations
Thomas, Brian F.; Pollard, Gerald T.; Grabenauer, Megan
2012-01-01
Uncontrolled recreational drugs are proliferating in number and variety. Effects of long-term use are unknown, and regulation is problematic, as efforts to control one chemical often lead to several other structural analogs. Advanced analytical instrumentation and methods are continuing to be developed to identify drugs, chemical constituents of products, and drug substances and metabolites in biological fluids. Several mass spectrometry based approaches appear promising, particularly those that involve high resolution chromatographic and mass spectrometric methods that allow unbiased data acquisition and sophisticated data interrogation. Several of these techniques are shown to facilitate both targeted and broad spectrum analysis, which is often of particular benefit when dealing with misleadingly labeled products or assessing a biological matrix for illicit drugs and metabolites. The development and application of novel analytical approaches such as these will help to assess the nature and degree of exposure and risk and, where necessary, inform forensics and facilitate implementation of specific regulation and control measures. PMID:23154240
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keck, B D; Ognibene, T; Vogel, J S
2010-02-05
Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of {sup 14}C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of anymore » separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of {sup 14}C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the {sup 14}C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with {sup 14}C corresponds to 30
Raman spectroscopy for in-line water quality monitoring--instrumentation and potential.
Li, Zhiyun; Deen, M Jamal; Kumar, Shiva; Selvaganapathy, P Ravi
2014-09-16
Worldwide, the access to safe drinking water is a huge problem. In fact, the number of persons without safe drinking water is increasing, even though it is an essential ingredient for human health and development. The enormity of the problem also makes it a critical environmental and public health issue. Therefore, there is a critical need for easy-to-use, compact and sensitive techniques for water quality monitoring. Raman spectroscopy has been a very powerful technique to characterize chemical composition and has been applied to many areas, including chemistry, food, material science or pharmaceuticals. The development of advanced Raman techniques and improvements in instrumentation, has significantly improved the performance of modern Raman spectrometers so that it can now be used for detection of low concentrations of chemicals such as in-line monitoring of chemical and pharmaceutical contaminants in water. This paper briefly introduces the fundamentals of Raman spectroscopy, reviews the development of Raman instrumentations and discusses advanced and potential Raman techniques for in-line water quality monitoring.
Nanomaterials in consumer products: a challenging analytical problem.
Contado, Catia
2015-01-01
Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.
Nanomaterials in consumer products: a challenging analytical problem
NASA Astrophysics Data System (ADS)
Contado, Catia
2015-08-01
Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits versus risks of engineered nanomaterials and consequently to legislate in favor of consumer’s protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.
Nanomaterials in consumer products: a challenging analytical problem
Contado, Catia
2015-01-01
Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices. PMID:26301216
Focus determination for the James Webb Space Telescope Science Instruments: A Survey of Methods
NASA Technical Reports Server (NTRS)
Davila, Pamela S.; Bolcar, Matthew R.; Boss, B.; Dean, B.; Hapogian, J.; Howard, J.; Unger, B.; Wilson, M.
2006-01-01
The James Webb Space Telescope (JWST) is a segmented deployable telescope that will require on-orbit alignment using the Near Infrared Camera as a wavefront sensor. The telescope will be aligned by adjusting seven degrees of freedom on each of 18 primary mirror segments and five degrees of freedom on the secondary mirror to optimize the performance of the telescope and camera at a wavelength of 2 microns. With the completion of these adjustments, the telescope focus is set and the optical performance of each of the other science instruments should then be optimal without making further telescope focus adjustments for each individual instrument. This alignment approach requires confocality of the instruments after integration and alignment to the composite metering structure, which will be verified during instrument level testing at Goddard Space Flight Center with a telescope optical simulator. In this paper, we present the results from a study of several analytical approaches to determine the focus for each instrument. The goal of the study is to compare the accuracies obtained for each method, and to select the most feasible for use during optical testing.
NASA Astrophysics Data System (ADS)
DeVorkin, David H.
2017-01-01
The National Air and Space Museum of the Smithsonian Institution is responsible for preserving the material heritage of modern astronomical history. We place emphasis on American accomplishments, on both airborne and spaceborne instrumentation, and on ground based instrumentation that stimulated and supported spaceborne efforts. At present the astronomical collection includes over 600 objects, of which approximately 40 relate to the history of infrared astronomy. This poster will provide a simple listing of our holdings in infrared and far-infrared astronomy, and will highlight particularly significant early objects, like Cashman and Ektron cells, Leighton and Neugebauer's Caltech 2.2 micron survey telescope, Low's Lear Jet Bolometer, Harwit's first Aerobee IR payload and Fazio's balloon-borne observatory. Elements from more recent missions will also be included, such as instruments from KAO, an IRAS focal plane instrument, FIRAS from COBE, the payload from Boomerang and Woody and Richards' balloonsonde payload. The poster author will invite AAS members to comment on these holdings, provide short stories of their experiences building and using them, and suggest candidates for possible collection.
Vidal, Fabien; Simon, Caroline; Cristini, Christelle; Arnaud, Catherine; Parant, Olivier
2013-01-01
To evaluate immediate perineal and neonatal morbidity associated with instrumental rotations performed with Thierry's spatulas for the management of persistent posterior occiput (OP) positions. Retrospective study including all persistent occiput posterior positions with vaginal OP delivery, from August 2006 to September 2007. Occiput anterior deliveries following successful instrumental rotation were included as well. We compared maternal and neonatal immediate outcomes between spontaneous deliveries, rotational and non rotational assisted deliveries, using χ(2) and Anova tests. 157 patients were enrolled, comprising 46 OP spontaneous deliveries, 58 assisted OP deliveries and 53 deliveries after rotational procedure. Instrumental rotation failed in 9 cases. Mean age and parity were significantly higher in the spontaneous delivery group, while labor duration was shorter. There were no significant differences in the rate of severe perineal tears and neonatal adverse outcomes between the 3 groups. Instrumental rotation using Thierry's spatulas was not associated with a reduced risk of maternal and neonatal morbidity for persistent OP deliveries. Further studies are required to define the true interest of such procedure in modern obstetrics.
Vidal, Fabien; Simon, Caroline; Cristini, Christelle; Arnaud, Catherine; Parant, Olivier
2013-01-01
Objective To evaluate immediate perineal and neonatal morbidity associated with instrumental rotations performed with Thierry’s spatulas for the management of persistent posterior occiput (OP) positions. Methods Retrospective study including all persistent occiput posterior positions with vaginal OP delivery, from August 2006 to September 2007. Occiput anterior deliveries following successful instrumental rotation were included as well. We compared maternal and neonatal immediate outcomes between spontaneous deliveries, rotational and non rotational assisted deliveries, using χ2 and Anova tests. Results 157 patients were enrolled, comprising 46 OP spontaneous deliveries, 58 assisted OP deliveries and 53 deliveries after rotational procedure. Instrumental rotation failed in 9 cases. Mean age and parity were significantly higher in the spontaneous delivery group, while labor duration was shorter. There were no significant differences in the rate of severe perineal tears and neonatal adverse outcomes between the 3 groups. Conclusion Instrumental rotation using Thierry’s spatulas was not associated with a reduced risk of maternal and neonatal morbidity for persistent OP deliveries. Further studies are required to define the true interest of such procedure in modern obstetrics. PMID:24205122
NASA Astrophysics Data System (ADS)
Garell, P. C.; Granner, M. A.; Noh, M. D.; Howard, M. A.; Volkov, I. O.; Gillies, G. T.
1998-12-01
Scientific advancement is often spurred by the development of new instruments for investigation. Over the last several decades, many new instruments have been produced to further our understanding of the physiology of the human brain. We present a partial overview of some of these instruments, paying particular attention to those which record the electrical activity of the human brain. We preface the review with a brief primer on neuroanatomy and physiology, followed by a discussion of the latest types of apparatus used to investigate various properties of the central nervous system. A special focus is on microelectrode investigations that employ both intracellular and extracellular methods of recording the electrical activity of single neurons; another is on the modern electroencephalographic, electrocorticographic, and magnetoencephalographic methods used to study the spontaneous and evoked field potentials of the brain. Some examples of clinical applications are included, where appropriate.
ERIC Educational Resources Information Center
Oblinger, Diana G.
2012-01-01
Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woronowicz, Michael; Blackmon, Rebecca; Brown, Martin
2014-12-09
The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to demonstrate the ability to detect NH{sub 3} coolant leaks in the ISS thermal control system. An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performancemore » to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations (“directionality”). The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lb{sub m/}/yr. to about 1 lb{sub m}/day. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ram/wake flows and structural shadowing within low Earth orbit.« less
ERIC Educational Resources Information Center
MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin
2014-01-01
This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…
Analytical Chemical Sensing in the Submillimeter/terahertz Spectral Range
NASA Astrophysics Data System (ADS)
Moran, Benjamin L.; Fosnight, Alyssa M.; Medvedev, Ivan R.; Neese, Christopher F.
2012-06-01
Highly sensitive and selective Terahertz sensor utilized to quantitatively analyze a complex mixture of Volatile Organic Compounds is reported. To best demonstrate analytical capabilities of THz chemical sensors we chose to perform analytical quantitative analysis of a certified gas mixture using a novel prototype chemical sensor that couples a commercial preconcentration system (Entech 7100A) to a high resolution THz spectrometer. We selected Method TO-14A certified mixture of 39 volatile organic compounds (VOCs) diluted to 1 part per million (ppm) in nitrogen. 26 of the 39 chemicals were identified by us as suitable for THz spectroscopic detection. Entech 7100A system is designed and marketed as an inlet system for Gas Chromatography-Mass Spectrometry (GC-MS) instruments with a specific focus on TO-14 and TO-15 EPA sampling methods. Its preconcentration efficiency is high for the 39 chemicals in the mixture used for this study and our preliminary results confirm this. Here we present the results of this study which serves as basis for our ongoing research in environmental sensing and analysis of exhaled human breath.
NASA Astrophysics Data System (ADS)
Parker, Tim; Devanney, Peter; Bainbridge, Geoff; Townsend, Bruce
2017-04-01
The march to make every type of seismometer, weak to strong motion, reliable and economically deployable in any terrestrial environment continues with the availability of three new sensors and seismic systems including ones with over 200dB of dynamic range. Until recently there were probably 100 pier type broadband sensors for every observatory type pier, not the types of deployments geoscientists are needing to advance science and monitoring capability. Deeper boreholes are now the recognized quieter environments for best observatory class instruments and these same instruments can now be deployed in direct burial environments which is unprecedented. The experiences of facilities in large deployments of broadband seismometers in continental scale rolling arrays proves the utility of packaging new sensors in corrosion resistant casings and designing in the robustness needed to work reliably in temporary deployments. Integrating digitizers and other sensors decreases deployment complexity, decreases acquisition and deployment costs, increases reliability and utility. We'll discuss the informed evolution of broadband pier instruments into the modern integrated field tools that enable economic densification of monitoring arrays along with supporting new ways to approach geoscience research in a field environment.
Military Health System Transformation Implications on Health Information Technology Modernization.
Khan, Saad
2018-03-01
With the recent passage of the National Defense Authorization Act for Fiscal Year 2017, Congress has triggered groundbreaking Military Health System organizational restructuring with the Defense Health Agency assuming responsibility for managing all hospitals and clinics owned by the Army, Navy, and Air Force. This is a major shift toward a modern value-based managed care system, which will require much greater military-civilian health care delivery integration to be in place by October 2018. Just before the National Defense Authorization Act for Fiscal Year 2017 passage, the Department of Defense had already begun a seismic shift and awarded a contract for the new Military Health System-wide electronic health record system. In this perspective, we discuss the implications of the intersection of two large-scope and large-scale initiatives, health system transformation, and information technology modernization, being rolled out in the largest and most complex federal agency and potential risk mitigating steps. The Military Health System will require an expanded unified clinical leadership to spearhead short-term transformation; furthermore, developing, organizing, and growing a cadre of informatics expertise to expand the use and diffusion of novel solutions such as health information exchanges, data analytics, and others to transcend organizational barriers are still needed to achieve the long-term aim of health system reform as envisioned by the National Defense Authorization Act for Fiscal Year 2017.
Stoel, Berend C; Borman, Terry M; de Jongh, Ronald
2012-01-01
Classical violins produced by makers such as Antonio Stradivari and Guarneri del Gesu have long been considered the epitome of the luthier's art and the expressive tool of choice for the most celebrated violinists. It has been speculated these makers had access to wood that was unique in some way and that this was responsible for their acclaimed tonal characteristics. In an attempt to discern whether the above conjecture is true, we analyzed 17 modern and classical Dutch, German, Austrian and French violins by wood densitometry using computed tomography and correlated these results with our previous study of modern and Cremonese violins; in all studying 30 instruments of the violin family. In order to make this comparison possible we developed methods to cross calibrate results from different CT manufacturers using calibration wood pieces. We found no significant differences in median densities between modern and classical violins, or between classical violins from different origins. These results suggest that it is unlikely classical Cremonese makers had access to wood with significantly different wood density characteristics than that available to contemporaneous or modern makers.
The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research
ERIC Educational Resources Information Center
Siemens, George
2014-01-01
The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.
Modern psychometrics for assessing achievement goal orientation: a Rasch analysis.
Muis, Krista R; Winne, Philip H; Edwards, Ordene V
2009-09-01
A program of research is needed that assesses the psychometric properties of instruments designed to quantify students' achievement goal orientations to clarify inconsistencies across previous studies and to provide a stronger basis for future research. We conducted traditional psychometric and modern Rasch-model analyses of the Achievement Goals Questionnaire (AGQ, Elliot & McGregor, 2001) and the Patterns of Adaptive Learning Scale (PALS, Midgley et al., 2000) to provide an in-depth analysis of the two most popular instruments in educational psychology. For Study 1, 217 undergraduate students enrolled in educational psychology courses participated. Thirty-four were male and 181 were female (two did not respond). Participants completed the AGQ in the context of their educational psychology class. For Study 2, 126 undergraduate students enrolled in educational psychology courses participated. Thirty were male and 95 were female (one did not respond). Participants completed the PALS in the context of their educational psychology class. Traditional psychometric assessments of the AGQ and PALS replicated previous studies. For both, reliability estimates ranged from good to very good for raw subscale scores and fit for the models of goal orientations were good. Based on traditional psychometrics, the AGQ and PALS are valid and reliable indicators of achievement goals. Rasch analyses revealed that estimates of reliability for items were very good but respondent ability estimates varied from poor to good for both the AGQ and PALS. These findings indicate that items validly and reliably reflect a group's aggregate goal orientation, but using either instrument to characterize an individual's goal orientation is hazardous.
Kalb, Daniel M; Fencl, Frank A; Woods, Travis A; Swanson, August; Maestas, Gian C; Juárez, Jaime J; Edwards, Bruce S; Shreve, Andrew P; Graves, Steven W
2017-09-19
Flow cytometry provides highly sensitive multiparameter analysis of cells and particles but has been largely limited to the use of a single focused sample stream. This limits the analytical rate to ∼50K particles/s and the volumetric rate to ∼250 μL/min. Despite the analytical prowess of flow cytometry, there are applications where these rates are insufficient, such as rare cell analysis in high cellular backgrounds (e.g., circulating tumor cells and fetal cells in maternal blood), detection of cells/particles in large dilute samples (e.g., water quality, urine analysis), or high-throughput screening applications. Here we report a highly parallel acoustic flow cytometer that uses an acoustic standing wave to focus particles into 16 parallel analysis points across a 2.3 mm wide optical flow cell. A line-focused laser and wide-field collection optics are used to excite and collect the fluorescence emission of these parallel streams onto a high-speed camera for analysis. With this instrument format and fluorescent microsphere standards, we obtain analysis rates of 100K/s and flow rates of 10 mL/min, while maintaining optical performance comparable to that of a commercial flow cytometer. The results with our initial prototype instrument demonstrate that the integration of key parallelizable components, including the line-focused laser, particle focusing using multinode acoustic standing waves, and a spatially arrayed detector, can increase analytical and volumetric throughputs by orders of magnitude in a compact, simple, and cost-effective platform. Such instruments will be of great value to applications in need of high-throughput yet sensitive flow cytometry analysis.
Nutritional Lipidomics: Molecular Metabolism, Analytics, and Diagnostics
Smilowitz, Jennifer T.; Zivkovic, Angela M.; Wan, Yu-Jui Yvonne; Watkins, Steve M.; Nording, Malin L.; Hammock, Bruce D.; German, J. Bruce
2013-01-01
The field of lipidomics is providing nutritional science a more comprehensive view of lipid intermediates. Lipidomics research takes advantage of the increase in accuracy and sensitivity of mass detection of mass spectrometry with new bioinformatics toolsets to characterize the structures and abundances of complex lipids. Yet, translating lipidomics to practice via nutritional interventions is still in its infancy. No single instrumentation platform is able to solve the varying analytical challenges of the different molecular lipid species. Biochemical pathways of lipid metabolism remain incomplete and the tools to map lipid compositional data to pathways are still being assembled. Biology itself is dauntingly complex and simply separating biological structures remains a key challenge to lipidomics. Nonetheless, the strategy of combining tandem analytical methods to perform the sensitive, high-throughput, quantitative and comprehensive analysis of lipid metabolites of very large numbers of molecules is poised to drive the field forward rapidly. Among the next steps for nutrition to understand the changes in structures, compositions and function of lipid biomolecules in response to diet is to describe their distribution within discrete functional compartments-lipoproteins. Additionally, lipidomics must tackle the task of assigning the functions of lipids as signaling molecules, nutrient sensors, and intermediates of metabolic pathways. PMID:23818328
Platform for Automated Real-Time High Performance Analytics on Medical Image Data.
Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A
2018-03-01
Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.
NASA Astrophysics Data System (ADS)
Ryan, J. G.
2012-12-01
Bringing the use of cutting-edge research tools into student classroom experiences has long been a popular educational strategy in the geosciences and other STEM disciplines. The NSF CCLI and TUES programs have funded a large number of projects that placed research-grade instrumentation at educational institutions for instructional use and use in supporting undergraduate research activities. While student and faculty response to these activities has largely been positive, a range of challenges exist related to their educational effectiveness. Many of the obstacles these approaches have faced relate to "scaling up" of research mentoring experiences (e.g., providing training and time for use for an entire classroom of students, as opposed to one or two), and to time tradeoffs associated with providing technical training for effective instrument use versus course content coverage. The biggest challenge has often been simple logistics: a single instrument, housed in a different space, is difficult to integrate effectively into instructional activities. My CCLI-funded project sought primarily to knock down the logistical obstacles to research instrument use by taking advantage of remote instrument operation technologies, which allow the in-classroom use of networked analytical tools. Remote use of electron microprobe and SEM instruments of the Florida Center for Analytical Electron Microscopy (FCAEM) in Miami, FL was integrated into two geoscience courses at USF in Tampa, FL. Remote operation permitted the development of whole-class laboratory exercises to familiarize students with the tools, their function, and their capabilities; and it allowed students to collect high-quality chemical and image data on their own prepared samples in the classroom during laboratory periods. These activities improve student engagement in the course, appear to improve learning of key concepts in mineralogy and petrology, and have led to students pursuing independent research projects, as
ERIC Educational Resources Information Center
Sollervall, Håkan; Stadler, Erika
2015-01-01
The aim of the presented case study is to investigate how coherent analytical instruments may guide the a priori and a posteriori analyses of a didactical situation. In the a priori analysis we draw on the notion of affordances, as artefact-mediated opportunities for action, to construct hypothetical trajectories of goal-oriented actions that have…
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; DeLoach, Richard
2003-01-01
A wind tunnel experiment for characterizing the aerodynamic and propulsion forces and moments acting on a research model airplane is described. The model airplane called the Free-flying Airplane for Sub-scale Experimental Research (FASER), is a modified off-the-shelf radio-controlled model airplane, with 7 ft wingspan, a tractor propeller driven by an electric motor, and aerobatic capability. FASER was tested in the NASA Langley 12-foot Low-Speed Wind Tunnel, using a combination of traditional sweeps and modern experiment design. Power level was included as an independent variable in the wind tunnel test, to allow characterization of power effects on aerodynamic forces and moments. A modeling technique that employs multivariate orthogonal functions was used to develop accurate analytic models for the aerodynamic and propulsion force and moment coefficient dependencies from the wind tunnel data. Efficient methods for generating orthogonal modeling functions, expanding the orthogonal modeling functions in terms of ordinary polynomial functions, and analytical orthogonal blocking were developed and discussed. The resulting models comprise a set of smooth, differentiable functions for the non-dimensional aerodynamic force and moment coefficients in terms of ordinary polynomials in the independent variables, suitable for nonlinear aircraft simulation.
Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen
2016-04-07
Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016
Nuclear analytical techniques in medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cesareo, R.
1988-01-01
This book acquaints one with the fundamental principles and the instrumentation relevant to analytical technique based on atomic and nuclear physics, as well as present and future biomedical applications. Besides providing a theoretical description of the physical phenomena, a large part of the book is devoted to applications in the medical and biological field, particularly in hematology, forensic medicine and environmental science. This volume reviews methods such as the possibility of carrying out rapid multi-element analysis of trace elements on biomedical samples, in vitro and in vivo, by XRF-analysis; the ability of the PIXE-microprobe to analyze in detail and tomore » map trace elements in fragments of biomedical samples or inside the cells; the potentiality of in vivo nuclear activation analysis for diagnostic purposes. Finally, techniques are described such as radiation scattering (elastic and inelastic scattering) and attenuation measurements which will undoubtedly see great development in the immediate future.« less
Galileo's Discorsi as a Tool for the Analytical Art.
Raphael, Renee Jennifer
2015-01-01
A heretofore overlooked response to Galileo's 1638 Discorsi is described by examining two extant copies of the text (one which has received little attention in the historiography, the other apparently unknown) which are heavily annotated. It is first demonstrated that these copies contain annotations made by Seth Ward and Sir Christopher Wren. This article then examines one feature of Ward's and Wren's responses to the Discorsi, namely their decision to re-write several of Galileo's geometrical demonstrations into the language of symbolic algebra. It is argued that this type of active reading of period mathematical texts may have been part of the regular scholarly and pedagogical practices of early modern British mathematicians like Ward and Wren. A set of Appendices contains a transcription and translation of the analytical solutions found in these annotated copies.
ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package
NASA Astrophysics Data System (ADS)
Jaggi, S.
1993-02-01
The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.
An Inexpensive, Open-Source USB Arduino Data Acquisition Device for Chemical Instrumentation.
Grinias, James P; Whitfield, Jason T; Guetschow, Erik D; Kennedy, Robert T
2016-07-12
Many research and teaching labs rely on USB data acquisition devices to collect voltage signals from instrumentation. However, these devices can be cost-prohibitive (especially when large numbers are needed for teaching labs) and require software to be developed for operation. In this article, we describe the development and use of an open-source USB data acquisition device (with 16-bit acquisition resolution) built using simple electronic components and an Arduino Uno that costs under $50. Additionally, open-source software written in Python is included so that data can be acquired using nearly any PC or Mac computer with a simple USB connection. Use of the device was demonstrated for a sophomore-level analytical experiment using GC and a CE-UV separation on an instrument used for research purposes.
Neurobiological differences in mental rotation and instrument interpretation in airline pilots.
Sladky, Ronald; Stepniczka, Irene; Boland, Edzard; Tik, Martin; Lamm, Claus; Hoffmann, André; Buch, Jan-Philipp; Niedermeier, Dominik; Field, Joris; Windischberger, Christian
2016-06-21
Airline pilots and similar professions require reliable spatial cognition abilities, such as mental imagery of static and moving three-dimensional objects in space. A well-known task to investigate these skills is the Shepard and Metzler mental rotation task (SMT), which is also frequently used during pre-assessment of pilot candidates. Despite the intuitive relationship between real-life spatial cognition and SMT, several studies have challenged its predictive value. Here we report on a novel instrument interpretation task (IIT) based on a realistic attitude indicator used in modern aircrafts that was designed to bridge the gap between the abstract SMT and a cockpit environment. We investigated 18 professional airline pilots using fMRI. No significant correlation was found between SMT and IIT task accuracies. Contrasting both tasks revealed higher activation in the fusiform gyrus, angular gyrus, and medial precuneus for IIT, whereas SMT elicited significantly stronger activation in pre- and supplementary motor areas, as well as lateral precuneus and superior parietal lobe. Our results show that SMT skills per se are not sufficient to predict task accuracy during (close to) real-life instrument interpretation. While there is a substantial overlap of activation across the task conditions, we found that there are important differences between instrument interpretation and non-aviation based mental rotation.
Neurobiological differences in mental rotation and instrument interpretation in airline pilots
Sladky, Ronald; Stepniczka, Irene; Boland, Edzard; Tik, Martin; Lamm, Claus; Hoffmann, André; Buch, Jan-Philipp; Niedermeier, Dominik; Field, Joris; Windischberger, Christian
2016-01-01
Airline pilots and similar professions require reliable spatial cognition abilities, such as mental imagery of static and moving three-dimensional objects in space. A well-known task to investigate these skills is the Shepard and Metzler mental rotation task (SMT), which is also frequently used during pre-assessment of pilot candidates. Despite the intuitive relationship between real-life spatial cognition and SMT, several studies have challenged its predictive value. Here we report on a novel instrument interpretation task (IIT) based on a realistic attitude indicator used in modern aircrafts that was designed to bridge the gap between the abstract SMT and a cockpit environment. We investigated 18 professional airline pilots using fMRI. No significant correlation was found between SMT and IIT task accuracies. Contrasting both tasks revealed higher activation in the fusiform gyrus, angular gyrus, and medial precuneus for IIT, whereas SMT elicited significantly stronger activation in pre- and supplementary motor areas, as well as lateral precuneus and superior parietal lobe. Our results show that SMT skills per se are not sufficient to predict task accuracy during (close to) real-life instrument interpretation. While there is a substantial overlap of activation across the task conditions, we found that there are important differences between instrument interpretation and non-aviation based mental rotation. PMID:27323913
LabVIEW-based control software for para-hydrogen induced polarization instrumentation.
Agraz, Jose; Grunfeld, Alexander; Li, Debiao; Cunningham, Karl; Willey, Cindy; Pozos, Robert; Wagner, Shawn
2014-04-01
The elucidation of cell metabolic mechanisms is the modern underpinning of the diagnosis, treatment, and in some cases the prevention of disease. Para-Hydrogen induced polarization (PHIP) enhances magnetic resonance imaging (MRI) signals over 10,000 fold, allowing for the MRI of cell metabolic mechanisms. This signal enhancement is the result of hyperpolarizing endogenous substances used as contrast agents during imaging. PHIP instrumentation hyperpolarizes Carbon-13 ((13)C) based substances using a process requiring control of a number of factors: chemical reaction timing, gas flow, monitoring of a static magnetic field (Bo), radio frequency (RF) irradiation timing, reaction temperature, and gas pressures. Current PHIP instruments manually control the hyperpolarization process resulting in the lack of the precise control of factors listed above, resulting in non-reproducible results. We discuss the design and implementation of a LabVIEW based computer program that automatically and precisely controls the delivery and manipulation of gases and samples, monitoring gas pressures, environmental temperature, and RF sample irradiation. We show that the automated control over the hyperpolarization process results in the hyperpolarization of hydroxyethylpropionate. The implementation of this software provides the fast prototyping of PHIP instrumentation for the evaluation of a myriad of (13)C based endogenous contrast agents used in molecular imaging.
Analytical methods for human biomonitoring of pesticides. A review.
Yusa, Vicent; Millet, Maurice; Coscolla, Clara; Roca, Marta
2015-09-03
Biomonitoring of both currently-used and banned-persistent pesticides is a very useful tool for assessing human exposure to these chemicals. In this review, we present current approaches and recent advances in the analytical methods for determining the biomarkers of exposure to pesticides in the most commonly used specimens, such as blood, urine, and breast milk, and in emerging non-invasive matrices such as hair and meconium. We critically discuss the main applications for sample treatment, and the instrumental techniques currently used to determine the most relevant pesticide biomarkers. We finally look at the future trends in this field. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hreniuc, V.; Hreniuc, A.; Pescaru, A.
2017-08-01
Solving a general strength problem of a ship hull may be done using analytical approaches which are useful to deduce the buoyancy forces distribution, the weighting forces distribution along the hull and the geometrical characteristics of the sections. These data are used to draw the free body diagrams and to compute the stresses. The general strength problems require a large amount of calculi, therefore it is interesting how a computer may be used to solve such problems. Using computer programming an engineer may conceive software instruments based on analytical approaches. However, before developing the computer code the research topic must be thoroughly analysed, in this way being reached a meta-level of understanding of the problem. The following stage is to conceive an appropriate development strategy of the original software instruments useful for the rapid development of computer aided analytical models. The geometrical characteristics of the sections may be computed using a bool algebra that operates with ‘simple’ geometrical shapes. By ‘simple’ we mean that for the according shapes we have direct calculus relations. In the set of ‘simple’ shapes we also have geometrical entities bounded by curves approximated as spline functions or as polygons. To conclude, computer programming offers the necessary support to solve general strength ship hull problems using analytical methods.
NASA Technical Reports Server (NTRS)
Wood, G. M.; Rayborn, G. H.; Ioup, J. W.; Ioup, G. E.; Upchurch, B. T.; Howard, S. J.
1981-01-01
Mathematical deconvolution of digitized analog signals from scientific measuring instruments is shown to be a means of extracting important information which is otherwise hidden due to time-constant and other broadening or distortion effects caused by the experiment. Three different approaches to deconvolution and their subsequent application to recorded data from three analytical instruments are considered. To demonstrate the efficacy of deconvolution, the use of these approaches to solve the convolution integral for the gas chromatograph, magnetic mass spectrometer, and the time-of-flight mass spectrometer are described. Other possible applications of these types of numerical treatment of data to yield superior results from analog signals of the physical parameters normally measured in aerospace simulation facilities are suggested and briefly discussed.
NASA Astrophysics Data System (ADS)
Kashansky, Vladislav V.; Kaftannikov, Igor L.
2018-02-01
Modern numerical modeling experiments and data analytics problems in various fields of science and technology reveal a wide variety of serious requirements for distributed computing systems. Many scientific computing projects sometimes exceed the available resource pool limits, requiring extra scalability and sustainability. In this paper we share the experience and findings of our own on combining the power of SLURM, BOINC and GlusterFS as software system for scientific computing. Especially, we suggest a complete architecture and highlight important aspects of systems integration.
A new innovative instrument for space plasma instrumentation
NASA Technical Reports Server (NTRS)
Torbert, Roy B.
1993-01-01
The Faraday Ring Ammeter was the subject of this grant for a new innovative instrument for space plasma instrumentation. This report summarizes our progress in this work. Briefly, we have conducted an intensive series of experiments and trials over three years, testing some five configurations of the instrument to measure currents, resulting in two Ph.D. theses, supported by this grant, and two flight configurations of the instrument. The first flight would have been on a NASA-Air Force collaborative sounding rocket, but was not flown because of instrumental difficulties. The second has been successfully integrated on the NASA Auroral Turbulence payload which is to be launched in February, 1994.
Jugessur, Astanand; Murray, Jeffrey C.; Moreno, Lina; Wilcox, Allen; Lie, Rolv T.
2011-01-01
This study uses instrumental variable (IV) models with genetic instruments to assess the effects of maternal smoking on the child’s risk of orofacial clefts (OFC), a common birth defect. The study uses genotypic variants in neurotransmitter and detoxification genes relateded to smoking as instruments for cigarette smoking before and during pregnancy. Conditional maximum likelihood and two-stage IV probit models are used to estimate the IV model. The data are from a population-level sample of affected and unaffected children in Norway. The selected genetic instruments generally fit the IV assumptions but may be considered “weak” in predicting cigarette smoking. We find that smoking before and during pregnancy increases OFC risk substantially under the IV model (by about 4–5 times at the sample average smoking rate). This effect is greater than that found with classical analytic models. This may be because the usual models are not able to consider self-selection into smoking based on unobserved confounders, or it may to some degree reflect limitations of the instruments. Inference based on weak-instrument robust confidence bounds is consistent with standard inference. Genetic instruments may provide a valuable approach to estimate the “causal” effects of risk behaviors with genetic-predisposing factors (such as smoking) on health and socioeconomic outcomes. PMID:22102793
Instrument Remote Control via the Astronomical Instrument Markup Language
NASA Technical Reports Server (NTRS)
Sall, Ken; Ames, Troy; Warsaw, Craig; Koons, Lisa; Shafer, Richard
1998-01-01
The Instrument Remote Control (IRC) project ongoing at NASA's Goddard Space Flight Center's (GSFC) Information Systems Center (ISC) supports NASA's mission by defining an adaptive intranet-based framework that provides robust interactive and distributed control and monitoring of remote instruments. An astronomical IRC architecture that combines the platform-independent processing capabilities of Java with the power of Extensible Markup Language (XML) to express hierarchical data in an equally platform-independent, as well as human readable manner, has been developed. This architecture is implemented using a variety of XML support tools and Application Programming Interfaces (API) written in Java. IRC will enable trusted astronomers from around the world to easily access infrared instruments (e.g., telescopes, cameras, and spectrometers) located in remote, inhospitable environments, such as the South Pole, a high Chilean mountaintop, or an airborne observatory aboard a Boeing 747. Using IRC's frameworks, an astronomer or other scientist can easily define the type of onboard instrument, control the instrument remotely, and return monitoring data all through the intranet. The Astronomical Instrument Markup Language (AIML) is the first implementation of the more general Instrument Markup Language (IML). The key aspects of our approach to instrument description and control applies to many domains, from medical instruments to machine assembly lines. The concepts behind AIML apply equally well to the description and control of instruments in general. IRC enables us to apply our techniques to several instruments, preferably from different observatories.
A Modern Automatic Chamber Technique as a Powerful Tool for CH4 and CO2 Flux Monitoring
NASA Astrophysics Data System (ADS)
Mastepanov, M.; Christensen, T. R.; Lund, M.; Pirk, N.
2014-12-01
A number of similar systems were used for monitoring of CH4 and CO2 exchange by the automatic chamber method in a range of different ecosystems. The measurements were carried out in northern Sweden (mountain birch forest near Abisko, 68°N, 2004-2010), southern Sweden (forest bog near Hässleholm, 56°N, 2007-2014), northeastern Greenland (arctic fen in Zackenberg valley, 74°N, 2005-2014), southwestern Greenland (fen near Nuuk, 64°N, 2007-2014), central Svalbard (arctic fen near Longyearbyen, 78°N, 2011-2014). Those in total 37 seasons of measurements delivered not only a large amount of valuable flux data, including a few novel findings (Mastepanov et al., Nature, 2008; Mastepanov et al., Biogeosciences, 2013), but also valuable experience with implementation of the automatic chamber technique using modern analytical instruments and computer technologies. A range of high resolution CH4 analysers (DLT-100, FMA, FGGA - Los Gatos Research), CO2 analyzers (EGM-4, SBA-4 - PP Systems; Li-820 - Li-Cor Biosciences), as well as Methane Carbon Isotope Analyzer (Los Gatos Research) has shown to be suitable for precise measurements of fluxes, from as low as 0.1 mg CH4 m-1 d-1 (wintertime measurements at Zackenberg, unpublished) to as high as 2.4 g CH4 m-1 d-1 (autumn burst 2007 at Zackenberg, Mastepanov et al., Nature, 2008). Some of these instruments had to be customized to accommodate 24/7 operation in harsh arctic conditions. In this presentation we will explain some of these customizations. High frequency of concentration measurements (1 Hz in most cases) provides a unique opportunity for quality control of flux calculations; on the other hand, this enormous amount of data can be analyzed only using highly automated algorithms. A specialized software package was developed and improved through the years of measurements and data processing. This software automates the data flow from raw concentration data of different instruments and sensors and various status records
Computer simulation of multiple pilots flying a modern high performance helicopter
NASA Technical Reports Server (NTRS)
Zipf, Mark E.; Vogt, William G.; Mickle, Marlin H.; Hoelzeman, Ronald G.; Kai, Fei; Mihaloew, James R.
1988-01-01
A computer simulation of a human response pilot mechanism within the flight control loop of a high-performance modern helicopter is presented. A human response mechanism, implemented by a low order, linear transfer function, is used in a decoupled single variable configuration that exploits the dominant vehicle characteristics by associating cockpit controls and instrumentation with specific vehicle dynamics. Low order helicopter models obtained from evaluations of the time and frequency domain responses of a nonlinear simulation model, provided by NASA Lewis Research Center, are presented and considered in the discussion of the pilot development. Pilot responses and reactions to test maneuvers are presented and discussed. Higher level implementation, using the pilot mechanisms, are discussed and considered for their use in a comprehensive control structure.
Insecticide ADME for support of early-phase discovery: combining classical and modern techniques.
David, Michael D
2017-04-01
The two factors that determine an insecticide's potency are its binding to a target site (intrinsic activity) and the ability of its active form to reach the target site (bioavailability). Bioavailability is dictated by the compound's stability and transport kinetics, which are determined by both physical and biochemical characteristics. At BASF Global Insecticide Research, we characterize bioavailability in early research with an ADME (Absorption, Distribution, Metabolism and Excretion) approach, combining classical and modern techniques. For biochemical assessment of metabolism, we purify native insect enzymes using classical techniques, and recombinantly express individual insect enzymes that are known to be relevant in insecticide metabolism and resistance. For analytical characterization of an experimental insecticide and its metabolites, we conduct classical radiotracer translocation studies when a radiolabel is available. In discovery, where typically no radiolabel has been synthesized, we utilize modern high-resolution mass spectrometry to probe complex systems for the test compounds and its metabolites. By using these combined approaches, we can rapidly compare the ADME properties of sets of new experimental insecticides and aid in the design of structures with an improved potential to advance in the research pipeline. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
"Modern Portuguese" and The Narration of Brazil
ERIC Educational Resources Information Center
Milleret, Margo
2016-01-01
"Modern Portuguese: A Project of the Modern Language Association" was a package of film strips, prerecorded tapes, an instructor's manual, and a textbook first published by Knopf in 1971. It followed the model established by "Modern Spanish" that was also a project of the Modern Language Association (MLA) published in 1960.The…
Practical aspects of modern interferometry for optical manufacturing quality control: Part 2
NASA Astrophysics Data System (ADS)
Smythe, Robert
2012-07-01
Modern phase shifting interferometers enable the manufacture of optical systems that drive the global economy. Semiconductor chips, solid-state cameras, cell phone cameras, infrared imaging systems, space based satellite imaging and DVD and Blu-Ray disks are all enabled by phase shifting interferometers. Theoretical treatments of data analysis and instrument design advance the technology but often are not helpful towards the practical use of interferometers. An understanding of the parameters that drive system performance is critical to produce useful results. Any interferometer will produce a data map and results; this paper, in three parts, reviews some of the key issues to minimize error sources in that data and provide a valid measurement.
Practical aspects of modern interferometry for optical manufacturing quality control, Part 3
NASA Astrophysics Data System (ADS)
Smythe, Robert A.
2012-09-01
Modern phase shifting interferometers enable the manufacture of optical systems that drive the global economy. Semiconductor chips, solid-state cameras, cell phone cameras, infrared imaging systems, space-based satellite imaging, and DVD and Blu-Ray disks are all enabled by phase-shifting interferometers. Theoretical treatments of data analysis and instrument design advance the technology but often are not helpful toward the practical use of interferometers. An understanding of the parameters that drive the system performance is critical to produce useful results. Any interferometer will produce a data map and results; this paper, in three parts, reviews some of the key issues to minimize error sources in that data and provide a valid measurement.
Search Analytics: Automated Learning, Analysis, and Search with Open Source
NASA Astrophysics Data System (ADS)
Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.
2016-12-01
The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82
Analytical quality by design: a tool for regulatory flexibility and robust analytics.
Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).
Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics
Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723
ERIC Educational Resources Information Center
Smith, Robert L.; Popham, Ronald E.
1983-01-01
Presents an experiment in thermometric titration used in an analytic chemistry-chemical instrumentation course, consisting of two titrations, one a mixture of calcium and magnesium, the other of calcium, magnesium, and barium ions. Provides equipment and solutions list/specifications, graphs, and discussion of results. (JM)
Degano, Ilaria; La Nasa, Jacopo; Ghelardi, Elisa; Modugno, Francesca; Colombini, Maria Perla
2016-12-01
Lipid binders have traditionally been determined in paintings by using gas chromatography/mass spectrometry (GC/MS) to identify the characteristic profiles and ratios of fatty acids . However, the presence of mixtures in contemporary and modern oil paints makes the GC/MS determination of fatty acids insufficient to fully characterize the lipid binding media. In this study we prove that triacylglycerol (TAG) profiling by high-performance liquid chromatography with high-resolution tandem mass spectrometry, using ESI in positive and negative ionization modes is highly effective. We exploited this analytical approach to study the curing and degradation processes undergone by six plant oils used in the formulation of media in modern paints, using both natural and artificial ageing experiments. We believe that is the first time that a negative ionization mode has been applied for this purpose and that a survey with HPLC-ESI-Q-ToF has been carried out to study the ageing kinetics of plant oils. TAG profiling enabled us to study the evolution over time of the constituents of modern oils, with respect to curing and ageing. The data analyzed in this study demonstrate that our approach is efficient to study the oxidation of TAGs during ageing. The data also improve current knowledge on the properties of vegetable oils, which could lead to the development of new paint materials and conservation treatments for modern and contemporary works of art. Copyright © 2016 Elsevier B.V. All rights reserved.
Prediction of sickness absence: development of a screening instrument
Duijts, S F A; Kant, IJ; Landeweerd, J A; Swaen, G M H
2006-01-01
Objectives To develop a concise screening instrument for early identification of employees at risk for sickness absence due to psychosocial health complaints. Methods Data from the Maastricht Cohort Study on “Fatigue at Work” were used to identify items to be associated with an increased risk of sickness absence. The analytical procedures univariate logistic regression, backward stepwise linear regression, and multiple logistic regression were successively applied. For both men and women, sum scores were calculated, and sensitivity and specificity rates of different cut‐off points on the screening instrument were defined. Results In women, results suggested that feeling depressed, having a burnout, being tired, being less interested in work, experiencing obligatory change in working days, and living alone, were strong predictors of sickness absence due to psychosocial health complaints. In men, statistically significant predictors were having a history of sickness absence, compulsive thinking, being mentally fatigued, finding it hard to relax, lack of supervisor support, and having no hobbies. A potential cut‐off point of 10 on the screening instrument resulted in a sensitivity score of 41.7% for women and 38.9% for men, and a specificity score of 91.3% for women and 90.6% for men. Conclusions This study shows that it is possible to identify predictive factors for sickness absence and to develop an instrument for early identification of employees at risk for sickness absence. The results of this study increase the possibility for both employers and policymakers to implement interventions directed at the prevention of sickness absence. PMID:16698807
Recent advances in CE-MS coupling: Instrumentation, methodology, and applications.
Týčová, Anna; Ledvina, Vojtěch; Klepárník, Karel
2017-01-01
This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices coupled with MS for detection and identification of important analytes. It is a continuation of the review article on the same topic by Kleparnik (Electrophoresis 2015, 36, 159-178). A wide selection of 161 relevant articles covers the literature published from June 2014 till May 2016. New improvements in the instrumentation and methodology of MS interfaced with capillary or microfluidic versions of zone electrophoresis, isotachophoresis, and isoelectric focusing are described in detail. The most frequently implemented MS ionization methods include electrospray ionization, matrix-assisted desorption/ionization and inductively coupled plasma ionization. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography, and micellar electrokinetic chromatography are not included. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Application of Steinberg vibration fatigue model for structural verification of space instruments
NASA Astrophysics Data System (ADS)
García, Andrés; Sorribes-Palmer, Félix; Alonso, Gustavo
2018-01-01
Electronic components in spaceships are subjected to vibration loads during the ascent phase of the launcher. It is important to verify by tests and analysis that all parts can survive in the most severe load cases. The purpose of this paper is to present the methodology and results of the application of the Steinberg's fatigue model to estimate the life of electronic components of the EPT-HET instrument for the Solar Orbiter space mission. A Nastran finite element model (FEM) of the EPT-HET instrument was created and used for the structural analysis. The methodology is based on the use of the FEM of the entire instrument to calculate the relative displacement RDSD and RMS values of the PCBs from random vibration analysis. These values are used to estimate the fatigue life of the most susceptible electronic components with the Steinberg's fatigue damage equation and the Miner's cumulative fatigue index. The estimations are calculated for two different configurations of the instrument and three different inputs in order to support the redesign process. Finally, these analytical results are contrasted with the inspections and the functional tests made after the vibration tests, concluding that this methodology can adequately predict the fatigue damage or survival of the electronic components.
A mathematical model for describing the mechanical behaviour of root canal instruments.
Zhang, E W; Cheung, G S P; Zheng, Y F
2011-01-01
The purpose of this study was to establish a general mathematical model for describing the mechanical behaviour of root canal instruments by combining a theoretical analytical approach with a numerical finite-element method. Mathematical formulas representing the longitudinal (taper, helical angle and pitch) and cross-sectional configurations and area, the bending and torsional inertia, the curvature of the boundary point and the (geometry of) loading condition were derived. Torsional and bending stresses and the resultant deformation were expressed mathematically as a function of these geometric parameters, modulus of elasticity of the material and the applied load. As illustrations, three brands of NiTi endodontic files of different cross-sectional configurations (ProTaper, Hero 642, and Mani NRT) were analysed under pure torsion and pure bending situation by entering the model into a finite-element analysis package (ANSYS). Numerical results confirmed that mathematical models were a feasible method to analyse the mechanical properties and predict the stress and deformation for root canal instruments during root canal preparation. Mathematical and numerical model can be a suitable way to examine mechanical behaviours as a criterion of the instrument design and to predict the stress and strain experienced by the endodontic instruments during root canal preparation. © 2010 International Endodontic Journal.
2013-01-01
Background Ethnographic evidence suggests that transactional sex is sometimes motivated by youth’s interest in the consumption of modern goods as much as it is in basic survival. There are very few quantitative studies that examine the association between young people’s interests in the consumption of modern goods and their sexual behaviour. We examined this association in two regions and four residence zones of Madagascar: urban, peri-urban and rural Antananarivo, and urban Antsiranana. We expected risky sexual behaviour would be associated with interests in consuming modern goods or lifestyles; urban residence; and socio-cultural characteristics. Methods We administered a population-based survey to 2, 255 youth ages 15–24 in all four residence zones. Focus group discussions guided the survey instrument which assessed socio-demographic and economic characteristics, consumption of modern goods, preferred activities and sexual behaviour. Our outcomes measures included: multiple sexual partners in the last year (for men and women); and ever practicing transactional sex (for women). Results Overall, 7.3% of women and 30.7% of men reported having had multiple partners in the last year; and 5.9% of women reported ever practicing transactional sex. Bivariate results suggested that for both men and women having multiple partners was associated with perceptions concerning the importance of fashion and a series of activities associated with modern lifestyles. A subset of lifestyle characteristics remained significant in multivariate models. For transactional sex bivariate results suggested perceptions around fashion, nightclub attendance, and getting to know a foreigner were key determinants; and all remained significant in multivariate analysis. We found peri-urban residence more associated with transactional sex than urban residence; and ethnic origin was the strongest predictor of both outcomes for women. Conclusions While we found indication of an association
Lin, Wei; Feng, Rui; Li, Hongzhe
2014-01-01
In genetical genomics studies, it is important to jointly analyze gene expression data and genetic variants in exploring their associations with complex traits, where the dimensionality of gene expressions and genetic variants can both be much larger than the sample size. Motivated by such modern applications, we consider the problem of variable selection and estimation in high-dimensional sparse instrumental variables models. To overcome the difficulty of high dimensionality and unknown optimal instruments, we propose a two-stage regularization framework for identifying and estimating important covariate effects while selecting and estimating optimal instruments. The methodology extends the classical two-stage least squares estimator to high dimensions by exploiting sparsity using sparsity-inducing penalty functions in both stages. The resulting procedure is efficiently implemented by coordinate descent optimization. For the representative L1 regularization and a class of concave regularization methods, we establish estimation, prediction, and model selection properties of the two-stage regularized estimators in the high-dimensional setting where the dimensionality of co-variates and instruments are both allowed to grow exponentially with the sample size. The practical performance of the proposed method is evaluated by simulation studies and its usefulness is illustrated by an analysis of mouse obesity data. Supplementary materials for this article are available online. PMID:26392642
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2017-01-01
There has been an immense amount of visibility of doping issues on the international stage over the past 12 months with the complexity of doping controls reiterated on various occasions. Hence, analytical test methods continuously being updated, expanded, and improved to provide specific, sensitive, and comprehensive test results in line with the World Anti-Doping Agency's (WADA) 2016 Prohibited List represent one of several critical cornerstones of doping controls. This enterprise necessitates expediting the (combined) exploitation of newly generated information on novel and/or superior target analytes for sports drug testing assays, drug elimination profiles, alternative test matrices, and recent advances in instrumental developments. This paper is a continuation of the series of annual banned-substance reviews appraising the literature published between October 2015 and September 2016 concerning human sports drug testing in the context of WADA's 2016 Prohibited List. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2015-01-01
Within the mosaic display of international anti-doping efforts, analytical strategies based on up-to-date instrumentation as well as most recent information about physiology, pharmacology, metabolism, etc., of prohibited substances and methods of doping are indispensable. The continuous emergence of new chemical entities and the identification of arguably beneficial effects of established or even obsolete drugs on endurance, strength, and regeneration, necessitate frequent and adequate adaptations of sports drug testing procedures. These largely rely on exploiting new technologies, extending the substance coverage of existing test protocols, and generating new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA). In reference of the content of the 2014 Prohibited List, literature concerning human sports drug testing that was published between October 2013 and September 2014 is summarized and reviewed in this annual banned-substance review, with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2014 John Wiley & Sons, Ltd.
Developing an instrument for assessing students' concepts of the nature of technology
NASA Astrophysics Data System (ADS)
Liou, Pey-Yan
2015-05-01
Background:The nature of technology has been rarely discussed despite the fact that technology plays an essential role in modern society. It is important to discuss students' concepts of the nature of technology, and further to advance their technological literacy and adaptation to modern society. There is a need to assess high school students' concepts of the nature of technology. Purpose:This study aims to engage in discourse on students' concepts of the nature of technology based on a proposed theoretical framework. Moreover, another goal is to develop an instrument for measuring students' concepts of the nature of technology. Sample:Four hundred and fifty-five high school students' perceptions of technology were qualitatively analyzed. Furthermore, 530 students' responses to a newly developed questionnaire were quantitatively analyzed in the final test. Design and method:First, content analysis was utilized to discuss and categorize students' statements regarding technology and its related issues. The Student Concepts of the Nature of Technology Questionnaire was developed based on the proposed theoretical framework and was supported by the students' qualitative data. Finally, exploratory factor analysis and reliability analysis were applied to determine the structure of the items and the internal consistency of each scale. Results:Through a process of instrument development, the Student Concepts of the Nature of Technology Questionnaire was shown to be a valid and reliable tool for measuring students' concepts of the nature of technology. This newly developed questionnaire is composed of 29 items in six scales, namely 'technology as artifacts,' 'technology as an innovation change,' 'the current role of technology in society,' 'technology as a double-edged sword,' 'technology as a science-based form,' and 'history of technology.' Conclusions:The Student Concepts of the Nature of Technology Questionnaire has been confirmed as a reasonably valid and reliable
Diana, Esther
2008-01-01
The scientific collections of Florentine Santa Maria Nuova Hospital stimulated new interest in the second half of eigthteenth century. Indeed, the modernization process of the Hospital lead to a steadily increasing alienation of its rich historical heritage, including the scientific collections. Archive documents witness the sale or the museum valorization of a number of collections including mathematical instruments and the anatomical, surgical and wax-obstetrical ones.
Teaching Modern Dance: A Conceptual Approach
ERIC Educational Resources Information Center
Enghauser, Rebecca Gose
2008-01-01
A conceptual approach to teaching modern dance can broaden the awareness and deepen the understanding of modern dance in the educational arena in general, and in dance education specifically. This article describes a unique program that dance teachers can use to introduce modern dance to novice dancers, as well as more experienced dancers,…
Cervical shaping in curved root canals: comparison of the efficiency of two endodontic instruments.
Busquim, Sandra Soares Kühne; dos Santos, Marcelo
2002-01-01
The aim of this study was to determine the removal of dentin produced by number 25 (0.08) Flare files (Quantec Flare Series, Analytic Endodontics, Glendora, California, USA) and number 1 e 2 Gates-Glidden burs (Dentsply - Maillefer, Ballaigues, Switzerland), in the mesio-buccal and mesio-lingual root canals, respectively, of extracted human permanent inferior molars, by means of measuring the width of dentinal walls prior and after instrumentation. The obtained values were compared. Due to the multiple analyses of data, a nonparametric test was used, and the Kruskal-Wallis test was chosen. There was no significant difference between the instruments as to the removal of dentin in the 1st and 2nd millimeters. However, when comparing the performances of the instruments in the 3rd millimeter, Flare files promoted a greater removal than Gates-Glidden drills (p > 0.05). The analysis revealed no significant differences as to mesial wear, which demonstrates the similar behavior of both instruments. Gates-Glidden drills produced an expressive mesial detour in the 2nd and 3rd millimeters, which was detected trough a statistically significant difference in the wear of this region (p > 0.05). There was no statistically significant difference between mesial and lateral wear when Flare instruments were employed.
Modern monitoring with preventive role for a production capacity
NASA Astrophysics Data System (ADS)
Tomescu, Cristian; Lupu, Constantin; Szollosi-Mota, Andrei; Rădoi, Florin; Chiuzan, Emeric
2016-10-01
In the process of exploitation of coal, the appearance of the phenomenon of spontaneous combustion represents a risk factor identified by the subjective and objective the causes, which requires the development of appropriate prevention methods. In order to control the risk, shall be drawn up incipient intervention solutions with preventive function, which consist in the direct and indirect measurement of the working environment, of the temperature of the coal massif and of the concentrations of gases, O2, CO2, CO. Monitoring instruments which fall within the modern concept for proactively anticipation is represented by thermography applied in the exploitation of coal and by the gas chromatograph for the analysis of the air collected. The drawing up of thermal maps on the basis of the thermograms and analysis of the chromatograms resulted represents the binome for assessing and treatments of the spontaneous combustion risk, which will be discussed in this work.
Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments â¹âº You are here CTIO Home » Astronomers » Instruments » IR Instruments IR Instruments Infrared Imaging ANDICAM - Ohio State Visual/IR Imager (on SMARTS 1.3m Telescope) OSIRIS - The Ohio State
Khodarahimi, Siamak
2009-10-01
The significance of dreams has been explained in psychoanalysis, depth psychology and gestalt therapy. There are many guidelines in analytic psychology for dream interpretation and integration in clinical practice. The present study, based on the Jungian analytic model, incorporated dreams as an instrument for assessment of aetiology, the psychotherapy process and the outcome of treatment for social phobia within a clinical case study. This case study describes the use of dream analysis in treating a female youth with social phobia. The present findings supported the three stage paradigm efficiency in the Jungian model for dream working within a clinical setting, i.e. written details, reassembly with amplification and assimilation. It was indicated that childhood and infantile traumatic events, psychosexual development malfunctions, and inefficient coping skills for solving current life events were expressed in the patient's dreams. Dreams can reflect a patient's aetiology, needs, illness prognosis and psychotherapy outcome. Dreams are an instrument for the diagnosis, research and treatment of mental disturbances in a clinical setting.
Analytical electron microscopy in mineralogy; exsolved phases in pyroxenes
Nord, G.L.
1982-01-01
Analytical scanning transmission electron microscopy has been successfully used to characterize the structure and composition of lamellar exsolution products in pyroxenes. At operating voltages of 100 and 200 keV, microanalytical techniques of x-ray energy analysis, convergent-beam electron diffraction, and lattice imaging have been used to chemically and structurally characterize exsolution lamellae only a few unit cells wide. Quantitative X-ray energy analysis using ratios of peak intensities has been adopted for the U.S. Geological Survey AEM in order to study the compositions of exsolved phases and changes in compositional profiles as a function of time and temperature. The quantitative analysis procedure involves 1) removal of instrument-induced background, 2) reduction of contamination, and 3) measurement of correction factors obtained from a wide range of standard compositions. The peak-ratio technique requires that the specimen thickness at the point of analysis be thin enough to make absorption corrections unnecessary (i.e., to satisfy the "thin-foil criteria"). In pyroxenes, the calculated "maximum thicknesses" range from 130 to 1400 nm for the ratios Mg/Si, Fe/Si, and Ca/Si; these "maximum thicknesses" have been contoured in pyroxene composition space as a guide during analysis. Analytical spatial resolutions of 50-100 nm have been achieved in AEM at 200 keV from the composition-profile studies, and analytical reproducibility in AEM from homogeneous pyroxene standards is ?? 1.5 mol% endmember. ?? 1982.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kovesdi, C.; Joe, J.; Boring, R.
The primary objective of the United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to sustain operation of the existing commercial nuclear power plants (NPPs) through a multi-pathway approach in conducting research and development (R&D). The Advanced Instrumentation, Information, and Control (II&C) System Technologies pathway conducts targeted R&D to address aging and reliability concerns with legacy instrumentation and control (I&C) and other information systems in existing U.S. NPPs. Control room modernization is an important part following this pathway, and human factors experts at Idaho National Laboratory (INL) have been involved in conducting R&D to supportmore » migration of new digital main control room (MCR) technologies from legacy analog and legacy digital I&C. This paper describes a human factors engineering (HFE) process that supports human-system interface (HSI) design in MCR modernization activities, particularly with migration of old digital to new digital I&C. The process described in this work is an expansion from the LWRS Report INL/EXT-16-38576, and is a requirements-driven approach that aligns with NUREG-0711 requirements. The work described builds upon the existing literature by adding more detail around key tasks and decisions to make when transitioning from HSI Design into Verification and Validation (V&V). The overall objective of this process is to inform HSI design and elicit specific, measurable, and achievable human factors criteria for new digital technologies. Upon following this process, utilities should have greater confidence with transitioning from HSI design into V&V.« less
Basic principles of flight test instrumentation engineering, volume 1, issue 2
NASA Technical Reports Server (NTRS)
Borek, Robert W., Sr. (Editor); Pool, A. (Editor)
1994-01-01
Volume 1 of the AG 300 series on 'Flight Test Instrumentation' gives a general introduction to the basic principles of flight test instrumentation. The other volumes in the series provide more detailed treatments of selected topics on flight test instrumentation. Volume 1, first published in 1974, has been used extensively as an introduction for instrumentation courses and symposia, as well as being a reference work on the desk of most flight test and instrumentation engineers. It is hoped that this second edition, fully revised, will be used with as much enthusiasm as the first edition. In this edition a flight test system is considered to include both the data collection and data processing systems. In order to obtain an optimal data flow, the overall design of these two subsystems must be carefully matched; the detail development and the operation may have to be done by separate groups of specialists. The main emphasis is on the large automated instrumentation systems used for the initial flight testing of modern military and civil aircraft. This is done because there, many of the problems, which are discussed here, are more critical. It does not imply, however, that smaller systems with manual data processing are no longer used. In general, the systems should be designed to provide the required results at the lowest possible cost. For many tests which require only a few parameters, relatively simple systems are justified, especially if no complex equipment is available to the user. Although many of the aspects discussed in this volume apply to both small and large systems, aspects of the smaller systems are mentioned only when they are of special interest. The volume has been divided into three main parts. Part 1 defines the main starting points for the design of a flight test instrumentation system, as seen from the points of view of the flight test engineer and the instrumentation engineer. In Part 2 the discussion is concentrated on those aspects which apply
Longitudinal flying qualities criteria for single-pilot instrument flight operations
NASA Technical Reports Server (NTRS)
Stengel, R. F.; Bar-Gill, A.
1983-01-01
Modern estimation and control theory, flight testing, and statistical analysis were used to deduce flying qualities criteria for General Aviation Single Pilot Instrument Flight Rule (SPIFR) operations. The principal concern is that unsatisfactory aircraft dynamic response combined with high navigation/communication workload can produce problems of safety and efficiency. To alleviate these problems. The relative importance of these factors must be determined. This objective was achieved by flying SPIFR tasks with different aircraft dynamic configurations and assessing the effects of such variations under these conditions. The experimental results yielded quantitative indicators of pilot's performance and workload, and for each of them, multivariate regression was applied to evaluate several candidate flying qualities criteria.
Subminiaturization for ERAST instrumentation (Environmental Research Aircraft and Sensor Technology)
NASA Technical Reports Server (NTRS)
Madou, Marc; Lowenstein, Max; Wegener, Steven
1995-01-01
We are focusing on the Argus as an example to demonstrate our philosophy on miniaturization of airborne analytical instruments for the study of atmospheric chemistry. Argus is a two channel, tunable-diode laser absorption spectrometer developed at NASA for the measurement of nitrogen dioxide (N2O) (4.5 micrometers) and ammonia (CH3) (3.3 micrometers) at the 0.1 parts per billion (ppb) level from the Perseus aircraft platform at altitudes up to 30 km. Although Argus' mass is down to 23 kg from the 197 kg Atlas, its predecessor, our goal is to design a next-generation subminiaturized instrument weighing less than 1 kg, measuring a few cm(exp 3) and able to eliminate dewars for cooling. Current designs enable use to make a small,inexpensive, monolithic spectrometer without the required sensitivity range. Further work is on its way to increase sensitivity. We are continuing to zero-base the technical approach in terms of the specifications for the given instrument. We are establishing a check list of questions to hone into the best micromachining approach and to superpose on the answers insights in scaling laws and flexible engineering designs to enable more relaxed tolerances for the smallest of the components.
Trends & Controversies: Sociocultural Predictive Analytics and Terrorism Deterrence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; McGrath, Liam R.
2011-08-12
The use of predictive analytics to model terrorist rhetoric is highly instrumental in developing a strategy to deter terrorism. Traditional (e.g. Cold-War) deterrence methods are ineffective with terrorist groups such as al Qaida. Terrorists typically regard the prospect of death or loss of property as acceptable consequences of their struggle. Deterrence by threat of punishment is therefore fruitless. On the other hand, isolating terrorists from the community that may sympathize with their cause can have a decisive deterring outcome. Without the moral backing of a supportive audience, terrorism cannot be successfully framed as a justifiable political strategy and recruiting ismore » curtailed. Ultimately, terrorism deterrence is more effectively enforced by exerting influence to neutralize the communicative reach of terrorists.« less
Understanding Business Analytics
2015-01-05
analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in
Two Approaches in the Lunar Libration Theory: Analytical vs. Numerical Methods
NASA Astrophysics Data System (ADS)
Petrova, Natalia; Zagidullin, Arthur; Nefediev, Yurii; Kosulin, Valerii
2016-10-01
Observation of the physical libration of the Moon and the celestial bodies is one of the astronomical methods to remotely evaluate the internal structure of a celestial body without using expensive space experiments. Review of the results obtained due to the physical libration study, is presented in the report.The main emphasis is placed on the description of successful lunar laser ranging for libration determination and on the methods of simulating the physical libration. As a result, estimation of the viscoelastic and dissipative properties of the lunar body, of the lunar core parameters were done. The core's existence was confirmed by the recent reprocessing of seismic data Apollo missions. Attention is paid to the physical interpretation of the phenomenon of free libration and methods of its determination.A significant part of the report is devoted to describing the practical application of the most accurate to date the analytical tables of lunar libration built by comprehensive analytical processing of residual differences obtained when comparing the long-term series of laser observations with numerical ephemeris DE421 [1].In general, the basic outline of the report reflects the effectiveness of two approaches in the libration theory - numerical and analytical solution. It is shown that the two approaches complement each other for the study of the Moon in different aspects: numerical approach provides high accuracy of the theory necessary for adequate treatment of modern high-accurate observations and the analytic approach allows you to see the essence of the various kind manifestations in the lunar rotation, predict and interpret the new effects in observations of physical libration [2].[1] Rambaux, N., J. G. Williams, 2011, The Moon's physical librations and determination of their free modes, Celest. Mech. Dyn. Astron., 109, 85-100.[2] Petrova N., A. Zagidullin, Yu. Nefediev. Analysis of long-periodic variations of lunar libration parameters on the basis of
LISA Pathfinder Instrument Data Analysis
NASA Technical Reports Server (NTRS)
Guzman, Felipe
2010-01-01
LISA Pathfinder (LPF) is an ESA-launched demonstration mission of key technologies required for the joint NASA-ESA gravitational wave observatory in space, LISA. As part of the LPF interferometry investigations, analytic models of noise sources and corresponding noise subtraction techniques have been developed to correct for effects like the coupling of test mass jitter into displacement readout, and fluctuations of the laser frequency or optical pathlength difference. Ground testing of pre-flight hardware of the Optical Metrology subsystem is currently ongoing at the Albert Einstein Institute Hannover. In collaboration with NASA Goddard Space Flight Center, the LPF mission data analysis tool LTPDA is being used to analyze the data product of these tests. Furthermore, the noise subtraction techniques and in-flight experiment runs for noise characterization are being defined as part of the mission experiment master plan. We will present the data analysis outcome of preflight hardware ground tests and possible noise subtraction strategies for in-flight instrument operations.
Analyticity without Differentiability
ERIC Educational Resources Information Center
Kirillova, Evgenia; Spindler, Karlheinz
2008-01-01
In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…