NHEXAS PHASE I REGION 5 STUDY--METALS IN DUST ANALYTICAL RESULTS
This data set includes analytical results for measurements of metals in 1,906 dust samples. Dust samples were collected to assess potential residential sources of dermal and inhalation exposures and to examine relationships between analyte levels in dust and in personal and bioma...
NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR METALS IN SPIKES
This data set includes analytical results for measurements of metals in 49 field control samples (spikes). Measurements were made for up to 11 metals in samples of water, blood, and urine. Field controls were used to assess recovery of target analytes from a sample media during s...
High throughput liquid absorption preconcentrator sampling instrument
Zaromb, Solomon; Bozen, Ralph M.
1992-01-01
A system for detecting trace concentrations of an analyte in air includes a preconcentrator for the analyte and an analyte detector. The preconcentrator includes an elongated tubular container comprising a wettable material. The wettable material is continuously wetted with an analyte-sorbing liquid which flows from one part of the container to a lower end. Sampled air flows through the container in contact with the wetted material with a swirling motion which results in efficient transfer of analyte vapors or aerosol particles to the sorbing liquid and preconcentration of traces of analyte in the liquid. The preconcentrated traces of analyte may be either detected within the container or removed therefrom for injection into a separate detection means or for subsequent analysis.
High-throughput liquid-absorption preconcentrator sampling methods
Zaromb, Solomon
1994-01-01
A system for detecting trace concentrations of an analyte in air includes a preconcentrator for the analyte and an analyte detector. The preconcentrator includes an elongated tubular container comprising a wettable material. The wettable material is continuously wetted with an analyte-sorbing liquid which flows from one part of the container to a lower end. Sampled air flows through the container in contact with the wetted material with a swirling motion which results in efficient transfer of analyte vapors or aerosol particles to the sorbing liquid and preconcentration of traces of analyte in the liquid. The preconcentrated traces of analyte may be either detected within the container or removed therefrom for injection into a separate detection means or for subsequent analysis.
High throughput liquid absorption preconcentrator sampling instrument
Zaromb, S.; Bozen, R.M.
1992-12-22
A system for detecting trace concentrations of an analyte in air includes a preconcentrator for the analyte and an analyte detector. The preconcentrator includes an elongated tubular container comprising a wettable material. The wettable material is continuously wetted with an analyte-sorbing liquid which flows from one part of the container to a lower end. Sampled air flows through the container in contact with the wetted material with a swirling motion which results in efficient transfer of analyte vapors or aerosol particles to the sorbing liquid and preconcentration of traces of analyte in the liquid. The preconcentrated traces of analyte may be either detected within the container or removed therefrom for injection into a separate detection means or for subsequent analysis. 12 figs.
High-throughput liquid-absorption preconcentrator sampling methods
Zaromb, S.
1994-07-12
A system for detecting trace concentrations of an analyte in air includes a preconcentrator for the analyte and an analyte detector. The preconcentrator includes an elongated tubular container comprising a wettable material. The wettable material is continuously wetted with an analyte-sorbing liquid which flows from one part of the container to a lower end. Sampled air flows through the container in contact with the wetted material with a swirling motion which results in efficient transfer of analyte vapors or aerosol particles to the sorbing liquid and preconcentration of traces of analyte in the liquid. The preconcentrated traces of analyte may be either detected within the container or removed therefrom for injection into a separate detection means or for subsequent analysis. 12 figs.
A Modern Approach to College Analytical Chemistry.
ERIC Educational Resources Information Center
Neman, R. L.
1983-01-01
Describes a course which emphasizes all facets of analytical chemistry, including sampling, preparation, interference removal, selection of methodology, measurement of a property, and calculation/interpretation of results. Includes special course features (such as cooperative agreement with an environmental protection center) and course…
Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.
Stolper, Charles D; Perer, Adam; Gotz, David
2014-12-01
As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.
Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface resid...
NHEXAS PHASE I REGION 5 STUDY--METALS IN BLOOD ANALYTICAL RESULTS
This data set includes analytical results for measurements of metals in 165 blood samples. These samples were collected to examine the relationships between personal exposure measurements, environmental measurements, and body burden. Venous blood samples were collected by venipun...
NHEXAS PHASE I REGION 5 STUDY--VOCS IN BLOOD ANALYTICAL RESULTS
This data set includes analytical results for measurements of VOCs (volatile organic compounds) in 145 blood samples. These samples were collected to examine the relationships between personal exposure measurements, environmental measurements, and body burden. Venous blood sample...
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This report provides a detailed summary of the activities carried out to sample groundwater at Waste Area Grouping (WAG) 6. The analytical results for samples collected during Phase 1, Activity 2 of the WAG 6 Resource Conservation and Recovery Act Facility Investigation (RFI) are also presented. In addition, analytical results for Phase 1, activity sampling events for which data were not previously reported are included in this TM. A summary of the groundwater sampling activities of WAG 6, to date, are given in the Introduction. The Methodology section describes the sampling procedures and analytical parameters. Six attachments are included. Attachmentsmore » 1 and 2 provide analytical results for selected RFI groundwater samples and ORNL sampling event. Attachment 3 provides a summary of the contaminants detected in each well sampled for all sampling events conducted at WAG 6. Bechtel National Inc. (BNI)/IT Corporation Contract Laboratory (IT) RFI analytical methods and detection limits are given in Attachment 4. Attachment 5 provides the Oak Ridge National Laboratory (ORNL)/Analytical Chemistry Division (ACD) analytical methods and detection limits and Resource Conservation and Recovery Act (RCRA) quarterly compliance monitoring (1988--1989). Attachment 6 provides ORNL/ACD groundwater analytical methods and detection limits (for the 1990 RCRA semi-annual compliance monitoring).« less
Casement, Ann
2014-02-01
The Jungian analyst Gerhard Adler left Berlin and re-settled in London in 1936. He was closely involved with the professionalization of analytical psychology internationally and in the UK, including the formation of the International Association for Analytical Psychology (IAAP) and The Society of Analytical Psychology (SAP).The tensions that arose within the latter organization led to a split that ended in the formation of the Association of Jungian Analysts (AJA). A further split at AJA resulted in the creation of another organization, the Independent Group of Analytical Psychologists (IGAP). Adler's extensive publications include his role as an editor of Jung's Collected Works and as editor of the C.G. Jung Letters. © 2014, The Society of Analytical Psychology.
NASA Astrophysics Data System (ADS)
Cucu, Daniela; Woods, Mike
2008-08-01
The paper aims to present a practical approach for testing laboratories to ensure the quality of their test results. It is based on the experience gained in assessing a large number of testing laboratories, discussing with management and staff, reviewing results obtained in national and international PTs and ILCs and exchanging information in the EA laboratory committee. According to EN ISO/IEC 17025, an accredited laboratory has to implement a programme to ensure the quality of its test results for each measurand. Pre-analytical, analytical and post-analytical measures shall be applied in a systematic manner. They shall include both quality control and quality assurance measures. When designing the quality assurance programme a laboratory should consider pre-analytical activities (like personnel training, selection and validation of test methods, qualifying equipment), analytical activities ranging from sampling, sample preparation, instrumental analysis and post-analytical activities (like decoding, calculation, use of statistical tests or packages, management of results). Designed on different levels (analyst, quality manager and technical manager), including a variety of measures, the programme shall ensure the validity and accuracy of test results, the adequacy of the management system, prove the laboratory's competence in performing tests under accreditation and last but not least show the comparability of test results. Laboratory management should establish performance targets and review periodically QC/QA results against them, implementing appropriate measures in case of non-compliance.
Ichihara, Kiyoshi; Ceriotti, Ferruccio; Tam, Tran Huu; Sueyoshi, Shigeo; Poon, Priscilla M K; Thong, Mee Ling; Higashiuesato, Yasushi; Wang, Xuejing; Kataoka, Hiromi; Matsubara, Akemi; Shiesh, Shu-Chu; Muliaty, Dewi; Kim, Jeong-Ho; Watanabe, Masakazu; Lam, Christopher W K; Siekmann, Lothar; Lopez, Joseph B; Panteghini, Mauro
2013-07-01
A multicenter study conducted in Southeast Asia to derive reference intervals (RIs) for 72 commonly measured analytes (general chemistry, inflammatory markers, hormones, etc.) featured centralized measurement to clearly detect regionality in test results. The results of 31 standardized analytes are reported, with the remaining analytes presented in the next report. The study included 63 clinical laboratories from South Korea, China, Vietnam, Malaysia, Indonesia, and seven areas in Japan. A total of 3541 healthy individuals aged 20-65 years (Japan 2082, others 1459) were recruited mostly from hospital workers using a well-defined common protocol. All serum specimens were transported to Tokyo at -80°C and collectively measured using reagents from four manufacturers. Three-level nested ANOVA was used to quantitate variation (SD) of test results due to region, sex, and age. A ratio of SD for a given factor over residual SD (representing net between-individual variations) (SDR) exceeding 0.3 was considered significant. Traceability of RIs was ensured by recalibration using value-assigned reference materials. RIs were derived parametrically. SDRs for sex and age were significant for 19 and 16 analytes, respectively. Regional difference was significant for 11 analytes, including high density lipoprotein (HDL)-cholesterol and inflammatory markers. However, when the data were limited to those from Japan, regionality was not observed in any of the analytes. Accordingly, RIs were derived with or without partition by sex and region. RIs applicable to a wide area in Asia were established for the majority of analytes with traceability to reference measuring systems, whereas regional partitioning was required for RIs of the other analytes.
Analytical and experimental study of axisymmetric truncated plug nozzle flow fields
NASA Technical Reports Server (NTRS)
Muller, T. J.; Sule, W. P.; Fanning, A. E.; Giel, T. V.; Galanga, F. L.
1972-01-01
Experimental and analytical investigation of the flow field and base pressure of internal-external-expansion truncated plug nozzles are discussed. Experimental results for two axisymmetric, conical plug-cylindrical shroud, truncated plug nozzles are presented for both open and closed wake operations. These results include extensive optical and pressure data covering nozzle flow field and base pressure characteristics, diffuser effects, lip shock strength, Mach disc behaviour, and the recompression and reverse flow regions. Transonic experiments for a special planar transonic section are presented. An extension of the analytical method of Hall and Mueller to include the internal shock wave from the shroud exit is presented for closed wake operation. Results of this analysis include effects on the flow field and base pressure of ambient pressure ratio, nozzle geometry, and the ratio of specific heats. Static thrust is presented as a function of ambient pressure ratio and nozzle geometry. A new transonic solution method is also presented.
The Case for Adopting Server-side Analytics
NASA Astrophysics Data System (ADS)
Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.
2017-12-01
The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for future applications.
NHEXAS PHASE I ARIZONA STUDY--METALS IN SOIL ANALYTICAL RESULTS
The Metals in Soil data set contains analytical results for measurements of up to 11 metals in 551 soil samples over 392 households. Samples were taken by collecting surface soil in the yard and next to the foundation from each residence. The primary metals of interest include ...
NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR VOCS IN BLANKS
This data set includes analytical results for measurements of VOCs in 88 blank samples. Measurements were made for up to 23 VOCs in blank samples of air, water, and blood. Blank samples were used to assess the potential for sample contamination during collection, storage, shipmen...
NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR VOCS IN REPLICATES
This data set includes analytical results for measurements of VOCs in 204 duplicate (replicate) samples. Measurements were made for up to 23 VOCs in samples of air, water, and blood. Duplicate samples (samples collected along with or next to the original samples) were collected t...
NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR METALS IN REPLICATES
This data set includes analytical results for measurements of metals in 490 duplicate (replicate) samples and for particles in 130 duplicate samples. Measurements were made for up to 11 metals in samples of air, dust, water, blood, and urine. Duplicate samples (samples collected ...
NASA Astrophysics Data System (ADS)
Wasser, L. A.; Gold, A. U.
2017-12-01
There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.
Analytical treatment of self-phase-modulation beyond the slowly varying envelope approximation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Syrchin, M.S.; Zheltikov, A.M.; International Laser Center, M.V. Lomonosov Moscow State University, 119899 Moscow
Analytical treatment of the self-phase-modulation of an ultrashort light pulse is extended beyond the slowly varying envelope approximation. The resulting wave equation is modified to include corrections to self-phase-modulation due to higher-order spatial and temporal derivatives. Analytical solutions are found in the limiting regimes of high nonlinearities and very short pulses. Our results reveal features that can significantly impact both pulse shape and the evolution of the phase.
-Omic and Electronic Health Records Big Data Analytics for Precision Medicine
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.
2017-01-01
Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470
NASA Technical Reports Server (NTRS)
Sadunas, J. A.; French, E. P.; Sexton, H.
1973-01-01
A 1/25 scale model S-2 stage base region thermal environment test is presented. Analytical results are included which reflect the effect of engine operating conditions, model scale, turbo-pump exhaust gas injection on base region thermal environment. Comparisons are made between full scale flight data, model test data, and analytical results. The report is prepared in two volumes. The description of analytical predictions and comparisons with flight data are presented. Tabulation of the test data is provided.
ICDA: A Platform for Intelligent Care Delivery Analytics
Gotz, David; Stavropoulos, Harry; Sun, Jimeng; Wang, Fei
2012-01-01
The identification of high-risk patients is a critical component in improving patient outcomes and managing costs. This paper describes the Intelligent Care Delivery Analytics platform (ICDA), a system which enables risk assessment analytics that process large collections of dynamic electronic medical data to identify at-risk patients. ICDA works by ingesting large volumes of data into a common data model, then orchestrating a collection of analytics that identify at-risk patients. It also provides an interactive environment through which users can access and review the analytics results. In addition, ICDA provides APIs via which analytics results can be retrieved to surface in external applications. A detailed review of ICDA’s architecture is provided. Descriptions of four use cases are included to illustrate ICDA’s application within two different data environments. These use cases showcase the system’s flexibility and exemplify the types of analytics it enables. PMID:23304296
Evaluation of the Effect of Exhausts from Liquid and Solid Rockets on Ozone Layer
NASA Astrophysics Data System (ADS)
Yamagiwa, Yoshiki; Ishimaki, Tetsuya
This paper reports the analytical results of the influences of solid rocket and liquid rocket exhausts on ozone layer. It is worried about that the exhausts from solid propellant rockets cause the ozone depletion in the ozone layer. Some researchers try to develop the analytical model of ozone depletion by rocket exhausts to understand its physical phenomena and to find the effective design of rocket to minimize its effect. However, these models do not include the exhausts from liquid rocket although there are many cases to use solid rocket boosters with a liquid rocket at the same time in practical situations. We constructed combined analytical model include the solid rocket exhausts and liquid rocket exhausts to analyze their effects. From the analytical results, we find that the exhausts from liquid rocket suppress the ozone depletion by solid rocket exhausts.
U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--METALS IN SOIL ANALYTICAL RESULTS
The Metals in Soil data set contains analytical results for measurements of up to 11 metals in 91 soil samples over 91 households. Samples were taken by collecting surface soil in the yard of each residence. The primary metals of interest include lead (CAS# 7439-92-1), arsenic ...
NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR METALS IN BLANKS
This data set includes analytical results for measurements of metals in 205 blank samples and for particles in 64 blank samples. Measurements were made for up to 12 metals in blank samples of air, dust, soil, water, food and beverages, blood, hair, and urine. Blank samples were u...
NASA Astrophysics Data System (ADS)
Yahya, W. A.; Falaye, B. J.; Oluwadare, O. J.; Oyewumi, K. J.
2013-08-01
By using the Nikiforov-Uvarov method, we give the approximate analytical solutions of the Dirac equation with the shifted Deng-Fan potential including the Yukawa-like tensor interaction under the spin and pseudospin symmetry conditions. After using an improved approximation scheme, we solved the resulting schr\\"{o}dinger-like equation analytically. Numerical results of the energy eigenvalues are also obtained, as expected, the tensor interaction removes degeneracies between spin and pseudospin doublets.
Accuracy of trace element determinations in alternate fuels
NASA Technical Reports Server (NTRS)
Greenbauer-Seng, L. A.
1980-01-01
A review of the techniques used at Lewis Research Center (LeRC) in trace metals analysis is presented, including the results of Atomic Absorption Spectrometry and DC Arc Emission Spectrometry of blank levels and recovery experiments for several metals. The design of an Interlaboratory Study conducted by LeRC is presented. Several factors were investigated, including: laboratory, analytical technique, fuel type, concentration, and ashing additive. Conclusions drawn from the statistical analysis will help direct research efforts toward those areas most responsible for the poor interlaboratory analytical results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, B.L.; Pool, K.H.; Evans, J.C.
1997-01-01
This report describes the analytical results of vapor samples taken from the headspace of waste storage tank 241-BY-108 (Tank BY-108) at the Hanford Site in Washington State. The results described in this report is the second in a series comparing vapor sampling of the tank headspace using the Vapor Sampling System (VSS) and In Situ Vapor Sampling (ISVS) system without high efficiency particulate air (HEPA) prefiltration. The results include air concentrations of water (H{sub 2}O) and ammonia (NH{sub 3}), permanent gases, total non-methane organic compounds (TO-12), and individual organic analytes collected in SUMMA{trademark} canisters and on triple sorbent traps (TSTs).more » Samples were collected by Westinghouse Hanford Company (WHC) and analyzed by Pacific Northwest National Laboratory (PNNL). Analyses were performed by the Vapor Analytical Laboratory (VAL) at PNNL. Analyte concentrations were based on analytical results and, where appropriate, sample volume measurements provided by WHC.« less
Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.
Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis
2016-07-01
Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.
Zhang, Xindi; Warren, Jim; Corter, Arden; Goodyear-Smith, Felicity
2016-01-01
This paper describes development of a prototype data analytics portal for analysis of accumulated screening results from eCHAT (electronic Case-finding and Help Assessment Tool). eCHAT allows individuals to conduct a self-administered lifestyle and mental health screening assessment, with usage to date chiefly in the context of primary care waiting rooms. The intention is for wide roll-out to primary care clinics, including secondary school based clinics, resulting in the accumulation of population-level data. Data from a field trial of eCHAT with sexual health questions tailored to youth were used to support design of a data analytics portal for population-level data. The design process included user personas and scenarios, screen prototyping and a simulator for generating large-scale data sets. The prototype demonstrates the promise of wide-scale self-administered screening data to support a range of users including practice managers, clinical directors and health policy analysts.
Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek
2016-01-15
In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Fuller, C. R.
1986-01-01
A simplified analytical model of transmission of noise into the interior of propeller-driven aircraft has been developed. The analysis includes directivity and relative phase effects of the propeller noise sources, and leads to a closed form solution for the coupled motion between the interior and exterior fields via the shell (fuselage) vibrational response. Various situations commonly encountered in considering sound transmission into aircraft fuselages are investigated analytically and the results obtained are compared to measurements in real aircraft. In general the model has proved successful in identifying basic mechanisms behind noise transmission phenomena.
Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A
2017-05-10
Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.
Multi-center evaluation of analytical performance of the Beckman Coulter AU5822 chemistry analyzer.
Zimmerman, M K; Friesen, L R; Nice, A; Vollmer, P A; Dockery, E A; Rankin, J D; Zmuda, K; Wong, S H
2015-09-01
Our three academic institutions, Indiana University, Northwestern Memorial Hospital, and Wake Forest, were among the first in the United States to implement the Beckman Coulter AU5822 series chemistry analyzers. We undertook this post-hoc multi-center study by merging our data to determine performance characteristics and the impact of methodology changes on analyte measurement. We independently completed performance validation studies including precision, linearity/analytical measurement range, method comparison, and reference range verification. Complete data sets were available from at least one institution for 66 analytes with the following groups: 51 from all three institutions, and 15 from 1 or 2 institutions for a total sample size of 12,064. Precision was similar among institutions. Coefficients of variation (CV) were <10% for 97%. Analytes with CVs >10% included direct bilirubin and digoxin. All analytes exhibited linearity over the analytical measurement range. Method comparison data showed slopes between 0.900-1.100 for 87.9% of the analytes. Slopes for amylase, tobramycin and urine amylase were <0.8; the slope for lipase was >1.5, due to known methodology or standardization differences. Consequently, reference ranges of amylase, urine amylase and lipase required only minor or no modification. The four AU5822 analyzers independently evaluated at three sites showed consistent precision, linearity, and correlation results. Since installations, the test results had been well received by clinicians from all three institutions. Copyright © 2015. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Dubois, George B; Ocvirk, Fred W
1953-01-01
An approximate analytical solution including the effect of end leakage from the oil film of short plain bearings is presented because of the importance of endwise flow in sleeve bearings of the short lengths commonly used. The analytical approximation is supported by experimental data, resulting in charts which facilitate analysis of short plain bearings. The analytical approximation includes the endwise flow and that part of the circumferential flow which is related to surface velocity and film thickness but neglects the effect of film pressure on the circumferential flow. In practical use, this approximation applies best to bearings having a length-diameter ratio up to 1, and the effects of elastic deflection, inlet oil pressure, and changes of clearance with temperature minimize the relative importance of the neglected term. The analytical approximation was found to be an extension of a little-known pressure-distribution function originally proposed by Michell and Cardullo.
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...
2016-07-05
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
Magnetic fields for transporting charged beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parzen, G.
1976-01-01
The transport of charged particle beams requires magnetic fields that must be shaped correctly and very accurately. During the last 20 years or so, many studies have been made, both analytically and through the use of computer programs, of various magnetic shapes that have proved to be useful. Many of the results for magnetic field shapes can be applied equally well to electric field shapes. A report is given which gathers together the results that have more general significance and would be useful in designing a configuration to produce a desired magnetic field shape. The field shapes studied include themore » fields in dipoles, quadrupoles, sextupoles, octupoles, septum magnets, combined-function magnets, and electrostatic septums. Where possible, empirical formulas are proposed, based on computer and analytical studies and on magnetic field measurements. These empirical formulas are often easier to use than analytical formulas and often include effects that are difficult to compute analytically. In addition, results given in the form of tables and graphs serve as illustrative examples. The field shapes studied include uniform fields produced by window-frame magnets, C-magnets, H-magnets, and cosine magnets; linear fields produced by various types of quadrupoles; quadratic and cubic fields produced by sextupoles and octupoles; combinations of uniform and linear fields; and septum fields with sharp boundaries.« less
CREATE-IP and CREATE-V: Data and Services Update
NASA Astrophysics Data System (ADS)
Carriere, L.; Potter, G. L.; Hertz, J.; Peters, J.; Maxwell, T. P.; Strong, S.; Shute, J.; Shen, Y.; Duffy, D.
2017-12-01
The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center and the Earth System Grid Federation (ESGF) are working together to build a uniform environment for the comparative study and use of a group of reanalysis datasets of particular importance to the research community. This effort is called the Collaborative REAnalysis Technical Environment (CREATE) and it contains two components: the CREATE-Intercomparison Project (CREATE-IP) and CREATE-V. This year's efforts included generating and publishing an atmospheric reanalysis ensemble mean and spread and improving the analytics available through CREATE-V. Related activities included adding access to subsets of the reanalysis data through ArcGIS and expanding the visualization tool to GMAO forecast data. This poster will present the access mechanisms to this data and use cases including example Jupyter Notebook code. The reanalysis ensemble was generated using two methods, first using standard Python tools for regridding, extracting levels and creating the ensemble mean and spread on a virtual server in the NCCS environment. The second was using a new analytics software suite, the Earth Data Analytics Services (EDAS), coupled with a high-performance Data Analytics and Storage System (DASS) developed at the NCCS. Results were compared to validate the EDAS methodologies, and the results, including time to process, will be presented. The ensemble includes selected 6 hourly and monthly variables, regridded to 1.25 degrees, with 24 common levels used for the 3D variables. Use cases for the new data and services will be presented, including the use of EDAS for the backend analytics on CREATE-V, the use of the GMAO forecast aerosol and cloud data in CREATE-V, and the ability to connect CREATE-V data to NCCS ArcGIS services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, IIftekhar; Husain, Tausif; Uddin, Md Wasi
2015-08-24
This paper presents a nonlinear analytical model of a novel double-sided flux concentrating Transverse Flux Machine (TFM) based on the Magnetic Equivalent Circuit (MEC) model. The analytical model uses a series-parallel combination of flux tubes to predict the flux paths through different parts of the machine including air gaps, permanent magnets, stator, and rotor. The two-dimensional MEC model approximates the complex three-dimensional flux paths of the TFM and includes the effects of magnetic saturation. The model is capable of adapting to any geometry that makes it a good alternative for evaluating prospective designs of TFM compared to finite element solversmore » that are numerically intensive and require more computation time. A single-phase, 1-kW, 400-rpm machine is analytically modeled, and its resulting flux distribution, no-load EMF, and torque are verified with finite element analysis. The results are found to be in agreement, with less than 5% error, while reducing the computation time by 25 times.« less
Analytical Modeling of a Novel Transverse Flux Machine for Direct Drive Wind Turbine Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, IIftekhar; Husain, Tausif; Uddin, Md Wasi
2015-09-02
This paper presents a nonlinear analytical model of a novel double sided flux concentrating Transverse Flux Machine (TFM) based on the Magnetic Equivalent Circuit (MEC) model. The analytical model uses a series-parallel combination of flux tubes to predict the flux paths through different parts of the machine including air gaps, permanent magnets (PM), stator, and rotor. The two-dimensional MEC model approximates the complex three-dimensional flux paths of the TFM and includes the effects of magnetic saturation. The model is capable of adapting to any geometry which makes it a good alternative for evaluating prospective designs of TFM as compared tomore » finite element solvers which are numerically intensive and require more computation time. A single phase, 1 kW, 400 rpm machine is analytically modeled and its resulting flux distribution, no-load EMF and torque, verified with Finite Element Analysis (FEA). The results are found to be in agreement with less than 5% error, while reducing the computation time by 25 times.« less
NASA Astrophysics Data System (ADS)
Zhu, Ting-Lei; Zhao, Chang-Yin; Zhang, Ming-Jiang
2017-04-01
This paper aims to obtain an analytic approximation to the evolution of circular orbits governed by the Earth's J2 and the luni-solar gravitational perturbations. Assuming that the lunar orbital plane coincides with the ecliptic plane, Allan and Cook (Proc. R. Soc. A, Math. Phys. Eng. Sci. 280(1380):97, 1964) derived an analytic solution to the orbital plane evolution of circular orbits. Using their result as an intermediate solution, we establish an approximate analytic model with lunar orbital inclination and its node regression be taken into account. Finally, an approximate analytic expression is derived, which is accurate compared to the numerical results except for the resonant cases when the period of the reference orbit approximately equals the integer multiples (especially 1 or 2 times) of lunar node regression period.
Methods for determination of inorganic substances in water and fluvial sediments
Fishman, Marvin J.; Friedman, Linda C.
1989-01-01
Chapter Al of the laboratory manual contains methods used by the U.S. Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, the total recoverable and total of constituents in water-suspended sediment samples, and the recoverable and total concentrations of constituents in samples of bottom material. The introduction to the manual includes essential definitions and a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including the accuracy and precision of analyses, the use of standard-reference water samples, and the operation of an effective quality-assurance program. Methods for sample preparation and pretreatment are given also. A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods of these techniques are arranged alphabetically by constituent. For each method, the general topics covered are the application, the principle of the method, the interferences, the apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 126 methods are given for the determination of 70 inorganic constituents and physical properties of water, suspended sediment, and bottom material.
Methods for determination of inorganic substances in water and fluvial sediments
Fishman, Marvin J.; Friedman, Linda C.
1985-01-01
Chapter Al of the laboratory manual contains methods used by the Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, total recoverable and total of constituents in water-suspended sediment samples, and recoverable and total concentrations of constituents in samples of bottom material. Essential definitions are included in the introduction to the manual, along with a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including accuracy and precision of analyses, the use of standard reference water samples, and the operation of an effective quality assurance program. Methods for sample preparation and pretreatment are given also.A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods involving these techniques are arranged alphabetically according to constituent. For each method given, the general topics covered are application, principle of the method, interferences, apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 125 methods are given for the determination of 70 different inorganic constituents and physical properties of water, suspended sediment, and bottom material.
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics
2016-01-01
Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304
The National Shipbuilding Research Program. Environmental Studies and Testing (Phase V)
2000-11-20
development of an analytical procedure for toxic organic compounds, including TBT ( tributyltin ), whose turnaround time would be in the order of minutes...Cost of the Subtask was $20,000. Subtask #33 - Turnaround Analytical Method for TBT This Subtask performed a preliminary investigation leading to the...34Quick TBT Analytical Method" that will yield reliable results in 15 minutes, a veritable breakthrough in sampling technology. The Subtask was managed by
Analytical performance of a bronchial genomic classifier.
Hu, Zhanzhi; Whitney, Duncan; Anderson, Jessica R; Cao, Manqiu; Ho, Christine; Choi, Yoonha; Huang, Jing; Frink, Robert; Smith, Kate Porta; Monroe, Robert; Kennedy, Giulia C; Walsh, P Sean
2016-02-26
The current standard practice of lung lesion diagnosis often leads to inconclusive results, requiring additional diagnostic follow up procedures that are invasive and often unnecessary due to the high benign rate in such lesions (Chest 143:e78S-e92, 2013). The Percepta bronchial genomic classifier was developed and clinically validated to provide more accurate classification of lung nodules and lesions that are inconclusive by bronchoscopy, using bronchial brushing specimens (N Engl J Med 373:243-51, 2015, BMC Med Genomics 8:18, 2015). The analytical performance of the Percepta test is reported here. Analytical performance studies were designed to characterize the stability of RNA in bronchial brushing specimens during collection and shipment; analytical sensitivity defined as input RNA mass; analytical specificity (i.e. potentially interfering substances) as tested on blood and genomic DNA; and assay performance studies including intra-run, inter-run, and inter-laboratory reproducibility. RNA content within bronchial brushing specimens preserved in RNAprotect is stable for up to 20 days at 4 °C with no changes in RNA yield or integrity. Analytical sensitivity studies demonstrated tolerance to variation in RNA input (157 ng to 243 ng). Analytical specificity studies utilizing cancer positive and cancer negative samples mixed with either blood (up to 10 % input mass) or genomic DNA (up to 10 % input mass) demonstrated no assay interference. The test is reproducible from RNA extraction through to Percepta test result, including variation across operators, runs, reagent lots, and laboratories (standard deviation of 0.26 for scores on > 6 unit scale). Analytical sensitivity, analytical specificity and robustness of the Percepta test were successfully verified, supporting its suitability for clinical use.
Analytic barrage attack model. Final report, January 1986-January 1989
DOE Office of Scientific and Technical Information (OSTI.GOV)
St Ledger, J.W.; Naegeli, R.E.; Dowden, N.A.
An analytic model is developed for a nuclear barrage attack, assuming weapons with no aiming error and a cookie-cutter damage function. The model is then extended with approximations for the effects of aiming error and distance damage sigma. The final result is a fast running model which calculates probability of damage for a barrage attack. The probability of damage is accurate to within seven percent or better, for weapon reliabilities of 50 to 100 percent, distance damage sigmas of 0.5 or less, and zero to very large circular error probabilities. FORTRAN 77 coding is included in the report for themore » analytic model and for a numerical model used to check the analytic results.« less
The Savannah River Site's Groundwater Monitoring Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report summarizes the Savannah River Site (SRS) groundwater monitoring program conducted during the first quarter of 1992. It includes the analytical data, field data, data review, quality control, and other documentation for this program; provides a record of the program's activities; and serves as an official document of the analytical results.
State-of-the-Art of (Bio)Chemical Sensor Developments in Analytical Spanish Groups
Plata, María Reyes; Contento, Ana María; Ríos, Angel
2010-01-01
(Bio)chemical sensors are one of the most exciting fields in analytical chemistry today. The development of these analytical devices simplifies and miniaturizes the whole analytical process. Although the initial expectation of the massive incorporation of sensors in routine analytical work has been truncated to some extent, in many other cases analytical methods based on sensor technology have solved important analytical problems. Many research groups are working in this field world-wide, reporting interesting results so far. Modestly, Spanish researchers have contributed to these recent developments. In this review, we summarize the more representative achievements carried out for these groups. They cover a wide variety of sensors, including optical, electrochemical, piezoelectric or electro-mechanical devices, used for laboratory or field analyses. The capabilities to be used in different applied areas are also critically discussed. PMID:22319260
An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1991-01-01
The three dimensional quasi-analytical sensitivity analysis and the ancillary driver programs are developed needed to carry out the studies and perform comparisons. The code is essentially contained in one unified package which includes the following: (1) a three dimensional transonic wing analysis program (ZEBRA); (2) a quasi-analytical portion which determines the matrix elements in the quasi-analytical equations; (3) a method for computing the sensitivity coefficients from the resulting quasi-analytical equations; (4) a package to determine for comparison purposes sensitivity coefficients via the finite difference approach; and (5) a graphics package.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Analytical Ultrasonics in Materials Research and Testing
NASA Technical Reports Server (NTRS)
Vary, A.
1986-01-01
Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.
NASA Astrophysics Data System (ADS)
Mathias, Simon A.; Gluyas, Jon G.; GonzáLez MartíNez de Miguel, Gerardo J.; Hosseini, Seyyed A.
2011-12-01
This work extends an existing analytical solution for pressure buildup because of CO2 injection in brine aquifers by incorporating effects associated with partial miscibility. These include evaporation of water into the CO2 rich phase and dissolution of CO2 into brine and salt precipitation. The resulting equations are closed-form, including the locations of the associated leading and trailing shock fronts. Derivation of the analytical solution involves making a number of simplifying assumptions including: vertical pressure equilibrium, negligible capillary pressure, and constant fluid properties. The analytical solution is compared to results from TOUGH2 and found to accurately approximate the extent of the dry-out zone around the well, the resulting permeability enhancement due to residual brine evaporation, the volumetric saturation of precipitated salt, and the vertically averaged pressure distribution in both space and time for the four scenarios studied. While brine evaporation is found to have a considerable effect on pressure, the effect of CO2 dissolution is found to be small. The resulting equations remain simple to evaluate in spreadsheet software and represent a significant improvement on current methods for estimating pressure-limited CO2 storage capacity.
Life cycle management of analytical methods.
Parr, Maria Kristina; Schmidt, Alexander H
2018-01-05
In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.
Shaikh, M S; Moiz, B
2016-04-01
Around two-thirds of important clinical decisions about the management of patients are based on laboratory test results. Clinical laboratories are required to adopt quality control (QC) measures to ensure provision of accurate and precise results. Six sigma is a statistical tool, which provides opportunity to assess performance at the highest level of excellence. The purpose of this study was to assess performance of our hematological parameters on sigma scale in order to identify gaps and hence areas of improvement in patient care. Twelve analytes included in the study were hemoglobin (Hb), hematocrit (Hct), red blood cell count (RBC), mean corpuscular volume (MCV), red cell distribution width (RDW), total leukocyte count (TLC) with percentages of neutrophils (Neutr%) and lymphocytes (Lymph %), platelet count (Plt), mean platelet volume (MPV), prothrombin time (PT), and fibrinogen (Fbg). Internal quality control data and external quality assurance survey results were utilized for the calculation of sigma metrics for each analyte. Acceptable sigma value of ≥3 was obtained for the majority of the analytes included in the analysis. MCV, Plt, and Fbg achieved value of <3 for level 1 (low abnormal) control. PT performed poorly on both level 1 and 2 controls with sigma value of <3. Despite acceptable conventional QC tools, application of sigma metrics can identify analytical deficits and hence prospects for the improvement in clinical laboratories. © 2016 John Wiley & Sons Ltd.
Evaluation of Analytical Errors in a Clinical Chemistry Laboratory: A 3 Year Experience
Sakyi, AS; Laing, EF; Ephraim, RK; Asibey, OF; Sadique, OK
2015-01-01
Background: Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. Aim: We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. Materials and Methods: We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). Results: A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Conclusion: Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified. PMID:25745569
NASA Technical Reports Server (NTRS)
Ambur, Damodar R.; Starnes, James H., Jr.; Prasad, Chunchu B.
1993-01-01
An analytical procedure is presented for determining the transient response of simply supported, rectangular laminated composite plates subjected to impact loads from airgun-propelled or dropped-weight impactors. A first-order shear-deformation theory is included in the analysis to represent properly any local short-wave-length transient bending response. The impact force is modeled as a locally distributed load with a cosine-cosine distribution. A double Fourier series expansion and the Timoshenko small-increment method are used to determine the contact force, out-of-plane deflections, and in-plane strains and stresses at any plate location due to an impact force at any plate location. The results of experimental and analytical studies are compared for quasi-isotropic laminates. The results indicate that using the appropriate local force distribution for the locally loaded area and including transverse-shear-deformation effects in the laminated plate response analysis are important. The applicability of the present analytical procedure based on small deformation theory is investigated by comparing analytical and experimental results for combinations of quasi-isotropic laminate thicknesses and impact energy levels. The results of this study indicate that large-deformation effects influence the response of both 24- and 32-ply laminated plates, and that a geometrically nonlinear analysis is required for predicting the response accurately.
Transport Phenomena in Thin Rotating Liquid Films Including: Nucleate Boiling
NASA Technical Reports Server (NTRS)
Faghri, Amir
2005-01-01
In this grant, experimental, numerical and analytical studies of heat transfer in a thin liquid film flowing over a rotating disk have been conducted. Heat transfer coefficients were measured experimentally in a rotating disk heat transfer apparatus where the disk was heated from below with electrical resistance heaters. The heat transfer measurements were supplemented by experimental characterization of the liquid film thickness using a novel laser based technique. The heat transfer measurements show that the disk rotation plays an important role on enhancement of heat transfer primarily through the thinning of the liquid film. Experiments covered both momentum and rotation dominated regimes of the flow and heat transfer in this apparatus. Heat transfer measurements have been extended to include evaporation and nucleate boiling and these experiments are continuing in our laboratory. Empirical correlations have also been developed to provide useful information for design of compact high efficiency heat transfer devices. The experimental work has been supplemented by numerical and analytical analyses of the same problem. Both numerical and analytical results have been found to agree reasonably well with the experimental results on liquid film thickness and heat transfer Coefficients/Nusselt numbers. The numerical simulations include the free surface liquid film flow and heat transfer under disk rotation including the conjugate effects. The analytical analysis utilizes an integral boundary layer approach from which
An analytical study on groundwater flow in drainage basins with horizontal wells
NASA Astrophysics Data System (ADS)
Wang, Jun-Zhi; Jiang, Xiao-Wei; Wan, Li; Wang, Xu-Sheng; Li, Hailong
2014-06-01
Analytical studies on release/capture zones are often limited to a uniform background groundwater flow. In fact, for basin-scale problems, the undulating water table would lead to the development of hierarchically nested flow systems, which are more complex than a uniform flow. Under the premise that the water table is a replica of undulating topography and hardly influenced by wells, an analytical solution of hydraulic head is derived for a two-dimensional cross section of a drainage basin with horizontal injection/pumping wells. Based on the analytical solution, distributions of hydraulic head, stagnation points and flow systems (including release/capture zones) are explored. The superposition of injection/pumping wells onto the background flow field leads to the development of new internal stagnation points and new flow systems (including release/capture zones). Generally speaking, the existence of n injection/pumping wells would result in up to n new internal stagnation points and up to 2n new flow systems (including release/capture zones). The analytical study presented, which integrates traditional well hydraulics with the theory of regional groundwater flow, is useful in understanding basin-scale groundwater flow influenced by human activities.
NASA Astrophysics Data System (ADS)
Jitomirskaya, S.; Marx, C. A.
2012-11-01
We show how to extend (and with what limitations) Avila's global theory of analytic SL(2,C) cocycles to families of cocycles with singularities. This allows us to develop a strategy to determine the Lyapunov exponent for the extended Harper's model, for all values of parameters and all irrational frequencies. In particular, this includes the self-dual regime for which even heuristic results did not previously exist in physics literature. The extension of Avila's global theory is also shown to imply continuous behavior of the LE on the space of analytic {M_2({C})}-cocycles. This includes rational approximation of the frequency, which so far has not been available.
Assessment of the analytical capabilities of inductively coupled plasma-mass spectrometry
Taylor, Howard E.; Garbarino, John R.
1988-01-01
A thorough assessment of the analytical capabilities of inductively coupled plasma-mass spectrometry was conducted for selected analytes of importance in water quality applications and hydrologic research. A multielement calibration curve technique was designed to produce accurate and precise results in analysis times of approximately one minute. The suite of elements included Al, As, B, Ba, Be, Cd, Co, Cr, Cu, Hg, Li, Mn, Mo, Ni, Pb, Se, Sr, V, and Zn. The effects of sample matrix composition on the accuracy of the determinations showed that matrix elements (such as Na, Ca, Mg, and K) that may be present in natural water samples at concentration levels greater than 50 mg/L resulted in as much as a 10% suppression in ion current for analyte elements. Operational detection limits are presented.
Flexible aircraft dynamic modeling for dynamic analysis and control synthesis
NASA Technical Reports Server (NTRS)
Schmidt, David K.
1989-01-01
The linearization and simplification of a nonlinear, literal model for flexible aircraft is highlighted. Areas of model fidelity that are critical if the model is to be used for control system synthesis are developed and several simplification techniques that can deliver the necessary model fidelity are discussed. These techniques include both numerical and analytical approaches. An analytical approach, based on first-order sensitivity theory is shown to lead not only to excellent numerical results, but also to closed-form analytical expressions for key system dynamic properties such as the pole/zero factors of the vehicle transfer-function matrix. The analytical results are expressed in terms of vehicle mass properties, vibrational characteristics, and rigid-body and aeroelastic stability derivatives, thus leading to the underlying causes for critical dynamic characteristics.
Verstraeten, Ingrid M.; Steele, G.V.; Cannia, J.C.; Bohlke, J.K.; Kraemer, T.E.; Hitch, D.E.; Wilson, K.E.; Carnes, A.E.
2001-01-01
A study of the water resources of the Dutch Flats area in the western part of the North Platte Natural Resources District, western Nebraska, was conducted from 1995 through 1999 to describe the surface water and hydrogeology, the spatial distribution of selected water-quality constituents in surface and ground water, and the surface-water/ground-water interaction in selected areas. This report describes the selected field and analytical methods used in the study and selected analytical results from the study not previously published. Specifically, dissolved gases, age-dating data, and other isotopes collected as part of an intensive sampling effort in August and November 1998 and all uranium and uranium isotope data collected through the course of this study are included in the report.
The Effects of Parent Participation on Child Psychotherapy Outcome: A Meta-Analytic Review
ERIC Educational Resources Information Center
Dowell, Kathy A.; Ogles, Benjamin M.
2010-01-01
Forty-eight child psychotherapy outcome studies offering direct comparisons of an individual child treatment group to a combined parent-child/family therapy treatment group were included in this meta-analytic review. Results indicate that combined treatments produced a moderate effect beyond the outcomes achieved by individual child treatments,…
A Model for Axial Magnetic Bearings Including Eddy Currents
NASA Technical Reports Server (NTRS)
Kucera, Ladislav; Ahrens, Markus
1996-01-01
This paper presents an analytical method of modelling eddy currents inside axial bearings. The problem is solved by dividing an axial bearing into elementary geometric forms, solving the Maxwell equations for these simplified geometries, defining boundary conditions and combining the geometries. The final result is an analytical solution for the flux, from which the impedance and the force of an axial bearing can be derived. Several impedance measurements have shown that the analytical solution can fit the measured data with a precision of approximately 5%.
Moscow Test Well, INEL Oversight Program: Aqueous geochemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurry, M.; Fromm, J.; Welhan, J.
1992-09-29
This report presents a summary and interpretation of data gathered during sampling of the Moscow Test Well at Moscow, Idaho during April and May of 1992. The principal objectives of this chemical survey were to validate sampling procedures with a new straddle packer sampling tool in a previously hydrologically well characterized and simple sampling environment, and to compare analytical results from two independent labs for reproducibility of analytical results. Analytes included a wide range of metals, anions, nutrients, BNA`s, and VOC`s. Secondary objectives included analyzing of waters from a large distilled water tank (utilized for all field laboratory purposes asmore » ``pure`` stock water), of water which passed through a steamer used to clean the packer, and of rinsates from the packer tool itself before it was lowered into the test well. Analyses were also obtained of blanks and spikes for data validation purposes.« less
Hydrocarbon-Fueled Rocket Engine Plume Diagnostics: Analytical Developments and Experimental Results
NASA Technical Reports Server (NTRS)
Tejwani, Gopal D.; McVay, Gregory P.; Langford, Lester A.; St. Cyr, William W.
2006-01-01
A viewgraph presentation describing experimental results and analytical developments about plume diagnostics for hydrocarbon-fueled rocket engines is shown. The topics include: 1) SSC Plume Diagnostics Background; 2) Engine Health Monitoring Approach; 3) Rocket Plume Spectroscopy Simulation Code; 4) Spectral Simulation for 10 Atomic Species and for 11 Diatomic Molecular Electronic Bands; 5) "Best" Lines for Plume Diagnostics for Hydrocarbon-Fueled Rocket Engines; 6) Experimental Set Up for the Methane Thruster Test Program and Experimental Results; and 7) Summary and Recommendations.
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.
Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita
2016-10-11
We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.
Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba
2014-11-01
Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity. However, the same techniques discussed could also have application to catastrophes resulting from other incidents, such as natural disasters or industrial accidents. Further, the high sample throughput enabled by the techniques discussed could be employed for conventional environmental studies and compliance monitoring, potentially decreasing costs and/or increasing the quantity of data available to decision-makers. Published by Elsevier Ltd.
An analytical and experimental study of crack extension in center-notched composites
NASA Technical Reports Server (NTRS)
Beuth, Jack L., Jr.; Herakovich, Carl T.
1987-01-01
The normal stress ratio theory for crack extension in anisotropic materials is studied analytically and experimentally. The theory is applied within a microscopic-level analysis of a single center notch of arbitrary orientation in a unidirectional composite material. The bulk of the analytical work of this study applies an elasticity solution for an infinite plate with a center line to obtain critical stress and crack growth direction predictions. An elasticity solution for an infinite plate with a center elliptical flaw is also used to obtain qualitative predictions of the location of crack initiation on the border of a rounded notch tip. The analytical portion of the study includes the formulation of a new crack growth theory that includes local shear stress. Normal stress ratio theory predictions are obtained for notched unidirectional tensile coupons and unidirectional Iosipescu shear specimens. These predictions are subsequently compared to experimental results.
Comparison of analysis and flight test data for a drone aircraft with active flutter suppression
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Pototzky, A. S.
1981-01-01
A drone aircraft equipped with an active flutter suppression system is considered with emphasis on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are given for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. The mathematical models are included and existing analytical techniques are described as well as an alternative analytical technique for obtaining closed-loop results.
A Meta-Analytic Review of Components Associated with Parent Training Program Effectiveness
ERIC Educational Resources Information Center
Kaminski, Jennifer Wyatt; Valle, Linda Anne; Filene, Jill H.; Boyle, Cynthia L.
2008-01-01
This component analysis used meta-analytic techniques to synthesize the results of 77 published evaluations of parent training programs (i.e., programs that included the active acquisition of parenting skills) to enhance behavior and adjustment in children aged 0-7. Characteristics of program content and delivery method were used to predict effect…
ERIC Educational Resources Information Center
McGill, Ryan J.; Canivez, Gary L.
2016-01-01
As recommended by Carroll, the present study examined the factor structure of the Wechsler Intelligence Scale for Children-Fourth Edition Spanish (WISC-IV Spanish) normative sample using higher order exploratory factor analytic techniques not included in the WISC-IV Spanish Technical Manual. Results indicated that the WISC-IV Spanish subtests were…
The Savannah River Site's groundwater monitoring program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report summarizes the Savannah River Site (SRS) groundwater monitoring program conducted by EPD/EMS in the first quarter of 1991. In includes the analytical data, field data, data review, quality control, and other documentation for this program, provides a record of the program's activities and rationale, and serves as an official document of the analytical results.
The Savannah River Site`s Groundwater Monitoring Program. First quarter 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report summarizes the Savannah River Site (SRS) groundwater monitoring program conducted during the first quarter of 1992. It includes the analytical data, field data, data review, quality control, and other documentation for this program; provides a record of the program`s activities; and serves as an official document of the analytical results.
NASA Astrophysics Data System (ADS)
Ravi, J. T.; Nidhan, S.; Muthu, N.; Maiti, S. K.
2018-02-01
An analytical method for determination of dimensions of longitudinal crack in monolithic beams, based on frequency measurements, has been extended to model L and inverted T cracks. Such cracks including longitudinal crack arise in beams made of layered isotropic or composite materials. A new formulation for modelling cracks in bi-material beams is presented. Longitudinal crack segment sizes, for L and inverted T cracks, varying from 2.7% to 13.6% of length of Euler-Bernoulli beams are considered. Both forward and inverse problems have been examined. In the forward problems, the analytical results are compared with finite element (FE) solutions. In the inverse problems, the accuracy of prediction of crack dimensions is verified using FE results as input for virtual testing. The analytical results show good agreement with the actual crack dimensions. Further, experimental studies have been done to verify the accuracy of the analytical method for prediction of dimensions of three types of crack in isotropic and bi-material beams. The results show that the proposed formulation is reliable and can be employed for crack detection in slender beam like structures in practice.
NASA Astrophysics Data System (ADS)
Avitabile, Daniele; Bridges, Thomas J.
2010-06-01
Numerical integration of complex linear systems of ODEs depending analytically on an eigenvalue parameter are considered. Complex orthogonalization, which is required to stabilize the numerical integration, results in non-analytic systems. It is shown that properties of eigenvalues are still efficiently recoverable by extracting information from a non-analytic characteristic function. The orthonormal systems are constructed using the geometry of Stiefel bundles. Different forms of continuous orthogonalization in the literature are shown to correspond to different choices of connection one-form on the Stiefel bundle. For the numerical integration, Gauss-Legendre Runge-Kutta algorithms are the principal choice for preserving orthogonality, and performance results are shown for a range of GLRK methods. The theory and methods are tested by application to example boundary value problems including the Orr-Sommerfeld equation in hydrodynamic stability.
Comparative Kinetic Analysis of Closed-Ended and Open-Ended Porous Sensors
NASA Astrophysics Data System (ADS)
Zhao, Yiliang; Gaur, Girija; Mernaugh, Raymond L.; Laibinis, Paul E.; Weiss, Sharon M.
2016-09-01
Efficient mass transport through porous networks is essential for achieving rapid response times in sensing applications utilizing porous materials. In this work, we show that open-ended porous membranes can overcome diffusion challenges experienced by closed-ended porous materials in a microfluidic environment. A theoretical model including both transport and reaction kinetics is employed to study the influence of flow velocity, bulk analyte concentration, analyte diffusivity, and adsorption rate on the performance of open-ended and closed-ended porous sensors integrated with flow cells. The analysis shows that open-ended pores enable analyte flow through the pores and greatly reduce the response time and analyte consumption for detecting large molecules with slow diffusivities compared with closed-ended pores for which analytes largely flow over the pores. Experimental confirmation of the results was carried out with open- and closed-ended porous silicon (PSi) microcavities fabricated in flow-through and flow-over sensor configurations, respectively. The adsorption behavior of small analytes onto the inner surfaces of closed-ended and open-ended PSi membrane microcavities was similar. However, for large analytes, PSi membranes in a flow-through scheme showed significant improvement in response times due to more efficient convective transport of analytes. The experimental results and theoretical analysis provide quantitative estimates of the benefits offered by open-ended porous membranes for different analyte systems.
Method Development in Forensic Toxicology.
Peters, Frank T; Wissenbach, Dirk K; Busardo, Francesco Paolo; Marchei, Emilia; Pichini, Simona
2017-01-01
In the field of forensic toxicology, the quality of analytical methods is of great importance to ensure the reliability of results and to avoid unjustified legal consequences. A key to high quality analytical methods is a thorough method development. The presented article will provide an overview on the process of developing methods for forensic applications. This includes the definition of the method's purpose (e.g. qualitative vs quantitative) and the analytes to be included, choosing an appropriate sample matrix, setting up separation and detection systems as well as establishing a versatile sample preparation. Method development is concluded by an optimization process after which the new method is subject to method validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György
2018-01-01
Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.
Analysis of Environmental Contamination resulting from ...
Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to safe levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illu
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Scott D.; Eckberg, Alison D.; Thallapally, Praveen K.
2011-09-01
The metal-organic framework Cu-BTC was evaluated for its ability to selectively interact with Lewis-base analytes, including explosives, by examining retention on GC columns packed with Chromosorb W HP that contained 3.0% SE-30 along with various loadings of Cu-BTC. SEM images of the support material showed the characteristic Cu-BTC crystals embedded in the SE-30 coating on the diatomaceous support. Results indicated that the Cu-BTC-containing stationary phase had limited thermal stability (220°C) and strong general retention for analytes. Kováts index calculations showed selective retention (amounting to about 300 Kováts units) relative to n-alkanes for many small Lewis-base analytes on a column thatmore » contained 0.75% Cu-BTC compared to an SE-30 control. Short columns that contained lower loadings of Cu-BTC (0.10%) were necessary to elute explosives and related analytes; however, selectivity was not observed for aromatic compounds (including nitroaromatics) or nitroalkanes. Observed retention characteristics are discussed.« less
Yan, Yifei; Zhang, Lisong; Yan, Xiangzhen
2016-01-01
In this paper, a single-slope tunnel pipeline was analysed considering the effects of vertical earth pressure, horizontal soil pressure, inner pressure, thermal expansion force and pipeline—soil friction. The concept of stagnation point for the pipeline was proposed. Considering the deformation compatibility condition of the pipeline elbow, the push force of anchor blocks of a single-slope tunnel pipeline was derived based on an energy method. Then, the theoretical formula for this force is thus generated. Using the analytical equation, the push force of the anchor block of an X80 large-diameter pipeline from the West—East Gas Transmission Project was determined. Meanwhile, to verify the results of the analytical method, and the finite element method, four categories of finite element codes were introduced to calculate the push force, including CAESARII, ANSYS, AutoPIPE and ALGOR. The results show that the analytical results agree well with the numerical results, and the maximum relative error is only 4.1%. Therefore, the results obtained with the analytical method can satisfy engineering requirements. PMID:26963097
Experimental and analytical characterization of triaxially braided textile composites
NASA Technical Reports Server (NTRS)
Masters, John E.; Fedro, Mark J.; Ifju, Peter G.
1993-01-01
There were two components, experimental and analytical, to this investigation of triaxially braided textile composite materials. The experimental portion of the study centered on measuring the materials' longitudinal and transverse tensile moduli, Poisson's ratio, and strengths. The identification of the damage mechanisms exhibited by these materials was also a prime objective of the experimental investigation. The analytical portion of the investigation utilized the Textile Composites Analysis (TECA) model to predict modulus and strength. The analytical and experimental results were compared to assess the effectiveness of the analysis. The figures contained in this paper reflect the presentation made at the conference. They may be divided into four sections: a definition of the material system tested; followed by a series of figures summarizing the experimental results (these figures contain results of a Moire interferometry study of the strain distribution in the material, examples and descriptions of the types of damage encountered in these materials, and a summary of the measured properties); a description of the TECA model follows the experimental results (this includes a series of predicted results and a comparison with measured values); and finally, a brief summary completes the paper.
Applications of reversible covalent chemistry in analytical sample preparation.
Siegel, David
2012-12-07
Reversible covalent chemistry (RCC) adds another dimension to commonly used sample preparation techniques like solid-phase extraction (SPE), solid-phase microextraction (SPME), molecular imprinted polymers (MIPs) or immuno-affinity cleanup (IAC): chemical selectivity. By selecting analytes according to their covalent reactivity, sample complexity can be reduced significantly, resulting in enhanced analytical performance for low-abundance target analytes. This review gives a comprehensive overview of the applications of RCC in analytical sample preparation. The major reactions covered include reversible boronic ester formation, thiol-disulfide exchange and reversible hydrazone formation, targeting analyte groups like diols (sugars, glycoproteins and glycopeptides, catechols), thiols (cysteinyl-proteins and cysteinyl-peptides) and carbonyls (carbonylated proteins, mycotoxins). Their applications range from low abundance proteomics to reversible protein/peptide labelling to antibody chromatography to quantitative and qualitative food analysis. In discussing the potential of RCC, a special focus is on the conditions and restrictions of the utilized reaction chemistry.
The Savannah River Site`s groundwater monitoring program. First quarter 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report summarizes the Savannah River Site (SRS) groundwater monitoring program conducted by EPD/EMS in the first quarter of 1991. In includes the analytical data, field data, data review, quality control, and other documentation for this program, provides a record of the program`s activities and rationale, and serves as an official document of the analytical results.
True, Lawrence D
2014-03-01
Paralleling the growth of ever more cost efficient methods to sequence the whole genome in minute fragments of tissue has been the identification of increasingly numerous molecular abnormalities in cancers--mutations, amplifications, insertions and deletions of genes, and patterns of differential gene expression, i.e., overexpression of growth factors and underexpression of tumor suppressor genes. These abnormalities can be translated into assays to be used in clinical decision making. In general terms, the result of such an assay is subject to a large number of variables regarding the characteristics of the available sample, particularities of the used assay, and the interpretation of the results. This review discusses the effects of these variables on assays of tissue-based biomarkers, classified by macromolecule--DNA, RNA (including micro RNA, messenger RNA, long noncoding RNA, protein, and phosphoprotein). Since the majority of clinically applicable biomarkers are immunohistochemically detectable proteins this review focuses on protein biomarkers. However, the principles outlined are mostly applicable to any other analyte. A variety of preanalytical variables impacts on the results obtained, including analyte stability (which is different for different analytes, i.e., DNA, RNA, or protein), period of warm and of cold ischemia, fixation time, tissue processing, sample storage time, and storage conditions. In addition, assay variables play an important role, including reagent specificity (notably but not uniquely an issue concerning antibodies used in immunohistochemistry), technical components of the assay, quantitation, and assay interpretation. Finally, appropriateness of an assay for clinical application is an important issue. Reference is made to publicly available guidelines to improve on biomarker development in general and requirements for clinical use in particular. Strategic goals are formulated in order to improve on the quality of biomarker reporting, including issues of analyte quality, experimental detail, assay efficiency and precision, and assay appropriateness.
Geochemical and isotopic water results, Barrow, Alaska, 2012-2013
Heikoop, Jeff; Wilson, Cathy; Newman, Brent
2012-07-18
Data include a large suite of analytes (geochemical and isotopic) for samples collected in Barrow, Alaska (2012-2013). Sample types are indicated, and include soil pore waters, drainage waters, snowmelt, precipitation, and permafrost samples.
NASA Technical Reports Server (NTRS)
Weller, W. H.
1983-01-01
A program of experimental and analytical research was performed to demonstrate the degree of correlation achieved between measured and computed rotor inplane stability characteristics. The experimental data were obtained from hover and wind tunnel tests of a scaled bearingless main rotor model. Both isolated rotor and free-hub conditions were tested. Test parameters included blade built-in cone and sweep angles; rotor inplane structural stiffness and damping; pitch link stiffness and location; and fuselage damping, inertia, and natural frequency. Analytical results for many test conditions were obtained. In addition, the analytical and experimental results were examined to ascertain the effects of the test parameters on rotor ground and air resonance stability. The results from this program are presented herein in tabular and graphical form.
Application of Interface Technology in Progressive Failure Analysis of Composite Panels
NASA Technical Reports Server (NTRS)
Sleight, D. W.; Lotts, C. G.
2002-01-01
A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.
Watts, R R; Langone, J J; Knight, G J; Lewtas, J
1990-01-01
A two-day technical workshop was convened November 10-11, 1986, to discuss analytical approaches for determining trace amounts of cotinine in human body fluids resulting from passive exposure to environmental tobacco smoke (ETS). The workshop, jointly sponsored by the U.S. Environmental Protection Agency and Centers for Disease Control, was attended by scientists with expertise in cotinine analytical methodology and/or conduct of human monitoring studies related to ETS. The workshop format included technical presentations, separate panel discussions on chromatography and immunoassay analytical approaches, and group discussions related to the quality assurance/quality control aspects of future monitoring programs. This report presents a consensus of opinion on general issues before the workshop panel participants and also a detailed comparison of several analytical approaches being used by the various represented laboratories. The salient features of the chromatography and immunoassay analytical methods are discussed separately. PMID:2190812
Hui, Boon Yih; Raoov, Muggundha; Zain, Nur Nadhirah Mohamad; Mohamad, Sharifah; Osman, Hasnah
2017-09-03
The growth in driving force and popularity of cyclodextrin (CDs) and ionic liquids (ILs) as promising materials in the field of analytical chemistry has resulted in an exponentially increase of their exploitation and production in analytical chemistry field. CDs belong to the family of cyclic oligosaccharides composing of α-(1,4) linked glucopyranose subunits and possess a cage-like supramolecular structure. This structure enables chemical reactions to proceed between interacting ions, radical or molecules in the absence of covalent bonds. Conversely, ILs are an ionic fluids comprising of only cation and anion often with immeasurable vapor pressure making them as green or designer solvent. The cooperative effect between CD and IL due to their fascinating properties, have nowadays contributed their footprints for a better development in analytical chemistry nowadays. This comprehensive review serves to give an overview on some of the recent studies and provides an analytical trend for the application of CDs with the combination of ILs that possess beneficial and remarkable effects in analytical chemistry including their use in various sample preparation techniques such as solid phase extraction, magnetic solid phase extraction, cloud point extraction, microextraction, and separation techniques which includes gas chromatography, high-performance liquid chromatography, capillary electrophoresis as well as applications of electrochemical sensors as electrode modifiers with references to recent applications. This review will highlight the nature of interactions and synergic effects between CDs, ILs, and analytes. It is hoped that this review will stimulate further research in analytical chemistry.
Krleza, Jasna Lenicek; Dorotic, Adrijana; Grzunov, Ana
2017-02-15
Proper standardization of laboratory testing requires assessment of performance after the tests are performed, known as the post-analytical phase. A nationwide external quality assessment (EQA) scheme implemented in Croatia in 2014 includes a questionnaire on post-analytical practices, and the present study examined laboratory responses in order to identify current post-analytical phase practices and identify areas for improvement. In four EQA exercises between September 2014 and December 2015, 145-174 medical laboratories across Croatia were surveyed using the Module 11 questionnaire on the post-analytical phase of testing. Based on their responses, the laboratories were evaluated on four quality indicators: turnaround time (TAT), critical values, interpretative comments and procedures in the event of abnormal results. Results were presented as absolute numbers and percentages. Just over half of laboratories (56.3%) monitored TAT. Laboratories varied substantially in how they dealt with critical values. Most laboratories (65-97%) issued interpretative comments with test results. One third of medical laboratories (30.6-33.3%) issued abnormal test results without confirming them in additional testing. Our results suggest that the nationwide post-analytical EQA scheme launched in 2014 in Croatia has yet to be implemented to the full. To close the gaps between existing recommendations and laboratory practice, laboratory professionals should focus on ensuring that TAT is monitored and lists of critical values are established within laboratories. Professional bodies/institutions should focus on clarify and harmonized rules to standardized practices and applied for adding interpretative comments to laboratory test results and for dealing with abnormal test results.
Contamination of dried blood spots - an underestimated risk in newborn screening.
Winter, Theresa; Lange, Anja; Hannemann, Anke; Nauck, Matthias; Müller, Cornelia
2018-01-26
Newborn screening (NBS) is an established screening procedure in many countries worldwide, aiming at the early detection of inborn errors of metabolism. For decades, dried blood spots have been the standard specimen for NBS. The procedure of blood collection is well described and standardized and includes many critical pre-analytical steps. We examined the impact of contamination of some anticipated common substances on NBS results obtained from dry spot samples. This possible pre-analytical source of uncertainty has been poorly examined in the past. Capillary blood was obtained from 15 adult volunteers and applied to 10 screening filter papers per volunteer. Nine filter papers were contaminated without visible trace. The contaminants were baby diaper rash cream, baby wet wipes, disinfectant, liquid infant formula, liquid infant formula hypoallergenic (HA), ultrasonic gel, breast milk, feces, and urine. The differences between control and contaminated samples were evaluated for 45 NBS quantities. We estimated if the contaminations might lead to false-positive NBS results. Eight of nine investigated contaminants significantly altered NBS analyte concentrations and potentially caused false-positive screening outcomes. A contamination with feces was most influential, affecting 24 of 45 tested analytes followed by liquid infant formula (HA) and urine, affecting 19 and 13 of 45 analytes, respectively. A contamination of filter paper samples can have a substantial effect on the NBS results. Our results underline the importance of good pre-analytical training to make the staff aware of the threat and ensure reliable screening results.
Sensitive glow discharge ion source for aerosol and gas analysis
Reilly, Peter T. A. [Knoxville, TN
2007-08-14
A high sensitivity glow discharge ion source system for analyzing particles includes an aerodynamic lens having a plurality of constrictions for receiving an aerosol including at least one analyte particle in a carrier gas and focusing the analyte particles into a collimated particle beam. A separator separates the carrier gas from the analyte particle beam, wherein the analyte particle beam or vapors derived from the analyte particle beam are selectively transmitted out of from the separator. A glow discharge ionization source includes a discharge chamber having an entrance orifice for receiving the analyte particle beam or analyte vapors, and a target electrode and discharge electrode therein. An electric field applied between the target electrode and discharge electrode generates an analyte ion stream from the analyte vapors, which is directed out of the discharge chamber through an exit orifice, such as to a mass spectrometer. High analyte sensitivity is obtained by pumping the discharge chamber exclusively through the exit orifice and the entrance orifice.
A system of three-dimensional complex variables
NASA Technical Reports Server (NTRS)
Martin, E. Dale
1986-01-01
Some results of a new theory of multidimensional complex variables are reported, including analytic functions of a three-dimensional (3-D) complex variable. Three-dimensional complex numbers are defined, including vector properties and rules of multiplication. The necessary conditions for a function of a 3-D variable to be analytic are given and shown to be analogous to the 2-D Cauchy-Riemann equations. A simple example also demonstrates the analogy between the newly defined 3-D complex velocity and 3-D complex potential and the corresponding ordinary complex velocity and complex potential in two dimensions.
NASA Technical Reports Server (NTRS)
Meyer, Tom; Zubrin, Robert
1997-01-01
The first phase of the research includes a comprehensive analytical study examining the potential applications for engineering subsystems and mission strategies made possible by such RWGS based subsystems, and will include an actual experimental demonstration and performance characterization of a full-scale brassboard RWGS working unit. By the time of this presentation the laboratory demonstration unit will not yet be operational but we will present the results of our analytical studies to date and plans for the ongoing work.
High temperature ion channels and pores
NASA Technical Reports Server (NTRS)
Cheley, Stephen (Inventor); Gu, Li Qun (Inventor); Bayley, Hagan (Inventor); Kang, Xiaofeng (Inventor)
2011-01-01
The present invention includes an apparatus, system and method for stochastic sensing of an analyte to a protein pore. The protein pore may be an engineer protein pore, such as an ion channel at temperatures above 55.degree. C. and even as high as near 100.degree. C. The analyte may be any reactive analyte, including chemical weapons, environmental toxins and pharmaceuticals. The analyte covalently bonds to the sensor element to produce a detectable electrical current signal. Possible signals include change in electrical current. Detection of the signal allows identification of the analyte and determination of its concentration in a sample solution. Multiple analytes present in the same solution may also be detected.
Ottaway, Josh; Farrell, Jeremy A; Kalivas, John H
2013-02-05
An essential part to calibration is establishing the analyte calibration reference samples. These samples must characterize the sample matrix and measurement conditions (chemical, physical, instrumental, and environmental) of any sample to be predicted. Calibration usually requires measuring spectra for numerous reference samples in addition to determining the corresponding analyte reference values. Both tasks are typically time-consuming and costly. This paper reports on a method named pure component Tikhonov regularization (PCTR) that does not require laboratory prepared or determined reference values. Instead, an analyte pure component spectrum is used in conjunction with nonanalyte spectra for calibration. Nonanalyte spectra can be from different sources including pure component interference samples, blanks, and constant analyte samples. The approach is also applicable to calibration maintenance when the analyte pure component spectrum is measured in one set of conditions and nonanalyte spectra are measured in new conditions. The PCTR method balances the trade-offs between calibration model shrinkage and the degree of orthogonality to the nonanalyte content (model direction) in order to obtain accurate predictions. Using visible and near-infrared (NIR) spectral data sets, the PCTR results are comparable to those obtained using ridge regression (RR) with reference calibration sets. The flexibility of PCTR also allows including reference samples if such samples are available.
NASA Astrophysics Data System (ADS)
Penin, A. A.; Pivovarov, A. A.
2001-02-01
We present an analytical description of top-antitop pair production near the threshold in $e^+e^-$ annihilation and $\\g\\g$ collisions. A set of basic observables considered includes the total cross sections, forward-backward asymmetry and top quark polarization. The threshold effects relevant for the basic observables are described by three universal functions related to S wave production, P wave production and S-P interference. These functions are computed analytically up to the next-to-next-to-leading order of NRQCD. The total $e^+e^-\\to t\\bar t$ cross section near the threshold is obtained in the next-to-next-to-leading order in the closed form including the contribution due to the axial coupling of top quark and mediated by the Z-boson. The effects of the running of the strong coupling constant and of the finite top quark width are taken into account analytically for the P wave production and S-P wave interference.
Borodkina, I.; Borodin, D.; Brezinsek, S.; ...
2017-04-12
For simulation of plasma-facing component erosion in fusion experiments, an analytical expression for the ion velocity just before the surface impact including the local electric field and an optional surface biasing effect is suggested. Energy and angular impact distributions and the resulting effective sputtering yields were produced for several experimental scenarios at JET ILW mostly involving PFCs exposed to an oblique magnetic field. The analytic solution has been applied as an improvement to earlier ERO modelling of localized, Be outer limiter, RF-enhanced erosion, modulated by toggling of a remote, however magnetically connected ICRH antenna. The effective W sputtering yields duemore » to D and Be ion impact in Type-I and Type-III ELMs and inter-ELM conditions were also estimated using the analytical approach and benchmarked by spectroscopy. The intra-ELM W sputtering flux increases almost 10 times in comparison to the inter-ELM flux.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallagher, Neal B.; Blake, Thomas A.; Gassman, Paul L.
2006-07-01
Multivariate curve resolution (MCR) is a powerful technique for extracting chemical information from measured spectra on complex mixtures. The difficulty with applying MCR to soil reflectance measurements is that light scattering artifacts can contribute much more variance to the measurements than the analyte(s) of interest. Two methods were integrated into a MCR decomposition to account for light scattering effects. Firstly, an extended mixture model using pure analyte spectra augmented with scattering ‘spectra’ was used for the measured spectra. And secondly, second derivative preprocessed spectra, which have higher selectivity than the unprocessed spectra, were included in a second block as amore » part of the decomposition. The conventional alternating least squares (ALS) algorithm was modified to simultaneously decompose the measured and second derivative spectra in a two-block decomposition. Equality constraints were also included to incorporate information about sampling conditions. The result was an MCR decomposition that provided interpretable spectra from soil reflectance measurements.« less
NASA Astrophysics Data System (ADS)
Nakada, Tomohiro; Takadama, Keiki; Watanabe, Shigeyoshi
This paper proposes the classification method using Bayesian analytical method to classify the time series data in the international emissions trading market depend on the agent-based simulation and compares the case with Discrete Fourier transform analytical method. The purpose demonstrates the analytical methods mapping time series data such as market price. These analytical methods have revealed the following results: (1) the classification methods indicate the distance of mapping from the time series data, it is easier the understanding and inference than time series data; (2) these methods can analyze the uncertain time series data using the distance via agent-based simulation including stationary process and non-stationary process; and (3) Bayesian analytical method can show the 1% difference description of the emission reduction targets of agent.
Analytical aspects of hydrogen exchange mass spectrometry
Engen, John R.; Wales, Thomas E.
2016-01-01
The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552
A new frequency approach for light flicker evaluation in electric power systems
NASA Astrophysics Data System (ADS)
Feola, Luigi; Langella, Roberto; Testa, Alfredo
2015-12-01
In this paper, a new analytical estimator for light flicker in frequency domain, which is able to take into account also the frequency components neglected by the classical methods proposed in literature, is proposed. The analytical solutions proposed apply for any generic stationary signal affected by interharmonic distortion. The light flicker analytical estimator proposed is applied to numerous numerical case studies with the goal of showing i) the correctness and the improvements of the analytical approach proposed with respect to the other methods proposed in literature and ii) the accuracy of the results compared to those obtained by means of the classical International Electrotechnical Commission (IEC) flickermeter. The usefulness of the proposed analytical approach is that it can be included in signal processing tools for interharmonic penetration studies for the integration of renewable energy sources in future smart grids.
Inorganic chemical analysis of environmental materials—A lecture series
Crock, J.G.; Lamothe, P.J.
2011-01-01
At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.
NASA Astrophysics Data System (ADS)
Sheu, R.; Marcotte, A.; Khare, P.; Ditto, J.; Charan, S.; Gentner, D. R.
2017-12-01
Intermediate-volatility and semi-volatile organic compounds (I/SVOCs) are major precursors to secondary organic aerosol, and contribute to tropospheric ozone formation. Their wide volatility range, chemical complexity, behavior in analytical systems, and trace concentrations present numerous hurdles to characterization. We present an integrated sampling-to-analysis system for the collection and offline analysis of trace gas-phase organic compounds with the goal of preserving and recovering analytes throughout sample collection, transport, storage, and thermal desorption for accurate analysis. Custom multi-bed adsorbent tubes are used to collect samples for offline analysis by advanced analytical detectors. The analytical instrumentation comprises an automated thermal desorption system that introduces analytes from the adsorbent tubes into a gas chromatograph, which is coupled with an electron ionization mass spectrometer (GC-EIMS) and other detectors. In order to optimize the collection and recovery for a wide range of analyte volatility and functionalization, we evaluated a variety of commercially-available materials, including Res-Sil beads, quartz wool, glass beads, Tenax TA, and silica gel. Key properties for optimization include inertness, versatile chemical capture, minimal affinity for water, and minimal artifacts or degradation byproducts; these properties were assessed with a diverse mix of traditionally-measured and functionalized analytes. Along with a focus on material selection, we provide recommendations spanning the entire sampling-and-analysis process to improve the accuracy of future comprehensive I/SVOC measurements, including oxygenated and other functionalized I/SVOCs. We demonstrate the performance of our system by providing results on speciated VOCs-SVOCs from indoor, outdoor, and chamber studies that establish the utility of our protocols and pave the way for precise laboratory characterization via a mix of detection methods.
Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van
2018-04-01
In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.
Limitations and Tolerances in Optical Devices
NASA Astrophysics Data System (ADS)
Jackman, Neil Allan
The performance of optical systems is limited by the imperfections of their components. Many of the devices in optical systems including optical fiber amplifiers, multimode transmission lines and multilayered media such as mirrors, windows and filters, are modeled by coupled line equations. This investigation includes: (i) a study of the limitations imposed on a wavelength multiplexed unidirectional ring by the non-uniformities of the gain spectra of Erbium-doped optical fiber amplifiers. We find numerical solutions for non-linear coupled power differential equations and use these solutions to compare the signal -to-noise ratios and signal levels at different nodes. (ii) An analytical study of the tolerances of imperfect multimode media which support forward traveling modes. The complex mode amplitudes are related by linear coupled differential equations. We use analytical methods to derive extended equations for the expected mode powers and give heuristic limits for their regions of validity. These results compare favorably to exact solutions found for a special case. (iii) A study of the tolerances of multilayered media in the presence of optical thickness imperfections. We use analytical methods including Kronecker producers, to calculate the reflection and transmission statistics of the media. Monte Carlo simulations compare well to our analytical method.
$ANBA; a rapid, combined data acquisition and correction program for the SEMQ electron microprobe
McGee, James J.
1983-01-01
$ANBA is a program developed for rapid data acquisition and correction on an automated SEMQ electron microprobe. The program provides increased analytical speed and reduced disk read/write operations compared with the manufacturer's software, resulting in a doubling of analytical throughput. In addition, the program provides enhanced analytical features such as averaging, rapid and compact data storage, and on-line plotting. The program is described with design philosophy, flow charts, variable names, a complete program listing, and system requirements. A complete operating example and notes to assist in running the program are included.
Numerical and analytical bounds on threshold error rates for hypergraph-product codes
NASA Astrophysics Data System (ADS)
Kovalev, Alexey A.; Prabhakar, Sanjay; Dumer, Ilya; Pryadko, Leonid P.
2018-06-01
We study analytically and numerically decoding properties of finite-rate hypergraph-product quantum low density parity-check codes obtained from random (3,4)-regular Gallager codes, with a simple model of independent X and Z errors. Several nontrivial lower and upper bounds for the decodable region are constructed analytically by analyzing the properties of the homological difference, equal minus the logarithm of the maximum-likelihood decoding probability for a given syndrome. Numerical results include an upper bound for the decodable region from specific heat calculations in associated Ising models and a minimum-weight decoding threshold of approximately 7 % .
A numerical test of the topographic bias
NASA Astrophysics Data System (ADS)
Sjöberg, L. E.; Joud, M. S. S.
2018-02-01
In 1962 A. Bjerhammar introduced the method of analytical continuation in physical geodesy, implying that surface gravity anomalies are downward continued into the topographic masses down to an internal sphere (the Bjerhammar sphere). The method also includes analytical upward continuation of the potential to the surface of the Earth to obtain the quasigeoid. One can show that also the common remove-compute-restore technique for geoid determination includes an analytical continuation as long as the complete density distribution of the topography is not known. The analytical continuation implies that the downward continued gravity anomaly and/or potential are/is in error by the so-called topographic bias, which was postulated by a simple formula of L E Sjöberg in 2007. Here we will numerically test the postulated formula by comparing it with the bias obtained by analytical downward continuation of the external potential of a homogeneous ellipsoid to an inner sphere. The result shows that the postulated formula holds: At the equator of the ellipsoid, where the external potential is downward continued 21 km, the computed and postulated topographic biases agree to less than a millimetre (when the potential is scaled to the unit of metre).
Molecular modeling of interactions in electronic nose sensors for environmental monitoring
NASA Technical Reports Server (NTRS)
Shevade, A. V.; Ryan, M. A.; Homer, M. L.; Manfreda, A. M.; Yen, S. -P. S.; Zhou, H.; Manatt, K.
2002-01-01
We report a study aimed at understanding analyte interactions with sensors made from polymer-carbon black composite films. The sensors are used in an Electronic Nose (ENose) which is used for monitoring the breathing air quality in human habitats. The model mimics the experimental conditions of the composite film deposition and formation and was developed using molecular modeling and simulation tools. The Dreiding 2.21 Force Field was used for the polymer and analyte molecules while graphite parameters were assigned to the carbon black atoms. The polymer considered for this work is methyl vinyl ether / maleic acid copolymer. The target analytes include both inorganic (NH3) and organic (methanol) types of compound. Results indicate different composite-analyte interaction behavior.
This data set contains the method performance results. This includes field blanks, method blanks, duplicate samples, analytical duplicates, matrix spikes, and surrogate recovery standards.
The Children’s Total Exposure to Persistent Pesticides and Other Persistent Pollutant (...
Jaskolla, Thorsten W; Karas, Michael
2011-06-01
This work experimentally verifies and proves the two long since postulated matrix-assisted laser desorption/ionization (MALDI) analyte protonation pathways known as the Lucky Survivor and the gas phase protonation model. Experimental differentiation between the predicted mechanisms becomes possible by the use of deuterated matrix esters as MALDI matrices, which are stable under typical sample preparation conditions and generate deuteronated reagent ions, including the deuterated and deuteronated free matrix acid, only upon laser irradiation in the MALDI process. While the generation of deuteronated analyte ions proves the gas phase protonation model, the detection of protonated analytes by application of deuterated matrix compounds without acidic hydrogens proves the survival of analytes precharged from solution in accordance with the predictions from the Lucky Survivor model. The observed ratio of the two analyte ionization processes depends on the applied experimental parameters as well as the nature of analyte and matrix. Increasing laser fluences and lower matrix proton affinities favor gas phase protonation, whereas more quantitative analyte protonation in solution and intramolecular ion stabilization leads to more Lucky Survivors. The presented results allow for a deeper understanding of the fundamental processes causing analyte ionization in MALDI and may alleviate future efforts for increasing the analyte ion yield.
Amplitudes of doping striations: comparison of numerical calculations and analytical approaches
NASA Astrophysics Data System (ADS)
Jung, T.; Müller, G.
1997-02-01
Transient, axisymmetric numerical calculations of the heat and species transport including convection were performed for a simplified vertical gradient freeze (Bridgman) process with bottom seeding for GaAs. Periodical oscillations were superimposed onto the transient heater temperature profile. The amplitudes of the resulting oscillations of the growth rate and the dopant concentration (striations) in the growing crystals are compared with the predictions of analytical models.
Mihura, Joni L; Meyer, Gregory J; Dumitrascu, Nicolae; Bombel, George
2016-01-01
We respond to Tibon Czopp and Zeligman's (2016) critique of our systematic reviews and meta-analyses of 65 Rorschach Comprehensive System (CS) variables published in Psychological Bulletin (2013). The authors endorsed our supportive findings but critiqued the same methodology when used for the 13 unsupported variables. Unfortunately, their commentary was based on significant misunderstandings of our meta-analytic method and results, such as thinking we used introspectively assessed criteria in classifying levels of support and reporting only a subset of our externally assessed criteria. We systematically address their arguments that our construct label and criterion variable choices were inaccurate and, therefore, meta-analytic validity for these 13 CS variables was artificially low. For example, the authors created new construct labels for these variables that they called "the customary CS interpretation," but did not describe their methodology nor provide evidence that their labels would result in better validity than ours. They cite studies they believe we should have included; we explain how these studies did not fit our inclusion criteria and that including them would have actually reduced the relevant CS variables' meta-analytic validity. Ultimately, criticisms alone cannot change meta-analytic support from negative to positive; Tibon Czopp and Zeligman would need to conduct their own construct validity meta-analyses.
Generation and analysis of chemical compound libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gregoire, John M.; Jin, Jian; Kan, Kevin S.
2017-10-03
Various samples are generated on a substrate. The samples each includes or consists of one or more analytes. In some instances, the samples are generated through the use of gels or through vapor deposition techniques. The samples are used in an instrument for screening large numbers of analytes by locating the samples between a working electrode and a counter electrode assembly. The instrument also includes one or more light sources for illuminating each of the samples. The instrument is configured to measure the photocurrent formed through a sample as a result of the illumination of the sample.
NASA Astrophysics Data System (ADS)
Russo, Thomas V.; Martin, Richard L.; Hay, P. Jeffrey; Rappé, Anthony K.
1995-06-01
The application of analytic second derivative techniques to quantum chemical calculations using effective core potentials is discussed. Using a recent implementation of these techniques, the vibrational frequencies of transition metal compounds are calculated including the chlorides TiCl4, ZrCl4, and HfCl4, the oxochlorides CrO2Cl2, MoO2Cl2, WO2Cl2, and VOCl3, and the oxide OsO4. Results are compared to previous calculations and with experimental results.
Dual-process cognitive interventions to enhance diagnostic reasoning: a systematic review.
Lambe, Kathryn Ann; O'Reilly, Gary; Kelly, Brendan D; Curristan, Sarah
2016-10-01
Diagnostic error incurs enormous human and economic costs. The dual-process model reasoning provides a framework for understanding the diagnostic process and attributes certain errors to faulty cognitive shortcuts (heuristics). The literature contains many suggestions to counteract these and to enhance analytical and non-analytical modes of reasoning. To identify, describe and appraise studies that have empirically investigated interventions to enhance analytical and non-analytical reasoning among medical trainees and doctors, and to assess their effectiveness. Systematic searches of five databases were carried out (Medline, PsycInfo, Embase, Education Resource Information Centre (ERIC) and Cochrane Database of Controlled Trials), supplemented with searches of bibliographies and relevant journals. Included studies evaluated an intervention to enhance analytical and/or non-analytical reasoning among medical trainees or doctors. Twenty-eight studies were included under five categories: educational interventions, checklists, cognitive forcing strategies, guided reflection, instructions at test and other interventions. While many of the studies found some effect of interventions, guided reflection interventions emerged as the most consistently successful across five studies, and cognitive forcing strategies improved accuracy and confidence judgements. Significant heterogeneity of measurement approaches was observed, and existing studies are largely limited to early-career doctors. Results to date are promising and this relatively young field is now close to a point where these kinds of cognitive interventions can be recommended to educators. Further research with refined methodology and more diverse samples is required before firm recommendations may be made for medical education and policy; however, these results suggest that such interventions hold promise, with much current enthusiasm for new research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
This data set contains the method performance results for CTEPP-OH. This includes field blanks, method blanks, duplicate samples, analytical duplicates, matrix spikes, and surrogate recovery standards.
The Children’s Total Exposure to Persistent Pesticides and Other Persisten...
Aquatic concentrations of chemical analytes compared to ...
We describe screening level estimates of potential aquatic toxicity posed by 227 chemical analytes that were measured in 25 ambient water samples collected as part of a joint USGS/USEPA drinking water plant study. Measured concentrations were compared to biological effect concentration (EC) estimates, including USEPA aquatic life criteria, effective plasma concentrations of pharmaceuticals, published toxicity data summarized in the USEPA ECOTOX database, and chemical structure-based predictions. Potential dietary exposures were estimated using a generic 3-tiered food web accumulation scenario. For many analytes, few or no measured effect data were found, and for some analytes, reporting limits exceeded EC estimates, limiting the scope of conclusions. Results suggest occasional occurrence above ECs for copper, aluminum, strontium, lead, uranium, and nitrate. Sparse effect data for manganese, antimony, and vanadium suggest that these analytes may occur above ECs, but additional effect data would be desirable to corroborate EC estimates. These conclusions were not affected by bioaccumulation estimates. No organic analyte concentrations were found to exceed EC estimates, but ten analytes had concentrations in excess of 1/10th of their respective EC: triclocarban, norverapamil, progesterone, atrazine, metolachlor, triclosan, para-nonylphenol, ibuprofen, venlafaxine, and amitriptyline, suggesting more detailed characterization of these analytes. Purpose: to provide sc
Aquatic concentrations of chemical analytes compared to ecotoxicity estimates
Kostich, Mitchell S.; Flick, Robert W.; Angela L. Batt,; Mash, Heath E.; Boone, J. Scott; Furlong, Edward T.; Kolpin, Dana W.; Glassmeyer, Susan T.
2017-01-01
We describe screening level estimates of potential aquatic toxicity posed by 227 chemical analytes that were measured in 25 ambient water samples collected as part of a joint USGS/USEPA drinking water plant study. Measured concentrations were compared to biological effect concentration (EC) estimates, including USEPA aquatic life criteria, effective plasma concentrations of pharmaceuticals, published toxicity data summarized in the USEPA ECOTOX database, and chemical structure-based predictions. Potential dietary exposures were estimated using a generic 3-tiered food web accumulation scenario. For many analytes, few or no measured effect data were found, and for some analytes, reporting limits exceeded EC estimates, limiting the scope of conclusions. Results suggest occasional occurrence above ECs for copper, aluminum, strontium, lead, uranium, and nitrate. Sparse effect data for manganese, antimony, and vanadium suggest that these analytes may occur above ECs, but additional effect data would be desirable to corroborate EC estimates. These conclusions were not affected by bioaccumulation estimates. No organic analyte concentrations were found to exceed EC estimates, but ten analytes had concentrations in excess of 1/10th of their respective EC: triclocarban, norverapamil, progesterone, atrazine, metolachlor, triclosan, para-nonylphenol, ibuprofen, venlafaxine, and amitriptyline, suggesting more detailed characterization of these analytes.
Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.
Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María
2017-01-01
This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.
Aquatic concentrations of chemical analytes compared to ecotoxicity estimates.
Kostich, Mitchell S; Flick, Robert W; Batt, Angela L; Mash, Heath E; Boone, J Scott; Furlong, Edward T; Kolpin, Dana W; Glassmeyer, Susan T
2017-02-01
We describe screening level estimates of potential aquatic toxicity posed by 227 chemical analytes that were measured in 25 ambient water samples collected as part of a joint USGS/USEPA drinking water plant study. Measured concentrations were compared to biological effect concentration (EC) estimates, including USEPA aquatic life criteria, effective plasma concentrations of pharmaceuticals, published toxicity data summarized in the USEPA ECOTOX database, and chemical structure-based predictions. Potential dietary exposures were estimated using a generic 3-tiered food web accumulation scenario. For many analytes, few or no measured effect data were found, and for some analytes, reporting limits exceeded EC estimates, limiting the scope of conclusions. Results suggest occasional occurrence above ECs for copper, aluminum, strontium, lead, uranium, and nitrate. Sparse effect data for manganese, antimony, and vanadium suggest that these analytes may occur above ECs, but additional effect data would be desirable to corroborate EC estimates. These conclusions were not affected by bioaccumulation estimates. No organic analyte concentrations were found to exceed EC estimates, but ten analytes had concentrations in excess of 1/10th of their respective EC: triclocarban, norverapamil, progesterone, atrazine, metolachlor, triclosan, para-nonylphenol, ibuprofen, venlafaxine, and amitriptyline, suggesting more detailed characterization of these analytes. Published by Elsevier B.V.
Rankin, Kristin M; Kroelinger, Charlan D; Rosenberg, Deborah; Barfield, Wanda D
2012-12-01
The purpose of this article is to summarize the methodology, partnerships, and products developed as a result of a distance-based workforce development initiative to improve analytic capacity among maternal and child health (MCH) epidemiologists in state health agencies. This effort was initiated by the Centers for Disease Control's MCH Epidemiology Program and faculty at the University of Illinois at Chicago to encourage and support the use of surveillance data by MCH epidemiologists and program staff in state agencies. Beginning in 2005, distance-based training in advanced analytic skills was provided to MCH epidemiologists. To support participants, this model of workforce development included: lectures about the practical application of innovative epidemiologic methods, development of multidisciplinary teams within and across agencies, and systematic, tailored technical assistance The goal of this initiative evolved to emphasize the direct application of advanced methods to the development of state data products using complex sample surveys, resulting in the articles published in this supplement to MCHJ. Innovative methods were applied by participating MCH epidemiologists, including regional analyses across geographies and datasets, multilevel analyses of state policies, and new indicator development. Support was provided for developing cross-state and regional partnerships and for developing and publishing the results of analytic projects. This collaboration was successful in building analytic capacity, facilitating partnerships and promoting surveillance data use to address state MCH priorities, and may have broader application beyond MCH epidemiology. In an era of decreasing resources, such partnership efforts between state and federal agencies and academia are essential for promoting effective data use.
International Space Station Potable Water Characterization for 2013
NASA Technical Reports Server (NTRS)
Straub, John E. II; Plumlee, Debrah K.; Schultz, John R..; Mudgett, Paul D.
2014-01-01
In this post-construction, operational phase of International Space Station (ISS) with an ever-increasing emphasis on its use as a test-bed for future exploration missions, the ISS crews continue to rely on water reclamation systems for the majority of their water needs. The onboard water supplies include US Segment potable water from humidity condensate and urine, Russian Segment potable water from condensate, and ground-supplied potable water, as reserve. In 2013, the cargo returned on the Soyuz 32-35 flights included archival potable water samples collected from Expeditions 34-37. The Water and Food Analytical Laboratory at the NASA Johnson Space Center continued its long-standing role of performing chemical analyses on ISS return water samples to verify compliance with potable water quality specifications. This paper presents and discusses the analytical results for potable water samples returned from Expeditions 34-37, including a comparison to ISS quality standards. During the summer of 2013, the U.S. Segment potable water experienced an anticipated temporary rise and fall in total organic carbon (TOC) content, as the result of organic contamination breaking through the water system's treatment process. Analytical results for the Expedition 36 archival samples returned on Soyuz 34 confirmed that dimethylsilanediol was once again the responsible contaminant, just as it was for comparable TOC rises in 2010 and 2012. Discussion herein includes the use of the in-flight Total Organic Carbon Analyzer (TOCA) as a key monitoring tool for tracking these TOC rises and scheduling appropriate remediation action.
Sierra/SolidMechanics 4.48 Verification Tests Manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose
2018-03-01
Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less
Sierra/SolidMechanics 4.48 Verification Tests Manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel Jose
Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less
Comparison of three multiplex cytokine analysis systems: Luminex, SearchLight and FAST Quant.
Lash, Gendie E; Scaife, Paula J; Innes, Barbara A; Otun, Harry A; Robson, Steven C; Searle, Roger F; Bulmer, Judith N
2006-02-20
Multiplex cytokine analysis technologies have become readily available in the last five years. Two main formats exist: multiplex sandwich ELISA and bead based assays. While these have each been compared to individual ELISAs, there has been no direct comparison between the two formats. We report here the comparison of two multiplex sandwich ELISA procedures (FAST Quant and SearchLight) and a bead based assay (UpState Luminex). All three kits differed from each other for different analytes and there was no clear pattern of one system giving systematically different results than another for any analyte studied. We suggest that each system has merits and several factors including range of analytes available, prospect of development of new analytes, dynamic range of the assay, sensitivity of the assay, cost of equipment, cost of consumables, ease of use and ease of data analysis need to be considered when choosing a system for use. We also suggest that results obtained from different systems cannot be combined.
Analytical Solution for the Anisotropic Rabi Model: Effects of Counter-Rotating Terms
NASA Astrophysics Data System (ADS)
Zhang, Guofeng; Zhu, Hanjie
2015-03-01
The anisotropic Rabi model, which was proposed recently, differs from the original Rabi model: the rotating and counter-rotating terms are governed by two different coupling constants. This feature allows us to vary the counter-rotating interaction independently and explore the effects of it on some quantum properties. In this paper, we eliminate the counter-rotating terms approximately and obtain the analytical energy spectrums and wavefunctions. These analytical results agree well with the numerical calculations in a wide range of the parameters including the ultrastrong coupling regime. In the weak counter-rotating coupling limit we find out that the counter-rotating terms can be considered as the shifts to the parameters of the Jaynes-Cummings model. This modification shows the validness of the rotating-wave approximation on the assumption of near-resonance and relatively weak coupling. Moreover, the analytical expressions of several physics quantities are also derived, and the results show the break-down of the U(1)-symmetry and the deviation from the Jaynes-Cummings model.
Analytical solution for the anisotropic Rabi model: effects of counter-rotating terms.
Zhang, Guofeng; Zhu, Hanjie
2015-03-04
The anisotropic Rabi model, which was proposed recently, differs from the original Rabi model: the rotating and counter-rotating terms are governed by two different coupling constants. This feature allows us to vary the counter-rotating interaction independently and explore the effects of it on some quantum properties. In this paper, we eliminate the counter-rotating terms approximately and obtain the analytical energy spectrums and wavefunctions. These analytical results agree well with the numerical calculations in a wide range of the parameters including the ultrastrong coupling regime. In the weak counter-rotating coupling limit we find out that the counter-rotating terms can be considered as the shifts to the parameters of the Jaynes-Cummings model. This modification shows the validness of the rotating-wave approximation on the assumption of near-resonance and relatively weak coupling. Moreover, the analytical expressions of several physics quantities are also derived, and the results show the break-down of the U(1)-symmetry and the deviation from the Jaynes-Cummings model.
Effect of crash pulse shape on seat stroke requirements for limiting loads on occupants of aircraft
NASA Technical Reports Server (NTRS)
Carden, Huey D.
1992-01-01
An analytical study was made to provide comparative information on various crash pulse shapes that potentially could be used to test seats under conditions included in Federal Regulations Part 23 Paragraph 23.562(b)(1) for dynamic testing of general aviation seats, show the effects that crash pulse shape can have on the seat stroke requirements necessary to maintain a specified limit loading on the seat/occupant during crash pulse loadings, compare results from certain analytical model pulses with approximations of actual crash pulses, and compare analytical seat results with experimental airplace crash data. Structural and seat/occupant displacement equations in terms of the maximum deceleration, velocity change, limit seat pan load, and pulse time for five potentially useful pulse shapes were derived; from these, analytical seat stroke data were obtained for conditions as specified in Federal Regulations Part 23 Paragraph 23.562(b)(1) for dynamic testing of general aviation seats.
Analytical Solution for the Anisotropic Rabi Model: Effects of Counter-Rotating Terms
Zhang, Guofeng; Zhu, Hanjie
2015-01-01
The anisotropic Rabi model, which was proposed recently, differs from the original Rabi model: the rotating and counter-rotating terms are governed by two different coupling constants. This feature allows us to vary the counter-rotating interaction independently and explore the effects of it on some quantum properties. In this paper, we eliminate the counter-rotating terms approximately and obtain the analytical energy spectrums and wavefunctions. These analytical results agree well with the numerical calculations in a wide range of the parameters including the ultrastrong coupling regime. In the weak counter-rotating coupling limit we find out that the counter-rotating terms can be considered as the shifts to the parameters of the Jaynes-Cummings model. This modification shows the validness of the rotating-wave approximation on the assumption of near-resonance and relatively weak coupling. Moreover, the analytical expressions of several physics quantities are also derived, and the results show the break-down of the U(1)-symmetry and the deviation from the Jaynes-Cummings model. PMID:25736827
Measuring solids concentration in stormwater runoff: comparison of analytical methods.
Clark, Shirley E; Siu, Christina Y S
2008-01-15
Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.
Duri, Simon; Tran, Chieu D.
2013-01-01
We have successfully developed a simple and one step method to prepare high performance supramolecular polysaccharide composites from cellulose (CEL), chitosan (CS) and (2,3,6-tri-O-acetyl)-α-, β- and γ-cyclodextrin (α-, β- and γ-TCD). In this method, [BMIm+Cl−], an ionic liquid (IL), was used as a solvent to dissolve and prepare the composites. Since majority (>88%) of the IL used was recovered for reuse, the method is recyclable. XRD, FT-IR, NIR and SEM were used to monitor the dissolution process and to confirm that the polysaccharides were regenerated without any chemical modifications. It was found that unique properties of each component including superior mechanical properties (from CEL), excellent adsorbent for pollutants and toxins (from CS) and size/structure selectivity through inclusion complex formation (from TCDs) remain intact in the composites. Specifically, results from kinetics and adsorption isotherms show that while CS-based composites can effectively adsorb the endocrine disruptors (polychlrophenols, bisphenol-A), its adsorption is independent on the size and structure of the analytes. Conversely, the adsorption by γ-TCD-based composites exhibits strong dependency on size and structure of the analytes. For example, while all three TCD-based composites (i.e., α-, β- and γ-TCD) can effectively adsorb 2-, 3- and 4-chlorophenol, only γ-TCD-based composite can adsorb analytes with bulky groups including 3,4-dichloro- and 2,4,5-trichlorophenol. Furthermore, equilibrium sorption capacities for the analytes with bulky groups by γ-TCD-based composite are much higher than those by CS-based composites. Together, these results indicate that γ-TCD-based composite with its relatively larger cavity size can readily form inclusion complexes with analytes with bulky groups, and through inclusion complex formation, it can strongly adsorb much more analytes and with size/structure selectivity compared to CS-based composites which can adsorb the analyte only by surface adsorption. PMID:23517477
Applying an analytical method to study neutron behavior for dosimetry
NASA Astrophysics Data System (ADS)
Shirazi, S. A. Mousavi
2016-12-01
In this investigation, a new dosimetry process is studied by applying an analytical method. This novel process is associated with a human liver tissue. The human liver tissue has compositions including water, glycogen and etc. In this study, organic compound materials of liver are decomposed into their constituent elements based upon mass percentage and density of every element. The absorbed doses are computed by analytical method in all constituent elements of liver tissue. This analytical method is introduced applying mathematical equations based on neutron behavior and neutron collision rules. The results show that the absorbed doses are converged for neutron energy below 15MeV. This method can be applied to study the interaction of neutrons in other tissues and estimating the absorbed dose for a wide range of neutron energy.
42 CFR 493.1251 - Standard: Procedure manual.
Code of Federal Regulations, 2013 CFR
2013-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...
42 CFR 493.1251 - Standard: Procedure manual.
Code of Federal Regulations, 2014 CFR
2014-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...
42 CFR 493.1251 - Standard: Procedure manual.
Code of Federal Regulations, 2010 CFR
2010-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...
Ungerer, Jacobus P J; Pretorius, Carel J
2014-04-01
Highly-sensitive cardiac troponin (cTn) assays are being introduced into the market. In this study we argue that the classification of cTn assays into sensitive and highly-sensitive is flawed and recommend a more appropriate way to characterize analytical sensitivity of cTn assays. The raw data of 2252 cardiac troponin I (cTnI) tests done in duplicate with a 'sensitive' assay was extracted and used to calculate the cTnI levels in all, including those below the 'limit of detection' (LoD) that were censored. Duplicate results were used to determine analytical imprecision. We show that cTnI can be quantified in all samples including those with levels below the LoD and that the actual margins of error decrease as concentrations approach zero. The dichotomous classification of cTn assays into sensitive and highly-sensitive is theoretically flawed and characterizing analytical sensitivity as a continuous variable based on imprecision at 0 and the 99th percentile cut-off would be more appropriate.
Analytical modeling of circuit aerodynamics in the new NASA Lewis wind tunnel
NASA Technical Reports Server (NTRS)
Towne, C. E.; Povinelli, L. A.; Kunik, W. G.; Muramoto, K. K.; Hughes, C. E.; Levy, R.
1985-01-01
Rehabilitation and extention of the capability of the altitude wind tunnel (AWT) was analyzed. The analytical modeling program involves the use of advanced axisymmetric and three dimensional viscous analyses to compute the flow through the various AWT components. Results for the analytical modeling of the high speed leg aerodynamics are presented; these include: an evaluation of the flow quality at the entrance to the test section, an investigation of the effects of test section bleed for different model blockages, and an examination of three dimensional effects in the diffuser due to reentry flow and due to the change in cross sectional shape of the exhaust scoop.
Urban Space Explorer: A Visual Analytics System for Urban Planning.
Karduni, Alireza; Cho, Isaac; Wessel, Ginette; Ribarsky, William; Sauda, Eric; Dou, Wenwen
2017-01-01
Understanding people's behavior is fundamental to many planning professions (including transportation, community development, economic development, and urban design) that rely on data about frequently traveled routes, places, and social and cultural practices. Based on the results of a practitioner survey, the authors designed Urban Space Explorer, a visual analytics system that utilizes mobile social media to enable interactive exploration of public-space-related activity along spatial, temporal, and semantic dimensions.
NASA Technical Reports Server (NTRS)
Mason, P. W.; Harris, H. G.; Zalesak, J.; Bernstein, M.
1974-01-01
The NASA Structural Analysis System (NASTRAN) Model 1 finite element idealization, input data, and detailed analytical results are presented. The data presented include: substructuring analysis for normal modes, plots of member data, plots of symmetric free-free modes, plots of antisymmetric free-free modes, analysis of the wing, analysis of the cargo doors, analysis of the payload, and analysis of the orbiter.
Processing plutonium-contaminated soil on Johnston Atoll
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moroney, K.; Moroney, J. III; Turney, J.
1994-07-01
This article describes a cleanup project to process plutonium- and americium-contaminated soil on Johnston Atoll for volume reduction. Thermo Analytical`s (TMA`s) segmented gate system (SGS) for this remedial operation has been in successful on-site operation since 1992. Topics covered include the basis for development, a description of the Johnston Atoll; the significance of results; the benefits of the technology; applicability to other radiologically contaminated sites. 7 figs., 1 tab.
Efficient Band-to-Trap Tunneling Model Including Heterojunction Band Offset
Gao, Xujiao; Huang, Andy; Kerr, Bert
2017-10-25
In this paper, we present an efficient band-to-trap tunneling model based on the Schenk approach, in which an analytic density-of-states (DOS) model is developed based on the open boundary scattering method. The new model explicitly includes the effect of heterojunction band offset, in addition to the well-known field effect. Its analytic form enables straightforward implementation into TCAD device simulators. It is applicable to all one-dimensional potentials, which can be approximated to a good degree such that the approximated potentials lead to piecewise analytic wave functions with open boundary conditions. The model allows for simulating both the electric-field-enhanced and band-offset-enhanced carriermore » recombination due to the band-to-trap tunneling near the heterojunction in a heterojunction bipolar transistor (HBT). Simulation results of an InGaP/GaAs/GaAs NPN HBT show that the proposed model predicts significantly increased base currents, due to the hole-to-trap tunneling enhanced by the emitter-base junction band offset. Finally, the results compare favorably with experimental observation.« less
Efficient Band-to-Trap Tunneling Model Including Heterojunction Band Offset
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Xujiao; Huang, Andy; Kerr, Bert
In this paper, we present an efficient band-to-trap tunneling model based on the Schenk approach, in which an analytic density-of-states (DOS) model is developed based on the open boundary scattering method. The new model explicitly includes the effect of heterojunction band offset, in addition to the well-known field effect. Its analytic form enables straightforward implementation into TCAD device simulators. It is applicable to all one-dimensional potentials, which can be approximated to a good degree such that the approximated potentials lead to piecewise analytic wave functions with open boundary conditions. The model allows for simulating both the electric-field-enhanced and band-offset-enhanced carriermore » recombination due to the band-to-trap tunneling near the heterojunction in a heterojunction bipolar transistor (HBT). Simulation results of an InGaP/GaAs/GaAs NPN HBT show that the proposed model predicts significantly increased base currents, due to the hole-to-trap tunneling enhanced by the emitter-base junction band offset. Finally, the results compare favorably with experimental observation.« less
STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, S.
2010-09-02
Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less
STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, S.
2010-09-02
Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less
Understanding resistant effect of mosquito on fumigation strategy in dengue control program
NASA Astrophysics Data System (ADS)
Aldila, D.; Situngkir, N.; Nareswari, K.
2018-01-01
A mathematical model of dengue disease transmission will be introduced in this talk with involving fumigation intervention into mosquito population. Worsening effect of uncontrolled fumigation in the form of resistance of mosquito to fumigation chemicals will also be included into the model to capture the reality in the field. Deterministic approach in a 9 dimensional of ordinary differential equation will be used. Analytical result about the existence and local stability of the equilibrium points followed with the basic reproduction number will be discussed. Some numerical result will be performed for some scenario to give a better interpretation for the analytical results.
NASA Technical Reports Server (NTRS)
Housner, J. M.; Anderson, M.; Belvin, W.; Horner, G.
1985-01-01
Dynamic analysis of large space antenna systems must treat the deployment as well as vibration and control of the deployed antenna. Candidate computer programs for deployment dynamics, and issues and needs for future program developments are reviewed. Some results for mast and hoop deployment are also presented. Modeling of complex antenna geometry with conventional finite element methods and with repetitive exact elements is considered. Analytical comparisons with experimental results for a 15 meter hoop/column antenna revealed the importance of accurate structural properties including nonlinear joints. Slackening of cables in this antenna is also a consideration. The technology of designing actively damped structures through analytical optimization is discussed and results are presented.
Irregular analytical errors in diagnostic testing - a novel concept.
Vogeser, Michael; Seger, Christoph
2018-02-23
In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.
Prioritizing pesticide compounds for analytical methods development
Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.
2012-01-01
The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1 compounds are high priority as new analytes. The objective for analytical methods development is to design an integrated analytical strategy that includes as many of the Tier 1 pesticide compounds as possible in a relatively few, cost-effective methods. More than 60 percent of the Tier 1 compounds are high priority because they are anticipated to be present at concentrations approaching levels that could be of concern to human health or aquatic life in surface water or groundwater. An additional 17 percent of Tier 1 compounds were frequently detected in monitoring studies, but either were not measured at levels potentially relevant to humans or aquatic organisms, or do not have benchmarks available with which to compare concentrations. The remaining 21 percent are pesticide degradates that were included because their parent pesticides were in Tier 1. Tier 1 pesticide compounds for water span all major pesticide use groups and a diverse range of chemical classes, with herbicides and their degradates composing half of compounds. Many of the high priority pesticide compounds also are in several national regulatory programs for water, including those that are regulated in drinking water by the U.S. Environmental Protection Agency under the Safe Drinking Water Act and those that are on the latest Contaminant Candidate List. For sediment, a total of 175 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods available for monitoring and studies. More than 60 percent of these compounds are included in some USGS analytical method; however, some are spread across several research methods that are expensive to perform, and monitoring data are not extensive for many compounds. The remaining Tier 1 compounds for sediment are high priority as new analytes. The objective for analytical methods development for sediment is to enhance an existing analytical method that currently includes nearly half of the pesticide compounds in Tier 1 by adding as many additional Tier 1 compounds as are analytically compatible. About 35 percent of the Tier 1 compounds for sediment are high priority on the basis of measured occurrence. A total of 74 compounds, or 42 percent, are high priority on the basis of predicted likelihood of occurrence according to physical-chemical properties, and either have potential toxicity to aquatic life, high pesticide useage, or both. The remaining 22 percent of Tier 1 pesticide compounds were either degradates of Tier 1 parent compounds or included for other reasons. As with water, the Tier 1 pesticide compounds for sediment are distributed across the major pesticide-use groups; insecticides and their degradates are the largest fraction, making up 45 percent of Tier 1. In contrast to water, organochlorines, at 17 percent, are the largest chemical class for Tier 1 in sediment, which is to be expected because there is continued widespread detection in sediments of persistent organochlorine pesticides and their degradates at concentrations high enough for potential effects on aquatic life. Compared to water, there are fewer available benchmarks with which to compare contaminant concentrations in sediment, but a total of 19 Tier 1 compounds have at least one sediment benchmark or screening value for aquatic organisms. Of the 175 compounds in Tier 1, 77 percent have high aquatic-life toxicity, as defined for this process. This evaluation of pesticides and degradates resulted in two lists of compounds that are priorities for USGS analytical methods development, one for water and one for sediment. These lists will be used as the basis for redesigning and enhancing USGS analytical capabilities for pesticides in order to capture as many high-priority pesticide compounds as possible using an economically feasible approach.
Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit
2016-03-01
Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.
Bidny, Sergei; Gago, Kim; Chung, Phuong; Albertyn, Desdemona; Pasin, Daniel
2017-04-01
An analytical method using ultra performance liquid chromatography (UPLC) quadrupole time-of-flight mass spectrometry (QTOF-MS) was developed and validated for the targeted toxicological screening and quantification of commonly used pharmaceuticals and drugs of abuse in postmortem blood using 100 µL sample. It screens for more than 185 drugs and metabolites and quantifies more than 90 drugs. The selected compounds include classes of pharmaceuticals and drugs of abuse such as: antidepressants, antipsychotics, analgesics (including narcotic analgesics), anti-inflammatory drugs, benzodiazepines, beta-blockers, amphetamines, new psychoactive substances (NPS), cocaine and metabolites. Compounds were extracted into acetonitrile using a salting-out assisted liquid-liquid extraction (SALLE) procedure. The extracts were analyzed using a Waters ACQUITY UPLC coupled with a XEVO QTOF mass spectrometer. Separation of the analytes was achieved by gradient elution using Waters ACQUITY HSS C18 column (2.1 mm x 150 mm, 1.8 μm). The mass spectrometer was operated in both positive and negative electrospray ionization modes. The high-resolution mass spectrometry (HRMS) data was acquired using a patented Waters MSE acquisition mode which collected low and high energy spectra alternatively during the same acquisition. Positive identification of target analytes was based on accurate mass measurements of the molecular ion, product ion, peak area ratio and retention times. Calibration curves were linear over the concentration range 0.05-2 mg/L for basic and neutral analytes and 0.1-6 mg/L for acidic analytes with the correlation coefficients (r2) > 0.96 for most analytes. The limits of detection (LOD) were between 0.001-0.05 mg/L for all analytes. Good recoveries were achieved ranging from 80% to 100% for most analytes using the SALLE method. The method was validated for sensitivity, selectivity, accuracy, precision, stability, carryover and matrix effects. The developed method was tested on a number of authentic forensic samples producing consistent results that correlated with results obtained from other validated methods. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Spatially resolved thermal desorption/ionization coupled with mass spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jesse, Stephen; Van Berkel, Gary J; Ovchinnikova, Olga S
2013-02-26
A system and method for sub-micron analysis of a chemical composition of a specimen are described. The method includes providing a specimen for evaluation and a thermal desorption probe, thermally desorbing an analyte from a target site of said specimen using the thermally active tip to form a gaseous analyte, ionizing the gaseous analyte to form an ionized analyte, and analyzing a chemical composition of the ionized analyte. The thermally desorbing step can include heating said thermally active tip to above 200.degree. C., and positioning the target site and the thermally active tip such that the heating step forms themore » gaseous analyte. The thermal desorption probe can include a thermally active tip extending from a cantilever body and an apex of the thermally active tip can have a radius of 250 nm or less.« less
EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.
2017-01-01
Background Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan. Materials and Methods Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician’s request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs. Results The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%), incorrect sample identification (8%) and clotted samples (6%). Most quality control schemes at Sulaimani hospitals focus only on the analytical phase, and none of the pre-analytical errors were recorded. Interestingly, none of the labs were internationally accredited; therefore, corrective actions are needed at these hospitals to ensure better health outcomes. Internal and External Quality Assessment Schemes (EQAS) for the pre-analytical phase at Sulaimani clinical laboratories should be implemented at public hospitals. Furthermore, lab personnel, particularly phlebotomists, need continuous training on the importance of sample quality to obtain accurate test results. PMID:28107395
NASA Technical Reports Server (NTRS)
Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.
1985-01-01
An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.
Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.
Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F
2016-01-01
Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence factors. This review is a summary of the most important recommendations regarding the importance of pre-analytical factors for coagulation testing and should be a tool to increase awareness about the importance of pre-analytical factors for coagulation testing.
What information on measurement uncertainty should be communicated to clinicians, and how?
Plebani, Mario; Sciacovelli, Laura; Bernardi, Daniela; Aita, Ada; Antonelli, Giorgia; Padoan, Andrea
2018-02-02
The communication of laboratory results to physicians and the quality of reports represent fundamental requirements of the post-analytical phase in order to assure the right interpretation and utilization of laboratory information. Accordingly, the International Standard for clinical laboratories accreditation (ISO 15189) requires that "laboratory reports shall include the information necessary for the interpretation of the examination results". Measurement uncertainty (MU) is an inherent property of any quantitative measurement result which express the lack of knowledge of the true value and quantify the uncertainty of a result, incorporating the factors known to influence it. Even if the MU is not included in the report attributes of ISO 15189 and cannot be considered a post-analytical requirement, it is suggested as an information which should facilitate an appropriate interpretation of quantitative results (quantity values). Therefore, MU has two intended uses: for laboratory professionals, it gives information about the quality of measurements, providing evidence of the compliance with analytical performance characteristics; for physicians (and patients) it may help in interpretation of measurement results, especially when values are compared with reference intervals or clinical decision limits, providing objective information. Here we describe the way that MU should be added to laboratory reports in order to facilitate the interpretation of laboratory results and connecting efforts performed within laboratory to provide more accurate and reliable results with a more objective tool for their interpretation by physicians. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Wexler, Eliezer J.
1992-01-01
Analytical solutions to the advective-dispersive solute-transport equation are useful in predicting the fate of solutes in ground water. Analytical solutions compiled from available literature or derived by the author are presented for a variety of boundary condition types and solute-source configurations in one-, two-, and three-dimensional systems having uniform ground-water flow. A set of user-oriented computer programs was created to evaluate these solutions and to display the results in tabular and computer-graphics format. These programs incorporate many features that enhance their accuracy, ease of use, and versatility. Documentation for the programs describes their operation and required input data, and presents the results of sample problems. Derivations of selected solutions, source codes for the computer programs, and samples of program input and output also are included.
Nonlinear pulse propagation and phase velocity of laser-driven plasma waves
NASA Astrophysics Data System (ADS)
Benedetti, Carlo; Rossi, Francesco; Schroeder, Carl; Esarey, Eric; Leemans, Wim
2014-10-01
We investigate and characterize the laser evolution and plasma wave excitation by a relativistically intense, short-pulse laser propagating in a preformed parabolic plasma channel, including the effects of pulse steepening, frequency redshifting, and energy depletion. We derived in 3D, and in the weakly relativistic intensity regime, analytical expressions for the laser energy depletion, the pulse self-steepening rate, the laser intensity centroid velocity, and the phase velocity of the plasma wave. Analytical results have been validated numerically using the 2D-cylindrical, ponderomotive code INF&RNO. We also discuss the extension of these results to the nonlinear regime, where an analytical theory of the nonlinear wake phase velocity is lacking. Work supported by the Office of Science, Office of High Energy Physics, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.
Analyses of ACPL thermal/fluid conditioning system
NASA Technical Reports Server (NTRS)
Stephen, L. A.; Usher, L. H.
1976-01-01
Results of engineering analyses are reported. Initial computations were made using a modified control transfer function where the systems performance was characterized parametrically using an analytical model. The analytical model was revised to represent the latest expansion chamber fluid manifold design, and systems performance predictions were made. Parameters which were independently varied in these computations are listed. Systems predictions which were used to characterize performance are primarily transient computer plots comparing the deviation between average chamber temperature and the chamber temperature requirement. Additional computer plots were prepared. Results of parametric computations with the latest fluid manifold design are included.
Progressive damage, fracture predictions and post mortem correlations for fiber composites
NASA Technical Reports Server (NTRS)
1985-01-01
Lewis Research Center is involved in the development of computational mechanics methods for predicting the structural behavior and response of composite structures. In conjunction with the analytical methods development, experimental programs including post failure examination are conducted to study various factors affecting composite fracture such as laminate thickness effects, ply configuration, and notch sensitivity. Results indicate that the analytical capabilities incorporated in the CODSTRAN computer code are effective in predicting the progressive damage and fracture of composite structures. In addition, the results being generated are establishing a data base which will aid in the characterization of composite fracture.
The case for visual analytics of arsenic concentrations in foods.
Johnson, Matilda O; Cohly, Hari H P; Isokpehi, Raphael D; Awofolu, Omotayo R
2010-05-01
Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species.
The Case for Visual Analytics of Arsenic Concentrations in Foods
Johnson, Matilda O.; Cohly, Hari H.P.; Isokpehi, Raphael D.; Awofolu, Omotayo R.
2010-01-01
Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species. PMID:20623005
Warren, Alexander D; Conway, Ulric; Arthur, Christopher J; Gates, Paul J
2016-07-01
The analysis of low molecular weight compounds by matrix-assisted laser desorption/ionisation mass spectrometry is problematic due to the interference and suppression of analyte ionisation by the matrices typically employed - which are themselves low molecular weight compounds. The application of colloidal graphite is demonstrated here as an easy to use matrix that can promote the ionisation of a wide range of analytes including low molecular weight organic compounds, complex natural products and inorganic complexes. Analyte ionisation with colloidal graphite is compared with traditional organic matrices along with various other sources of graphite (e.g. graphite rods and charcoal pencils). Factors such as ease of application, spectra reproducibility, spot longevity, spot-to-spot reproducibility and spot homogeneity (through single spot imaging) are explored. For some analytes, considerable matrix suppression effects are observed resulting in spectra completely devoid of matrix ions. We also report the observation of radical molecular ions [M(-●) ] in the negative ion mode, particularly with some aromatic analytes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R
Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less
Data Analytics of Hydraulic Fracturing Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jovan Yang; Viswanathan, Hari; Hyman, Jeffery
These are a set of slides on the data analytics of hydraulic fracturing data. The conclusions from this research are the following: they proposed a permeability evolution as a new mechanism to explain hydraulic fracturing trends; they created a model to include this mechanism and it showed promising results; the paper from this research is ready for submission; they devised a way to identify and sort refractures in order to study their effects, and this paper is currently being written.
The Top 10 Challenges in Extreme-Scale Visual Analytics
Wong, Pak Chung; Shen, Han-Wei; Johnson, Christopher R.; Chen, Chaomei; Ross, Robert B.
2013-01-01
In this issue of CG&A, researchers share their R&D findings and results on applying visual analytics (VA) to extreme-scale data. Having surveyed these articles and other R&D in this field, we’ve identified what we consider the top challenges of extreme-scale VA. To cater to the magazine’s diverse readership, our discussion evaluates challenges in all areas of the field, including algorithms, hardware, software, engineering, and social issues. PMID:24489426
NASA Astrophysics Data System (ADS)
Hosseini-Hashemi, Shahrokh; Sepahi-Boroujeni, Amin; Sepahi-Boroujeni, Saeid
2018-04-01
Normal impact performance of a system including a fullerene molecule and a single-layered graphene sheet is studied in the present paper. Firstly, through a mathematical approach, a new contact law is derived to describe the overall non-bonding interaction forces of the "hollow indenter-target" system. Preliminary verifications show that the derived contact law gives a reliable picture of force field of the system which is in good agreements with the results of molecular dynamics (MD) simulations. Afterwards, equation of the transversal motion of graphene sheet is utilized on the basis of both the nonlocal theory of elasticity and the assumptions of classical plate theory. Then, to derive dynamic behavior of the system, a set including the proposed contact law and the equations of motion of both graphene sheet and fullerene molecule is solved numerically. In order to evaluate outcomes of this method, the problem is modeled by MD simulation. Despite intrinsic differences between analytical and MD methods as well as various errors arise due to transient nature of the problem, acceptable agreements are established between analytical and MD outcomes. As a result, the proposed analytical method can be reliably used to address similar impact problems. Furthermore, it is found that a single-layered graphene sheet is capable of trapping fullerenes approaching with low velocities. Otherwise, in case of rebound, the sheet effectively absorbs predominant portion of fullerene energy.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-17
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER10-2541-000] Maple Analytics, LLC; Supplemental Notice That Initial Market- Based Rate Filing Includes Request for Blanket... proceeding of Maple Analytics, LLC's application for market-based rate authority, with an accompanying rate...
Han, Thomas Yong-Jin; Valdez, Carlos A; Olson, Tammy Y; Kim, Sung Ho; Satcher, Jr., Joe H
2015-04-21
In one embodiment, a system includes a plurality of metal nanoparticles functionalized with a plurality of organic molecules tethered thereto, wherein the plurality of organic molecules preferentially interact with one or more analytes when placed in proximity therewith. According to another embodiment, a method for detecting analytes includes contacting a fluid having one or more analytes of interest therein with a plurality of metal nanoparticles, each metal nanoparticle having a plurality of organic molecules tethered thereto, and detecting Raman scattering from an analyte of interest from the fluid, the analyte interacting with one or more of the plurality of organic molecules. In another embodiment, a method includes chemically modifying a plurality of cyclodextrin molecules at a primary hydroxyl moiety to create a chemical handle, and tethering the plurality of cyclodextrin molecules to a metal nanoparticle using the chemical handle. Other systems and methods for detecting analytes are also described.
International Space Station Potable Water Characterization for 2013
NASA Technical Reports Server (NTRS)
Straub, John E., II; Plumlee, Debrah K.; Schultz, John R.; Mudgett, Paul D.
2014-01-01
In this post-construction, operational phase of International Space Station (ISS) with an ever-increasing emphasis on its use as a test-bed for future exploration missions, the ISS crews continue to rely on water reclamation systems for the majority of their water needs. The onboard water supplies include U.S. Segment potable water from humidity condensate and urine, Russian Segment potable water from condensate, and ground-supplied potable water, as reserve. In 2013, the cargo returned on the Soyuz 32-35 flights included archival potable water samples collected from Expeditions 34-37. The former Water and Food Analytical Laboratory (now Toxicology and Evironmental Chemistry Laboratory) at the NASA Johnson Space Center continued its long-standing role of performing chemical analyses on ISS return water samples to verify compliance with potable water quality specifications. This paper presents and discusses the analytical results for potable water samples returned from Expeditions 34-37, including a comparison to ISS quality standards. During the summer of 2013, the U.S. Segment potable water experienced a third temporary rise and fall in total organic carbon (TOC) content, as the result of organic contamination breaking through the water system's treatment process. Analytical results for the Expedition 36 archival samples returned on Soyuz 34 confirmed that dimethylsilanediol was once again the responsible contaminant, just as it was for the previous comparable TOC rises in 2010 and 2012. Discussion herein includes the use of the in-flight total organic carbon analyzer (TOCA) as a key monitoring tool for tracking these TOC rises and scheduling appropriate remediation.
NASA Technical Reports Server (NTRS)
Mctiernan, James M.; Petrosian, Vahe
1989-01-01
For many astrophysical situations, such as in solar flares or cosmic gamma-ray bursts, continuum gamma rays with energies up to hundreds of MeV were observed, and can be interpreted to be due to bremsstrahlung radiation by relativistic electrons. The region of acceleration for these particles is not necessarily the same as the region in which the radiation is produced, and the effects of the transport of the electrons must be included in the general problem. Hence it is necessary to solve the kinetic equation for relativistic electrons, including all the interactions and loss mechanisms relevant at such energies. The resulting kinetic equation for non-thermal electrons, including the effects of Coulomb collisions and losses due to synchrotron emission, was solved analytically in some simple limiting cases, and numerically for the general cases including constant and varying background plasma density and magnetic field. New approximate analytic solutions are presented for collision dominated cases, for small pitch angles and all energies, synchrotron dominated cases, both steady-state and time dependent, for all pitch angles and energies, and for cases when both synchrotron and collisional energy losses are important, but for relativistic electrons. These analytic solutions are compared to the full numerical results in the proper limits. These results will be useful for calculation of spectra and angular distribution of the radiation (x rays, gamma-rays, and microwaves) emitted via synchrotron or bremsstrahlung processes by the electrons. These properties and their relevance to observations will be observed in subsequent papers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindberg, Michael J.
2010-09-28
Between October 14, 2009 and February 22, 2010 sediment samples were received from 100-BC Decision Unit for geochemical studies. This is an analytical data report for sediments received from CHPRC at the 100 BC 5 OU. The analyses for this project were performed at the 325 building located in the 300 Area of the Hanford Site. The analyses were performed according to Pacific Northwest National Laboratory (PNNL) approved procedures and/or nationally recognized test procedures. The data sets include the sample identification numbers, analytical results, estimated quantification limits (EQL), and quality control data. The preparatory and analytical quality control requirements, calibrationmore » requirements, acceptance criteria, and failure actions are defined in the on-line QA plan 'Conducting Analytical Work in Support of Regulatory Programs' (CAW). This QA plan implements the Hanford Analytical Services Quality Assurance Requirements Documents (HASQARD) for PNNL.« less
Engineering of a miniaturized, robotic clinical laboratory
Nourse, Marilyn B.; Engel, Kate; Anekal, Samartha G.; Bailey, Jocelyn A.; Bhatta, Pradeep; Bhave, Devayani P.; Chandrasekaran, Shekar; Chen, Yutao; Chow, Steven; Das, Ushati; Galil, Erez; Gong, Xinwei; Gessert, Steven F.; Ha, Kevin D.; Hu, Ran; Hyland, Laura; Jammalamadaka, Arvind; Jayasurya, Karthik; Kemp, Timothy M.; Kim, Andrew N.; Lee, Lucie S.; Liu, Yang Lily; Nguyen, Alphonso; O'Leary, Jared; Pangarkar, Chinmay H.; Patel, Paul J.; Quon, Ken; Ramachandran, Pradeep L.; Rappaport, Amy R.; Roy, Joy; Sapida, Jerald F.; Sergeev, Nikolay V.; Shee, Chandan; Shenoy, Renuka; Sivaraman, Sharada; Sosa‐Padilla, Bernardo; Tran, Lorraine; Trent, Amanda; Waggoner, Thomas C.; Wodziak, Dariusz; Yuan, Amy; Zhao, Peter; Holmes, Elizabeth A.
2018-01-01
Abstract The ability to perform laboratory testing near the patient and with smaller blood volumes would benefit patients and physicians alike. We describe our design of a miniaturized clinical laboratory system with three components: a hardware platform (ie, the miniLab) that performs preanalytical and analytical processing steps using miniaturized sample manipulation and detection modules, an assay‐configurable cartridge that provides consumable materials and assay reagents, and a server that communicates bidirectionally with the miniLab to manage assay‐specific protocols and analyze, store, and report results (i.e., the virtual analyzer). The miniLab can detect analytes in blood using multiple methods, including molecular diagnostics, immunoassays, clinical chemistry, and hematology. Analytical performance results show that our qualitative Zika virus assay has a limit of detection of 55 genomic copies/ml. For our anti‐herpes simplex virus type 2 immunoglobulin G, lipid panel, and lymphocyte subset panel assays, the miniLab has low imprecision, and method comparison results agree well with those from the United States Food and Drug Administration‐cleared devices. With its small footprint and versatility, the miniLab has the potential to provide testing of a range of analytes in decentralized locations. PMID:29376134
Engineering of a miniaturized, robotic clinical laboratory.
Nourse, Marilyn B; Engel, Kate; Anekal, Samartha G; Bailey, Jocelyn A; Bhatta, Pradeep; Bhave, Devayani P; Chandrasekaran, Shekar; Chen, Yutao; Chow, Steven; Das, Ushati; Galil, Erez; Gong, Xinwei; Gessert, Steven F; Ha, Kevin D; Hu, Ran; Hyland, Laura; Jammalamadaka, Arvind; Jayasurya, Karthik; Kemp, Timothy M; Kim, Andrew N; Lee, Lucie S; Liu, Yang Lily; Nguyen, Alphonso; O'Leary, Jared; Pangarkar, Chinmay H; Patel, Paul J; Quon, Ken; Ramachandran, Pradeep L; Rappaport, Amy R; Roy, Joy; Sapida, Jerald F; Sergeev, Nikolay V; Shee, Chandan; Shenoy, Renuka; Sivaraman, Sharada; Sosa-Padilla, Bernardo; Tran, Lorraine; Trent, Amanda; Waggoner, Thomas C; Wodziak, Dariusz; Yuan, Amy; Zhao, Peter; Young, Daniel L; Robertson, Channing R; Holmes, Elizabeth A
2018-01-01
The ability to perform laboratory testing near the patient and with smaller blood volumes would benefit patients and physicians alike. We describe our design of a miniaturized clinical laboratory system with three components: a hardware platform (ie, the miniLab) that performs preanalytical and analytical processing steps using miniaturized sample manipulation and detection modules, an assay-configurable cartridge that provides consumable materials and assay reagents, and a server that communicates bidirectionally with the miniLab to manage assay-specific protocols and analyze, store, and report results (i.e., the virtual analyzer). The miniLab can detect analytes in blood using multiple methods, including molecular diagnostics, immunoassays, clinical chemistry, and hematology. Analytical performance results show that our qualitative Zika virus assay has a limit of detection of 55 genomic copies/ml. For our anti-herpes simplex virus type 2 immunoglobulin G, lipid panel, and lymphocyte subset panel assays, the miniLab has low imprecision, and method comparison results agree well with those from the United States Food and Drug Administration-cleared devices. With its small footprint and versatility, the miniLab has the potential to provide testing of a range of analytes in decentralized locations.
The pitfalls of hair analysis for toxicants in clinical practice: three case reports.
Frisch, Melissa; Schwartz, Brian S
2002-01-01
Hair analysis is used to assess exposure to heavy metals in patients presenting with nonspecific symptoms and is a commonly used procedure in patients referred to our clinic. We are frequently called on to evaluate patients who have health-related concerns as a result of hair analysis. Three patients first presented to outside physicians with nonspecific, multisystemic symptoms. A panel of analytes was measured in hair, and one or more values were interpreted as elevated. As a result of the hair analysis and other unconventional diagnostic tests, the patients presented to us believing they suffered from metal toxicity. In this paper we review the clinical efficacy of this procedure within the context of a patient population with somatic disorders and no clear risk factors for metal intoxication. We also review limitations of hair analysis in this setting; these limitations include patient factors such as low pretest probability of disease and test factors such as the lack of validation of analytic techniques, the inability to discern between exogenous contaminants and endogenous toxicants in hair, the variability of analytic procedures, low interlaboratory reliability, and the increased likelihood of false positive test results in the measurement of panels of analytes. PMID:11940463
NASA Astrophysics Data System (ADS)
Ghorbani, A.; Farahani, M. Mahmoodi; Rabbani, M.; Aflaki, F.; Waqifhosain, Syed
2008-01-01
In this paper we propose uncertainty estimation for the analytical results we obtained from determination of Ni, Pb and Al by solidphase extraction and inductively coupled plasma optical emission spectrometry (SPE-ICP-OES). The procedure is based on the retention of analytes in the form of 8-hydroxyquinoline (8-HQ) complexes on a mini column of XAD-4 resin and subsequent elution with nitric acid. The influence of various analytical parameters including the amount of solid phase, pH, elution factors (concentration and volume of eluting solution), volume of sample solution, and amount of ligand on the extraction efficiency of analytes was investigated. To estimate the uncertainty of analytical result obtained, we propose assessing trueness by employing spiked sample. Two types of bias are calculated in the assessment of trueness: a proportional bias and a constant bias. We applied Nested design for calculating proportional bias and Youden method to calculate the constant bias. The results we obtained for proportional bias are calculated from spiked samples. In this case, the concentration found is plotted against the concentration added and the slop of standard addition curve is an estimate of the method recovery. Estimated method of average recovery in Karaj river water is: (1.004±0.0085) for Ni, (0.999±0.010) for Pb and (0.987±0.008) for Al.
42 CFR 493.1289 - Standard: Analytic systems quality assessment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Analytic systems quality assessment. 493... Nonwaived Testing Analytic Systems § 493.1289 Standard: Analytic systems quality assessment. (a) The... through 493.1283. (b) The analytic systems quality assessment must include a review of the effectiveness...
Analytical Applications of Transport Through Bulk Liquid Membranes.
Diaconu, Ioana; Ruse, Elena; Aboul-Enein, Hassan Y; Bunaciu, Andrei A
2016-07-03
This review discusses the results of research in the use of bulk liquid membranes in separation processes and preconcentration for analytical purposes. It includes some theoretical aspects, definitions, types of liquid membranes, and transport mechanism, as well as advantages of using liquid membranes in laboratory studies. These concepts are necessary to understand fundamental principles of liquid membrane transport. Due to the multiple advantages of liquid membranes several studies present analytical applications of the transport through liquid membranes in separation or preconcentration processes of metallic cations and some organic compounds, such as phenol and phenolic derivatives, organic acids, amino acids, carbohydrates, and drugs. This review presents coupled techniques such as separation through the liquid membrane coupled with flow injection analysis.
Mechanical and Electronic Approaches to Improve the Sensitivity of Microcantilever Sensors
Mutyala, Madhu Santosh Ku; Bandhanadham, Deepika; Pan, Liu; Pendyala, Vijaya Rohini; Ji, Hai-Feng
2010-01-01
Advances in the field of Micro Electro Mechanical Systems (MEMS) and their uses now offer unique opportunities in the design of ultrasensitive analytical tools. The analytical community continues to search for cost-effective, reliable, and even portable analytical techniques that can give reliable and fast response results for a variety of chemicals and biomolecules. Microcantilevers (MCLs) have emerged as a unique platform for label-free biosensor or bioassay. Several electronic designs, including piezoresistive, piezoelectric, and capacitive approaches, have been applied to measure the bending or frequency change of the MCLs upon exposure to chemicals. This review summarizes mechanical, fabrication, and electronics approaches to increase the sensitivity of microcantilever (MCL) sensors. PMID:20975987
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Analytic cognitive style predicts religious and paranormal belief.
Pennycook, Gordon; Cheyne, James Allan; Seli, Paul; Koehler, Derek J; Fugelsang, Jonathan A
2012-06-01
An analytic cognitive style denotes a propensity to set aside highly salient intuitions when engaging in problem solving. We assess the hypothesis that an analytic cognitive style is associated with a history of questioning, altering, and rejecting (i.e., unbelieving) supernatural claims, both religious and paranormal. In two studies, we examined associations of God beliefs, religious engagement (attendance at religious services, praying, etc.), conventional religious beliefs (heaven, miracles, etc.) and paranormal beliefs (extrasensory perception, levitation, etc.) with performance measures of cognitive ability and analytic cognitive style. An analytic cognitive style negatively predicted both religious and paranormal beliefs when controlling for cognitive ability as well as religious engagement, sex, age, political ideology, and education. Participants more willing to engage in analytic reasoning were less likely to endorse supernatural beliefs. Further, an association between analytic cognitive style and religious engagement was mediated by religious beliefs, suggesting that an analytic cognitive style negatively affects religious engagement via lower acceptance of conventional religious beliefs. Results for types of God belief indicate that the association between an analytic cognitive style and God beliefs is more nuanced than mere acceptance and rejection, but also includes adopting less conventional God beliefs, such as Pantheism or Deism. Our data are consistent with the idea that two people who share the same cognitive ability, education, political ideology, sex, age and level of religious engagement can acquire very different sets of beliefs about the world if they differ in their propensity to think analytically. Copyright © 2012 Elsevier B.V. All rights reserved.
Azzouz, Abdelmonaim; Ballesteros, Evaristo
2014-09-19
A novel analytical method using a continuous solid-phase extraction system in combination with gas chromatography-mass spectrometry for the simultaneous separation and determination of endocrine disrupting compounds (EDCs) is reported. The method was applied to major EDCs of various types including parabens, alkylphenols, phenylphenols, bisphenol A and triclosan in water. Samples were preconcentrated by using an automatic solid-phase extraction module containing a sorbent column, and retained analytes eluted with acetonitrile for derivatization with a mixture of N,O-bis(trimethylsilyl)trifluoroacetamide and trimethylchlorosilane. A number of variables potentially influencing recovery of the target compounds such as the type of SPE sorbent (Silica gel, Florisil, RP-C18, Amberlite XAD-2 and XAD-4, Oasis HLB and LiChrolut EN), eluent and properties of the water including pH and ionic strength, were examined. LiChrolut EN was found to be the most efficient sorbent for retaining the analytes, with ∼100% efficiency. The ensuing method was validated with good analytical results including low limits of detection (0.01-0.08ng/L for 100mL of sample) and good linearity (r(2)>0.997) throughout the studied concentration ranges. The method exhibited good accuracy (recoveries of 90-101%) and precision (relative standard deviations less than 7%) in the determination of EDCs in drinking, river, pond, well, swimming pool and waste water. Waste water samples were found to contain the largest number and highest concentrations of analytes (3.2-390ng/L). Copyright © 2014 Elsevier B.V. All rights reserved.
Warwick, Peter D.; Breland, F. Clayton; Hackley, Paul C.; Dulong, Frank T.; Nichols, Douglas J.; Karlsen, Alexander W.; Bustin, R. Marc; Barker, Charles E.; Willett, Jason C.; Trippi, Michael H.
2006-01-01
In 2001, and 2002, the U.S. Geological Survey (USGS) and the Louisiana Geological Survey (LGS), through a Cooperative Research and Development Agreement (CRADA) with Devon SFS Operating, Inc. (Devon), participated in an exploratory drilling and coring program for coal-bed methane in north-central Louisiana. The USGS and LGS collected 25 coal core and cuttings samples from two coal-bed methane test wells that were drilled in west-central Caldwell Parish, Louisiana. The purpose of this report is to provide the results of the analytical program conducted on the USGS/LGS samples. The data generated from this project are summarized in various topical sections that include: 1. molecular and isotopic data from coal gas samples; 2. results of low-temperature ashing and X-ray analysis; 3. palynological data; 4. down-hole temperature data; 5. detailed core descriptions and selected core photographs; 6. coal physical and chemical analytical data; 7. coal gas desorption results; 8. methane and carbon dioxide coal sorption data; 9. coal petrographic results; and 10. geophysical logs.
Effect of risk perception on epidemic spreading in temporal networks
NASA Astrophysics Data System (ADS)
Moinet, Antoine; Pastor-Satorras, Romualdo; Barrat, Alain
2018-01-01
Many progresses in the understanding of epidemic spreading models have been obtained thanks to numerous modeling efforts and analytical and numerical studies, considering host populations with very different structures and properties, including complex and temporal interaction networks. Moreover, a number of recent studies have started to go beyond the assumption of an absence of coupling between the spread of a disease and the structure of the contacts on which it unfolds. Models including awareness of the spread have been proposed, to mimic possible precautionary measures taken by individuals that decrease their risk of infection, but have mostly considered static networks. Here, we adapt such a framework to the more realistic case of temporal networks of interactions between individuals. We study the resulting model by analytical and numerical means on both simple models of temporal networks and empirical time-resolved contact data. Analytical results show that the epidemic threshold is not affected by the awareness but that the prevalence can be significantly decreased. Numerical studies on synthetic temporal networks highlight, however, the presence of very strong finite-size effects, resulting in a significant shift of the effective epidemic threshold in the presence of risk awareness. For empirical contact networks, the awareness mechanism leads as well to a shift in the effective threshold and to a strong reduction of the epidemic prevalence.
Acute recreational drug toxicity
Liakoni, Evangelia; Yates, Christopher; Dines, Alison M.; Dargan, Paul I.; Heyerdahl, Fridtjof; Hovda, Knut Erik; Wood, David M.; Eyer, Florian; Liechti, Matthias E.
2018-01-01
Abstract The aim of the study was to compare self-reported and analytically confirmed substance use in cases of acute recreational drug toxicity. We performed a retrospective analysis of emergency department presentations of acute recreational drug toxicity over 2 years (October 2013 to September 2015) within the European Drug Emergencies Network Plus project. Among the 10,956 cases of acute recreational drug toxicity during the study period, 831 could be included. Between the self-reported substance use and the toxicological results, the highest agreement was found for heroin (86.1%) and cocaine (74.1%), whereas inhalants, poppers, and magic mushrooms were self-reported but not analytically detected. Cathinones and other new psychoactive substances (NPS) could be detected using additional analytical methods. Among cases with both immunoassay (IA) and confirmation with mass spectrometry (MS), the results were consistent for methadone (100%) and cocaine (95.5%) and less consistent for amphetamines (81.8%). In cases with a positive IA for amphetamines (n = 54), MS confirmed the presence of 3,4-methylenedioxymethamphetamine (MDMA), amphetamine, methamphetamine, and NPS in 37, 20, 10, and 6 cases, respectively, also revealing use of more than 1 substance in some cases. MS yielded positive results in 21 cases with a negative IA for amphetamines, including amphetamine, MDMA, methamphetamine, and NPS, in 14, 7, 2, and 2 cases, respectively. In conclusion, the highest agreement was found between self-reports and analytical findings for heroin and cocaine. The diagnosis of NPS use was mainly based on self-report. The IAs accurately identified methadone and cocaine, and MS had advantages for the detection of NPS and amphetamine derivatives. PMID:29384873
Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.
Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G
2018-06-01
This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.
NASA Astrophysics Data System (ADS)
Zhou, Xuhong; Cao, Liang; Chen, Y. Frank; Liu, Jiepeng; Li, Jiang
2016-01-01
The developed pre-stressed cable reinforced concrete truss (PCT) floor system is a relatively new floor structure, which can be applied to various long-span structures such as buildings, stadiums, and bridges. Due to the lighter mass and longer span, floor vibration would be a serviceability concern problem for such systems. In this paper, field testing and theoretical analysis for the PCT floor system were conducted. Specifically, heel-drop impact and walking tests were performed on the PCT floor system to capture the dynamic properties including natural frequencies, mode shapes, damping ratios, and acceleration response. The PCT floor system was found to be a low frequency (<10 Hz) and low damping (damping ratio<2 percent) structural system. The comparison of the experimental results with the AISC's limiting values indicates that the investigated PCT system exhibits satisfactory vibration perceptibility, however. The analytical solution obtained from the weighted residual method agrees well with the experimental results and thus validates the proposed analytical expression. Sensitivity studies using the analytical solution were also conducted to investigate the vibration performance of the PCT floor system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muljadi, Eduard; Hasan, Iftekhar; Husain, Tausif
In this paper, a nonlinear analytical model based on the Magnetic Equivalent Circuit (MEC) method is developed for a double-sided E-Core Transverse Flux Machine (TFM). The proposed TFM has a cylindrical rotor, sandwiched between E-core stators on both sides. Ferrite magnets are used in the rotor with flux concentrating design to attain high airgap flux density, better magnet utilization, and higher torque density. The MEC model was developed using a series-parallel combination of flux tubes to estimate the reluctance network for different parts of the machine including air gaps, permanent magnets, and the stator and rotor ferromagnetic materials, in amore » two-dimensional (2-D) frame. An iterative Gauss-Siedel method is integrated with the MEC model to capture the effects of magnetic saturation. A single phase, 1 kW, 400 rpm E-Core TFM is analytically modeled and its results for flux linkage, no-load EMF, and generated torque, are verified with Finite Element Analysis (FEA). The analytical model significantly reduces the computation time while estimating results with less than 10 percent error.« less
Analytical investigation of thermal barrier coatings on advanced power generation gas turbines
NASA Technical Reports Server (NTRS)
Amos, D. J.
1977-01-01
An analytical investigation of present and advanced gas turbine power generation cycles incorporating thermal barrier turbine component coatings was performed. Approximately 50 parametric points considering simple, recuperated, and combined cycles (including gasification) with gas turbine inlet temperatures from current levels through 1644K (2500 F) were evaluated. The results indicated that thermal barriers would be an attractive means to improve performance and reduce cost of electricity for these cycles. A recommended thermal barrier development program has been defined.
Application of differential transformation method for solving dengue transmission mathematical model
NASA Astrophysics Data System (ADS)
Ndii, Meksianis Z.; Anggriani, Nursanti; Supriatna, Asep K.
2018-03-01
The differential transformation method (DTM) is a semi-analytical numerical technique which depends on Taylor series and has application in many areas including Biomathematics. The aim of this paper is to employ the differential transformation method (DTM) to solve system of non-linear differential equations for dengue transmission mathematical model. Analytical and numerical solutions are determined and the results are compared to that of Runge-Kutta method. We found a good agreement between DTM and Runge-Kutta method.
NASA Technical Reports Server (NTRS)
Giles, G. L.; Rogers, J. L., Jr.
1982-01-01
The implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calclating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of the system are also discussed.
Critical Elements in Produced Fluids from Nevada and Utah
Simmons, Stuart
2017-07-27
Critical elements and related analytical data for produced fluids from geothermal fields in Nevada and Utah, Sevier thermal belt hot springs, Utah, and Uinta basin oil-gas wells, Utah are reported. Analytical results include pH, major species, trace elements, transition metals, other metals, metalloids and REEs. Gas samples were collected and analyzed from Beowawe, Dixie Valley, Roosevelt Hot Springs, and Thermo. Helium gases and helium isotopes were analyzed on samples collected at Patua, San Emido and two wells in the Uinta basin.
Partially Coherent Scattering in Stellar Chromospheres. Part 4; Analytic Wing Approximations
NASA Technical Reports Server (NTRS)
Gayley, K. G.
1993-01-01
Simple analytic expressions are derived to understand resonance-line wings in stellar chromospheres and similar astrophysical plasmas. The results are approximate, but compare well with accurate numerical simulations. The redistribution is modeled using an extension of the partially coherent scattering approximation (PCS) which we term the comoving-frame partially coherent scattering approximation (CPCS). The distinction is made here because Doppler diffusion is included in the coherent/noncoherent decomposition, in a form slightly improved from the earlier papers in this series.
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-01-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-08-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.
Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA)
NASA Astrophysics Data System (ADS)
Bates, E. M.; Birmingham, W. J.; Romero-Talamás, C. A.
2018-05-01
The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.
Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA).
Bates, E M; Birmingham, W J; Romero-Talamás, C A
2018-05-01
The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.
Enhanced spot preparation for liquid extractive sampling and analysis
Van Berkel, Gary J.; King, Richard C.
2015-09-22
A method for performing surface sampling of an analyte, includes the step of placing the analyte on a stage with a material in molar excess to the analyte, such that analyte-analyte interactions are prevented and the analyte can be solubilized for further analysis. The material can be a matrix material that is mixed with the analyte. The material can be provided on a sample support. The analyte can then be contacted with a solvent to extract the analyte for further processing, such as by electrospray mass spectrometry.
MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.
Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang
2005-01-01
MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.
MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory
Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang
2005-01-01
MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721
Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian
2014-01-01
Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Wall, Mark J.
2016-01-01
Microelectrode amperometric biosensors are widely used to measure concentrations of analytes in solution and tissue including acetylcholine, adenosine, glucose, and glutamate. A great deal of experimental and modeling effort has been directed at quantifying the response of the biosensors themselves; however, the influence that the macroscopic tissue environment has on biosensor response has not been subjected to the same level of scrutiny. Here we identify an important issue in the way microelectrode biosensors are calibrated that is likely to have led to underestimations of analyte tissue concentrations. Concentration in tissue is typically determined by comparing the biosensor signal to that measured in free-flow calibration conditions. In a free-flow environment the concentration of the analyte at the outer surface of the biosensor can be considered constant. However, in tissue the analyte reaches the biosensor surface by diffusion through the extracellular space. Because the enzymes in the biosensor break down the analyte, a density gradient is set up resulting in a significantly lower concentration of analyte near the biosensor surface. This effect is compounded by the diminished volume fraction (porosity) and reduction in the diffusion coefficient due to obstructions (tortuosity) in tissue. We demonstrate this effect through modeling and experimentally verify our predictions in diffusive environments. NEW & NOTEWORTHY Microelectrode biosensors are typically calibrated in a free-flow environment where the concentrations at the biosensor surface are constant. However, when in tissue, the analyte reaches the biosensor via diffusion and so analyte breakdown by the biosensor results in a concentration gradient and consequently a lower concentration around the biosensor. This effect means that naive free-flow calibration will underestimate tissue concentration. We develop mathematical models to better quantify the discrepancy between the calibration and tissue environment and experimentally verify our key predictions. PMID:27927788
Newton, Adam J H; Wall, Mark J; Richardson, Magnus J E
2017-03-01
Microelectrode amperometric biosensors are widely used to measure concentrations of analytes in solution and tissue including acetylcholine, adenosine, glucose, and glutamate. A great deal of experimental and modeling effort has been directed at quantifying the response of the biosensors themselves; however, the influence that the macroscopic tissue environment has on biosensor response has not been subjected to the same level of scrutiny. Here we identify an important issue in the way microelectrode biosensors are calibrated that is likely to have led to underestimations of analyte tissue concentrations. Concentration in tissue is typically determined by comparing the biosensor signal to that measured in free-flow calibration conditions. In a free-flow environment the concentration of the analyte at the outer surface of the biosensor can be considered constant. However, in tissue the analyte reaches the biosensor surface by diffusion through the extracellular space. Because the enzymes in the biosensor break down the analyte, a density gradient is set up resulting in a significantly lower concentration of analyte near the biosensor surface. This effect is compounded by the diminished volume fraction (porosity) and reduction in the diffusion coefficient due to obstructions (tortuosity) in tissue. We demonstrate this effect through modeling and experimentally verify our predictions in diffusive environments. NEW & NOTEWORTHY Microelectrode biosensors are typically calibrated in a free-flow environment where the concentrations at the biosensor surface are constant. However, when in tissue, the analyte reaches the biosensor via diffusion and so analyte breakdown by the biosensor results in a concentration gradient and consequently a lower concentration around the biosensor. This effect means that naive free-flow calibration will underestimate tissue concentration. We develop mathematical models to better quantify the discrepancy between the calibration and tissue environment and experimentally verify our key predictions. Copyright © 2017 the American Physiological Society.
NASA Astrophysics Data System (ADS)
Cvetkovic, V.; Molin, S.
2012-02-01
We present a methodology that combines numerical simulations of groundwater flow and advective transport in heterogeneous porous media with analytical retention models for computing the infection risk probability from pathogens in aquifers. The methodology is based on the analytical results presented in [1,2] for utilising the colloid filtration theory in a time-domain random walk framework. It is shown that in uniform flow, the results from the numerical simulations of advection yield comparable results as the analytical TDRW model for generating advection segments. It is shown that spatial variability of the attachment rate may be significant, however, it appears to affect risk in a different manner depending on if the flow is uniform or radially converging. In spite of the fact that numerous issues remain open regarding pathogen transport in aquifers on the field scale, the methodology presented here may be useful for screening purposes, and may also serve as a basis for future studies that would include greater complexity.
Generalized bipartite quantum state discrimination problems with sequential measurements
NASA Astrophysics Data System (ADS)
Nakahira, Kenji; Kato, Kentaro; Usuda, Tsuyoshi Sasaki
2018-02-01
We investigate an optimization problem of finding quantum sequential measurements, which forms a wide class of state discrimination problems with the restriction that only local operations and one-way classical communication are allowed. Sequential measurements from Alice to Bob on a bipartite system are considered. Using the fact that the optimization problem can be formulated as a problem with only Alice's measurement and is convex programming, we derive its dual problem and necessary and sufficient conditions for an optimal solution. Our results are applicable to various practical optimization criteria, including the Bayes criterion, the Neyman-Pearson criterion, and the minimax criterion. In the setting of the problem of finding an optimal global measurement, its dual problem and necessary and sufficient conditions for an optimal solution have been widely used to obtain analytical and numerical expressions for optimal solutions. Similarly, our results are useful to obtain analytical and numerical expressions for optimal sequential measurements. Examples in which our results can be used to obtain an analytical expression for an optimal sequential measurement are provided.
Electron scattering from excited states of hydrogen: Implications for the ionization threshold law
NASA Astrophysics Data System (ADS)
Temkin, A.; Shertzer, J.
2013-05-01
The elastic scattering wave function for electrons scattered from the Nth excited state of hydrogen is the final state of the matrix element for excitation of that state. This paper deals with the solution of that problem primarily in the context of the Temkin-Poet (TP) model [A. Temkin, Phys. Rev.PHRVAO0031-899X10.1103/PhysRev.126.130 126, 130 (1962); R. Poet, J. Phys. BJPAPEH0022-370010.1088/0022-3700/11/17/019 11, 3081 (1978)], wherein only the radial parts of the interaction are included. The relevant potential for the outer electron is dominated by the Hartree potential, VNH(r). In the first part of the paper, VNH(r) is approximated by a potential WN(r), for which the scattering equation can be analytically solved. The results allow formal analytical continuation of N into the continuum, so that the ionization threshold law can be deduced. Because the analytic continuation involves going from N to an imaginary function of the momentum of the inner electron, the threshold law turns out to be an exponentially damped function of the available energy E, in qualitative accord with the result of Macek and Ihra [J. H. Macek and W. Ihra, Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.55.2024 55, 2024 (1997)] for the TP model. Thereafter, the scattering equation for the Hartree potential VNH(r) is solved numerically. The numerical aspects of these calculations have proven to be challenging and required several developments for the difficulties to be overcome. The results for VNH(r) show only a simple energy-dependent shift from the approximate potential WN(r), which therefore does not change the analytic continuation and the form of the threshold law. It is concluded that the relevant optical potential must be included in order to compare directly with the analytic result of Macek and Ihra. The paper concludes with discussions of (a) a quantum mechanical interpretation of the result, and (b) the outlook of this approach for the complete problem.
Synthesized airfoil data method for prediction of dynamic stall and unsteady airloads
NASA Technical Reports Server (NTRS)
Gangwani, S. T.
1983-01-01
A detailed analysis of dynamic stall experiments has led to a set of relatively compact analytical expressions, called synthesized unsteady airfoil data, which accurately describe in the time-domain the unsteady aerodynamic characteristics of stalled airfoils. An analytical research program was conducted to expand and improve this synthesized unsteady airfoil data method using additional available sets of unsteady airfoil data. The primary objectives were to reduce these data to synthesized form for use in rotor airload prediction analyses and to generalize the results. Unsteady drag data were synthesized which provided the basis for successful expansion of the formulation to include computation of the unsteady pressure drag of airfoils and rotor blades. Also, an improved prediction model for airfoil flow reattachment was incorporated in the method. Application of this improved unsteady aerodynamics model has resulted in an improved correlation between analytic predictions and measured full scale helicopter blade loads and stress data.
Wexler, Eliezer J.
1989-01-01
Analytical solutions to the advective-dispersive solute-transport equation are useful in predicting the fate of solutes in ground water. Analytical solutions compiled from available literature or derived by the author are presented in this report for a variety of boundary condition types and solute-source configurations in one-, two-, and three-dimensional systems with uniform ground-water flow. A set of user-oriented computer programs was created to evaluate these solutions and to display the results in tabular and computer-graphics format. These programs incorporate many features that enhance their accuracy, ease of use, and versatility. Documentation for the programs describes their operation and required input data, and presents the results of sample problems. Derivations of select solutions, source codes for the computer programs, and samples of program input and output also are included.
Vandermause, Roxanne; Barbosa-Leiker, Celestina; Fritz, Roschelle
2014-12-01
This multimethod, qualitative study provides results for educators of nursing doctoral students to consider. Combining the expertise of an empirical analytical researcher (who uses statistical methods) and an interpretive phenomenological researcher (who uses hermeneutic methods), a course was designed that would place doctoral students in the midst of multiparadigmatic discussions while learning fundamental research methods. Field notes and iterative analytical discussions led to patterns and themes that highlight the value of this innovative pedagogical application. Using content analysis and interpretive phenomenological approaches, together with one of the students, data were analyzed from field notes recorded in real time over the period the course was offered. This article describes the course and the study analysis, and offers the pedagogical experience as transformative. A link to a sample syllabus is included in the article. The results encourage nurse educators of doctoral nursing students to focus educational practice on multiple methodological perspectives. Copyright 2014, SLACK Incorporated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loizu, J., E-mail: joaquim.loizu@ipp.mpg.de; Princeton Plasma Physics Laboratory, P.O. Box 451, Princeton New Jersey 08543; Hudson, S.
2015-02-15
Using the recently developed multiregion, relaxed MHD (MRxMHD) theory, which bridges the gap between Taylor's relaxation theory and ideal MHD, we provide a thorough analytical and numerical proof of the formation of singular currents at rational surfaces in non-axisymmetric ideal MHD equilibria. These include the force-free singular current density represented by a Dirac δ-function, which presumably prevents the formation of islands, and the Pfirsch-Schlüter 1/x singular current, which arises as a result of finite pressure gradient. An analytical model based on linearized MRxMHD is derived that can accurately (1) describe the formation of magnetic islands at resonant rational surfaces, (2)more » retrieve the ideal MHD limit where magnetic islands are shielded, and (3) compute the subsequent formation of singular currents. The analytical results are benchmarked against numerical simulations carried out with a fully nonlinear implementation of MRxMHD.« less
NASA Astrophysics Data System (ADS)
Vvedenskii, N. V.; Kostin, V. A.; Laryushin, I. D.; Silaev, A. A.
2016-05-01
We have studied the processes of excitation of low-frequency residual currents in a plasma produced through ionisation of gases by two-colour laser pulses in laser-plasma schemes for THz generation. We have developed an analytical approach that allows one to find residual currents in the case when one of the components of a two-colour pulse is weak enough. The derived analytical expressions show that the effective generation of the residual current (and hence the effective THz generation) is possible if the ratio of the frequencies in the two-colour laser pulse is close to a rational fraction with a not very big odd sum of the numerator and denominator. The results of numerical calculations (including those based on the solution of the three-dimensional time-dependent Schrödinger equation) agree well with the analytical results.
Structural Design Optimization of Doubly-Fed Induction Generators Using GeneratorSE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sethuraman, Latha; Fingersh, Lee J; Dykes, Katherine L
2017-11-13
A wind turbine with a larger rotor swept area can generate more electricity, however, this increases costs disproportionately for manufacturing, transportation, and installation. This poster presents analytical models for optimizing doubly-fed induction generators (DFIGs), with the objective of reducing the costs and mass of wind turbine drivetrains. The structural design for the induction machine includes models for the casing, stator, rotor, and high-speed shaft developed within the DFIG module in the National Renewable Energy Laboratory's wind turbine sizing tool, GeneratorSE. The mechanical integrity of the machine is verified by examining stresses, structural deflections, and modal properties. The optimization results aremore » then validated using finite element analysis (FEA). The results suggest that our analytical model correlates with the FEA in some areas, such as radial deflection, differing by less than 20 percent. But the analytical model requires further development for axial deflections, torsional deflections, and stress calculations.« less
Stochastic sensing through covalent interactions
Bayley, Hagan; Shin, Seong-Ho; Luchian, Tudor; Cheley, Stephen
2013-03-26
A system and method for stochastic sensing in which the analyte covalently bonds to the sensor element or an adaptor element. If such bonding is irreversible, the bond may be broken by a chemical reagent. The sensor element may be a protein, such as the engineered P.sub.SH type or .alpha.HL protein pore. The analyte may be any reactive analyte, including chemical weapons, environmental toxins and pharmaceuticals. The analyte covalently bonds to the sensor element to produce a detectable signal. Possible signals include change in electrical current, change in force, and change in fluorescence. Detection of the signal allows identification of the analyte and determination of its concentration in a sample solution. Multiple analytes present in the same solution may be detected.
Gas and isotope chemistry of thermal features in Yellowstone National Park, Wyoming
Bergfeld, D.; Lowenstern, Jacob B.; Hunt, Andrew G.; Shanks, W.C. Pat; Evans, William
2011-01-01
This report presents 130 gas analyses and 31 related water analyses on samples collected from thermal features at Yellowstone between 2003 and 2009. An overview of previous studies of gas emissions at Yellowstone is also given. The analytical results from the present study include bulk chemistry of gases and waters and isotope values for water and steam (delta18O, dealtaD), carbon dioxide (delta13C only), methane (delta13C only), helium, neon, and argon. We include appendixes containing photos of sample sites, geographic information system (GIS) files including shape and kml formats, and analytical results in spreadsheets. In addition, we provide a lengthy discussion of previous work on gas chemistry at Yellowstone and a general discussion of the implications of our results. We demonstrate that gases collected from different thermal areas often have distinct chemical signatures, and that differences across the thermal areas are not a simple function of surface temperatures or the type of feature. Instead, gas chemistry and isotopic composition are linked to subsurface lithologies and varying contributions from magmatic, crustal, and meteoric sources.
Methods for determination of radioactive substances in water and fluvial sediments
Thatcher, Leland Lincoln; Janzer, Victor J.; Edwards, Kenneth W.
1977-01-01
Analytical methods for the determination of some of the more important components of fission or neutron activation product radioactivity and of natural radioactivity found in water are reported. The report for each analytical method includes conditions for application of the method, a summary of the method, interferences, required apparatus and reagents, analytical procedures, calculations, reporting of results, and estimation of precision. The fission product isotopes considered are cesium-137, strontium-90, and ruthenium-106. The natural radioelements and isotopes considered are uranium, lead-210, radium-226, radium-228, tritium, and carbon-14. A gross radioactivity survey method and a uranium isotope ratio method are given. When two analytical methods are in routine use for an individual isotope, both methods are reported with identification of the specific areas of application of each. Techniques for the collection and preservation of water samples to be analyzed for radioactivity are discussed.
Anisotropic Multishell Analytical Modeling of an Intervertebral Disk Subjected to Axial Compression.
Demers, Sébastien; Nadeau, Sylvie; Bouzid, Abdel-Hakim
2016-04-01
Studies on intervertebral disk (IVD) response to various loads and postures are essential to understand disk's mechanical functions and to suggest preventive and corrective actions in the workplace. The experimental and finite-element (FE) approaches are well-suited for these studies, but validating their findings is difficult, partly due to the lack of alternative methods. Analytical modeling could allow methodological triangulation and help validation of FE models. This paper presents an analytical method based on thin-shell, beam-on-elastic-foundation and composite materials theories to evaluate the stresses in the anulus fibrosus (AF) of an axisymmetric disk composed of multiple thin lamellae. Large deformations of the soft tissues are accounted for using an iterative method and the anisotropic material properties are derived from a published biaxial experiment. The results are compared to those obtained by FE modeling. The results demonstrate the capability of the analytical model to evaluate the stresses at any location of the simplified AF. It also demonstrates that anisotropy reduces stresses in the lamellae. This novel model is a preliminary step in developing valuable analytical models of IVDs, and represents a distinctive groundwork that is able to sustain future refinements. This paper suggests important features that may be included to improve model realism.
Mechelke, Matthias; Herlet, Jonathan; Benz, J Philipp; Schwarz, Wolfgang H; Zverlov, Vladimir V; Liebl, Wolfgang; Kornberger, Petra
2017-12-01
The rising importance of accurately detecting oligosaccharides in biomass hydrolyzates or as ingredients in food, such as in beverages and infant milk products, demands for the availability of tools to sensitively analyze the broad range of available oligosaccharides. Over the last decades, HPAEC-PAD has been developed into one of the major technologies for this task and represents a popular alternative to state-of-the-art LC-MS oligosaccharide analysis. This work presents the first comprehensive study which gives an overview of the separation of 38 analytes as well as enzymatic hydrolyzates of six different polysaccharides focusing on oligosaccharides. The high sensitivity of the PAD comes at cost of its stability due to recession of the gold electrode. By an in-depth analysis of the sensitivity drop over time for 35 analytes, including xylo- (XOS), arabinoxylo- (AXOS), laminari- (LOS), manno- (MOS), glucomanno- (GMOS), and cellooligosaccharides (COS), we developed an analyte-specific one-phase decay model for this effect over time. Using this model resulted in significantly improved data normalization when using an internal standard. Our results thereby allow a quantification approach which takes the inevitable and analyte-specific PAD response drop into account. Graphical abstract HPAEC-PAD analysis of oligosaccharides and determination of PAD response drop leading to an improved data normalization.
MIT CSAIL and Lincoln Laboratory Task Force Report
2016-08-01
projects have been very diverse, spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications...spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications, computing architectures and...to machine learning systems and algorithms, such as recommender systems, and “Big Data ” analytics . Advanced computing architectures broadly refer to
NASA Astrophysics Data System (ADS)
Mieles, John; Zhan, Hongbin
2012-06-01
The permeable reactive barrier (PRB) remediation technology has proven to be more cost-effective than conventional pump-and-treat systems, and has demonstrated the ability to rapidly reduce the concentrations of specific chemicals of concern (COCs) by up to several orders of magnitude in some scenarios. This study derives new steady-state analytical solutions to multispecies reactive transport in a PRB-aquifer (dual domain) system. The advantage of the dual domain model is that it can account for the potential existence of natural degradation in the aquifer, when designing the required PRB thickness. The study focuses primarily on the steady-state analytical solutions of the tetrachloroethene (PCE) serial degradation pathway and secondly on the analytical solutions of the parallel degradation pathway. The solutions in this study can also be applied to other types of dual domain systems with distinct flow and transport properties. The steady-state analytical solutions are shown to be accurate and the numerical program RT3D is selected for comparison. The results of this study are novel in that the solutions provide improved modeling flexibility including: 1) every species can have unique first-order reaction rates and unique retardation factors, and 2) daughter species can be modeled with their individual input concentrations or solely as byproducts of the parent species. The steady-state analytical solutions exhibit a limitation that occurs when interspecies reaction rate factors equal each other, which result in undefined solutions. Excel spreadsheet programs were created to facilitate prompt application of the steady-state analytical solutions, for both the serial and parallel degradation pathways.
Analytical model for screening potential CO2 repositories
Okwen, R.T.; Stewart, M.T.; Cunningham, J.A.
2011-01-01
Assessing potential repositories for geologic sequestration of carbon dioxide using numerical models can be complicated, costly, and time-consuming, especially when faced with the challenge of selecting a repository from a multitude of potential repositories. This paper presents a set of simple analytical equations (model), based on the work of previous researchers, that could be used to evaluate the suitability of candidate repositories for subsurface sequestration of carbon dioxide. We considered the injection of carbon dioxide at a constant rate into a confined saline aquifer via a fully perforated vertical injection well. The validity of the analytical model was assessed via comparison with the TOUGH2 numerical model. The metrics used in comparing the two models include (1) spatial variations in formation pressure and (2) vertically integrated brine saturation profile. The analytical model and TOUGH2 show excellent agreement in their results when similar input conditions and assumptions are applied in both. The analytical model neglects capillary pressure and the pressure dependence of fluid properties. However, simulations in TOUGH2 indicate that little error is introduced by these simplifications. Sensitivity studies indicate that the agreement between the analytical model and TOUGH2 depends strongly on (1) the residual brine saturation, (2) the difference in density between carbon dioxide and resident brine (buoyancy), and (3) the relationship between relative permeability and brine saturation. The results achieved suggest that the analytical model is valid when the relationship between relative permeability and brine saturation is linear or quasi-linear and when the irreducible saturation of brine is zero or very small. ?? 2011 Springer Science+Business Media B.V.
Guise, Andy; Horyniak, Danielle; Melo, Jason; McNeil, Ryan; Werb, Dan
2017-12-01
Understanding the experience of initiating injection drug use and its social contexts is crucial to inform efforts to prevent transitions into this mode of drug consumption and support harm reduction. We reviewed and synthesized existing qualitative scientific literature systematically to identify the socio-structural contexts for, and experiences of, the initiation of injection drug use. We searched six databases (Medline, Embase, PsychINFO, CINAHL, IBSS and SSCI) systematically, along with a manual search, including key journals and subject experts. Peer-reviewed studies were included if they qualitatively explored experiences of or socio-structural contexts for injection drug use initiation. A thematic synthesis approach was used to identify descriptive and analytical themes throughout studies. From 1731 initial results, 41 studies reporting data from 1996 participants were included. We developed eight descriptive themes and two analytical (higher-order) themes. The first analytical theme focused on injecting initiation resulting from a social process enabled and constrained by socio-structural factors: social networks and individual interactions, socialization into drug-using identities and choices enabled and constrained by social context all combine to produce processes of injection initiation. The second analytical theme addressed pathways that explore varying meanings attached to injection initiation and how they link to social context: seeking pleasure, responses to increasing tolerance to drugs, securing belonging and identity and coping with pain and trauma. Qualitative research shows that injection drug use initiation has varying and distinct meanings for individuals involved and is a dynamic process shaped by social and structural factors. Interventions should therefore respond to the socio-structural influences on injecting drug use initiation by seeking to modify the contexts for initiation, rather than solely prioritizing the reduction of individual harms through behavior change. © 2017 Society for the Study of Addiction.
Using Computer Graphics in Statistics.
ERIC Educational Resources Information Center
Kerley, Lyndell M.
1990-01-01
Described is software which allows a student to use simulation to produce analytical output as well as graphical results. The results include a frequency histogram of a selected population distribution, a frequency histogram of the distribution of the sample means, and test the normality distributions of the sample means. (KR)
Boukazouha, F; Poulin-Vittrant, G; Tran-Huu-Hue, L P; Bavencoffe, M; Boubenider, F; Rguiti, M; Lethiecq, M
2015-07-01
This article is dedicated to the study of Piezoelectric Transformers (PTs), which offer promising solutions to the increasing need for integrated power electronics modules within autonomous systems. The advantages offered by such transformers include: immunity to electromagnetic disturbances; ease of miniaturisation for example, using conventional micro fabrication processes; and enhanced performance in terms of voltage gain and power efficiency. Central to the adequate description of such transformers is the need for complex analytical modeling tools, especially if one is attempting to include combined contributions due to (i) mechanical phenomena owing to the different propagation modes which differ at the primary and secondary sides of the PT; and (ii) electrical phenomena such as the voltage gain and power efficiency, which depend on the electrical load. The present work demonstrates an original one-dimensional (1D) analytical model, dedicated to a Rosen-type PT and simulation results are successively compared against that of a three-dimensional (3D) Finite Element Analysis (COMSOL Multiphysics software) and experimental results. The Rosen-type PT studied here is based on a single layer soft PZT (P191) with corresponding dimensions 18 mm × 3 mm × 1.5 mm, which operated at the second harmonic of 176 kHz. Detailed simulational and experimental results show that the presented 1D model predicts experimental measurements to within less than 10% error of the voltage gain at the second and third resonance frequency modes. Adjustment of the analytical model parameters is found to decrease errors relative to experimental voltage gain to within 1%, whilst a 2.5% error on the output admittance magnitude at the second resonance mode were obtained. Relying on the unique assumption of one-dimensionality, the present analytical model appears as a useful tool for Rosen-type PT design and behavior understanding. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dutta, Ivy; Chowdhury, Anirban Roy; Kumbhakar, Dharmadas
2013-03-01
Using Chebyshev power series approach, accurate description for the first higher order (LP11) mode of graded index fibers having three different profile shape functions are presented in this paper and applied to predict their propagation characteristics. These characteristics include fractional power guided through the core, excitation efficiency and Petermann I and II spot sizes with their approximate analytic formulations. We have shown that where two and three Chebyshev points in LP11 mode approximation present fairly accurate results, the values based on our calculations involving four Chebyshev points match excellently with available exact numerical results.
Crew appliance computer program manual, volume 1
NASA Technical Reports Server (NTRS)
Russell, D. J.
1975-01-01
Trade studies of numerous appliance concepts for advanced spacecraft galley, personal hygiene, housekeeping, and other areas were made to determine which best satisfy the space shuttle orbiter and modular space station mission requirements. Analytical models of selected appliance concepts not currently included in the G-189A Generalized Environmental/Thermal Control and Life Support Systems (ETCLSS) Computer Program subroutine library were developed. The new appliance subroutines are given along with complete analytical model descriptions, solution methods, user's input instructions, and validation run results. The appliance components modeled were integrated with G-189A ETCLSS models for shuttle orbiter and modular space station, and results from computer runs of these systems are presented.
Ficklin, W.H.; Nowlan, G.A.; Preston, D.J.
1983-01-01
Water samples were collected in the vicinity of Jackman, Maine as a part of the study of the relationship of dissolved constituents in water to the sediments subjacent to the water. Each sample was analyzed for specific conductance, alkalinity, acidity, pH, fluoride, chloride, sulfate, phosphate, nitrate, sodium, potassium, calcium, magnesium, and silica. Trace elements determined were copper, zinc, molybdenum, lead, iron, manganese, arsenic, cobalt, nickel, and strontium. The longitude and latitude of each sample location and a sample site map are included in the report as well as a table of the analytical results.
An analytical and experimental evaluation of shadow shields and their support members
NASA Technical Reports Server (NTRS)
Stochl, R. J.; Boyle, R. J.
1972-01-01
Experimental tests were performed on a model shadow shield thermal protection system to examine the effect of certain configuration variables. The experimental results were used to verify the ability of an analytical program to predict the shadow shield performance including the shield-support interaction. In general, the analysis (assuming diffuse surfaces) agreed well with the experimental support temperature profiles. The agreement for the shield profiles was not as good. The results demonstrated: (1) shadow shields can be effective in reducing the heat transfer into cryogenic propellant tanks, and (2) the conductive heat transfer through supports can be reduced by selective surface coatings.
Code of Federal Regulations, 2010 CFR
2010-04-01
... (IND) Application, any information obtained during the inspection of an extramural facility having a... Administration does not consider results of validation studies of analytical and assay methods and control...
Let's Go Off the Grid: Subsurface Flow Modeling With Analytic Elements
NASA Astrophysics Data System (ADS)
Bakker, M.
2017-12-01
Subsurface flow modeling with analytic elements has the major advantage that no grid or time stepping are needed. Analytic element formulations exist for steady state and transient flow in layered aquifers and unsaturated flow in the vadose zone. Analytic element models are vector-based and consist of points, lines and curves that represent specific features in the subsurface. Recent advances allow for the simulation of partially penetrating wells and multi-aquifer wells, including skin effect and wellbore storage, horizontal wells of poly-line shape including skin effect, sharp changes in subsurface properties, and surface water features with leaky beds. Input files for analytic element models are simple, short and readable, and can easily be generated from, for example, GIS databases. Future plans include the incorporation of analytic element in parts of grid-based models where additional detail is needed. This presentation will give an overview of advanced flow features that can be modeled, many of which are implemented in free and open-source software.
Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation
Yang, Yong; Liu, Yongzhong; Yu, Bo; Ding, Tian
2016-06-01
Volatile contaminants may migrate with carbon dioxide (CO2) injection or leakage in subsurface formations, which leads to the risk of the CO2 storage and the ecological environment. This study aims to develop an analytical model that could predict the contaminant migration process induced by CO2 storage. The analytical model with two moving boundaries is obtained through the simplification of the fully coupled model for the CO2-aqueous phase -stagnant phase displacement system. The analytical solutions are confirmed and assessed through the comparison with the numerical simulations of the fully coupled model. Then, some key variables in the analytical solutions, including the critical time, the locations of the dual moving boundaries and the advance velocity, are discussed to present the characteristics of contaminant migration in the multi-phase displacement system. The results show that these key variables are determined by four dimensionless numbers, Pe, RD, Sh and RF, which represent the effects of the convection, the dispersion, the interphase mass transfer and the retention factor of contaminant, respectively. The proposed analytical solutions could be used for tracking the migration of the injected CO2 and the contaminants in subsurface formations, and also provide an analytical tool for other solute transport in multi-phase displacement system. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Parvathi, S. P.; Ramanan, R. V.
2018-06-01
An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.
Stark, Peter C [Los Alamos, NM; Zurek, Eduardo [Barranquilla, CO; Wheat, Jeffrey V [Fort Walton Beach, FL; Dunbar, John M [Santa Fe, NM; Olivares, Jose A [Los Alamos, NM; Garcia-Rubio, Luis H [Temple Terrace, FL; Ward, Michael D [Los Alamos, NM
2011-07-26
There is provided a method and device for remote sampling, preparation and optical interrogation of a sample using light scattering and light absorption methods. The portable device is a filtration-based device that removes interfering background particle material from the sample matrix by segregating or filtering the chosen analyte from the sample solution or matrix while allowing the interfering background particles to be pumped out of the device. The segregated analyte is then suspended in a diluent for analysis. The device is capable of calculating an initial concentration of the analyte, as well as diluting the analyte such that reliable optical measurements can be made. Suitable analytes include cells, microorganisms, bioparticles, pathogens and diseases. Sample matrixes include biological fluids such as blood and urine, as well as environmental samples including waste water.
Calculating cost savings in utilization management.
MacMillan, Donna
2014-01-01
A major motivation for managing the utilization of laboratory testing is to reduce the cost of medical care. For this reason it is important to understand the basic principles of cost accounting in the clinical laboratory. The process of laboratory testing includes three distinct components termed the pre-analytic, analytic and post-analytic phases. Utilization management efforts may impact the cost structure of these three phases in different ways depending on the specific details of the initiative. Estimates of cost savings resulting from utilization management programs reported in the literature have often been fundamentally flawed due to a failure to understand basic concepts such as the difference between laboratory costs versus charges and the impact of reducing laboratory test volumes on the average versus marginal cost structure in the laboratory. This article will provide an overview of basic cost accounting principles in the clinical laboratory including both job order and process cost accounting. Specific examples will be presented to illustrate these concepts in various different scenarios. © 2013.
NASA Astrophysics Data System (ADS)
Mehta, Ajit Kumar; Mishra, Chandra Kant; Varma, Vijay; Ajith, Parameswaran
2017-12-01
We present an analytical waveform family describing gravitational waves (GWs) from the inspiral, merger, and ringdown of nonspinning black-hole binaries including the effect of several nonquadrupole modes [(ℓ=2 ,m =±1 ),(ℓ=3 ,m =±3 ),(ℓ=4 ,m =±4 ) apart from (ℓ=2 ,m =±2 )]. We first construct spin-weighted spherical harmonics modes of hybrid waveforms by matching numerical-relativity simulations (with mass ratio 1-10) describing the late inspiral, merger, and ringdown of the binary with post-Newtonian/effective-one-body waveforms describing the early inspiral. An analytical waveform family is constructed in frequency domain by modeling the Fourier transform of the hybrid waveforms making use of analytical functions inspired by perturbative calculations. The resulting highly accurate, ready-to-use waveforms are highly faithful (unfaithfulness ≃10-4- 10-2 ) for observation of GWs from nonspinning black-hole binaries and are extremely inexpensive to generate.
2013-01-01
Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate them into a form that is credible and useful to healthcare practitioners, educators and policy-makers. PMID:23721181
Perspectives on bioanalytical mass spectrometry and automation in drug discovery.
Janiszewski, John S; Liston, Theodore E; Cole, Mark J
2008-11-01
The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.
An overview of key technology thrusts at Bell Helicopter Textron
NASA Technical Reports Server (NTRS)
Harse, James H.; Yen, Jing G.; Taylor, Rodney S.
1988-01-01
Insight is provided into several key technologies at Bell. Specific topics include the results of ongoing research and development in advanced rotors, methodology development, and new configurations. The discussion on advanced rotors highlight developments on the composite, bearingless rotor, including the development and testing of full scale flight hardware as well as some of the design support analyses and verification testing. The discussion on methodology development concentrates on analytical development in aeromechanics, including correlation studies and design application. New configurations, presents the results of some advanced configuration studies including hardware development.
Analyticity without Differentiability
ERIC Educational Resources Information Center
Kirillova, Evgenia; Spindler, Karlheinz
2008-01-01
In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…
WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL
The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...
NASA Astrophysics Data System (ADS)
Nunes, Josane C.
1991-02-01
This work quantifies the changes effected in electron absorbed dose to a soft-tissue equivalent medium when part of this medium is replaced by a material that is not soft -tissue equivalent. That is, heterogeneous dosimetry is addressed. Radionuclides which emit beta particles are the electron sources of primary interest. They are used in brachytherapy and in nuclear medicine: for example, beta -ray applicators made with strontium-90 are employed in certain ophthalmic treatments and iodine-131 is used to test thyroid function. More recent medical procedures under development and which involve beta radionuclides include radioimmunotherapy and radiation synovectomy; the first is a cancer modality and the second deals with the treatment of rheumatoid arthritis. In addition, the possibility of skin surface contamination exists whenever there is handling of radioactive material. Determination of absorbed doses in the examples of the preceding paragraph requires considering boundaries of interfaces. Whilst the Monte Carlo method can be applied to boundary calculations, for routine work such as in clinical situations, or in other circumstances where doses need to be determined quickly, analytical dosimetry would be invaluable. Unfortunately, few analytical methods for boundary beta dosimetry exist. Furthermore, the accuracy of results from both Monte Carlo and analytical methods has to be assessed. Although restricted to one radionuclide, phosphorus -32, the experimental data obtained in this work serve several purposes, one of which is to provide standards against which calculated results can be tested. The experimental data also contribute to the relatively sparse set of published boundary dosimetry data. At the same time, they may be useful in developing analytical boundary dosimetry methodology. The first application of the experimental data is demonstrated. Results from two Monte Carlo codes and two analytical methods, which were developed elsewhere, are compared with experimental data. Monte Carlo results compare satisfactory with experimental results for the boundaries considered. The agreement with experimental results for air interfaces is of particular interest because of discrepancies reported previously by another investigator who used data obtained from a different experimental technique. Results from one of the analytical methods differ significantly from the experimental data obtained here. The second analytical method provided data which approximate experimental results to within 30%. This is encouraging but it remains to be determined whether this method performs equally well for other source energies.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
An analytic cosmology solution of Poincaré gauge gravity
NASA Astrophysics Data System (ADS)
Lu, Jianbo; Chee, Guoying
2016-06-01
A cosmology of Poincaré gauge theory is developed. An analytic solution is obtained. The calculation results agree with observation data and can be compared with the ΛCDM model. The cosmological constant puzzle is the coincidence and fine tuning problem are solved naturally at the same time. The cosmological constant turns out to be the intrinsic torsion and curvature of the vacuum universe, and is derived from the theory naturally rather than added artificially. The dark energy originates from geometry, includes the cosmological constant but differs from it. The analytic expression of the state equations of the dark energy and the density parameters of the matter and the geometric dark energy are derived. The full equations of linear cosmological perturbations and the solutions are obtained.
Quantum decay model with exact explicit analytical solution
NASA Astrophysics Data System (ADS)
Marchewka, Avi; Granot, Er'El
2009-01-01
A simple decay model is introduced. The model comprises a point potential well, which experiences an abrupt change. Due to the temporal variation, the initial quantum state can either escape from the well or stay localized as a new bound state. The model allows for an exact analytical solution while having the necessary features of a decay process. The results show that the decay is never exponential, as classical dynamics predicts. Moreover, at short times the decay has a fractional power law, which differs from perturbation quantum method predictions. At long times the decay includes oscillations with an envelope that decays algebraically. This is a model where the final state can be either continuous or localized, and that has an exact analytical solution.
Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.
Ozarda, Yesim; Ichihara, Kiyoshi; Aslan, Diler; Aybek, Hulya; Ari, Zeki; Taneli, Fatma; Coker, Canan; Akan, Pinar; Sisman, Ali Riza; Bahceci, Onur; Sezgin, Nurzen; Demir, Meltem; Yucel, Gultekin; Akbas, Halide; Ozdem, Sebahat; Polat, Gurbuz; Erbagci, Ayse Binnur; Orkmez, Mustafa; Mete, Nuriye; Evliyaoglu, Osman; Kiyici, Aysel; Vatansev, Husamettin; Ozturk, Bahadir; Yucel, Dogan; Kayaalp, Damla; Dogan, Kubra; Pinar, Asli; Gurbilek, Mehmet; Cetinkaya, Cigdem Damla; Akin, Okhan; Serdar, Muhittin; Kurt, Ismail; Erdinc, Selda; Kadicesme, Ozgur; Ilhan, Necip; Atali, Dilek Sadak; Bakan, Ebubekir; Polat, Harun; Noyan, Tevfik; Can, Murat; Bedir, Abdulkerim; Okuyucu, Ali; Deger, Orhan; Agac, Suret; Ademoglu, Evin; Kaya, Ayşem; Nogay, Turkan; Eren, Nezaket; Dirican, Melahat; Tuncer, GulOzlem; Aykus, Mehmet; Gunes, Yeliz; Ozmen, Sevda Unalli; Kawano, Reo; Tezcan, Sehavet; Demirpence, Ozlem; Degirmen, Elif
2014-12-01
A nationwide multicenter study was organized to establish reference intervals (RIs) in the Turkish population for 25 commonly tested biochemical analytes and to explore sources of variation in reference values, including regionality. Blood samples were collected nationwide in 28 laboratories from the seven regions (≥400 samples/region, 3066 in all). The sera were collectively analyzed in Uludag University in Bursa using Abbott reagents and analyzer. Reference materials were used for standardization of test results. After secondary exclusion using the latent abnormal values exclusion method, RIs were derived by a parametric method employing the modified Box-Cox formula and compared with the RIs by the non-parametric method. Three-level nested ANOVA was used to evaluate variations among sexes, ages and regions. Associations between test results and age, body mass index (BMI) and region were determined by multiple regression analysis (MRA). By ANOVA, differences of reference values among seven regions were significant in none of the 25 analytes. Significant sex-related and age-related differences were observed for 10 and seven analytes, respectively. MRA revealed BMI-related changes in results for uric acid, glucose, triglycerides, high-density lipoprotein (HDL)-cholesterol, alanine aminotransferase, and γ-glutamyltransferase. Their RIs were thus derived by applying stricter criteria excluding individuals with BMI >28 kg/m2. Ranges of RIs by non-parametric method were wider than those by parametric method especially for those analytes affected by BMI. With the lack of regional differences and the well-standardized status of test results, the RIs derived from this nationwide study can be used for the entire Turkish population.
Multiaxis sensing using metal organic frameworks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talin, Albert Alec; Allendorf, Mark D.; Leonard, Francois
2017-01-17
A sensor device including a sensor substrate; and a thin film comprising a porous metal organic framework (MOF) on the substrate that presents more than one transduction mechanism when exposed to an analyte. A method including exposing a porous metal organic framework (MOF) on a substrate to an analyte; and identifying more than one transduction mechanism in response to the exposure to the analyte.
Chen, Jun; Quan, Wenting; Cui, Tingwei
2015-01-01
In this study, two sample semi-analytical algorithms and one new unified multi-band semi-analytical algorithm (UMSA) for estimating chlorophyll-a (Chla) concentration were constructed by specifying optimal wavelengths. The three sample semi-analytical algorithms, including the three-band semi-analytical algorithm (TSA), four-band semi-analytical algorithm (FSA), and UMSA algorithm, were calibrated and validated by the dataset collected in the Yellow River Estuary between September 1 and 10, 2009. By comparing of the accuracy of assessment of TSA, FSA, and UMSA algorithms, it was found that the UMSA algorithm had a superior performance in comparison with the two other algorithms, TSA and FSA. Using the UMSA algorithm in retrieving Chla concentration in the Yellow River Estuary decreased by 25.54% NRMSE (normalized root mean square error) when compared with the FSA algorithm, and 29.66% NRMSE in comparison with the TSA algorithm. These are very significant improvements upon previous methods. Additionally, the study revealed that the TSA and FSA algorithms are merely more specific forms of the UMSA algorithm. Owing to the special form of the UMSA algorithm, if the same bands were used for both the TSA and UMSA algorithms or FSA and UMSA algorithms, the UMSA algorithm would theoretically produce superior results in comparison with the TSA and FSA algorithms. Thus, good results may also be produced if the UMSA algorithm were to be applied for predicting Chla concentration for datasets of Gitelson et al. (2008) and Le et al. (2009).
The "Anatomy" of a Performance-Enhancing Drug Test in Sports
ERIC Educational Resources Information Center
Werner, T. C.
2012-01-01
The components of a performance-enhancing drug (PED) test in sports include sample selection, collection, establishing sample integrity, sample pretreatment, analyte detection, data evaluation, reporting results, and action taken based on the result. Undergraduate curricula generally focus on the detection and evaluation steps of an analytical…
Novel immunoassay formats for integrated microfluidic circuits: diffusion immunoassays (DIA)
NASA Astrophysics Data System (ADS)
Weigl, Bernhard H.; Hatch, Anson; Kamholz, Andrew E.; Yager, Paul
2000-03-01
Novel designs of integrated fluidic microchips allow separations, chemical reactions, and calibration-free analytical measurements to be performed directly in very small quantities of complex samples such as whole blood and contaminated environmental samples. This technology lends itself to applications such as clinical diagnostics, including tumor marker screening, and environmental sensing in remote locations. Lab-on-a-Chip based systems offer many *advantages over traditional analytical devices: They consume extremely low volumes of both samples and reagents. Each chip is inexpensive and small. The sampling-to-result time is extremely short. They perform all analytical functions, including sampling, sample pretreatment, separation, dilution, and mixing steps, chemical reactions, and detection in an integrated microfluidic circuit. Lab-on-a-Chip systems enable the design of small, portable, rugged, low-cost, easy to use, yet extremely versatile and capable diagnostic instruments. In addition, fluids flowing in microchannels exhibit unique characteristics ('microfluidics'), which allow the design of analytical devices and assay formats that would not function on a macroscale. Existing Lab-on-a-chip technologies work very well for highly predictable and homogeneous samples common in genetic testing and drug discovery processes. One of the biggest challenges for current Labs-on-a-chip, however, is to perform analysis in the presence of the complexity and heterogeneity of actual samples such as whole blood or contaminated environmental samples. Micronics has developed a variety of Lab-on-a-Chip assays that can overcome those shortcomings. We will now present various types of novel Lab- on-a-Chip-based immunoassays, including the so-called Diffusion Immunoassays (DIA) that are based on the competitive laminar diffusion of analyte molecules and tracer molecules into a region of the chip containing antibodies that target the analyte molecules. Advantages of this technique are a reduction in reagents, higher sensitivity, minimal preparation of complex samples such as blood, real-time calibration, and extremely rapid analysis.
Cross reactive arrays of three-way junction sensors for steroid determination
NASA Technical Reports Server (NTRS)
Stojanovic, Milan N. (Inventor); Nikic, Dragan B. (Inventor); Landry, Donald (Inventor)
2008-01-01
This invention provides analyte sensitive oligonucleotide compositions for detecting and analyzing analytes in solution, including complex solutions using cross reactive arrays of analyte sensitive oligonucleotide compositions.
Promoting clinical and laboratory interaction by harmonization.
Plebani, Mario; Panteghini, Mauro
2014-05-15
The lack of interchangeable results in current practice among clinical laboratories has underpinned greater attention to standardization and harmonization projects. Although the focus was mainly on the standardization and harmonization of measurement procedures and their results, the scope of harmonization goes beyond method and analytical results: it includes all other aspects of laboratory testing, including terminology and units, report formats, reference limits and decision thresholds, as well as test profiles and criteria for the interpretation of results. In particular, as evidence collected in last decades demonstrates that pre-pre- and post-post-analytical steps are more vulnerable to errors, harmonization initiatives should be performed to improve procedures and processes at the laboratory-clinical interface. Managing upstream demand, down-stream interpretation of laboratory results, and subsequent appropriate action through close relationships between laboratorians and clinicians remains a crucial issue of the laboratory testing process. Therefore, initiatives to improve test demand management from one hand and to harmonize procedures to improve physicians' acknowledgment of laboratory data and their interpretation from the other hand are needed in order to assure quality and safety in the total testing process. © 2013.
Study of solid state photomultiplier
NASA Technical Reports Server (NTRS)
Hays, K. M.; Laviolette, R. A.
1987-01-01
Available solid state photomultiplier (SSPM) detectors were tested under low-background, low temperature conditions to determine the conditions producing optimal sensitivity in a space-based astronomy system such as a liquid cooled helium telescope in orbit. Detector temperatures varied between 6 and 9 K, with background flux ranging from 10 to the 13th power to less than 10 to the 6th power photons/square cm-s. Measured parameters included quantum efficiency, noise, dark current, and spectral response. Experimental data were reduced, analyzed, and combined with existing data to build the SSPM data base included herein. The results were compared to analytical models of SSPM performance where appropriate models existed. Analytical models presented here were developed to be as consistent with the data base as practicable. Significant differences between the theory and data are described. Some models were developed or updated as a result of this study.
Non-linear effects in finite amplitude wave propagation through ducts and nozzles
NASA Technical Reports Server (NTRS)
Salikuddin, M.; Brown, W. H.
1986-01-01
In this paper an extensive study of non-linear effects in finite amplitude wave propagation through ducts and nozzles is summarized. Some results from earlier studies are included to illustrate the non-linear effects on the transmission characteristics of duct and nozzle terminations. Investigaiations, both experimental and analytical, were carried out to determine the magnitudes of the effects for high intensity pulse propagation. The results derived from these investigations are presented in this paper. They include the effect of the sound intensity on the acoustic characteristics of duct and nozzle terminations, the extent of the non-linearities in the propagation of high intensity impulsive sound inside the duct and out into free field, the acoustic energy dissipation mechanism at a termination as shown by flow visualizations, and quantitative evaluations by experimental and analytical means of the influence of the intensity of a sound pulse on the dissipation of its acoustic power.
An accurate analytic description of neutrino oscillations in matter
NASA Astrophysics Data System (ADS)
Akhmedov, E. Kh.; Niro, Viviana
2008-12-01
A simple closed-form analytic expression for the probability of two-flavour neutrino oscillations in a matter with an arbitrary density profile is derived. Our formula is based on a perturbative expansion and allows an easy calculation of higher order corrections. The expansion parameter is small when the density changes relatively slowly along the neutrino path and/or neutrino energy is not very close to the Mikheyev-Smirnov-Wolfenstein (MSW) resonance energy. Our approximation is not equivalent to the adiabatic approximation and actually goes beyond it. We demonstrate the validity of our results using a few model density profiles, including the PREM density profile of the Earth. It is shown that by combining the results obtained from the expansions valid below and above the MSW resonance one can obtain a very good description of neutrino oscillations in matter in the entire energy range, including the resonance region.
Recent Methodology in Ginseng Analysis
Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill
2012-01-01
As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112
Cunningham, Virginia L; D'Aco, Vincent J; Pfeiffer, Danielle; Anderson, Paul D; Buzby, Mary E; Hannah, Robert E; Jahnke, James; Parke, Neil J
2012-07-01
This article presents the capability expansion of the PhATE™ (pharmaceutical assessment and transport evaluation) model to predict concentrations of trace organics in sludges and biosolids from municipal wastewater treatment plants (WWTPs). PhATE was originally developed as an empirical model to estimate potential concentrations of active pharmaceutical ingredients (APIs) in US surface and drinking waters that could result from patient use of medicines. However, many compounds, including pharmaceuticals, are not completely transformed in WWTPs and remain in biosolids that may be applied to land as a soil amendment. This practice leads to concerns about potential exposures of people who may come into contact with amended soils and also about potential effects to plants and animals living in or contacting such soils. The model estimates the mass of API in WWTP influent based on the population served, the API per capita use, and the potential loss of the compound associated with human use (e.g., metabolism). The mass of API on the treated biosolids is then estimated based on partitioning to primary and secondary solids, potential loss due to biodegradation in secondary treatment (e.g., activated sludge), and potential loss during sludge treatment (e.g., aerobic digestion, anaerobic digestion, composting). Simulations using 2 surrogate compounds show that predicted environmental concentrations (PECs) generated by PhATE are in very good agreement with measured concentrations, i.e., well within 1 order of magnitude. Model simulations were then carried out for 18 APIs representing a broad range of chemical and use characteristics. These simulations yielded 4 categories of results: 1) PECs are in good agreement with measured data for 9 compounds with high analytical detection frequencies, 2) PECs are greater than measured data for 3 compounds with high analytical detection frequencies, possibly as a result of as yet unidentified depletion mechanisms, 3) PECs are less than analytical reporting limits for 5 compounds with low analytical detection frequencies, and 4) the PEC is greater than the analytical method reporting limit for 1 compound with a low analytical detection frequency, possibly again as a result of insufficient depletion data. Overall, these results demonstrate that PhATE has the potential to be a very useful tool in the evaluation of APIs in biosolids. Possible applications include: prioritizing APIs for assessment even in the absence of analytical methods; evaluating sludge processing scenarios to explore potential mitigation approaches; using in risk assessments; and developing realistic nationwide concentrations, because PECs can be represented as a cumulative probability distribution. Finally, comparison of PECs to measured concentrations can also be used to identify the need for fate studies of compounds of interest in biosolids. Copyright © 2011 SETAC.
Strategic, Analytic and Operational Domains of Information Management.
ERIC Educational Resources Information Center
Diener, Richard AV
1992-01-01
Discussion of information management focuses on three main areas of activities and their interrelationship: (1) strategic, including establishing frameworks and principles of operations; (2) analytic, or research elements, including user needs assessment, data gathering, and data analysis; and (3) operational activities, including reference…
Updates to Selected Analytical Methods for Environmental Remediation and Recovery (SAM)
View information on the latest updates to methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM), including the newest recommended methods and publications.
Cooper, Jason D.; Tomasik, Jakub; Bahn, Sabine; Aerts, Joeri L.; Osterhaus, Albert D. M. E.; Gruters, Rob A.; Andeweg, Arno C.
2018-01-01
Objectives To characterize the host response to dendritic cell-based immunotherapy and subsequent combined antiretroviral therapy (cART) interruption in HIV-1-infected individuals at the plasma protein level. Design An autologous dendritic cell (DC) therapeutic vaccine was administered to HIV-infected individuals, stable on cART. The effect of vaccination was evaluated at the plasma protein level during the period preceding cART interruption, during analytical therapy interruption and at viral reactivation. Healthy controls and post-exposure prophylactically treated healthy individuals were included as controls. Methods Plasma marker (‘analyte’) levels including cytokines, chemokines, growth factors, and hormones were measured in trial participants and control plasma samples using a multiplex immunoassay. Analyte levels were analysed using principle component analysis, cluster analysis and limma. Blood neutrophil counts were analysed using linear regression. Results Plasma analyte levels of HIV-infected individuals are markedly different from those of healthy controls and HIV-negative individuals receiving post-exposure prophylaxis. Viral reactivation following cART interruption also affects multiple analytes, but cART interruption itself only has only a minor effect. We find that Thyroxine-Binding Globulin (TBG) levels and late-stage neutrophil numbers correlate with the time off cART after DC vaccination. Furthermore, analysis shows that cART alters several regulators of blood glucose levels, including C-peptide, chromogranin-A and leptin. HIV reactivation is associated with the upregulation of CXCR3 ligands. Conclusions Chronic HIV infection leads to a change in multiple plasma analyte levels, as does virus reactivation after cART interruption. Furthermore, we find evidence for the involvement of TBG and neutrophils in the response to DC-vaccination in the setting of HIV-infection. PMID:29389978
Kling, Maximilian; Seyring, Nicole; Tzanova, Polia
2016-09-01
Economic instruments provide significant potential for countries with low municipal waste management performance in decreasing landfill rates and increasing recycling rates for municipal waste. In this research, strengths and weaknesses of landfill tax, pay-as-you-throw charging systems, deposit-refund systems and extended producer responsibility schemes are compared, focusing on conditions in countries with low waste management performance. In order to prioritise instruments for implementation in these countries, the analytic hierarchy process is applied using results of a literature review as input for the comparison. The assessment reveals that pay-as-you-throw is the most preferable instrument when utility-related criteria are regarded (wb = 0.35; analytic hierarchy process distributive mode; absolute comparison) mainly owing to its waste prevention effect, closely followed by landfill tax (wb = 0.32). Deposit-refund systems (wb = 0.17) and extended producer responsibility (wb = 0.16) rank third and fourth, with marginal differences owing to their similar nature. When cost-related criteria are additionally included in the comparison, landfill tax seems to provide the highest utility-cost ratio. Data from literature concerning cost (contrary to utility-related criteria) is currently not sufficiently available for a robust ranking according to the utility-cost ratio. In general, the analytic hierarchy process is seen as a suitable method for assessing economic instruments in waste management. Independent from the chosen analytic hierarchy process mode, results provide valuable indications for policy-makers on the application of economic instruments, as well as on their specific strengths and weaknesses. Nevertheless, the instruments need to be put in the country-specific context along with the results of this analytic hierarchy process application before practical decisions are made. © The Author(s) 2016.
Hey, Jody; Nielsen, Rasmus
2007-01-01
In 1988, Felsenstein described a framework for assessing the likelihood of a genetic data set in which all of the possible genealogical histories of the data are considered, each in proportion to their probability. Although not analytically solvable, several approaches, including Markov chain Monte Carlo methods, have been developed to find approximate solutions. Here, we describe an approach in which Markov chain Monte Carlo simulations are used to integrate over the space of genealogies, whereas other parameters are integrated out analytically. The result is an approximation to the full joint posterior density of the model parameters. For many purposes, this function can be treated as a likelihood, thereby permitting likelihood-based analyses, including likelihood ratio tests of nested models. Several examples, including an application to the divergence of chimpanzee subspecies, are provided. PMID:17301231
Application of conformal transformation to elliptic geometry for electric impedance tomography.
Yilmaz, Atila; Akdoğan, Kurtuluş E; Saka, Birsen
2008-03-01
Electrical impedance tomography (EIT) is a medical imaging modality that is used to compute the conductivity distribution through measurements on the cross-section of a body part. An elliptic geometry model, which defines a more general frame, ensures more accurate results in reconstruction and assessment of inhomogeneities inside. This study provides a link between the analytical solutions defined in circular and elliptical geometries on the basis of the computation of conformal mapping. The results defined as voltage distributions for the homogeneous case in elliptic and circular geometries have been compared with those obtained by the use of conformal transformation between elliptical and well-known circular geometry. The study also includes the results of the finite element method (FEM) as another approach for more complex geometries for the comparison of performance in other complex scenarios for eccentric inhomogeneities. The study emphasizes that for the elliptic case the analytical solution with conformal transformation is a reliable and useful tool for developing insight into more complex forms including eccentric inhomogeneities.
Risser, Dennis W.; Williams, John H.; Hand, Kristen L.; Behr, Rose-Anna; Markowski, Antonette K.
2013-01-01
Open-File Miscellaneous Investigation 13–01.1 presents the results of geohydrologic investigations on a 1,664-foot-deep core hole drilled in the Bradford County part of the Gleason 7.5-minute quadrangle in north-central Pennsylvania. In the text, the authors discuss their methods of investigation, summarize physical and analytical results, and place those results in context. Four appendices include (1) a full description of the core in an Excel worksheet; (2) water-quality and core-isotope analytical results in Excel workbooks; (3) geophysical logs in LAS and PDF files, and an Excel workbook containing attitudes of bedding and fractures calculated from televiewer logs; and (4) MP4 clips from the downhole video at selected horizons.
Sensor for detecting and differentiating chemical analytes
Yi, Dechang [Metuchen, NJ; Senesac, Lawrence R [Knoxville, TN; Thundat, Thomas G [Knoxville, TN
2011-07-05
A sensor for detecting and differentiating chemical analytes includes a microscale body having a first end and a second end and a surface between the ends for adsorbing a chemical analyte. The surface includes at least one conductive heating track for heating the chemical analyte and also a conductive response track, which is electrically isolated from the heating track, for producing a thermal response signal from the chemical analyte. The heating track is electrically connected with a voltage source and the response track is electrically connected with a signal recorder. The microscale body is restrained at the first end and the second end and is substantially isolated from its surroundings therebetween, thus having a bridge configuration.
Lab-on-chip systems for integrated bioanalyses
Madaboosi, Narayanan; Soares, Ruben R.G.; Fernandes, João Tiago S.; Novo, Pedro; Moulas, Geraud; Chu, Virginia
2016-01-01
Biomolecular detection systems based on microfluidics are often called lab-on-chip systems. To fully benefit from the miniaturization resulting from microfluidics, one aims to develop ‘from sample-to-answer’ analytical systems, in which the input is a raw or minimally processed biological, food/feed or environmental sample and the output is a quantitative or qualitative assessment of one or more analytes of interest. In general, such systems will require the integration of several steps or operations to perform their function. This review will discuss these stages of operation, including fluidic handling, which assures that the desired fluid arrives at a specific location at the right time and under the appropriate flow conditions; molecular recognition, which allows the capture of specific analytes at precise locations on the chip; transduction of the molecular recognition event into a measurable signal; sample preparation upstream from analyte capture; and signal amplification procedures to increase sensitivity. Seamless integration of the different stages is required to achieve a point-of-care/point-of-use lab-on-chip device that allows analyte detection at the relevant sensitivity ranges, with a competitive analysis time and cost. PMID:27365042
NASA Astrophysics Data System (ADS)
Boss, Alan P.
2009-03-01
The disk instability mechanism for giant planet formation is based on the formation of clumps in a marginally gravitationally unstable protoplanetary disk, which must lose thermal energy through a combination of convection and radiative cooling if they are to survive and contract to become giant protoplanets. While there is good observational support for forming at least some giant planets by disk instability, the mechanism has become theoretically contentious, with different three-dimensional radiative hydrodynamics codes often yielding different results. Rigorous code testing is required to make further progress. Here we present two new analytical solutions for radiative transfer in spherical coordinates, suitable for testing the code employed in all of the Boss disk instability calculations. The testing shows that the Boss code radiative transfer routines do an excellent job of relaxing to and maintaining the analytical results for the radial temperature and radiative flux profiles for a spherical cloud with high or moderate optical depths, including the transition from optically thick to optically thin regions. These radial test results are independent of whether the Eddington approximation, diffusion approximation, or flux-limited diffusion approximation routines are employed. The Boss code does an equally excellent job of relaxing to and maintaining the analytical results for the vertical (θ) temperature and radiative flux profiles for a disk with a height proportional to the radial distance. These tests strongly support the disk instability mechanism for forming giant planets.
High heating rate thermal desorption for molecular surface sampling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ovchinnikova, Olga S.; Van Berkel, Gary J.
2016-03-29
A method for analyzing a sample having at least one analyte includes the step of heating the sample at a rate of at least 10.sup.6 K/s to thermally desorb at least one analyte from the sample. The desorbed analyte is collected. The analyte can then be analyzed.
Park, Yu Jin; Rim, John Hoon; Yim, Jisook; Lee, Sang-Guk; Kim, Jeong-Ho
2017-08-01
The use of iodinated contrast media has grown in popularity in the past two decades, but relatively little attention has been paid to the possible interferential effects of contrast media on laboratory test results. Herein, we investigate medical contrast media interference with routine chemistry results obtained by three automated chemistry analyzers. Ten levels of pooled serum were used in the study. Two types of medical contrast media [Iopamiro (iopamidol) and Omnipaque (iohexol)] were evaluated. To evaluate the dose-dependent effects of the contrast media, iopamidol and iohexol were spiked separately into aliquots of serum for final concentrations of 1.8%, 3.6%, 5.5%, 7.3%, and 9.1%. The 28 analytes included in the routine chemistry panel were measured by using Hitachi 7600, AU5800, and Cobas c702 analyzers. We calculated the delta percentage difference (DPD) between the samples and the control, and examined dose-dependent trends. When the mean DPD values were compared with the reference cut-off criteria, the only uniformly interferential effect observed for all analyzers was in total protein with iopamidol. Two additional analytes that showed trends toward interferential effects only in few analyzers and exceeded the limits of the allowable error were the serum iron and the total CO 2 . The other combinations of analyzer and contrast showed no consistent dose-dependent propensity for change in any analyte level. Our study suggests that many of the analytes included in routine chemistry results, except total protein and serum iron, are not significantly affected by iopamidol and iohexol. These results suggest that it would be beneficial to apply a flexible medical evaluation process for patients requiring both laboratory tests and imaging studies, minimizing the need for strict regulations for sequential tests. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
He, Qili; Su, Guoming; Liu, Keliang; Zhang, Fangcheng; Jiang, Yong; Gao, Jun; Liu, Lida; Jiang, Zhongren; Jin, Minwu; Xie, Huiping
2017-01-01
Hematologic and biochemical analytes of Sprague-Dawley rats are commonly used to determine effects that were induced by treatment and to evaluate organ dysfunction in toxicological safety assessments, but reference intervals have not been well established for these analytes. Reference intervals as presently defined for these analytes in Sprague-Dawley rats have not used internationally recommended statistical method nor stratified by sex. Thus, we aimed to establish sex-specific reference intervals for hematologic and biochemical parameters in Sprague-Dawley rats according to Clinical and Laboratory Standards Institute C28-A3 and American Society for Veterinary Clinical Pathology guideline. Hematology and biochemistry blood samples were collected from 500 healthy Sprague-Dawley rats (250 males and 250 females) in the control groups. We measured 24 hematologic analytes with the Sysmex XT-2100i analyzer, 9 biochemical analytes with the Olympus AU400 analyzer. We then determined statistically relevant sex partitions and calculated reference intervals, including corresponding 90% confidence intervals, using nonparametric rank percentile method. We observed that most hematologic and biochemical analytes of Sprague-Dawley rats were significantly influenced by sex. Males had higher hemoglobin, hematocrit, red blood cell count, red cell distribution width, mean corpuscular volume, mean corpuscular hemoglobin, white blood cell count, neutrophils, lymphocytes, monocytes, percentage of neutrophils, percentage of monocytes, alanine aminotransferase, aspartate aminotransferase, and triglycerides compared to females. Females had higher mean corpuscular hemoglobin concentration, plateletcrit, platelet count, eosinophils, percentage of lymphocytes, percentage of eosinophils, creatinine, glucose, total cholesterol and urea compared to males. Sex partition was required for most hematologic and biochemical analytes in Sprague-Dawley rats. We established sex-specific reference intervals, including corresponding 90% confidence intervals, for Sprague-Dawley rats. Understanding the significant discrepancies in hematologic and biochemical analytes between male and female Sprague-Dawley rats provides important insight into physiological effects in test rats. Establishment of locally sex-specific reference intervals allows a more precise evaluation of animal quality and experimental results of Sprague-Dawley rats in our toxicology safety assessment.
A meta-analytic review of the effects of mindfulness meditation on telomerase activity.
Schutte, Nicola S; Malouff, John M
2014-04-01
The enzyme telomerase, through its influence on telomere length, is associated with health and mortality. Four pioneering randomized control trials, including a total of 190 participants, provided information on the effect of mindfulness meditation on telomerase. A meta-analytic effect size of d=0.46 indicated that mindfulness meditation leads to increased telomerase activity in peripheral blood mononuclear cells. These results suggest the need for further large-scale trials investigating optimal implementation of mindfulness meditation to facilitate telomerase functioning. Copyright © 2014 Elsevier Ltd. All rights reserved.
Parametric study of minimum converter loss in an energy-storage dc-to-dc converter
NASA Technical Reports Server (NTRS)
Wong, R. C.; Owen, H. A., Jr.; Wilson, T. G.
1982-01-01
Through a combination of analytical and numerical minimization procedures, a converter design that results in the minimum total converter loss (including core loss, winding loss, capacitor and energy-storage-reactor loss, and various losses in the semiconductor switches) is obtained. Because the initial phase involves analytical minimization, the computation time required by the subsequent phase of numerical minimization is considerably reduced in this combination approach. The effects of various loss parameters on the optimum values of the design variables are also examined.
Distribution analysis for F100(3) engine
NASA Technical Reports Server (NTRS)
Walter, W. A.; Shaw, M.
1980-01-01
The F100(3) compression system response to inlet circumferential distortion was investigated using an analytical compressor flow model. Compression system response to several types of distortion, including pressure, temperature, and combined pressure/temperature distortions, was investigated. The predicted response trends were used in planning future F100(3) distortion tests. Results show that compression system response to combined temperature and pressure distortions depends upon the relative orientation, as well as the individual amplitudes and circumferential extents of the distortions. Also the usefulness of the analytical predictions in planning engine distortion tests is indicated.
Anisotropic cosmological solutions in R + R^2 gravity
NASA Astrophysics Data System (ADS)
Müller, Daniel; Ricciardone, Angelo; Starobinsky, Alexei A.; Toporensky, Aleksey
2018-04-01
In this paper we investigate the past evolution of an anisotropic Bianchi I universe in R+R^2 gravity. Using the dynamical system approach we show that there exists a new two-parameter set of solutions that includes both an isotropic "false radiation" solution and an anisotropic generalized Kasner solution, which is stable. We derive the analytic behavior of the shear from a specific property of f( R) gravity and the analytic asymptotic form of the Ricci scalar when approaching the initial singularity. Finally, we numerically check our results.
Optical sensors and multisensor arrays containing thin film electroluminescent devices
Aylott, Jonathan W.; Chen-Esterlit, Zoe; Friedl, Jon H.; Kopelman, Raoul; Savvateev, Vadim N.; Shinar, Joseph
2001-12-18
Optical sensor, probe and array devices for detecting chemical biological, and physical analytes. The devices include an analyte-sensitive layer optically coupled to a thin film electroluminescent layer which activates the analyte-sensitive layer to provide an optical response. The optical response varies depending upon the presence of an analyte and is detected by a photodetector and analyzed to determine the properties of the analyte.
-Omic and Electronic Health Record Big Data Analytics for Precision Medicine.
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D
2017-02-01
Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.
AmO 2 Analysis for Analytical Method Testing and Assessment: Analysis Support for AmO 2 Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhn, Kevin John; Bland, Galey Jean; Fulwyler, James Brent
Americium oxide samples will be measured for various analytes to support AmO 2 production. The key analytes that are currently requested by the Am production customer at LANL include total Am content, Am isotopics, Pu assay, Pu isotopics, and trace element content including 237Np content. Multiple analytical methods will be utilized depending on the sensitivity, accuracy and precision needs of the Am matrix. Traceability to the National Institute of Standards and Technology (NIST) will be achieved, where applicable, by running NIST traceable quality control materials. This given that there are no suitable AmO 2 reference materials currently available for requestedmore » analytes. The primary objective is to demonstrate the suitability of actinide analytical chemistry methods to support AmO 2 production operations.« less
Fishman, M. J.
1993-01-01
Methods to be used to analyze samples of water, suspended sediment and bottom material for their content of inorganic and organic constituents are presented. Technology continually changes, and so this laboratory manual includes new and revised methods for determining the concentration of dissolved constituents in water, whole water recoverable constituents in water-suspended sediment samples, and recoverable concentration of constit- uents in bottom material. For each method, the general topics covered are the application, the principle of the method, interferences, the apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data. Included in this manual are 30 methods.
NASA Astrophysics Data System (ADS)
Li, Jiangui; Wang, Junhua; Zhigang, Zhao; Yan, Weili
2012-04-01
In this paper, analytical analysis of the permanent magnet vernier (PMV) is presented. The key is to analytically solve the governing Laplacian/quasi-Poissonian field equations in the motor regions. By using the time-stepping finite element method, the analytical method is verified. Hence, the performances of the PMV machine are quantitatively compared with that of the analytical results. The analytical results agree well with the finite element method results. Finally, the experimental results are given to further show the validity of the analysis.
NASA Astrophysics Data System (ADS)
Calderone, G. M.
2006-12-01
A long-term monitoring program was initiated in 1995 at 6 sites at NAS Brunswick, including 3 National Priorities List (Superfund) sites. Primary contaminants of concern include chlorinated volatile organic compounds, including tetrachloroethane, trichloroethene, and vinyl chloride, in addition to metals. More than 80 submersible pumping systems were installed to facilitate sample collection utilizing the low-flow sampling technique. Long-term monitoring of the groundwater is conducted to assess the effectiveness of remedial measures, and monitor changes in contaminant concentrations in the Eastern Plume Operable Unit. Long-term monitoring program activities include quarterly groundwater sampling and analysis at more than 90 wells across 6 sites; surface water, sediment, seep, and leachate sampling and analysis at 3 sites; landfill gas monitoring; well maintenance; engineering inspections of landfill covers and other sites or evidence of stressed vegetation; water level gauging; and treatment plant sampling and analysis. Significant cost savings were achieved by optimizing the sampling network and reducing sampling frequency from quarterly to semi- annual or annual sampling. As part of an ongoing optimization effort, a geostatistical assessment of the Eastern Plume was conducted at the Naval Air Station, Brunswick, Maine. The geostatistical assessment used 40 monitoring points and analytical data collected over 3 years. For this geostatistical assessment, EA developed and utilized a database of analytical results generated during 3 years of long-term monitoring which was linked to a Geographic Information System to enhance data visualization capacity. The Geographic Information System included themes for groundwater volatile organic compound concentration, groundwater flow directions, shallow and deep wells, and immediate access to point-specific analytical results. This statistical analysis has been used by the site decision-maker and its conclusions supported a significant reduction in the Long-Term Monitoring Program.
Joyner, Katherine; Wang, Weizhen; Yu, Yihua Bruce
2011-01-01
The effect of column and eluent fluorination on the retention and separation of non-fluorinated amino acids and proteins in HPLC is investigated. A side-by-side comparison of fluorocarbon column and eluents (F-column and F-eluents) with their hydrocarbon counterparts (H-column and H-eluents) in the separation of a group of 33 analytes, including 30 amino acids and 3 proteins, is conducted. The H-column and the F-column contain the n-C8H17 group and n-C8F17 group, respectively, in their stationary phases. The H-eluents include ethanol (EtOH) and isopropanol (ISP) while the F-eluents include trifluoroethanol (TFE) and hexafluorosopropanol (HFIP). The 2 columns and 4 eluents generated 8 (column, eluent) pairs that produce 264 retention time data points for the 33 analytes. A statistical analysis of the retention time data reveals that although the H-column is better than the F-column in analyte separation and H-eluents are better than F-eluents in analyte retention, the more critical factor is the proper pairing of column with eluent. Among the conditions explored in this project, optimal retention and separation is achieved when the fluorocarbon column is paired with ethanol, even though TFE is the most polar one among the 4 eluents. This result shows fluorocarbon columns have much potential in chromatographic analysis and separation of non-fluorinated amino acids and proteins. PMID:21318121
Use of airborne hyperspectral imagery to map soil parameters in tilled agricultural fields
Hively, W. Dean; McCarty, Gregory W.; Reeves, James B.; Lang, Megan W.; Oesterling, Robert A.; Delwiche, Stephen R.
2011-01-01
Soil hyperspectral reflectance imagery was obtained for six tilled (soil) agricultural fields using an airborne imaging spectrometer (400–2450 nm, ~10 nm resolution, 2.5 m spatial resolution). Surface soil samples (n = 315) were analyzed for carbon content, particle size distribution, and 15 agronomically important elements (Mehlich-III extraction). When partial least squares (PLS) regression of imagery-derived reflectance spectra was used to predict analyte concentrations, 13 of the 19 analytes were predicted with R2 > 0.50, including carbon (0.65), aluminum (0.76), iron (0.75), and silt content (0.79). Comparison of 15 spectral math preprocessing treatments showed that a simple first derivative worked well for nearly all analytes. The resulting PLS factors were exported as a vector of coefficients and used to calculate predicted maps of soil properties for each field. Image smoothing with a 3 × 3 low-pass filter prior to spectral data extraction improved prediction accuracy. The resulting raster maps showed variation associated with topographic factors, indicating the effect of soil redistribution and moisture regime on in-field spatial variability. High-resolution maps of soil analyte concentrations can be used to improve precision environmental management of farmlands.
Quality-control materials in the USDA National Food and Nutrient Analysis Program (NFNAP).
Phillips, Katherine M; Patterson, Kristine Y; Rasor, Amy S; Exler, Jacob; Haytowitz, David B; Holden, Joanne M; Pehrsson, Pamela R
2006-03-01
The US Department of Agriculture (USDA) Nutrient Data Laboratory (NDL) develops and maintains the USDA National Nutrient Databank System (NDBS). Data are released from the NDBS for scientific and public use through the USDA National Nutrient Database for Standard Reference (SR) ( http://www.ars.usda.gov/ba/bhnrc/ndl ). In 1997 the NDL initiated the National Food and Nutrient Analysis Program (NFNAP) to update and expand its food-composition data. The program included: 1) nationwide probability-based sampling of foods; 2) central processing and archiving of food samples; 3) analysis of food components at commercial, government, and university laboratories; 4) incorporation of new analytical data into the NDBS; and 5) dissemination of these data to the scientific community. A key feature and strength of the NFNAP was a rigorous quality-control program that enabled independent verification of the accuracy and precision of analytical results. Custom-made food-control composites and/or commercially available certified reference materials were sent to the laboratories, blinded, with the samples. Data for these materials were essential to ongoing monitoring of analytical work, to identify and resolve suspected analytical problems, to ensure the accuracy and precision of results for the NFNAP food samples.
Transformation of an uncertain video search pipeline to a sketch-based visual analytics loop.
Legg, Philip A; Chung, David H S; Parry, Matthew L; Bown, Rhodri; Jones, Mark W; Griffiths, Iwan W; Chen, Min
2013-12-01
Traditional sketch-based image or video search systems rely on machine learning concepts as their core technology. However, in many applications, machine learning alone is impractical since videos may not be semantically annotated sufficiently, there may be a lack of suitable training data, and the search requirements of the user may frequently change for different tasks. In this work, we develop a visual analytics systems that overcomes the shortcomings of the traditional approach. We make use of a sketch-based interface to enable users to specify search requirement in a flexible manner without depending on semantic annotation. We employ active machine learning to train different analytical models for different types of search requirements. We use visualization to facilitate knowledge discovery at the different stages of visual analytics. This includes visualizing the parameter space of the trained model, visualizing the search space to support interactive browsing, visualizing candidature search results to support rapid interaction for active learning while minimizing watching videos, and visualizing aggregated information of the search results. We demonstrate the system for searching spatiotemporal attributes from sports video to identify key instances of the team and player performance.
Modeling of a rotary motor driven by an anisotropic piezoelectric composite laminate.
Zhu, M L; Lee, S R; Zhang, T Y; Tong, P
2000-01-01
This paper proposes an analytical model of a rotary motor driven by an anisotropic piezoelectric composite laminate. The driving element of the motor is a three-layer laminated plate. A piezoelectric layer is sandwiched between two anti-symmetric composite laminae. Because of the material anisotropy and the anti-symmetric configuration, torsional vibration can be induced through the inplane strain actuated by the piezoelectric layer. The advantages of the motor are its magnetic field immunity, simple structure, easy maintenance, low cost, and good low-speed performance. In this paper, the motor is considered to be a coupled dynamic system. The analytical model includes the longitudinal and torsional vibrations of the laminate and the rotating motion of the rotor under action of contact forces. The analytical model can predict the overall characteristics of the motor, including the modal frequency and the response of motion of the laminate, the rotating speed of the rotor, the input power, the output power, and the efficiency of the motor. The effects of the initial compressive force, the applied voltage, the moment of rotor inertia, and the frictional coefficient of the contact interface on the characteristics of the motor are simulated and discussed. A selection of the numerical results from the analytical model is confirmed by experimental data.
NASA Astrophysics Data System (ADS)
Bastianello, Alvise; Piroli, Lorenzo; Calabrese, Pasquale
2018-05-01
We derive exact analytic expressions for the n -body local correlations in the one-dimensional Bose gas with contact repulsive interactions (Lieb-Liniger model) in the thermodynamic limit. Our results are valid for arbitrary states of the model, including ground and thermal states, stationary states after a quantum quench, and nonequilibrium steady states arising in transport settings. Calculations for these states are explicitly presented and physical consequences are critically discussed. We also show that the n -body local correlations are directly related to the full counting statistics for the particle-number fluctuations in a short interval, for which we provide an explicit analytic result.
Semi-Analytic Reconstruction of Flux in Finite Volume Formulations
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.
2006-01-01
Semi-analytic reconstruction uses the analytic solution to a second-order, steady, ordinary differential equation (ODE) to simultaneously evaluate the convective and diffusive flux at all interfaces of a finite volume formulation. The second-order ODE is itself a linearized approximation to the governing first- and second- order partial differential equation conservation laws. Thus, semi-analytic reconstruction defines a family of formulations for finite volume interface fluxes using analytic solutions to approximating equations. Limiters are not applied in a conventional sense; rather, diffusivity is adjusted in the vicinity of changes in sign of eigenvalues in order to achieve a sufficiently small cell Reynolds number in the analytic formulation across critical points. Several approaches for application of semi-analytic reconstruction for the solution of one-dimensional scalar equations are introduced. Results are compared with exact analytic solutions to Burger s Equation as well as a conventional, upwind discretization using Roe s method. One approach, the end-point wave speed (EPWS) approximation, is further developed for more complex applications. One-dimensional vector equations are tested on a quasi one-dimensional nozzle application. The EPWS algorithm has a more compact difference stencil than Roe s algorithm but reconstruction time is approximately a factor of four larger than for Roe. Though both are second-order accurate schemes, Roe s method approaches a grid converged solution with fewer grid points. Reconstruction of flux in the context of multi-dimensional, vector conservation laws including effects of thermochemical nonequilibrium in the Navier-Stokes equations is developed.
Trace level detection of analytes using artificial olfactometry
NASA Technical Reports Server (NTRS)
Lewis, Nathan S. (Inventor); Severin, Erik J. (Inventor); Wong, Bernard (Inventor)
2002-01-01
The present invention provides a device for detecting the presence of an analyte, such as for example, a lightweight device, including: a sample chamber having a fluid inlet port for the influx of the analyte; a fluid concentrator in flow communication with the sample chamber wherein the fluid concentrator has an absorbent material capable of absorbing the analyte and capable of desorbing a concentrated analyte; and an array of sensors in fluid communication with the concentrated analyte to be released from the fluid concentrator.
Closing the brain-to-brain loop in laboratory testing.
Plebani, Mario; Lippi, Giuseppe
2011-07-01
Abstract The delivery of laboratory services has been described 40 years ago and defined with the foremost concept of "brain-to-brain turnaround time loop". This concept consists of several processes, including the final step which is the action undertaken on the patient based on laboratory information. Unfortunately, the need for systematic feedback to improve the value of laboratory services has been poorly understood and, even more risky, poorly applied in daily laboratory practice. Currently, major problems arise from the unavailability of consensually accepted quality specifications for the extra-analytical phase of laboratory testing. This, in turn, does not allow clinical laboratories to calculate a budget for the "patient-related total error". The definition and use of the term "total error" refers only to the analytical phase, and should be better defined as "total analytical error" to avoid any confusion and misinterpretation. According to the hierarchical approach to classify strategies to set analytical quality specifications, the "assessment of the effect of analytical performance on specific clinical decision-making" is comprehensively at the top and therefore should be applied as much as possible to address analytical efforts towards effective goals. In addition, an increasing number of laboratories worldwide are adopting risk management strategies such as FMEA, FRACAS, LEAN and Six Sigma since these techniques allow the identification of the most critical steps in the total testing process, and to reduce the patient-related risk of error. As a matter of fact, an increasing number of laboratory professionals recognize the importance of understanding and monitoring any step in the total testing process, including the appropriateness of the test request as well as the appropriate interpretation and utilization of test results.
Assessing analytical comparability of biosimilars: GCSF as a case study.
Nupur, Neh; Singh, Sumit Kumar; Narula, Gunjan; Rathore, Anurag S
2016-10-01
The biosimilar industry is witnessing an unprecedented growth with the newer therapeutics increasing in complexity over time. A key step towards development of a biosimilar is to establish analytical comparability with the innovator product, which would otherwise affect the safety/efficacy profile of the product. Choosing appropriate analytical tools that can fulfil this objective by qualitatively and/or quantitatively assessing the critical quality attributes (CQAs) of the product is highly critical for establishing equivalence. These CQAs cover the primary and higher order structures of the product, product related variants and impurities, as well as process related impurities, and host cell related impurities. In the present work, we use such an analytical platform for assessing comparability of five approved Granulocyte Colony Stimulating Factor (GCSF) biosimilars (Emgrast, Lupifil, Colstim, Neukine and Grafeel) to the innovator product, Neupogen(®). The comparability studies involve assessing structural homogeneity, identity, secondary structure, and product related modifications. Physicochemical analytical tools include peptide mapping with mass determination, circular dichroism (CD) spectroscopy, reverse phase chromatography (RPC) and size exclusion chromatography (SEC) have been used in this exercise. Bioactivity assessment include comparison of relative potency through in vitro cell proliferation assays. The results from extensive analytical examination offer robust evidence of structural and biological similarity of the products under consideration with the pertinent innovator product. For the most part, the biosimilar drugs were found to be comparable to the innovator drug anomaly that was identified was that three of the biosimilars had a typical variant which was reported as an oxidized species in the literature. But, upon further investigation using RPC-FLD and ESI-MS we found that this is likely a conformational variant of the biotherapeutic been studied. Copyright © 2016 Elsevier B.V. All rights reserved.
Analysis of structural dynamic data from Skylab. Volume 1: Technical discussion
NASA Technical Reports Server (NTRS)
Demchak, L.; Harcrow, H.
1976-01-01
The results of a study to analyze data and document dynamic program highlights of the Skylab Program are presented. Included are structural model sources, illustration of the analytical models, utilization of models and the resultant derived data, data supplied to organization and subsequent utilization, and specifications of model cycles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jo, J.
This document is a report of the analytical results for samples collected from the radioactive wastes in Tank 241-U-202 at the Hanford Reservation. Core samples were collected from the solid wastes in the tank and underwent safety screening analyses including differential scanning calorimetry, thermogravimetric analysis, and total alpha analysis. Results indicate that no safety screening notification limits were exceeded.
The Effects of Measurement Error on Statistical Models for Analyzing Change. Final Report.
ERIC Educational Resources Information Center
Dunivant, Noel
The results of six major projects are discussed including a comprehensive mathematical and statistical analysis of the problems caused by errors of measurement in linear models for assessing change. In a general matrix representation of the problem, several new analytic results are proved concerning the parameters which affect bias in…
Walker, M J; Burns, D T; Elliott, C T; Gowland, M H; Mills, E N Clare
2016-01-07
Food allergy is an increasing problem for those affected, their families or carers, the food industry and for regulators. The food supply chain is highly vulnerable to fraud involving food allergens, risking fatalities and severe reputational damage to the food industry. Many facets are being pursued to ameliorate the difficulties including better food labelling and the concept of thresholds of elicitation of allergy symptoms as risk management tools. These efforts depend to a high degree on the ability reliably to detect and quantify food allergens; yet all current analytical approaches exhibit severe deficiencies that jeopardise accurate results being produced particularly in terms of the risks of false positive and false negative reporting. If we fail to realise the promise of current risk assessment and risk management of food allergens through lack of the ability to measure food allergens reproducibly and with traceability to an international unit of measurement, the analytical community will have failed a significant societal challenge. Three distinct but interrelated areas of analytical work are urgently needed to address the substantial gaps identified: (a) a coordinated international programme for the production of properly characterised clinically relevant reference materials and calibrants for food allergen analysis; (b) an international programme to widen the scope of proteomics and genomics bioinformatics for the genera containing the major allergens to address problems in ELISA, MS and DNA methods; (c) the initiation of a coordinated international programme leading to reference methods for allergen proteins that provide results traceable to the SI. This article describes in more detail food allergy, the risks of inapplicable or flawed allergen analyses with examples and a proposed framework, including clinically relevant incurred allergen concentrations, to address the currently unmet and urgently required analytical requirements. Support for the above recommendations from food authorities, business organisations and National Measurement Institutes is important; however transparent international coordination is essential. Thus our recommendations are primarily addressed to the European Commission, the Health and Food Safety Directorate, DG Santé. A global multidisciplinary consortium is required to provide a curated suite of data including genomic and proteomic data on key allergenic food sources, made publically available on line.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2013-12-20
Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.
The HVT technique and the 'uncertainty' relation for central potentials
NASA Astrophysics Data System (ADS)
Grypeos, M. E.; Koutroulos, C. G.; Oyewumi, K. J.; Petridou, Th
2004-08-01
The quantum mechanical hypervirial theorems (HVT) technique is used to treat the so-called 'uncertainty' relation for quite a general class of central potential wells, including the (reduced) Poeschl-Teller and the Gaussian one. It is shown that this technique is quite suitable in deriving an approximate analytic expression in the form of a truncated power series expansion for the dimensionless product Pnl equiv langr2rangnllangp2rangnl/planck2, for every (deeply) bound state of a particle moving non-relativistically in the well, provided that a (dimensionless) parameter s is sufficiently small. Attention is also paid to a number of cases, among the limited existing ones, in which exact analytic or semi-analytic expressions for Pnl can be derived. Finally, numerical results are given and discussed.
Analytical spectrum for a Hamiltonian of quantum dots with Rashba spin-orbit coupling
NASA Astrophysics Data System (ADS)
Dossa, Anselme F.; Avossevou, Gabriel Y. H.
2014-12-01
We determine the analytical solution for a Hamiltonian describing a confined charged particle in a quantum dot, including Rashba spin-orbit coupling and Zeeman splitting terms. The approach followed in this paper is straightforward and uses the symmetrization of the wave function's components. The eigenvalue problem for the Hamiltonian in Bargmann's Hilbert space reduces to a system of coupled first-order differential equations. Then we exploit the symmetry in the system to obtain uncoupled second-order differential equations, which are found to be the Whittaker-Ince limit of the confluent Heun equations. Analytical expressions as well as numerical results are obtained for the spectrum. One of the main features of such models, namely, the level splitting, is present through the spectrum obtained in this paper.
A Family of Vortices to Study Axisymmetric Vortex Breakdown and Reconnection
NASA Technical Reports Server (NTRS)
Young, Larry A.
2007-01-01
A new analytic model describing a family of vortices has been developed to study some of the axisymmetric vortex breakdown and reconnection fluid dynamic processes underlying body-vortex interactions that are frequently manifested in rotorcraft and propeller-driven fixed-wing aircraft wakes. The family of vortices incorporates a wide range of prescribed initial vorticity distributions -- including single or dual-core vorticity distributions. The result is analytical solutions for the vorticity and velocities for each member of the family of vortices. This model is of sufficient generality to further illustrate the dependence of vortex reconnection and breakdown on initial vorticity distribution as was suggested by earlier analytical work. This family of vortices, though laminar in nature, is anticipated to provide valuable insight into the vortical evolution of large-scale rotor and propeller wakes.
Liu, Ken H.; Walker, Douglas I.; Uppal, Karan; Tran, ViLinh; Rohrbeck, Patricia; Mallon, Timothy M.; Jones, Dean P.
2016-01-01
Objective To maximize detection of serum metabolites with high-resolution metabolomics (HRM). Methods Department of Defense Serum Repository (DoDSR) samples were analyzed using ultra-high resolution mass spectrometry with three complementary chromatographic phases and four ionization modes. Chemical coverage was evaluated by number of ions detected and accurate mass matches to a human metabolomics database. Results Individual HRM platforms provided accurate mass matches for up to 58% of the KEGG metabolite database. Combining two analytical methods increased matches to 72%, and included metabolites in most major human metabolic pathways and chemical classes. Detection and feature quality varied by analytical configuration. Conclusions Dual chromatography HRM with positive and negative electrospray ionization provides an effective generalized method for metabolic assessment of military personnel. PMID:27501105
Comazzi, S; Cozzi, M; Bernardi, S; Zanella, D R; Aresu, L; Stefanello, D; Marconato, L; Martini, V
2018-02-01
Flow cytometry (FC) is increasingly being used for immunophenotyping and staging of canine lymphoma. The aim of this retrospective study was to assess pre-analytical variables that might influence the diagnostic utility of FC of lymph node (LN) fine needle aspirate (FNA) specimens from dogs with lymphoproliferative diseases. The study included 987 cases with LN FNA specimens sent for immunophenotyping that were submitted to a diagnostic laboratory in Italy from 2009 to 2015. Cases were grouped into 'diagnostic' and 'non-diagnostic'. Pre-analytical factors analysed by univariate and multivariate analyses were animal-related factors (breed, age, sex, size), operator-related factors (year, season, shipping method, submitting veterinarian) and sample-related factors (type of sample material, cellular concentration, cytological smears, artefacts). The submitting veterinarian, sample material, sample cellularity and artefacts affected the likelihood of having a diagnostic sample. The availability of specimens from different sites and of cytological smears increased the odds of obtaining a diagnostic result. Major artefacts affecting diagnostic utility included poor cellularity and the presence of dead cells. Flow cytometry on LN FNA samples yielded conclusive results in more than 90% of cases with adequate sample quality and sampling conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.
CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila
2015-03-10
We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observedmore » galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs.« less
Aptamer- and nucleic acid enzyme-based systems for simultaneous detection of multiple analytes
Lu, Yi [Champaign, IL; Liu, Juewen [Albuquerque, NM
2011-11-15
The present invention provides aptamer- and nucleic acid enzyme-based systems for simultaneously determining the presence and optionally the concentration of multiple analytes in a sample. Methods of utilizing the system and kits that include the sensor components are also provided. The system includes a first reactive polynucleotide that reacts to a first analyte; a second reactive polynucleotide that reacts to a second analyte; a third polynucleotide; a fourth polynucleotide; a first particle, coupled to the third polynucleotide; a second particle, coupled to the fourth polynucleotide; and at least one quencher, for quenching emissions of the first and second quantum dots, coupled to the first and second reactive polynucleotides. The first particle includes a quantum dot having a first emission wavelength. The second particle includes a second quantum dot having a second emission wavelength different from the first emission wavelength. The third polynucleotide and the fourth polynucleotide are different.
NASA Astrophysics Data System (ADS)
Fischer, Ulrich; Celia, Michael A.
1999-04-01
Functional relationships for unsaturated flow in soils, including those between capillary pressure, saturation, and relative permeabilities, are often described using analytical models based on the bundle-of-tubes concept. These models are often limited by, for example, inherent difficulties in prediction of absolute permeabilities, and in incorporation of a discontinuous nonwetting phase. To overcome these difficulties, an alternative approach may be formulated using pore-scale network models. In this approach, the pore space of the network model is adjusted to match retention data, and absolute and relative permeabilities are then calculated. A new approach that allows more general assignments of pore sizes within the network model provides for greater flexibility to match measured data. This additional flexibility is especially important for simultaneous modeling of main imbibition and drainage branches. Through comparisons between the network model results, analytical model results, and measured data for a variety of both undisturbed and repacked soils, the network model is seen to match capillary pressure-saturation data nearly as well as the analytical model, to predict water phase relative permeabilities equally well, and to predict gas phase relative permeabilities significantly better than the analytical model. The network model also provides very good estimates for intrinsic permeability and thus for absolute permeabilities. Both the network model and the analytical model lost accuracy in predicting relative water permeabilities for soils characterized by a van Genuchten exponent n≲3. Overall, the computational results indicate that reliable predictions of both relative and absolute permeabilities are obtained with the network model when the model matches the capillary pressure-saturation data well. The results also indicate that measured imbibition data are crucial to good predictions of the complete hysteresis loop.
A research program to reduce interior noise in general aviation airplanes. [test methods and results
NASA Technical Reports Server (NTRS)
Roskam, J.; Muirhead, V. U.; Smith, H. W.; Peschier, T. D.; Durenberger, D.; Vandam, K.; Shu, T. C.
1977-01-01
Analytical and semi-empirical methods for determining the transmission of sound through isolated panels and predicting panel transmission loss are described. Test results presented include the influence of plate stiffness and mass and the effects of pressurization and vibration damping materials on sound transmission characteristics. Measured and predicted results are presented in tables and graphs.
Improved explosive collection and detection with rationally assembled surface sampling materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chouyyok, Wilaiwan; Bays, J. Timothy; Gerasimenko, Aleksandr A.
Sampling and detection of trace explosives is a key analytical process in modern transportation safety. In this work we have explored some of the fundamental analytical processes for collection and detection of trace level explosive on surfaces with the most widely utilized system, thermal desorption IMS. The performance of the standard muslin swipe material was compared with chemically modified fiberglass cloth. The fiberglass surface was modified to include phenyl functional groups. When compared to standard muslin, the phenyl functionalized fiberglass sampling material showed better analyte release from the sampling material as well as improved response and repeatability from multiple usesmore » of the same swipe. The improved sample release of the functionalized fiberglass swipes resulted in a significant increase in sensitivity. Various physical and chemical properties were systematically explored to determine optimal performance. The results herein have relevance to improving the detection of other explosive compounds and potentially to a wide range of other chemical sampling and field detection challenges.« less
Managing knowledge business intelligence: A cognitive analytic approach
NASA Astrophysics Data System (ADS)
Surbakti, Herison; Ta'a, Azman
2017-10-01
The purpose of this paper is to identify and analyze integration of Knowledge Management (KM) and Business Intelligence (BI) in order to achieve competitive edge in context of intellectual capital. Methodology includes review of literatures and analyzes the interviews data from managers in corporate sector and models established by different authors. BI technologies have strong association with process of KM for attaining competitive advantage. KM have strong influence from human and social factors and turn them to the most valuable assets with efficient system run under BI tactics and technologies. However, the term of predictive analytics is based on the field of BI. Extracting tacit knowledge is a big challenge to be used as a new source for BI to use in analyzing. The advanced approach of the analytic methods that address the diversity of data corpus - structured and unstructured - required a cognitive approach to provide estimative results and to yield actionable descriptive, predictive and prescriptive results. This is a big challenge nowadays, and this paper aims to elaborate detail in this initial work.
NASA Technical Reports Server (NTRS)
Everett, L.
1992-01-01
This report documents the performance characteristics of a Targeting Reflective Alignment Concept (TRAC) sensor. The performance will be documented for both short and long ranges. For long ranges, the sensor is used without the flat mirror attached to the target. To better understand the capabilities of the TRAC based sensors, an engineering model is required. The model can be used to better design the system for a particular application. This is necessary because there are many interrelated design variables in application. These include lense parameters, camera, and target configuration. The report presents first an analytical development of the performance, and second an experimental verification of the equations. In the analytical presentation it is assumed that the best vision resolution is a single pixel element. The experimental results suggest however that the resolution is better than 1 pixel. Hence the analytical results should be considered worst case conditions. The report also discusses advantages and limitations of the TRAC sensor in light of the performance estimates. Finally the report discusses potential improvements.
Analytical Model of Large Data Transactions in CoAP Networks
Ludovici, Alessandro; Di Marco, Piergiuseppe; Calveras, Anna; Johansson, Karl H.
2014-01-01
We propose a novel analytical model to study fragmentation methods in wireless sensor networks adopting the Constrained Application Protocol (CoAP) and the IEEE 802.15.4 standard for medium access control (MAC). The blockwise transfer technique proposed in CoAP and the 6LoWPAN fragmentation are included in the analysis. The two techniques are compared in terms of reliability and delay, depending on the traffic, the number of nodes and the parameters of the IEEE 802.15.4 MAC. The results are validated trough Monte Carlo simulations. To the best of our knowledge this is the first study that evaluates and compares analytically the performance of CoAP blockwise transfer and 6LoWPAN fragmentation. A major contribution is the possibility to understand the behavior of both techniques with different network conditions. Our results show that 6LoWPAN fragmentation is preferable for delay-constrained applications. For highly congested networks, the blockwise transfer slightly outperforms 6LoWPAN fragmentation in terms of reliability. PMID:25153143
Sakaguchi, Yohei; Yoshida, Hideyuki; Todoroki, Kenichiro; Nohta, Hitoshi; Yamaguchi, Masatoshi
2009-06-15
We have developed a new and simple method based on "fluorous derivatization" for LC of native fluorescent compounds. This method involves the use of a column with a fluorous stationary phase. Native fluorescent analytes with target functional groups are precolumn derivatized with a nonfluorescent fluorous tag, and the fluorous-labeled analytes are retained in the column, whereas underivatized substances are not. Only the retained fluorescent analytes are detected fluorometrically at appropriate retention times, and retained substrates without fluorophores are not detected. In this study, biologically important carboxylic acids (homovanillic acid, vanillylmandelic acid, and 5-hydroxyindoleacetic acid) and drugs (naproxen, felbinac, flurbiprofen, and etodolac) were used as model native fluorescent compounds. Experimental results indicate that the fluorous-phase column can selectively retain fluorous compounds including fluorous-labeled analytes on the basis of fluorous separation. We believe that separation-oriented derivatization presented here is the first step toward the introduction of fluorous derivatization in quantitative LC analysis.
Kurylyk, Barret L.; McKenzie, Jeffrey M; MacQuarrie, Kerry T. B.; Voss, Clifford I.
2014-01-01
Numerous cold regions water flow and energy transport models have emerged in recent years. Dissimilarities often exist in their mathematical formulations and/or numerical solution techniques, but few analytical solutions exist for benchmarking flow and energy transport models that include pore water phase change. This paper presents a detailed derivation of the Lunardini solution, an approximate analytical solution for predicting soil thawing subject to conduction, advection, and phase change. Fifteen thawing scenarios are examined by considering differences in porosity, surface temperature, Darcy velocity, and initial temperature. The accuracy of the Lunardini solution is shown to be proportional to the Stefan number. The analytical solution results obtained for soil thawing scenarios with water flow and advection are compared to those obtained from the finite element model SUTRA. Three problems, two involving the Lunardini solution and one involving the classic Neumann solution, are recommended as standard benchmarks for future model development and testing.
Smartphone-based portable wireless optical system for the detection of target analytes.
Gautam, Shreedhar; Batule, Bhagwan S; Kim, Hyo Yong; Park, Ki Soo; Park, Hyun Gyu
2017-02-01
Rapid and accurate on-site wireless measurement of hazardous molecules or biomarkers is one of the biggest challenges in nanobiotechnology. A novel smartphone-based Portable and Wireless Optical System (PAWS) for rapid, quantitative, and on-site analysis of target analytes is described. As a proof-of-concept, we employed gold nanoparticles (GNP) and an enzyme, horse radish peroxidase (HRP), to generate colorimetric signals in response to two model target molecules, melamine and hydrogen peroxide, respectively. The colorimetric signal produced by the presence of the target molecules is converted to an electrical signal by the inbuilt electronic circuit of the device. The converted electrical signal is then measured wirelessly via multimeter in the smartphone which processes the data and displays the results, including the concentration of analytes and its significance. This handheld device has great potential as a programmable and miniaturized platform to achieve rapid and on-site detection of various analytes in a point-of-care testing (POCT) manner. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Risk analysis by FMEA as an element of analytical validation.
van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M
2009-12-05
We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.
NASA Technical Reports Server (NTRS)
Phatak, A. V.
1980-01-01
A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention
Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-01-01
Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
NASA Astrophysics Data System (ADS)
Irwanto, Rohaeti, Eli; LFX, Endang Widjajanti; Suyanta
2017-05-01
This research aims to develop instrument and determine the characteristics of an integrated assessment instrument. This research uses 4-D model, which includes define, design, develop, and disseminate. The primary product is validated by expert judgment, tested it's readability by students, and assessed it's feasibility by chemistry teachers. This research involved 246 students of grade XI of four senior high schools in Yogyakarta, Indonesia. Data collection techniques include interview, questionnaire, and test. Data collection instruments include interview guideline, item validation sheet, users' response questionnaire, instrument readability questionnaire, and essay test. The results show that the integrated assessment instrument has Aiken validity value of 0.95. Item reliability was 0.99 and person reliability was 0.69. Teachers' response to the integrated assessment instrument is very good. Therefore, the integrated assessment instrument is feasible to be applied to measure the students' analytical thinking and science process skills.
Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M
2008-01-01
Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. Conclusion The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. Method We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit. PMID:18992163
Saraji, Mohammad; Ghambari, Hoda
2018-06-21
In this work we seek clues to select the appropriate dispersive liquid-liquid microextraction mode for extracting three categories of compounds. For this purpose, three common dispersive liquid-liquid microextraction modes were compared under optimized conditions. Traditional dispersive liquid-liquid microextraction, in situ ionic liquid dispersive liquid-liquid microextraction and conventional ionic liquid dispersive liquid-liquid microextraction using chloroform, 1-butyl-3-methylimidazolium tetrafluoroborate, and 1-hexyl-3-methylimidazolium hexafluorophosphate as the extraction solvent, respectively, were considered in this work. Phenolic, neutral aromatic and amino compounds (each category included six members) were studied as analytes. The analytes in the extracts were determined by high-performance liquid chromatography with UV detection. For the analytes with polar functionalities, the in situ ionic liquid dispersive liquid-liquid microextraction mode mostly led to better results. In contrast, for neutral hydrocarbons without polar functionalities, traditional dispersive liquid-liquid microextraction using chloroform produced better results. In this case, where dispersion forces were the dominant interactions in the extraction, the refractive index of solvent and analyte predicted the extraction performance better than the octanol-water partition coefficient. It was also revealed that none of the methods were successful in extracting very hydrophilic analytes (compounds with the log octanol-water partition coefficient < 2). The results of this study could be helpful in selecting a dispersive liquid-liquid microextraction mode for the extraction of various groups of compounds. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Han, Lijun; Matarrita, Jessie; Sapozhnikova, Yelena; Lehotay, Steven J
2016-06-03
This study demonstrates the application of a novel lipid removal product to the residue analysis of 65 pesticides and 52 environmental contaminants in kale, pork, salmon, and avocado by fast, low pressure gas chromatography - tandem mass spectrometry (LPGC-MS/MS). Sample preparation involves QuEChERS extraction followed by use of EMR-Lipid ("enhanced matrix removal of lipids") and an additional salting out step for cleanup. The optimal amount of EMR-Lipid was determined to be 500mg for 2.5mL extracts for most of the analytes. The co-extractive removal efficiency by the EMR-Lipid cleanup step was 83-98% for fatty samples and 79% for kale, including 76% removal of chlorophyll. Matrix effects were typically less than ±20%, in part because analyte protectants were used in the LPGC-MS/MS analysis. The recoveries of polycyclic aromatic hydrocarbons and diverse pesticides were mostly 70-120%, whereas recoveries of nonpolar polybrominated diphenyl ethers and polychlorinated biphenyls were mostly lower than 70% through the cleanup procedure. With the use of internal standards, method validation results showed that 76-85 of the 117 analytes achieved satisfactory results (recoveries of 70-120% and RSD≤20%) in pork, avocado, and kale, while 53 analytes had satisfactory results in salmon. Detection limits were 5-10ng/g for all but a few analytes. EMR-Lipid is a new sample preparation tool that serves as another useful option for cleanup in multiresidue analysis, particularly of fatty foods. Published by Elsevier B.V.
Sleno, Lekha; Volmer, Dietrich A
2006-01-01
Growing interest in the ability to conduct quantitative assays for small molecules by matrix-assisted laser desorption/ionization (MALDI) has been the driving force for several recent studies. This present work includes the investigation of internal standards for these analyses using a high-repetition rate MALDI triple quadrupole instrument. Certain physicochemical properties are assessed for predicting possible matches for internal standards for different small molecules. The importance of similar molecular weight of an internal standard to its analyte is seen through experiments with a series of acylcarnitines, having a fixed charge site and growing alkyl chain length. Both acetyl- and hexanoyl-carnitine were systematically assessed with several other acylcarnitine compounds as internal standards. The results clearly demonstrate that closely matched molecular weights between analyte and internal standard are essential for acceptable quantitation results. Using alpha-cyano-4-hydroxycinnamic acid as the organic matrix, the similarities between analyte and internal standard remain the most important parameter and not necessarily their even distribution within the solid sample spot. Several 4-quinolone antibiotics as well as a diverse group of pharmaceutical drugs were tested as internal standards for the 4-quinolone, ciprofloxacin. Quantitative results were shown using the solution-phase properties, log D and pKa, of these molecules. Their distribution coefficients, log D, are demonstrated as a fundamental parameter for similar crystallization patterns of analyte and internal standard. In the end, it was also possible to quantify ciprofloxacin using a drug from a different compound class, namely quinidine, having a similar log D value as the analyte. Copyright 2006 John Wiley & Sons, Ltd.
21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Restrictions on the sale, distribution and use of... Requirements for Manufacturers and Producers § 809.30 Restrictions on the sale, distribution and use of analyte... include the statement for class I exempt ASR's: “Analyte Specific Reagent. Analytical and performance...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hennebert, Pierre, E-mail: pierre.hennebert@ineris.fr; Papin, Arnaud; Padox, Jean-Marie
Highlights: • Knowledge of wastes in substances will be necessary to assess HP1–HP15 hazard properties. • A new analytical protocol is proposed for this and tested by two service laboratories on 32 samples. • Sixty-three percentage of the samples have a satisfactory analytical balance between 90% and 110%. • Eighty-four percentage of the samples were classified identically (Seveso Directive) for their hazardousness by the two laboratories. • The method, in progress, is being normalized in France and is be proposed to CEN. - Abstract: The classification of waste as hazardous could soon be assessed in Europe using largely the hazardmore » properties of its constituents, according to the the Classification, Labelling and Packaging (CLP) regulation. Comprehensive knowledge of the component constituents of a given waste will therefore be necessary. An analytical protocol for determining waste composition is proposed, which includes using inductively coupled plasma (ICP) screening methods to identify major elements and gas chromatography/mass spectrometry (GC–MS) screening techniques to measure organic compounds. The method includes a gross or indicator measure of ‘pools’ of higher molecular weight organic substances that are taken to be less bioactive and less hazardous, and of unresolved ‘mass’ during the chromatography of volatile and semi-volatile compounds. The concentration of some elements and specific compounds that are linked to specific hazard properties and are subject to specific regulation (examples include: heavy metals, chromium(VI), cyanides, organo-halogens, and PCBs) are determined by classical quantitative analysis. To check the consistency of the analysis, the sum of the concentrations (including unresolved ‘pools’) should give a mass balance between 90% and 110%. Thirty-two laboratory samples comprising different industrial wastes (liquids and solids) were tested by two routine service laboratories, to give circa 7000 parameter results. Despite discrepancies in some parameters, a satisfactory sum of estimated or measured concentrations (analytical balance) of 90% was reached for 20 samples (63% of the overall total) during this first test exercise, with identified reasons for most of the unsatisfactory results. Regular use of this protocol (which is now included in the French legislation) has enabled service laboratories to reach a 90% mass balance for nearly all the solid samples tested, and most of liquid samples (difficulties were caused in some samples from polymers in solution and vegetable oil). The protocol is submitted to French and European normalization bodies (AFNOR and CEN) and further improvements are awaited.« less
The Relation of Empathy and Defending in Bullying: A Meta-Analytic Investigation
ERIC Educational Resources Information Center
Nickerson, Amanda B.; Aloe, Ariel M.; Werth, Jilynn M.
2015-01-01
This meta-analysis synthesized results about the association between empathy and defending in bullying. A total of 20 studies were included, with 22 effect sizes from 6 studies that separated findings by the defender's gender, and 31 effect sizes from 18 studies that provided effects for the total sample were included in the analysis. The weighted…
Lermen, Dominik; Schmitt, Daniel; Bartel-Steinbach, Martina; Schröter-Kermani, Christa; Kolossa-Gehring, Marike; von Briesen, Hagen; Zimmermann, Heiko
2014-01-01
Technical progress has simplified tasks in lab diagnosis and improved quality of test results. Errors occurring during the pre-analytical phase have more negative impact on the quality of test results than errors encountered during the total analytical process. Different infrastructures of sampling sites can highly influence the quality of samples and therewith of analytical results. Annually the German Environmental Specimen Bank (ESB) collects, characterizes, and stores blood, plasma, and urine samples of 120–150 volunteers each on four different sampling sites in Germany. Overarching goal is to investigate the exposure to environmental pollutants of non-occupational exposed young adults combining human biomonitoring with questionnaire data. We investigated the requirements of the study and the possibility to realize a highly standardized sampling procedure on a mobile platform in order to increase the required quality of the pre-analytical phase. The results lead to the development of a mobile epidemiologic laboratory (epiLab) in the project “Labor der Zukunft” (future’s lab technology). This laboratory includes a 14.7 m2 reception area to record medical history and exposure-relevant behavior, a 21.1 m2 examination room to record dental fillings and for blood withdrawal, a 15.5 m2 biological safety level 2 laboratory to process and analyze samples on site including a 2.8 m2 personnel lock and a 3.6 m2 cryofacility to immediately freeze samples. Frozen samples can be transferred to their final destination within the vehicle without breaking the cold chain. To our knowledge, we herewith describe for the first time the implementation of a biological safety laboratory (BSL) 2 lab and an epidemiologic unit on a single mobile platform. Since 2013 we have been collecting up to 15.000 individual human samples annually under highly standardized conditions using the mobile laboratory. Characterized and free of alterations they are kept ready for retrospective analyses in their final archive, the German ESB. PMID:25141120
Lermen, Dominik; Schmitt, Daniel; Bartel-Steinbach, Martina; Schröter-Kermani, Christa; Kolossa-Gehring, Marike; von Briesen, Hagen; Zimmermann, Heiko
2014-01-01
Technical progress has simplified tasks in lab diagnosis and improved quality of test results. Errors occurring during the pre-analytical phase have more negative impact on the quality of test results than errors encountered during the total analytical process. Different infrastructures of sampling sites can highly influence the quality of samples and therewith of analytical results. Annually the German Environmental Specimen Bank (ESB) collects, characterizes, and stores blood, plasma, and urine samples of 120-150 volunteers each on four different sampling sites in Germany. Overarching goal is to investigate the exposure to environmental pollutants of non-occupational exposed young adults combining human biomonitoring with questionnaire data. We investigated the requirements of the study and the possibility to realize a highly standardized sampling procedure on a mobile platform in order to increase the required quality of the pre-analytical phase. The results lead to the development of a mobile epidemiologic laboratory (epiLab) in the project "Labor der Zukunft" (future's lab technology). This laboratory includes a 14.7 m(2) reception area to record medical history and exposure-relevant behavior, a 21.1 m(2) examination room to record dental fillings and for blood withdrawal, a 15.5 m(2) biological safety level 2 laboratory to process and analyze samples on site including a 2.8 m(2) personnel lock and a 3.6 m2 cryofacility to immediately freeze samples. Frozen samples can be transferred to their final destination within the vehicle without breaking the cold chain. To our knowledge, we herewith describe for the first time the implementation of a biological safety laboratory (BSL) 2 lab and an epidemiologic unit on a single mobile platform. Since 2013 we have been collecting up to 15.000 individual human samples annually under highly standardized conditions using the mobile laboratory. Characterized and free of alterations they are kept ready for retrospective analyses in their final archive, the German ESB.
Evaluation of analytical errors in a clinical chemistry laboratory: a 3 year experience.
Sakyi, As; Laing, Ef; Ephraim, Rk; Asibey, Of; Sadique, Ok
2015-01-01
Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified.
Selected Analytical Methods for Environmental Remediation and Recovery (SAM) - Home
The SAM Home page provides access to all information provided in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM), and includes a query function allowing users to search methods by analyte, sample type and instrumentation.
On trying something new: effort and practice in psychoanalytic change.
Power, D G
2000-07-01
This paper describes one of the ingredients of successful psychoanalytic change: the necessity for the analysand to actively attempt altered patterns of thinking, behaving, feeling, and relating outside of the analytic relationship. When successful, such self-initiated attempts at change are founded on insight and experience gained in the transference and constitute a crucial step in the consolidation and transfer of therapeutic gains. The analytic literature related to this aspect of therapeutic action is reviewed, including the work of Freud, Bader, Rangell, Renik, Valenstein, and Wheelis. Recent interest in the complex and complementary relationship between action and increased self-understanding as it unfolds in the analytic setting is extended beyond the consulting room to include the analysand's extra-analytic attempts to initiate change. Contemporary views of the relationship between praxis and self-knowledge are discussed and offered as theoretical support for broadening analytic technique to include greater attention to the analysand's efforts at implementing therapeutic gains. Case vignettes are presented.
NASA Astrophysics Data System (ADS)
Danesh Yazdi, M.; Klaus, J.; Condon, L. E.; Maxwell, R. M.
2017-12-01
Recent advancements in analytical solutions to quantify water and solute time-variant travel time distributions (TTDs) and the related StorAge Selection (SAS) functions synthesize catchment complexity into a simplified, lumped representation. While these analytical approaches are easy and efficient in application, they require high frequency hydrochemical data for parameter estimation. Alternatively, integrated hydrologic models coupled to Lagrangian particle-tracking approaches can directly simulate age under different catchment geometries and complexity at a greater computational expense. Here, we compare and contrast the two approaches by exploring the influence of the spatial distribution of subsurface heterogeneity, interactions between distinct flow domains, diversity of flow pathways, and recharge rate on the shape of TTDs and the relating SAS functions. To this end, we use a parallel three-dimensional variably saturated groundwater model, ParFlow, to solve for the velocity fields in the subsurface. A particle-tracking model, SLIM, is then implemented to determine the age distributions at every real time and domain location, facilitating a direct characterization of the SAS functions as opposed to analytical approaches requiring calibration of such functions. Steady-state results reveal that the assumption of random age sampling scheme might only hold in the saturated region of homogeneous catchments resulting in an exponential TTD. This assumption is however violated when the vadose zone is included as the underlying SAS function gives a higher preference to older ages. The dynamical variability of the true SAS functions is also shown to be largely masked by the smooth analytical SAS functions. As the variability of subsurface spatial heterogeneity increases, the shape of TTD approaches a power-law distribution function, including a broader distribution of shorter and longer travel times. We further found that larger (smaller) magnitude of effective precipitation shifts the scale of TTD towards younger (older) travel times, while the shape of the TTD remains untouched. This work constitutes a first step in linking a numerical transport model and analytical solutions of TTD to study their assumptions and limitations, providing physical inferences for empirical parameters.
Analytic institutes: A guide to training in the United States
NASA Astrophysics Data System (ADS)
Blanken, Terry G.
This investigation was inspired by the researcher's desire to pursue psychoanalytic training subsequent to completion of her PhD in clinical psychology and the discovery that no comprehensive resource existed to assist prospective psychoanalytic candidates with identifying or evaluating psychoanalytic training opportunities. This dissertation therefore aspires to provide a comprehensive guide to analytic training in the United States today. The researcher presents the expanding horizons of depth-oriented training leading to certification as an analyst, including training based on those schools of thought that resulted from early splits with Freud (Adlerian and Jungian) as well as training based on thought that has remained within the Freudian theoretical umbrella (e.g., classical, object relations, self psychology, etc.). Employing a heuristic approach and using hermeneutics and systems theory methodologies, the study situates analytic training in its historical context, explores contemporary issues, and considers its future. The study reviews the various analytic schools of thought and traces the history of psychoanalytic theory from its origins with Freud through its many permutations. It then discusses the history of psychoanalytic training and describes political, social, and economic factors influencing the development of training in this country. The centerpiece of the dissertation is a guidebook offering detailed information on each of 107 training institutes in the United States. Tables provide contact data and information which differentiate the institutes in terms of such parameters as size; length of program, theoretical orientation, and accreditation. A narrative of each institute summarizes the unique aspects of the program, including its admissions policy, the requirements for the training analysis and supervised clinical work, and the didactic curriculum, along with lists of courses offered. Child and adolescent psychoanalytic training is also discussed for institutes offering this option. A discussion of the contemporary world of analytic training emerges from the results of the analysis of individual institutes. Both the variations and convergences among institutes are explored. Current problems and issues in training, accreditation, and licensing are addressed. Finally, the future of psychoanalytic training is considered; concluding with an assessment of needed reforms and presentation of a model for the ideal analytic training institute of the future.
Analytic model of a multi-electron atom
NASA Astrophysics Data System (ADS)
Skoromnik, O. D.; Feranchuk, I. D.; Leonau, A. U.; Keitel, C. H.
2017-12-01
A fully analytical approximation for the observable characteristics of many-electron atoms is developed via a complete and orthonormal hydrogen-like basis with a single-effective charge parameter for all electrons of a given atom. The basis completeness allows us to employ the secondary-quantized representation for the construction of regular perturbation theory, which includes in a natural way correlation effects, converges fast and enables an effective calculation of the subsequent corrections. The hydrogen-like basis set provides a possibility to perform all summations over intermediate states in closed form, including both the discrete and continuous spectra. This is achieved with the help of the decomposition of the multi-particle Green function in a convolution of single-electronic Coulomb Green functions. We demonstrate that our fully analytical zeroth-order approximation describes the whole spectrum of the system, provides accuracy, which is independent of the number of electrons and is important for applications where the Thomas-Fermi model is still utilized. In addition already in second-order perturbation theory our results become comparable with those via a multi-configuration Hartree-Fock approach.
Chen, Hui; Luthra, Rajyalakshmi; Goswami, Rashmi S; Singh, Rajesh R; Roy-Chowdhuri, Sinchita
2015-08-28
Application of next-generation sequencing (NGS) technology to routine clinical practice has enabled characterization of personalized cancer genomes to identify patients likely to have a response to targeted therapy. The proper selection of tumor sample for downstream NGS based mutational analysis is critical to generate accurate results and to guide therapeutic intervention. However, multiple pre-analytic factors come into play in determining the success of NGS testing. In this review, we discuss pre-analytic requirements for AmpliSeq PCR-based sequencing using Ion Torrent Personal Genome Machine (PGM) (Life Technologies), a NGS sequencing platform that is often used by clinical laboratories for sequencing solid tumors because of its low input DNA requirement from formalin fixed and paraffin embedded tissue. The success of NGS mutational analysis is affected not only by the input DNA quantity but also by several other factors, including the specimen type, the DNA quality, and the tumor cellularity. Here, we review tissue requirements for solid tumor NGS based mutational analysis, including procedure types, tissue types, tumor volume and fraction, decalcification, and treatment effects.
Generalized Subset Designs in Analytical Chemistry.
Surowiec, Izabella; Vikström, Ludvig; Hector, Gustaf; Johansson, Erik; Vikström, Conny; Trygg, Johan
2017-06-20
Design of experiments (DOE) is an established methodology in research, development, manufacturing, and production for screening, optimization, and robustness testing. Two-level fractional factorial designs remain the preferred approach due to high information content while keeping the number of experiments low. These types of designs, however, have never been extended to a generalized multilevel reduced design type that would be capable to include both qualitative and quantitative factors. In this Article we describe a novel generalized fractional factorial design. In addition, it also provides complementary and balanced subdesigns analogous to a fold-over in two-level reduced factorial designs. We demonstrate how this design type can be applied with good results in three different applications in analytical chemistry including (a) multivariate calibration using microwave resonance spectroscopy for the determination of water in tablets, (b) stability study in drug product development, and (c) representative sample selection in clinical studies. This demonstrates the potential of generalized fractional factorial designs to be applied in many other areas of analytical chemistry where representative, balanced, and complementary subsets are required, especially when a combination of quantitative and qualitative factors at multiple levels exists.
Analytically tractable climate-carbon cycle feedbacks under 21st century anthropogenic forcing
NASA Astrophysics Data System (ADS)
Lade, Steven J.; Donges, Jonathan F.; Fetzer, Ingo; Anderies, John M.; Beer, Christian; Cornell, Sarah E.; Gasser, Thomas; Norberg, Jon; Richardson, Katherine; Rockström, Johan; Steffen, Will
2018-05-01
Changes to climate-carbon cycle feedbacks may significantly affect the Earth system's response to greenhouse gas emissions. These feedbacks are usually analysed from numerical output of complex and arguably opaque Earth system models. Here, we construct a stylised global climate-carbon cycle model, test its output against comprehensive Earth system models, and investigate the strengths of its climate-carbon cycle feedbacks analytically. The analytical expressions we obtain aid understanding of carbon cycle feedbacks and the operation of the carbon cycle. Specific results include that different feedback formalisms measure fundamentally the same climate-carbon cycle processes; temperature dependence of the solubility pump, biological pump, and CO2 solubility all contribute approximately equally to the ocean climate-carbon feedback; and concentration-carbon feedbacks may be more sensitive to future climate change than climate-carbon feedbacks. Simple models such as that developed here also provide workbenches
for simple but mechanistically based explorations of Earth system processes, such as interactions and feedbacks between the planetary boundaries, that are currently too uncertain to be included in comprehensive Earth system models.
Collective relaxation dynamics of small-world networks
NASA Astrophysics Data System (ADS)
Grabow, Carsten; Grosskinsky, Stefan; Kurths, Jürgen; Timme, Marc
2015-05-01
Complex networks exhibit a wide range of collective dynamic phenomena, including synchronization, diffusion, relaxation, and coordination processes. Their asymptotic dynamics is generically characterized by the local Jacobian, graph Laplacian, or a similar linear operator. The structure of networks with regular, small-world, and random connectivities are reasonably well understood, but their collective dynamical properties remain largely unknown. Here we present a two-stage mean-field theory to derive analytic expressions for network spectra. A single formula covers the spectrum from regular via small-world to strongly randomized topologies in Watts-Strogatz networks, explaining the simultaneous dependencies on network size N , average degree k , and topological randomness q . We present simplified analytic predictions for the second-largest and smallest eigenvalue, and numerical checks confirm our theoretical predictions for zero, small, and moderate topological randomness q , including the entire small-world regime. For large q of the order of one, we apply standard random matrix theory, thereby overarching the full range from regular to randomized network topologies. These results may contribute to our analytic and mechanistic understanding of collective relaxation phenomena of network dynamical systems.
Collective relaxation dynamics of small-world networks.
Grabow, Carsten; Grosskinsky, Stefan; Kurths, Jürgen; Timme, Marc
2015-05-01
Complex networks exhibit a wide range of collective dynamic phenomena, including synchronization, diffusion, relaxation, and coordination processes. Their asymptotic dynamics is generically characterized by the local Jacobian, graph Laplacian, or a similar linear operator. The structure of networks with regular, small-world, and random connectivities are reasonably well understood, but their collective dynamical properties remain largely unknown. Here we present a two-stage mean-field theory to derive analytic expressions for network spectra. A single formula covers the spectrum from regular via small-world to strongly randomized topologies in Watts-Strogatz networks, explaining the simultaneous dependencies on network size N, average degree k, and topological randomness q. We present simplified analytic predictions for the second-largest and smallest eigenvalue, and numerical checks confirm our theoretical predictions for zero, small, and moderate topological randomness q, including the entire small-world regime. For large q of the order of one, we apply standard random matrix theory, thereby overarching the full range from regular to randomized network topologies. These results may contribute to our analytic and mechanistic understanding of collective relaxation phenomena of network dynamical systems.
Blasco, Antonio Javier; Barrigas, Inés; González, María Cristina; Escarpa, Alberto
2005-12-01
This paper examines for the first time the analytical possibilities of fast and simultaneous detection of prominent natural antioxidants including examples of flavonoids and vitamins using a CE microchip with electrochemical detection (ED). Unpinched injection conditions, zone electrophoretic separation and amperometric detection were carefully assayed and optimised. Analysis involved the zone electrophoretic separation of arbutin, (+)-catechin and ascorbic acid in less than 4 min using a borate buffer (pH 9.0, 50 mM), employing 2 kV as the separation voltage and +1.0 V as the detection potential. In addition, the separation of different 'couples' of natural antioxidants of food significance including (+)-catechin and ascorbic acid, (+)-catechin and rutin, as well as arbutin and phlorizdin is proposed. To demonstrate the potential and future role of CE microsystems, analytical possibilities and a new route in the raw sample analysis are presented. The preliminary results obtained allow the proposal of CE-ED microchips as a real gateway to microanalysis in foods.
NASA Technical Reports Server (NTRS)
Hollis, Brian R.
1995-01-01
A FORTRAN computer code for the reduction and analysis of experimental heat transfer data has been developed. This code can be utilized to determine heat transfer rates from surface temperature measurements made using either thin-film resistance gages or coaxial surface thermocouples. Both an analytical and a numerical finite-volume heat transfer model are implemented in this code. The analytical solution is based on a one-dimensional, semi-infinite wall thickness model with the approximation of constant substrate thermal properties, which is empirically corrected for the effects of variable thermal properties. The finite-volume solution is based on a one-dimensional, implicit discretization. The finite-volume model directly incorporates the effects of variable substrate thermal properties and does not require the semi-finite wall thickness approximation used in the analytical model. This model also includes the option of a multiple-layer substrate. Fast, accurate results can be obtained using either method. This code has been used to reduce several sets of aerodynamic heating data, of which samples are included in this report.
2015-01-01
Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893
Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M
2008-11-07
Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit.
Insight solutions are correct more often than analytic solutions
Salvi, Carola; Bricolo, Emanuela; Kounios, John; Bowden, Edward; Beeman, Mark
2016-01-01
How accurate are insights compared to analytical solutions? In four experiments, we investigated how participants’ solving strategies influenced their solution accuracies across different types of problems, including one that was linguistic, one that was visual and two that were mixed visual-linguistic. In each experiment, participants’ self-judged insight solutions were, on average, more accurate than their analytic ones. We hypothesised that insight solutions have superior accuracy because they emerge into consciousness in an all-or-nothing fashion when the unconscious solving process is complete, whereas analytic solutions can be guesses based on conscious, prematurely terminated, processing. This hypothesis is supported by the finding that participants’ analytic solutions included relatively more incorrect responses (i.e., errors of commission) than timeouts (i.e., errors of omission) compared to their insight responses. PMID:27667960
Rapid detection of nicotine from breath using desorption ionisation on porous silicon.
Guinan, T M; Abdelmaksoud, H; Voelcker, N H
2017-05-04
Desorption ionisation on porous silicon (DIOS) was used for the detection of nicotine from exhaled breath. This result represents proof-of-principle of the ability of DIOS to detect small molecular analytes in breath including biomarkers and illicit drugs.
CTEPP NC DATA SUPPLEMENTAL INFORMATION ON FIELD AND LABORATORY SAMPLES
This data set contains supplemental data related to the final core analytical results table. This includes sample collection data for example sample weight, air volume, creatinine, specific gravity etc.
The Children’s Total Exposure to Persistent Pesticides and Other Persistent...
40 CFR 141.24 - Organic chemicals, sampling and analytical requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... available from U.S. Environmental Protection Agency, National Exposure Research Laboratory (NERL)-Cincinnati... for which the laboratory desires certification. (B) Achieve the quantitative acceptance limits under... contaminants included in the PE sample. (C) Achieve quantitative results on the analyses performed under...
40 CFR 141.24 - Organic chemicals, sampling and analytical requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... available from U.S. Environmental Protection Agency, National Exposure Research Laboratory (NERL)-Cincinnati... for which the laboratory desires certification. (B) Achieve the quantitative acceptance limits under... contaminants included in the PE sample. (C) Achieve quantitative results on the analyses performed under...
40 CFR 141.24 - Organic chemicals, sampling and analytical requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... available from U.S. Environmental Protection Agency, National Exposure Research Laboratory (NERL)-Cincinnati... for which the laboratory desires certification. (B) Achieve the quantitative acceptance limits under... contaminants included in the PE sample. (C) Achieve quantitative results on the analyses performed under...
40 CFR 141.24 - Organic chemicals, sampling and analytical requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... available from U.S. Environmental Protection Agency, National Exposure Research Laboratory (NERL)-Cincinnati... for which the laboratory desires certification. (B) Achieve the quantitative acceptance limits under... contaminants included in the PE sample. (C) Achieve quantitative results on the analyses performed under...
40 CFR 141.24 - Organic chemicals, sampling and analytical requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... available from U.S. Environmental Protection Agency, National Exposure Research Laboratory (NERL)-Cincinnati... for which the laboratory desires certification. (B) Achieve the quantitative acceptance limits under... contaminants included in the PE sample. (C) Achieve quantitative results on the analyses performed under...
Wu, Zheyang; Zhao, Hongyu
2012-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies.
Wu, Zheyang; Zhao, Hongyu
2013-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies. PMID:23956610
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pool, K.H.; Evans, J.C.; Olsen, K.B.
1997-08-01
This report presents the results from analyses of samples taken from the headspace of waste storage tank 241-S-102 (Tank S-102) at the Hanford Site in Washington State. Tank headspace samples collected by SGN Eurisys Service Corporation (SESC) were analyzed by Pacific Northwest National Laboratory (PNNL) to determine headspace concentrations of selected non-radioactive analytes. Analyses were performed by the Vapor Analytical Laboratory (VAL) at PNNL. Vapor concentrations from sorbent trap samples are based on measured sample volumes provided by SESC. Ammonia was determined to be above the immediate notification limit of 150 ppm as specified by the sampling and analysis planmore » (SAP). Hydrogen was the principal flammable constituent of the Tank S-102 headspace, determined to be present at approximately 2.410% of its lower flammability limit (LFL). Total headspace flammability was estimated to be <2.973% of its lower flammability limit (LFL). Total headspace flammability was estimated to be <2.973% of the LFL. Average measured concentrations of targeted gases, inorganic vapors, and selected organic vapors are provided in Table S.1. A summary of experimental methods, including sampling methodology, analytical procedures, and quality assurance and control methods are presented in Section 2.0. Detailed descriptions of the analytical results are provided in Section 3.0.« less
Mammana, Sabrina B; Berton, Paula; Camargo, Alejandra B; Lascalea, Gustavo E; Altamirano, Jorgelina C
2017-05-01
An analytical methodology based on coprecipitation-assisted coacervative extraction coupled to HPLC-UV was developed for determination of five organophosphorus pesticides (OPPs), including fenitrothion, guthion, parathion, methidathion, and chlorpyrifos, in water samples. It involves a green technique leading to an efficient and simple analytical methodology suitable for high-throughput analysis. Relevant physicochemical variables were studied and optimized on the analytical response of each OPP. Under optimized conditions, the resulting methodology was as follows: an aliquot of 9 mL of water sample was placed into a centrifuge tube and 0.5 mL sodium citrate 0.1 M, pH 4; 0.08 mL Al 2 (SO 4 ) 3 0.1 M; and 0.7 mL SDS 0.1 M were added and homogenized. After centrifugation the supernatant was discarded. A 700 μL aliquot of the coacervate-rich phase obtained was dissolved with 300 μL of methanol and 20 μL of the resulting solution was analyzed by HPLC-UV. The resulting LODs ranged within 0.7-2.5 ng/mL and the achieved RSD and recovery values were <8% (n = 3) and >81%, respectively. The proposed analytical methodology was successfully applied for the analysis of five OPPs in water samples for human consumption of different locations of Mendoza. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Field comparison of analytical results from discrete-depth ground water samplers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zemo, D.A.; Delfino, T.A.; Gallinatti, J.D.
1995-07-01
Discrete-depth ground water samplers are used during environmental screening investigations to collect ground water samples in lieu of installing and sampling monitoring wells. Two of the most commonly used samplers are the BAT Enviroprobe and the QED HydroPunch I, which rely on differing sample collection mechanics. Although these devices have been on the market for several years, it was unknown what, if any, effect the differences would have on analytical results for ground water samples containing low to moderate concentrations of chlorinated volatile organic compounds (VOCs). This study investigated whether the discrete-depth ground water sampler used introduces statistically significant differencesmore » in analytical results. The goal was to provide a technical basis for allowing the two devices to be used interchangeably during screening investigations. Because this study was based on field samples, it included several sources of potential variability. It was necessary to separate differences due to sampler type from variability due to sampling location, sample handling, and laboratory analytical error. To statistically evaluate these sources of variability, the experiment was arranged in a nested design. Sixteen ground water samples were collected from eight random locations within a 15-foot by 15-foot grid. The grid was located in an area where shallow ground water was believed to be uniformly affected by VOCs. The data were evaluated using analysis of variance.« less
Recent applications of carbon-based nanomaterials in analytical chemistry: critical review.
Scida, Karen; Stege, Patricia W; Haby, Gabrielle; Messina, Germán A; García, Carlos D
2011-04-08
The objective of this review is to provide a broad overview of the advantages and limitations of carbon-based nanomaterials with respect to analytical chemistry. Aiming to illustrate the impact of nanomaterials on the development of novel analytical applications, developments reported in the 2005-2010 period have been included and divided into sample preparation, separation, and detection. Within each section, fullerenes, carbon nanotubes, graphene, and composite materials will be addressed specifically. Although only briefly discussed, included is a section highlighting nanomaterials with interesting catalytic properties that can be used in the design of future devices for analytical chemistry. Copyright © 2011 Elsevier B.V. All rights reserved.
Recent Applications of Carbon-Based Nanomaterials in Analytical Chemistry: Critical Review
Scida, Karen; Stege, Patricia W.; Haby, Gabrielle; Messina, Germán A.; García, Carlos D.
2011-01-01
The objective of this review is to provide a broad overview of the advantages and limitations of carbon-based nanomaterials with respect to analytical chemistry. Aiming to illustrate the impact of nanomaterials on the development of novel analytical applications, developments reported in the 2005–2010 period have been included and divided into sample preparation, separation, and detection. Within each section, fullerenes, carbon nanotubes, graphene, and composite materials will be addressed specifically. Although only briefly discussed, included is a section highlighting nanomaterials with interesting catalytic properties that can be used in the design of future devices for analytical chemistry. PMID:21458626
Algorithms and software for U-Pb geochronology by LA-ICPMS
NASA Astrophysics Data System (ADS)
McLean, Noah M.; Bowring, James F.; Gehrels, George
2016-07-01
The past 15 years have produced numerous innovations in geochronology, including experimental methods, instrumentation, and software that are revolutionizing the acquisition and application of geochronological data. For example, exciting advances are being driven by Laser-Ablation ICP Mass Spectrometry (LA-ICPMS), which allows for rapid determination of U-Th-Pb ages with 10s of micrometer-scale spatial resolution. This method has become the most commonly applied tool for dating zircons, constraining a host of geological problems. The LA-ICPMS community is now faced with archiving these data with associated analytical results and, more importantly, ensuring that data meet the highest standards for precision and accuracy and that interlaboratory biases are minimized. However, there is little consensus with regard to analytical strategies and data reduction protocols for LA-ICPMS geochronology. The result is systematic interlaboratory bias and both underestimation and overestimation of uncertainties on calculated dates that, in turn, decrease the value of data in repositories such as EarthChem, which archives data and analytical results from participating laboratories. We present free open-source software that implements new algorithms for evaluating and resolving many of these discrepancies. This solution is the result of a collaborative effort to extend the U-Pb_Redux software for the ID-TIMS community to the LA-ICPMS community. Now named ET_Redux, our new software automates the analytical and scientific workflows of data acquisition, statistical filtering, data analysis and interpretation, publication, community-based archiving, and the compilation and comparison of data from different laboratories to support collaborative science.
Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.
Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S
2016-04-07
Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.
Contains basic information on the role and origins of the Selected Analytical Methods including the formation of the Homeland Security Laboratory Capacity Work Group and the Environmental Evaluation Analytical Process Roadmap for Homeland Security Events
Najat, Dereen
2017-01-01
Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan. Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs. The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%), incorrect sample identification (8%) and clotted samples (6%). Most quality control schemes at Sulaimani hospitals focus only on the analytical phase, and none of the pre-analytical errors were recorded. Interestingly, none of the labs were internationally accredited; therefore, corrective actions are needed at these hospitals to ensure better health outcomes. Internal and External Quality Assessment Schemes (EQAS) for the pre-analytical phase at Sulaimani clinical laboratories should be implemented at public hospitals. Furthermore, lab personnel, particularly phlebotomists, need continuous training on the importance of sample quality to obtain accurate test results.
Zacharis, Constantinos K; Vastardi, Elli
2018-02-20
In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B.V. All rights reserved.
Explosive shock damage potential in space structures
NASA Technical Reports Server (NTRS)
Mortimer, R. W.
1972-01-01
The effects of a pulse shape on the transient response of a cylindrical shell are presented. Uniaxial, membrane, and bending theories for isotropic shells were used in this study. In addition to the results of the analytical study, the preliminary results of an experimental study into the generation and measurement of shear waves in a cylindrical shell are included.
The exact solution of the monoenergetic transport equation for critical cylinders
NASA Technical Reports Server (NTRS)
Westfall, R. M.; Metcalf, D. R.
1972-01-01
An analytic solution for the critical, monoenergetic, bare, infinite cylinder is presented. The solution is obtained by modifying a previous development based on a neutron density transform and Case's singular eigenfunction method. Numerical results for critical radii and the neutron density as a function of position are included and compared with the results of other methods.
Spin dynamics in storage rings and linear accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Irwin, J.
1994-12-01
The purpose of these lectures is to survey the subject of spin dynamics in accelerators: to give a sense of the underlying physics, the typical analytic and numeric methods used, and an overview of results achieved. Consideration will be limited to electrons and protons. Examples of experimental and theoretical results in both linear and circular machines are included.
Solid state electro-optic color filter and iris
NASA Technical Reports Server (NTRS)
1974-01-01
Test results obtained have confirmed the practicality of the solid state electro-optic filters as an optical control element in a television system. Neutral-density control range in excess of 1000:1 has been obtained on sample filters. Test results, measurements in a complete camera system, discussions of problem areas, analytical comparisons, and recommendations for future investigations are included.
High speed operation of permanent magnet machines
NASA Astrophysics Data System (ADS)
El-Refaie, Ayman M.
This work proposes methods to extend the high-speed operating capabilities of both the interior PM (IPM) and surface PM (SPM) machines. For interior PM machines, this research has developed and presented the first thorough analysis of how a new bi-state magnetic material can be usefully applied to the design of IPM machines. Key elements of this contribution include identifying how the unique properties of the bi-state magnetic material can be applied most effectively in the rotor design of an IPM machine by "unmagnetizing" the magnet cavity center posts rather than the outer bridges. The importance of elevated rotor speed in making the best use of the bi-state magnetic material while recognizing its limitations has been identified. For surface PM machines, this research has provided, for the first time, a clear explanation of how fractional-slot concentrated windings can be applied to SPM machines in order to achieve the necessary conditions for optimal flux weakening. A closed-form analytical procedure for analyzing SPM machines designed with concentrated windings has been developed. Guidelines for designing SPM machines using concentrated windings in order to achieve optimum flux weakening are provided. Analytical and numerical finite element analysis (FEA) results have provided promising evidence of the scalability of the concentrated winding technique with respect to the number of poles, machine aspect ratio, and output power rating. Useful comparisons between the predicted performance characteristics of SPM machines equipped with concentrated windings and both SPM and IPM machines designed with distributed windings are included. Analytical techniques have been used to evaluate the impact of the high pole number on various converter performance metrics. Both analytical techniques and FEA have been used for evaluating the eddy-current losses in the surface magnets due to the stator winding subharmonics. Techniques for reducing these losses have been investigated. A 6kW, 36slot/30pole prototype SPM machine has been designed and built. Experimental measurements have been used to verify the analytical and FEA results. These test results have demonstrated that wide constant-power speed range can be achieved. Other important machine features such as the near-sinusoidal back-emf, high efficiency, and low cogging torque have also been demonstrated.
Mixing of two co-directional Rayleigh surface waves in a nonlinear elastic material.
Morlock, Merlin B; Kim, Jin-Yeon; Jacobs, Laurence J; Qu, Jianmin
2015-01-01
The mixing of two co-directional, initially monochromatic Rayleigh surface waves in an isotropic, homogeneous, and nonlinear elastic solid is investigated using analytical, finite element method, and experimental approaches. The analytical investigations show that while the horizontal velocity component can form a shock wave, the vertical velocity component can form a pulse independent of the specific ratios of the fundamental frequencies and amplitudes that are mixed. This analytical model is then used to simulate the development of the fundamentals, second harmonics, and the sum and difference frequency components over the propagation distance. The analytical model is further extended to include diffraction effects in the parabolic approximation. Finally, the frequency and amplitude ratios of the fundamentals are identified which provide maximum amplitudes of the second harmonics as well as of the sum and difference frequency components, to help guide effective material characterization; this approach should make it possible to measure the acoustic nonlinearity of a solid not only with the second harmonics, but also with the sum and difference frequency components. Results of the analytical investigations are then confirmed using the finite element method and the experimental feasibility of the proposed technique is validated for an aluminum specimen.
Ultrasonic analyte concentration and application in flow cytometry
Kaduchak, Gregory; Goddard, Greg; Salzman, Gary; Sinha, Dipen; Martin, John C.; Kwiatkowski, Christopher; Graves, Steven
2014-07-22
The present invention includes an apparatus and corresponding method for concentrating analytes within a fluid flowing through a tube using acoustic radiation pressure. The apparatus includes a function generator that outputs a radio frequency electrical signal to a transducer that transforms the radio frequency electric signal to an acoustic signal and couples the acoustic signal to the tube. The acoustic signal is converted within the tube to acoustic pressure that concentrates the analytes within the fluid.
Ultrasonic analyte concentration and application in flow cytometry
Kaduchak, Gregory [Los Alamos, NM; Goddard, Greg [Los Alamos, NM; Salzman, Gary [White Rock, NM; Sinha, Dipen [Los Alamos, NM; Martin, John C [Los Alamos, NM; Kwiatkowski, Christopher [Los Alamos, NM; Graves, Steven [San Juan Pueblo, NM
2008-03-11
The present invention includes an apparatus and corresponding method for concentrating analytes within a fluid flowing through a tube using acoustic radiation pressure. The apparatus includes a function generator that outputs a radio frequency electrical signal to a transducer that transforms the radio frequency electric signal to an acoustic signal and couples the acoustic signal to the tube. The acoustic signal is converted within the tube to acoustic pressure that concentrates the analytes within the fluid.
Ultrasonic analyte concentration and application in flow cytometry
Kaduchak, Gregory; Goddard, Greg; Salzman, Gary; Sinha, Dipen; Martin, John C.; Kwiatkowski, Christopher; Graves, Steven
2015-07-07
The present invention includes an apparatus and corresponding method for concentrating analytes within a fluid flowing through a tube using acoustic radiation pressure. The apparatus includes a function generator that outputs a radio frequency electrical signal to a transducer that transforms the radio frequency electric signal to an acoustic signal and couples the acoustic signal to the tube. The acoustic signal is converted within the tube to acoustic pressure that concentrates the analytes within the fluid.
Analytical Study on Flight Performance of a RP Laser Launcher
NASA Astrophysics Data System (ADS)
Katsurayama, H.; Ushio, M.; Komurasaki, K.; Arakawa, Y.
2005-04-01
An air-breathing RP Laser Launcher has been proposed as the alternative to conventional chemical launch systems. This paper analytically examines the feasibility of SSTO system powered by RP lasers. The trajectory from the ground to the geosynchronous orbit is computed and the launch cost including laser-base development is estimated. The engine performance is evaluated by CFD computations and a cycle analysis. The results show that the beam power of 2.3MW per unit initial vehicle mass is optimum to reach a geo-synchronous transfer orbit, and 3,000 launches are necessary to redeem the cost for laser transmitter.
Laboratory Methods for the Measurement of Pollutants in Water and Waste Effluents
NASA Technical Reports Server (NTRS)
Ballinger, Dwight G.
1971-01-01
The requirement for accurate, precise, and rapid analytical procedures for the examination of water and waste samples requires the use of a variety of instruments. The instrumentation in water laboratories includes atomic absorption, UV-visible. and infrared spectrophotometers, automatic colorimetric analyzers, gas chromatographs and mass spectrometers. Because of the emphasis on regulatory action, attention is being directed toward quality control of analytical results. Among the challenging problems are the differentiation of metallic species in water at nanogram concentrations, rapid measurement of free cyanide and free ammonia, more sensitive methods for arsenic and selenium and improved characterization of organic contaminants.
Small-World Network Spectra in Mean-Field Theory
NASA Astrophysics Data System (ADS)
Grabow, Carsten; Grosskinsky, Stefan; Timme, Marc
2012-05-01
Collective dynamics on small-world networks emerge in a broad range of systems with their spectra characterizing fundamental asymptotic features. Here we derive analytic mean-field predictions for the spectra of small-world models that systematically interpolate between regular and random topologies by varying their randomness. These theoretical predictions agree well with the actual spectra (obtained by numerical diagonalization) for undirected and directed networks and from fully regular to strongly random topologies. These results may provide analytical insights to empirically found features of dynamics on small-world networks from various research fields, including biology, physics, engineering, and social science.
An Improved Analytic Model for Microdosimeter Response
NASA Technical Reports Server (NTRS)
Shinn, Judy L.; Wilson, John W.; Xapsos, Michael A.
2001-01-01
An analytic model used to predict energy deposition fluctuations in a microvolume by ions through direct events is improved to include indirect delta ray events. The new model can now account for the increase in flux at low lineal energy when the ions are of very high energy. Good agreement is obtained between the calculated results and available data for laboratory ion beams. Comparison of GCR (galactic cosmic ray) flux between Shuttle TEPC (tissue equivalent proportional counter) flight data and current calculations draws a different assessment of developmental work required for the GCR transport code (HZETRN) than previously concluded.
Analytical studies of the Space Shuttle orbiter nose-gear tire
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Tanner, John A.; Peters, Jeanne M.; Robinson, Martha P.
1991-01-01
A computational procedure is presented for evaluating the analytic sensitivity derivatives of the tire response with respect to material and geometrical properties of the tire. The tire is modeled by using a two-dimensional laminated anisotropic shell theory with the effects of variation in material and geometric parameters included. The computational procedure is applied to the case of the Space Shuttle orbiter nose-gear tire subjected to uniform inflation pressure. Numerical results are presented which show the sensitivity of the different tire response quantities to variations in the material characteristics of both the cord and rubber.
Holographic Lifshitz superconductors: Analytic solution
NASA Astrophysics Data System (ADS)
Natsuume, Makoto; Okamura, Takashi
2018-03-01
We construct an analytic solution for a one-parameter family of holographic superconductors in asymptotically Lifshitz spacetimes. We utilize this solution to explore various properties of the systems such as (1) the superfluid phase background and the grand canonical potential, (2) the order parameter response function or the susceptibility, (3) the London equation, and (4) the background with a superfluid flow or a magnetic field. From these results, we identify the dual Ginzburg-Landau theory including numerical coefficients. Also, the dynamic critical exponent zD associated with the critical point is given by zD=2 irrespective of the value of the Lifshitz exponent z .
Comparative study of solar optics for paraboloidal concentrators
NASA Technical Reports Server (NTRS)
Wen, L.; Poon, P.; Carley, W.; Huang, L.
1979-01-01
Different analytical methods for computing the flux distribution on the focal plane of a paraboloidal solar concentrator are reviewed. An analytical solution in algebraic form is also derived for an idealized model. The effects resulting from using different assumptions in the definition of optical parameters used in these methodologies are compared and discussed in detail. These parameters include solar irradiance distribution (limb darkening and circumsolar), reflector surface specular spreading, surface slope error, and concentrator pointing inaccuracy. The type of computational method selected for use depends on the maturity of the design and the data available at the time the analysis is made.
[The taking and transport of biological samples].
Kerwat, Klaus; Kerwat, Martina; Eberhart, Leopold; Wulf, Hinnerk
2011-05-01
The results of microbiological tests are the foundation for a targetted therapy and the basis for monitoring infections. The quality of each and every laboratory finding depends not only on an error-free analytical process. The pre-analysis handling procedures are of particular importance. They encompass all factors and influences prior to the actual analysis. These include the correct timepoint for sample taking, the packaging and the rapid transport of the material to be investigated. Errors in the pre-analytical processing are the most frequent reasons for inappropriate findings. © Georg Thieme Verlag Stuttgart · New York.
NASA Astrophysics Data System (ADS)
Burin, Alexander L.
2015-03-01
Many-body localization in a disordered system of interacting spins coupled by the long-range interaction 1 /Rα is investigated combining analytical theory considering resonant interactions and a finite-size scaling of exact numerical solutions with number of spins N . The numerical results for a one-dimensional system are consistent with the general expectations of analytical theory for a d -dimensional system including the absence of localization in the infinite system at α <2 d and a universal scaling of a critical energy disordering Wc∝N2/d -α d .
KC-135 winglet program overview
NASA Technical Reports Server (NTRS)
Barber, M. R.; Selegan, D.
1982-01-01
A joint NASA/USAF program was conducted to accomplish the following objectives: (1) evaluate the benefits that could be achieved from the application of winglets to KC-135 aircraft; and (2) determine the ability of wind tunnel tests and analytical analysis to predict winglet characteristics. The program included wind-tunnel development of a test winglet configuration; analytical predictions of the changes to the aircraft resulting from the application of the test winglet; and finally, flight tests of the developed configuration. Pressure distribution, loads, stability and control, buffet, fuel mileage, and flutter data were obtained to fulfill the objectives of the program.
Analytical fitting model for rough-surface BRDF.
Renhorn, Ingmar G E; Boreman, Glenn D
2008-08-18
A physics-based model is developed for rough surface BRDF, taking into account angles of incidence and scattering, effective index, surface autocovariance, and correlation length. Shadowing is introduced on surface correlation length and reflectance. Separate terms are included for surface scatter, bulk scatter and retroreflection. Using the FindFit function in Mathematica, the functional form is fitted to BRDF measurements over a wide range of incident angles. The model has fourteen fitting parameters; once these are fixed, the model accurately describes scattering data over two orders of magnitude in BRDF without further adjustment. The resulting analytical model is convenient for numerical computations.
Huang, Yande; Su, Bao-Ning; Ye, Qingmei; Palaniswamy, Venkatapuram A; Bolgar, Mark S; Raglione, Thomas V
2014-01-01
The classical internal standard quantitative NMR (qNMR) method determines the purity of an analyte by the determination of a solution containing the analyte and a standard. Therefore, the standard must meet the requirements of chemical compatibility and lack of resonance interference with the analyte as well as a known purity. The identification of such a standard can be time consuming and must be repeated for each analyte. In contrast, the external standard qNMR method utilizes a standard with a known purity to calibrate the NMR instrument. The external standard and the analyte are measured separately, thereby eliminating the matter of chemical compatibility and resonance interference between the standard and the analyte. However, the instrumental factors, including the quality of NMR tubes, must be kept the same. Any deviations will compromise the accuracy of the results. An innovative qNMR method reported herein utilizes an internal reference substance along with an external standard to assume the role of the standard used in the traditional internal standard qNMR method. In this new method, the internal reference substance must only be chemically compatible and be free of resonance-interference with the analyte or external standard whereas the external standard must only be of a known purity. The exact purity or concentration of the internal reference substance is not required as long as the same quantity is added to the external standard and the analyte. The new method reduces the burden of searching for an appropriate standard for each analyte significantly. Therefore the efficiency of the qNMR purity assay increases while the precision of the internal standard method is retained. Copyright © 2013 Elsevier B.V. All rights reserved.
Tsao, C C; Liou, J U; Wen, P H; Peng, C C; Liu, T S
2013-01-01
Aim To develop analytical models and analyse the stress distribution and flexibility of nickel–titanium (NiTi) instruments subject to bending forces. Methodology The analytical method was used to analyse the behaviours of NiTi instruments under bending forces. Two NiTi instruments (RaCe and Mani NRT) with different cross-sections and geometries were considered. Analytical results were derived using Euler–Bernoulli nonlinear differential equations that took into account the screw pitch variation of these NiTi instruments. In addition, the nonlinear deformation analysis based on the analytical model and the finite element nonlinear analysis was carried out. Numerical results are obtained by carrying out a finite element method. Results According to analytical results, the maximum curvature of the instrument occurs near the instrument tip. Results of the finite element analysis revealed that the position of maximum von Mises stress was near the instrument tip. Therefore, the proposed analytical model can be used to predict the position of maximum curvature in the instrument where fracture may occur. Finally, results of analytical and numerical models were compatible. Conclusion The proposed analytical model was validated by numerical results in analysing bending deformation of NiTi instruments. The analytical model is useful in the design and analysis of instruments. The proposed theoretical model is effective in studying the flexibility of NiTi instruments. Compared with the finite element method, the analytical model can deal conveniently and effectively with the subject of bending behaviour of rotary NiTi endodontic instruments. PMID:23173762
Method and apparatus for optimized sampling of volatilizable target substances
Lindgren, Eric R.; Phelan, James M.
2004-10-12
An apparatus for capturing, from gases such as soil gas, target analytes. Target analytes may include emanations from explosive materials or from residues of explosive materials. The apparatus employs principles of sorption common to solid phase microextraction, and is best used in conjunction with analysis means such as a gas chromatograph. To sorb target analytes, the apparatus functions using various sorptive structures to capture target analyte. Depending upon the embodiment, those structures may include a capillary tube including an interior surface on which sorptive material (similar to that on the surface of a SPME fiber) is supported (along with means for moving gases through the capillary tube so that the gases come into close proximity to the sorptive material). In one disclosed embodiment, at least one such sorptive structure is associated with an enclosure including an opening in communication with the surface of a soil region potentially contaminated with buried explosive material such as unexploded ordnance. Emanations from explosive materials can pass into and accumulate in the enclosure where they are sorbed by the sorptive structures. Also disclosed is the use of heating means such as microwave horns to drive target analytes into the soil gas from solid and liquid phase components of the soil.
NASA Technical Reports Server (NTRS)
Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.
1983-01-01
The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.
Markiewicz, B; Sajnóg, A; Lorenc, W; Hanć, A; Komorowicz, I; Suliburska, J; Kocyłowski, R; Barałkiewicz, D
2017-11-01
Amniotic fluid is the substantial factor in the development of an embryo and fetus due to the fact that water and solutes contained in it penetrate the fetal membranes in an hydrostatic and osmotic way as well as being swallowed by the fetus. Elemental composition of amniotic fluid influences the growth and health of the fetus, therefore, an analysis of amniotic fluid is important because the results would indicate abnormal levels of minerals or toxic elements. Inductively coupled plasma mass spectroscopy (ICP-MS) is often used for determination of trace and ultra-trace level elements in a wide range of matrices including biological samples because of its unique analytical capabilities. In the case of trace and ultra-trace level analysis detailed characteristics of analytical procedure as well as properties of the analytical result are particularly important. The purpose of this study was to develop a new analytical procedure for multielemental analysis of 18 elements (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Mg, Mn, Ni, Pb, Sb, Se, Sr, U, V and Zn) in amniotic fluid samples using ICP-MS. Dynamic reaction cell (DRC) with two reaction gases, ammonia and oxygen, was involved in the experiment to eliminate spectral interferences. Detailed validation was conducted using 3 certified reference mterials (CRMs) and real amniotic fluid samples collected from patients. Repeatability for all analyzed analytes was found to range from 0.70% to 8.0% and for intermediate precision results varied from 1.3% to 15%. Trueness expressed as recovery ranged from 80% to 125%. Traceability was assured through the analyses of CRMs. Uncertainty of the results was also evaluated using single-laboratory validation approach. The obtained expanded uncertainty (U) results for CRMs, expressed as a percentage of the concentration of an analyte, were found to be between 8.3% for V and 45% for Cd. Standard uncertainty of the precision was found to have a greater influence on the combined standard uncertainty than on trueness factor. Copyright © 2017 Elsevier B.V. All rights reserved.
Effects of space environment on composites: An analytical study of critical experimental parameters
NASA Technical Reports Server (NTRS)
Gupta, A.; Carroll, W. F.; Moacanin, J.
1979-01-01
A generalized methodology currently employed at JPL, was used to develop an analytical model for effects of high-energy electrons and interactions between electron and ultraviolet effects. Chemical kinetic concepts were applied in defining quantifiable parameters; the need for determining short-lived transient species and their concentration was demonstrated. The results demonstrates a systematic and cost-effective means of addressing the issues and show qualitative and quantitative, applicable relationships between space radiation and simulation parameters. An equally important result is identification of critical initial experiments necessary to further clarify the relationships. Topics discussed include facility and test design; rastered vs. diffuse continuous e-beam; valid acceleration level; simultaneous vs. sequential exposure to different types of radiation; and interruption of test continuity.
An overview of the crash dynamics failure behavior of metal and composite aircraft structures
NASA Technical Reports Server (NTRS)
Carden, Huey D.; Boitnott, Richard L.; Fasanella, Edwin L.; Jones, Lisa E.
1991-01-01
An overview of failure behavior results is presented from some of the crash dynamics research conducted with concepts of aircraft elements and substructure not necessarily designed or optimized for energy absorption or crash loading considerations. Experimental and analytical data are presented that indicate some general trends in the failure behavior of a class of composite structures that includes fuselage panels, individual fuselage sections, fuselage frames, skeleton subfloors with stringers and floor beams without skin covering, and subfloors with skin added to the frame stringer structure. Although the behavior is complex, a strong similarity in the static/dynamic failure behavior among these structures is illustrated through photographs of the experimental results and through analytical data of generic composite structural models.
NASA Technical Reports Server (NTRS)
Lyle, Karen H.
2015-01-01
Acceptance of new spacecraft structural architectures and concepts requires validated design methods to minimize the expense involved with technology demonstration via flight-testing. Hypersonic Inflatable Aerodynamic Decelerator (HIAD) architectures are attractive for spacecraft deceleration because they are lightweight, store compactly, and utilize the atmosphere to decelerate a spacecraft during entry. However, designers are hesitant to include these inflatable approaches for large payloads or spacecraft because of the lack of flight validation. This publication summarizes results comparing analytical results with test data for two concepts subjected to representative entry, static loading. The level of agreement and ability to predict the load distribution is considered sufficient to enable analytical predictions to be used in the design process.
Interacting steps with finite-range interactions: Analytical approximation and numerical results
NASA Astrophysics Data System (ADS)
Jaramillo, Diego Felipe; Téllez, Gabriel; González, Diego Luis; Einstein, T. L.
2013-05-01
We calculate an analytical expression for the terrace-width distribution P(s) for an interacting step system with nearest- and next-nearest-neighbor interactions. Our model is derived by mapping the step system onto a statistically equivalent one-dimensional system of classical particles. The validity of the model is tested with several numerical simulations and experimental results. We explore the effect of the range of interactions q on the functional form of the terrace-width distribution and pair correlation functions. For physically plausible interactions, we find modest changes when next-nearest neighbor interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.
IMPROVED METHOD FOR THE STORAGE OF GROUND WATER SAMPLES CONTAINING VOLATILE ORGANIC ANALYTES
The sorption of volatile organic analytes from water samples by the Teflon septum surface used with standard glass 40-ml sample collection vials was investigated. Analytes tested included alkanes, isoalkanes, olefins, cycloalkanes, a cycloalkene, monoaromatics, a polynuclear arom...
Selvi, Emine Kılıçkaya; Şahin, Uğur; Şahan, Serkan
2017-01-01
This method was developed for the determination of trace amounts of aluminum(III) in dialysis concentrates using atomic absorption spectrometry after coprecipitation with lanthanum phosphate. The analytical parameters that influenced the quantitative coprecipitation of analyte including amount of lanthanum, amount of phosfate, pH and duration time were optimized. The % recoveries of the analyte ion were in the range of 95-105 % with limit of detection (3s) of 0.5 µg l -1 . Preconcentration factor was found as 1000 and Relative Standard Deviation (RSD) % value obtained from model solutions was 2.5% for 0.02 mg L -1 . The accuracy of the method was evaluated with standard reference material (CWW-TMD Waste Water). The method was also applied to most concentrated acidic and basic dialysis concentrates with satisfactory results.
Distribution of Steps with Finite-Range Interactions: Analytic Approximations and Numerical Results
NASA Astrophysics Data System (ADS)
GonzáLez, Diego Luis; Jaramillo, Diego Felipe; TéLlez, Gabriel; Einstein, T. L.
2013-03-01
While most Monte Carlo simulations assume only nearest-neighbor steps interact elastically, most analytic frameworks (especially the generalized Wigner distribution) posit that each step elastically repels all others. In addition to the elastic repulsions, we allow for possible surface-state-mediated interactions. We investigate analytically and numerically how next-nearest neighbor (NNN) interactions and, more generally, interactions out to q'th nearest neighbor alter the form of the terrace-width distribution and of pair correlation functions (i.e. the sum over n'th neighbor distribution functions, which we investigated recently.[2] For physically plausible interactions, we find modest changes when NNN interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.
Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.
This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambert, K.A.; Mitchell, M.M.; Jean, D.
1997-09-01
This report contains the Appendices A-L including Voluntary Corrective Measure Plans, Waste Management Plans, Task-Specific Health and Safety Plan, Analytical Laboratory Procedures, Soil Sample Results, In-Situ Gamma Spectroscopy Results, Radionuclide Activity Summary, TCLP Soil Sample Results, Waste Characterization Memoranda, Waste Drum Inventory Data, Radiological Risk Assessment, and Summary of Site-Specific Recommendations.
Effective diffusion coefficient including the Marangoni effect
NASA Astrophysics Data System (ADS)
Kitahata, Hiroyuki; Yoshinaga, Natsuhiko
2018-04-01
Surface-active molecules supplied from a particle fixed at the water surface create a spatial gradient of the molecule concentration, resulting in Marangoni convection. Convective flow transports the molecules far from the particle, enhancing diffusion. We analytically derive the effective diffusion coefficient associated with the Marangoni convection rolls. The resulting estimated effective diffusion coefficient is consistent with our numerical results and the apparent diffusion coefficient measured in experiments.
ERIC Educational Resources Information Center
Behn, Robert D.; Vaupel, James W.
1976-01-01
Description of the philosophy and general nature of a course at Drake University that emphasizes basic concepts of analytical thinking, including think, decompose, simplify, specify, and rethink problems. Some sample homework exercises are included. The journal is available from University of California Press, Berkeley, California 94720.…
NASA Astrophysics Data System (ADS)
Raksharam; Dutta, Aloke K.
2017-04-01
In this paper, a unified analytical model for the drain current of a symmetric Double-Gate Junctionless Field-Effect Transistor (DG-JLFET) is presented. The operation of the device has been classified into four modes: subthreshold, semi-depleted, accumulation, and hybrid; with the main focus of this work being on the accumulation mode, which has not been dealt with in detail so far in the literature. A physics-based model, using a simplified one-dimensional approach, has been developed for this mode, and it has been successfully integrated with the model for the hybrid mode. It also includes the effect of carrier mobility degradation due to the transverse electric field, which was hitherto missing in the earlier models reported in the literature. The piece-wise models have been unified using suitable interpolation functions. In addition, the model includes two most important short-channel effects pertaining to DG-JLFETs, namely the Drain Induced Barrier Lowering (DIBL) and the Subthreshold Swing (SS) degradation. The model is completely analytical, and is thus computationally highly efficient. The results of our model have shown an excellent match with those obtained from TCAD simulations for both long- and short-channel devices, as well as with the experimental data reported in the literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ganapol, B.D.; Kornreich, D.E.
Because of the requirement of accountability and quality control in the scientific world, a demand for high-quality analytical benchmark calculations has arisen in the neutron transport community. The intent of these benchmarks is to provide a numerical standard to which production neutron transport codes may be compared in order to verify proper operation. The overall investigation as modified in the second year renewal application includes the following three primary tasks. Task 1 on two dimensional neutron transport is divided into (a) single medium searchlight problem (SLP) and (b) two-adjacent half-space SLP. Task 2 on three-dimensional neutron transport covers (a) pointmore » source in arbitrary geometry, (b) single medium SLP, and (c) two-adjacent half-space SLP. Task 3 on code verification, includes deterministic and probabilistic codes. The primary aim of the proposed investigation was to provide a suite of comprehensive two- and three-dimensional analytical benchmarks for neutron transport theory applications. This objective has been achieved. The suite of benchmarks in infinite media and the three-dimensional SLP are a relatively comprehensive set of one-group benchmarks for isotropically scattering media. Because of time and resource limitations, the extensions of the benchmarks to include multi-group and anisotropic scattering are not included here. Presently, however, enormous advances in the solution for the planar Green`s function in an anisotropically scattering medium have been made and will eventually be implemented in the two- and three-dimensional solutions considered under this grant. Of particular note in this work are the numerical results for the three-dimensional SLP, which have never before been presented. The results presented were made possible only because of the tremendous advances in computing power that have occurred during the past decade.« less
Association of Biotin Ingestion With Performance of Hormone and Nonhormone Assays in Healthy Adults.
Li, Danni; Radulescu, Angela; Shrestha, Rupendra T; Root, Matthew; Karger, Amy B; Killeen, Anthony A; Hodges, James S; Fan, Shu-Ling; Ferguson, Angela; Garg, Uttam; Sokoll, Lori J; Burmeister, Lynn A
2017-09-26
Biotinylated antibodies and analogues, with their strong binding to streptavidin, are used in many clinical laboratory tests. Excess biotin in blood due to supplemental biotin ingestion may affect biotin-streptavidin binding, leading to potential clinical misinterpretation. However, the degree of interference remains undefined in healthy adults. To assess performance of specific biotinylated immunoassays after 7 days of ingesting 10 mg/d of biotin, a dose common in over-the-counter supplements for healthy adults. Nonrandomized crossover trial involving 6 healthy adults who were treated at an academic medical center research laboratory. Administration of 10 mg/d of biotin supplementation for 7 days. Analyte concentrations were compared with baseline (day 0) measures on the seventh day of biotin treatment and 7 days after treatment had stopped (day 14). The 11 analytes included 9 hormones (ie, thyroid-stimulating hormone, total thyroxine, total triiodothyronine, free thyroxine, free triiodothyronine, parathyroid hormone, prolactin, N-terminal pro-brain natriuretic peptide, 25-hydroxyvitamin D) and 2 nonhormones (prostate-specific antigen and ferritin). A total of 37 immunoassays for the 11 analytes were evaluated on 4 diagnostic systems, including 23 assays that incorporated biotin and streptavidin components and 14 assays that did not include biotin and streptavidin components and served as negative controls. Among the 2 women and 4 men (mean age, 38 years [range, 31-45 years]) who took 10 mg/d of biotin for 7 days, biotin ingestion-associated interference was found in 9 of the 23 (39%) biotinylated assays compared with none of the 14 nonbiotinylated assays (P = .007). Results from 5 of 8 biotinylated (63%) competitive immunoassays tested falsely high and results from 4 out of 15 (27%) biotinylated sandwich immunoassays tested falsely low. In this preliminary study of 6 healthy adult participants and 11 hormone and nonhormone analytes measured by 37 immunoassays, ingesting 10 mg/d of biotin for 1 week was associated with potentially clinically important assay interference in some but not all biotinylated assays studied. These findings should be considered for patients taking biotin supplements before ordering blood tests or when interpreting results. clinicaltrials.gov Identifier: NCT03034707.
Analytical Chemistry Laboratory Progress Report for FY 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less
Quality control for federal clean water act and safe drinking water act regulatory compliance.
Askew, Ed
2013-01-01
QC sample results are required in order to have confidence in the results from analytical tests. Some of the AOAC water methods include specific QC procedures, frequencies, and acceptance criteria. These are considered to be the minimum controls needed to perform the method successfully. Some regulatory programs, such as those in 40 CFR Part 136.7, require additional QC or have alternative acceptance limits. Essential QC measures include method calibration, reagent standardization, assessment of each analyst's capabilities, analysis of blind check samples, determination of the method's sensitivity (method detection level or quantification limit), and daily evaluation of bias, precision, and the presence of laboratory contamination or other analytical interference. The details of these procedures, their performance frequency, and expected ranges of results are set out in this manuscript. The specific regulatory requirements of 40 CFR Part 136.7 for the Clean Water Act, the laboratory certification requirements of 40 CFR Part 141 for the Safe Drinking Water Act, and the ISO 17025 accreditation requirements under The NELAC Institute are listed.
BETA (Bitter Electromagnet Testing Apparatus)
NASA Astrophysics Data System (ADS)
Bates, Evan M.; Birmingham, William J.; Rivera, William F.; Romero-Talamas, Carlos A.
2017-10-01
The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) prototype of the 10-T Adjustable Long Pulse High-Field Apparatus (ALPHA). These water-cooled resistive magnets use high DC currents to produce strong uniform magnetic fields. Presented here is the successful completion of the BETA project and experimental results validating analytical magnet designing methods developed at the Dusty Plasma Laboratory (DPL). BETA's final design specifications will be highlighted which include electromagnetic, thermal and stress analyses. The magnet core design will be explained which include: Bitter Arcs, helix starters, and clamping annuli. The final version of the magnet's vessel and cooling system are also presented, as well as the electrical system of BETA, which is composed of a unique solid-state breaker circuit. Experimental results presented will show the operation of BETA at 1 T. The results are compared to both analytical design methods and finite element analysis calculations. We also explore the steady state maximums and theoretical limits of BETA's design. The completion of BETA validates the design and manufacturing techniques that will be used in the succeeding magnet, ALPHA.
Tank 241-B-108, cores 172 and 173 analytical results for the final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nuzum, J.L., Fluoro Daniel Hanford
1997-03-04
The Data Summary Table (Table 3) included in this report compiles analytical results in compliance with all applicable DQOS. Liquid subsamples that were prepared for analysis by an acid adjustment of the direct subsample are indicated by a `D` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a fusion digest are indicated by an `F` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a water digest are indicated by a I.wl. or an `I` in the A column of Table 3. Due to poormore » precision and accuracy in original analysis of both Lower Half Segment 2 of Core 173 and the core composite of Core 173, fusion and water digests were performed for a second time. Precision and accuracy improved with the repreparation of Core 173 Composite. Analyses with the repreparation of Lower Half Segment 2 of Core 173 did not show improvement and suggest sample heterogeneity. Results from both preparations are included in Table 3.« less
NASA Astrophysics Data System (ADS)
Yi, Dake; Wang, TzuChiang
2018-06-01
In the paper, a new procedure is proposed to investigate three-dimensional fracture problems of a thin elastic plate with a long through-the-thickness crack under remote uniform tensile loading. The new procedure includes a new analytical method and high accurate finite element simulations. In the part of theoretical analysis, three-dimensional Maxwell stress functions are employed in order to derive three-dimensional crack tip fields. Based on the theoretical analysis, an equation which can describe the relationship among the three-dimensional J-integral J( z), the stress intensity factor K( z) and the tri-axial stress constraint level T z ( z) is derived first. In the part of finite element simulations, a fine mesh including 153360 elements is constructed to compute the stress field near the crack front, J( z) and T z ( z). Numerical results show that in the plane very close to the free surface, the K field solution is still valid for in-plane stresses. Comparison with the numerical results shows that the analytical results are valid.
Exhaled breath condensate – from an analytical point of view
Dodig, Slavica; Čepelak, Ivana
2013-01-01
Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297
Application of DNA Machineries for the Barcode Patterned Detection of Genes or Proteins.
Zhou, Zhixin; Luo, Guofeng; Wulf, Verena; Willner, Itamar
2018-06-05
The study introduces an analytical platform for the detection of genes or aptamer-ligand complexes by nucleic acid barcode patterns generated by DNA machineries. The DNA machineries consist of nucleic acid scaffolds that include specific recognition sites for the different genes or aptamer-ligand analytes. The binding of the analytes to the scaffolds initiate, in the presence of the nucleotide mixture, a cyclic polymerization/nicking machinery that yields displaced strands of variable lengths. The electrophoretic separation of the resulting strands provides barcode patterns for the specific detection of the different analytes. Mixtures of DNA machineries that yield, upon sensing of different genes (or aptamer ligands), one-, two-, or three-band barcode patterns are described. The combination of nucleic acid scaffolds acting, in the presence of polymerase/nicking enzyme and nucleotide mixture, as DNA machineries, that generate multiband barcode patterns provide an analytical platform for the detection of an individual gene out of many possible genes. The diversity of genes (or other analytes) that can be analyzed by the DNA machineries and the barcode patterned imaging is given by the Pascal's triangle. As a proof-of-concept, the detection of one of six genes, that is, TP53, Werner syndrome, Tay-Sachs normal gene, BRCA1, Tay-Sachs mutant gene, and cystic fibrosis disorder gene by six two-band barcode patterns is demonstrated. The advantages and limitations of the detection of analytes by polymerase/nicking DNA machineries that yield barcode patterns as imaging readout signals are discussed.
(Bio)Sensing Using Nanoparticle Arrays: On the Effect of Analyte Transport on Sensitivity.
Lynn, N Scott; Homola, Jiří
2016-12-20
There has recently been an extensive amount of work regarding the development of optical, electrical, and mechanical (bio)sensors employing planar arrays of surface-bound nanoparticles. The sensor output for these systems is dependent on the rate at which analyte is transported to, and interacts with, each nanoparticle in the array. There has so far been little discussion on the relationship between the design parameters of an array and the interplay of convection, diffusion, and reaction. Moreover, current methods providing such information require extensive computational simulation. Here we demonstrate that the rate of analyte transport to a nanoparticle array can be quantified analytically. We show that such rates are bound by both the rate to a single NP and that to a planar surface (having equivalent size as the array), with the specific rate determined by the fill fraction: the ratio between the total surface area used for biomolecular capture with respect to the entire sensing area. We characterize analyte transport to arrays with respect to changes in numerous parameters relevant to experiment, including variation of the nanoparticle shape and size, packing density, flow conditions, and analyte diffusivity. We also explore how analyte capture is dependent on the kinetic parameters related to an affinity-based biosensor, and furthermore, we classify the conditions under which the array might be diffusion- or reaction-limited. The results obtained herein are applicable toward the design and optimization of all (bio)sensors based on nanoparticle arrays.
Results of the 2007 national roadside survey of alcohol and drug use by drivers
DOT National Transportation Integrated Search
2009-07-01
The 2007 NRS included, for the first time, measures to estimate the use of other potentially impairing drugs by drivers. Prior roadside surveys had collected breath samples to determine blood alcohol concentration (BAC). Due to developments in analyt...
42 CFR 493.1445 - Standard; Laboratory director responsibilities.
Code of Federal Regulations, 2010 CFR
2010-10-01
... quality laboratory services for all aspects of test performance, which includes the preanalytic, analytic... result is found to be unacceptable or unsatisfactory; (5) Ensure that the quality control and quality assessment programs are established and maintained to assure the quality of laboratory services provided and...
CTEPP-OH DATA SUPPLEMENTAL INFORMATION ON FIELD AND LABORATORY SAMPLES
This data set contains supplemental data related to the final core analytical results table for CTEPP-OH. This includes sample collection data for example sample weight, air volume, creatinine, specific gravity etc.
The Children’s Total Exposure to Persistent Pesticides and Oth...
Orion Orbit Control Design and Analysis
NASA Technical Reports Server (NTRS)
Jackson, Mark; Gonzalez, Rodolfo; Sims, Christopher
2007-01-01
The analysis of candidate thruster configurations for the Crew Exploration Vehicle (CEV) is presented. Six candidate configurations were considered for the prime contractor baseline design. The analysis included analytical assessments of control authority, control precision, efficiency and robustness, as well as simulation assessments of control performance. The principles used in the analytic assessments of controllability, robustness and fuel performance are covered and results provided for the configurations assessed. Simulation analysis was conducted using a pulse width modulated, 6 DOF reaction system control law with a simplex-based thruster selection algorithm. Control laws were automatically derived from hardware configuration parameters including thruster locations, directions, magnitude and specific impulse, as well as vehicle mass properties. This parameterized controller allowed rapid assessment of multiple candidate layouts. Simulation results are presented for final phase rendezvous and docking, as well as low lunar orbit attitude hold. Finally, on-going analysis to consider alternate Service Module designs and to assess the pilot-ability of the baseline design are discussed to provide a status of orbit control design work to date.
NASA Astrophysics Data System (ADS)
Sharma, Dinesh Kumar; Sharma, Anurag; Tripathi, Saurabh Mani
2017-11-01
The excellent propagation properties of square-lattice microstructured optical fibers (MOFs) have been widely recognized. We generalized our recently developed analytical field model (Sharma and Sharma, 2016), for index-guiding MOFs with square-lattice of circular air-holes in the photonic crystal cladding. Using the field model, we have studied the propagation properties of the fundamental mode of index-guiding square-lattice MOFs with different hole-to-hole spacing and the air-hole diameter. Results for the modal effective index, near and the far-field patterns and the group-velocity dispersion have been included. The evolution of the mode shape has been investigated in transition from the near to the far-field domain. We have also studied the splice losses between two identical square-lattice MOFs and also between an MOF and a traditional step-index single-mode fiber. Comparisons with available numerical simulation results, e.g., those based on the full-vector finite element method have also been included.
Variable Refractive Index Effects on Radiation in Semitransparent Scattering Multilayered Regions
NASA Technical Reports Server (NTRS)
Siegel, R.; Spuckler, C. M.
1993-01-01
A simple set of equations is derived for predicting the temperature distribution and radiative energy flow in a semitransparent layer consisting of an arbitrary number of laminated sublayers that absorb, emit, and scatter radiation. Each sublayer can have a different refractive index and optical thickness. The plane composite region is heated on each exterior side by a different amount of incident radiation. The results are for the limiting case where heat conduction within the layers is very small relative to radiative transfer, and is neglected. The interfaces are assumed diffuse, and all interface reflections are included in the analysis. The thermal behavior is readily calculated from the analytical expressions that are obtained. By using many sublayers, the analytical expressions provide the temperature distribution and heat flow for a diffusing medium with a continuously varying refractive index, including internal reflection effects caused by refractive index gradients. Temperature and heat flux results are given to show the effect of variations in refractive index and optical thickness through the multilayer laminate.
Park, In-Sun; Park, Jae-Woo
2011-01-30
Total petroleum hydrocarbon (TPH) is an important environmental contaminant that is toxic to human and environmental receptors. However, human health risk assessment for petroleum, oil, and lubricant (POL)-contaminated sites is especially challenging because TPH is not a single compound, but rather a mixture of numerous substances. To address this concern, this study recommends a new human health risk assessment strategy for POL-contaminated sites. The strategy is based on a newly modified TPH fractionation method and includes an improved analytical protocol. The proposed TPH fractionation method is composed of ten fractions (e.g., aliphatic and aromatic EC8-10, EC10-12, EC12-16, EC16-22 and EC22-40). Physicochemical properties and toxicity values of each fraction were newly defined in this study. The stepwise ultrasonication-based analytical process was established to measure TPH fractions. Analytical results were compared with those from the TPH Criteria Working Group (TPHCWG) Direct Method. Better analytical efficiencies in TPH, aliphatic, and aromatic fractions were achieved when contaminated soil samples were analyzed with the new analytical protocol. Finally, a human health risk assessment was performed based on the developed tiered risk assessment framework. Results showed that a detailed quantitative risk assessment should be conducted to determine scientifically and economically appropriate cleanup target levels, although the phase II process is useful for determining the potency of human health risks posed by POL-contamination. Copyright © 2010 Elsevier B.V. All rights reserved.
Aerodynamic and hydrodynamic model tests of the Enserch Garden Banks floating production facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, E.W.; Bauer, T.C.; Kelly, P.J.
1995-12-01
This paper presents the results of aerodynamic and hydrodynamic model tests of the Enserch Garden Banks, a semisubmersible Floating Production Facility (FPF) moored in 2,190-ft waters. During the wind tunnel tests, the steady component of wind and current forces/moments at various skew and heel axes were measured. The results were compared and calibrated against analytical calculations using techniques recommended by ABS and API. During the wave basin recommend test the mooring line tensions and vessel motions including the effects of dynamic wind and current were measured. An analytical calculation of the airgap, vessel motions, and mooring line loads were comparedmore » with wave basin model test results. This paper discusses the test objectives, test setups and agendas for wind and wave basin testing of a deepwater permanently moored floating production system. The experience from these tests and the comparison of measured tests results with analytical calculations will be of value to designers and operators contemplating the use of a semisubmersible based floating production system. The analysis procedures are aimed at estimating (1) vessel motions, (2) airgap, and (3) mooring line tensions with reasonable accuracy. Finally, this paper demonstrates how the model test results were interpolated and adapted in the design loop.« less
Installation Restoration Program, Phase II (Stage 2-1). Volume 2.
1985-05-01
Worthington OH PAGES: 89 SOURCE: Radian Library COMMENTS: General manual for well drilling, sample collection . Includes good elementary background. FIRST... Collected from the Base Industrial and Domestic Waste Water Treatment Plants and Field * Sampling Data PAGES: 75 p. SOURCE: Capt. Mario Ierarti * COMMENTS...Analytical results for samples collected from the * base industrial and domestic wastewater treatment plants have been included. RADIAN FIRST AUTHOR
NASA Astrophysics Data System (ADS)
Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min
2017-09-01
The Halbach type hollow cylindrical permanent magnet array (HCPMA) is a volume compact and energy conserved field source, which have attracted intense interests in many practical applications. Here, using the complex variable integration method based on the Biot-Savart Law (including current distributions inside the body and on the surfaces of magnet), we derive analytical field solutions to an ideal multipole HCPMA in entire space including the interior of magnet. The analytic field expression inside the array material is used to construct an analytic demagnetization function, with which we can explain the origin of demagnetization phenomena in HCPMA by taking into account an ideal magnetic hysteresis loop with finite coercivity. These analytical field expressions and demagnetization functions provide deeper insight into the nature of such permanent magnet array systems and offer guidance in designing optimized array system.
Sampling probe for microarray read out using electrospray mass spectrometry
Van Berkel, Gary J.
2004-10-12
An automated electrospray based sampling system and method for analysis obtains samples from surface array spots having analytes. The system includes at least one probe, the probe including an inlet for flowing at least one eluting solvent to respective ones of a plurality of spots and an outlet for directing the analyte away from the spots. An automatic positioning system is provided for translating the probe relative to the spots to permit sampling of any spot. An electrospray ion source having an input fluidicly connected to the probe receives the analyte and generates ions from the analyte. The ion source provides the generated ions to a structure for analysis to identify the analyte, preferably being a mass spectrometer. The probe can be a surface contact probe, where the probe forms an enclosing seal along the periphery of the array spot surface.
Teachable, high-content analytics for live-cell, phase contrast movies.
Alworth, Samuel V; Watanabe, Hirotada; Lee, James S J
2010-09-01
CL-Quant is a new solution platform for broad, high-content, live-cell image analysis. Powered by novel machine learning technologies and teach-by-example interfaces, CL-Quant provides a platform for the rapid development and application of scalable, high-performance, and fully automated analytics for a broad range of live-cell microscopy imaging applications, including label-free phase contrast imaging. The authors used CL-Quant to teach off-the-shelf universal analytics, called standard recipes, for cell proliferation, wound healing, cell counting, and cell motility assays using phase contrast movies collected on the BioStation CT and BioStation IM platforms. Similar to application modules, standard recipes are intended to work robustly across a wide range of imaging conditions without requiring customization by the end user. The authors validated the performance of the standard recipes by comparing their performance with truth created manually, or by custom analytics optimized for each individual movie (and therefore yielding the best possible result for the image), and validated by independent review. The validation data show that the standard recipes' performance is comparable with the validated truth with low variation. The data validate that the CL-Quant standard recipes can provide robust results without customization for live-cell assays in broad cell types and laboratory settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denysenko, I. B.; Azarenkov, N. A.; Kersten, H.
2016-05-15
Analytical expressions describing the variation of electron energy distribution function (EEDF) in an afterglow of a plasma are obtained. Especially, the case when the electron energy loss is mainly due to momentum-transfer electron-neutral collisions is considered. The study is carried out for different EEDFs in the steady state, including Maxwellian and Druyvesteyn distributions. The analytical results are not only obtained for the case when the rate for momentum-transfer electron-neutral collisions is independent on electron energy but also for the case when the collisions are a power function of electron energy. Using analytical expressions for the EEDF, the effective electron temperaturemore » and charge of the dust particles, which are assumed to be present in plasma, are calculated for different afterglow durations. An analytical expression for the rate describing collection of electrons by dust particles for the case when the rate for momentum-transfer electron-neutral collisions is independent on electron energy is also derived. The EEDF profile and, as a result, the effective electron temperature and dust charge are sufficiently different in the cases when the rate for momentum-transfer electron-neutral collisions is independent on electron energy and when the rate is a power function of electron energy.« less
Analytical Tools in School Finance Reform.
ERIC Educational Resources Information Center
Johns, R. L.
This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…
Learning Analytics: Challenges and Limitations
ERIC Educational Resources Information Center
Wilson, Anna; Watson, Cate; Thompson, Terrie Lynn; Drew, Valerie; Doyle, Sarah
2017-01-01
Learning analytic implementations are increasingly being included in learning management systems in higher education. We lay out some concerns with the way learning analytics--both data and algorithms--are often presented within an unproblematized Big Data discourse. We describe some potential problems with the often implicit assumptions about…
Reading Multimodal Texts: Perceptual, Structural and Ideological Perspectives
ERIC Educational Resources Information Center
Serafini, Frank
2010-01-01
This article presents a tripartite framework for analyzing multimodal texts. The three analytical perspectives presented include: (1) perceptual, (2) structural, and (3) ideological analytical processes. Using Anthony Browne's picturebook "Piggybook" as an example, assertions are made regarding what each analytical perspective brings to the…
Accurate mass measurements and their appropriate use for reliable analyte identification.
Godfrey, A Ruth; Brenton, A Gareth
2012-09-01
Accurate mass instrumentation is becoming increasingly available to non-expert users. This data can be mis-used, particularly for analyte identification. Current best practice in assigning potential elemental formula for reliable analyte identification has been described with modern informatic approaches to analyte elucidation, including chemometric characterisation, data processing and searching using facilities such as the Chemical Abstracts Service (CAS) Registry and Chemspider.
Sensor arrays for detecting analytes in fluids
NASA Technical Reports Server (NTRS)
Freund, Michael S. (Inventor); Lewis, Nathan S. (Inventor)
2000-01-01
A sensor array for detecting an analyte in a fluid, comprising at least first and second chemically sensitive resistors electrically connected to an electrical measuring apparatus, wherein each of the chemically sensitive resistors comprises a mixture of nonconductive material and a conductive material. Each resistor provides an electrical path through the mixture of nonconductive material and the conductive material. The resistors also provide a difference in resistance between the conductive elements when contacted with a fluid comprising an analyte at a first concentration, than when contacted with an analyte at a second different concentration. A broad range of analytes can be detected using the sensors of the present invention. Examples of such analytes include, but are not limited to, alkanes, alkenes, alkynes, dienes, alicyclic hydrocarbons, arenes, alcohols, ethers, ketones, aldehydes, carbonyls, carbanions, polynuclear aromatics, organic derivatives, biomolecules, sugars, isoprenes, isoprenoids and fatty acids. Moreover, applications for the sensors of the present invention include, but are not limited to, environmental toxicology, remediation, biomedicine, material quality control, food monitoring and agricultural monitoring.
Liquid-absorption preconcentrator sampling instrument
Zaromb, Solomon
1990-01-01
A system for detecting trace concentrations of an analyte in air and includes a preconcentrator for the analyte and an analyte detector. The preconcentrator includes an elongated tubular container in which is disposed a wettable material extending substantially the entire length of the container. One end of the wettable material is continuously wetted with an analyte-sorbing liquid, which flows to the other end of the container. Sample air is flowed through the container in contact with the wetted material for trapping and preconcentrating the traces of analyte in the sorbing liquid, which is then collected at the other end of the container and discharged to the detector. The wetted material may be a wick comprising a bundle of fibers, one end of which is immersed in a reservoir of the analyte-sorbing liquid, or may be a liner disposed on the inner surface of the container, with the sorbing liquid being centrifugally dispersed onto the liner at one end thereof. The container is preferably vertically oriented so that gravity effects the liquid flow.
Liquid-absorption preconcentrator sampling instrument
Zaromb, S.
1990-12-11
A system is described for detecting trace concentrations of an analyte in air and includes a preconcentrator for the analyte and an analyte detector. The preconcentrator includes an elongated tubular container in which is disposed a wettable material extending substantially the entire length of the container. One end of the wettable material is continuously wetted with an analyte-sorbing liquid, which flows to the other end of the container. Sample air is flowed through the container in contact with the wetted material for trapping and preconcentrating the traces of analyte in the sorbing liquid, which is then collected at the other end of the container and discharged to the detector. The wetted material may be a wick comprising a bundle of fibers, one end of which is immersed in a reservoir of the analyte-sorbing liquid, or may be a liner disposed on the inner surface of the container, with the sorbing liquid being centrifugally dispersed onto the liner at one end thereof. The container is preferably vertically oriented so that gravity effects the liquid flow. 4 figs.
NASA Astrophysics Data System (ADS)
Wang, Xiaojing; Yu, Qingquan; Zhang, Xiaodong; Zhang, Yang; Zhu, Sizheng; Wang, Xiaoguang; Wu, Bin
2018-04-01
Numerical studies on the stabilization of neoclassical tearing modes (NTMs) by electron cyclotron current drive (ECCD) have been carried out based on reduced MHD equations, focusing on the amount of the required driven current for mode stabilization and the comparison with analytical results. The dependence of the minimum driven current required for NTM stabilization on some parameters, including the bootstrap current density, radial width of the driven current, radial deviation of the driven current from the resonant surface, and the island width when applying ECCD, are studied. By fitting the numerical results, simple expressions for these dependences are obtained. Analysis based on the modified Rutherford equation (MRE) has also been carried out, and the corresponding results have the same trend as numerical ones, while a quantitative difference between them exists. This difference becomes smaller when the applied radio frequency (rf) current is smaller.
Contemporary Privacy Theory Contributions to Learning Analytics
ERIC Educational Resources Information Center
Heath, Jennifer
2014-01-01
With the continued adoption of learning analytics in higher education institutions, vast volumes of data are generated and "big data" related issues, including privacy, emerge. Privacy is an ill-defined concept and subject to various interpretations and perspectives, including those of philosophers, lawyers, and information systems…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillen, David S.
Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less
The Analytical Limits of Modeling Short Diffusion Timescales
NASA Astrophysics Data System (ADS)
Bradshaw, R. W.; Kent, A. J.
2016-12-01
Chemical and isotopic zoning in minerals is widely used to constrain the timescales of magmatic processes such as magma mixing and crystal residence, etc. via diffusion modeling. Forward modeling of diffusion relies on fitting diffusion profiles to measured compositional gradients. However, an individual measurement is essentially an average composition for a segment of the gradient defined by the spatial resolution of the analysis. Thus there is the potential for the analytical spatial resolution to limit the timescales that can be determined for an element of given diffusivity, particularly where the scale of the gradient approaches that of the measurement. Here we use a probabilistic modeling approach to investigate the effect of analytical spatial resolution on estimated timescales from diffusion modeling. Our method investigates how accurately the age of a synthetic diffusion profile can be obtained by modeling an "unknown" profile derived from discrete sampling of the synthetic compositional gradient at a given spatial resolution. We also include the effects of analytical uncertainty and the position of measurements relative to the diffusion gradient. We apply this method to the spatial resolutions of common microanalytical techniques (LA-ICP-MS, SIMS, EMP, NanoSIMS). Our results confirm that for a given diffusivity, higher spatial resolution gives access to shorter timescales, and that each analytical spacing has a minimum timescale, below which it overestimates the timescale. For example, for Ba diffusion in plagioclase at 750 °C timescales are accurate (within 20%) above 10, 100, 2,600, and 71,000 years at 0.3, 1, 5, and 25 mm spatial resolution, respectively. For Sr diffusion in plagioclase at 750 °C, timescales are accurate above 0.02, 0.2, 4, and 120 years at the same spatial resolutions. Our results highlight the importance of selecting appropriate analytical techniques to estimate accurate diffusion-based timescales.
Systems and methods for laser assisted sample transfer to solution for chemical analysis
Van Berkel, Gary J.; Kertesz, Vilmos; Ovchinnikova, Olga S.
2014-06-03
Systems and methods are described for laser ablation of an analyte from a specimen and capturing of the analyte in a dispensed solvent to form a testing solution. A solvent dispensing and extraction system can form a liquid microjunction with the specimen. The solvent dispensing and extraction system can include a surface sampling probe. The laser beam can be directed through the surface sampling probe. The surface sampling probe can also serve as an atomic force microscopy probe. The surface sampling probe can form a seal with the specimen. The testing solution including the analyte can then be analyzed using an analytical instrument or undergo further processing.
Systems and methods for laser assisted sample transfer to solution for chemical analysis
Van Berkel, Gary J.; Kertesz, Vilmos; Ovchinnikova, Olga S.
2015-09-29
Systems and methods are described for laser ablation of an analyte from a specimen and capturing of the analyte in a dispensed solvent to form a testing solution. A solvent dispensing and extraction system can form a liquid microjunction with the specimen. The solvent dispensing and extraction system can include a surface sampling probe. The laser beam can be directed through the surface sampling probe. The surface sampling probe can also serve as an atomic force microscopy probe. The surface sampling probe can form a seal with the specimen. The testing solution including the analyte can then be analyzed using an analytical instrument or undergo further processing.
Systems and methods for laser assisted sample transfer to solution for chemical analysis
Van Berkel, Gary J; Kertesz, Vilmos; Ovchinnikova, Olga S
2013-08-27
Systems and methods are described for laser ablation of an analyte from a specimen and capturing of the analyte in a dispensed solvent to form a testing solution. A solvent dispensing and extraction system can form a liquid microjunction with the specimen. The solvent dispensing and extraction system can include a surface sampling probe. The laser beam can be directed through the surface sampling probe. The surface sampling probe can also serve as an atomic force microscopy probe. The surface sampling probe can form a seal with the specimen. The testing solution including the analyte can then be analyzed using an analytical instrument or undergo further processing.
Means of introducing an analyte into liquid sampling atmospheric pressure glow discharge
Marcus, R. Kenneth; Quarles, Jr., Charles Derrick; Russo, Richard E.; Koppenaal, David W.; Barinaga, Charles J.; Carado, Anthony J.
2017-01-03
A liquid sampling, atmospheric pressure, glow discharge (LS-APGD) device as well as systems that incorporate the device and methods for using the device and systems are described. The LS-APGD includes a hollow capillary for delivering an electrolyte solution to a glow discharge space. The device also includes a counter electrode in the form of a second hollow capillary that can deliver the analyte into the glow discharge space. A voltage across the electrolyte solution and the counter electrode creates the microplasma within the glow discharge space that interacts with the analyte to move it to a higher energy state (vaporization, excitation, and/or ionization of the analyte).
Toxicologic evaluation of analytes from Tank 241-C-103
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahlum, D.D.; Young, J.Y.; Weller, R.E.
1994-11-01
Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team`s objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise,more » including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found.« less
Stochastic differential equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sobczyk, K.
1990-01-01
This book provides a unified treatment of both regular (or random) and Ito stochastic differential equations. It focuses on solution methods, including some developed only recently. Applications are discussed, in particular an insight is given into both the mathematical structure, and the most efficient solution methods (analytical as well as numerical). Starting from basic notions and results of the theory of stochastic processes and stochastic calculus (including Ito's stochastic integral), many principal mathematical problems and results related to stochastic differential equations are expounded here for the first time. Applications treated include those relating to road vehicles, earthquake excitations and offshoremore » structures.« less
An analytical study of M2 tidal waves in the Taiwan Strait using an extended Taylor method
NASA Astrophysics Data System (ADS)
Wu, Di; Fang, Guohong; Cui, Xinmei; Teng, Fei
2018-02-01
The tides in the Taiwan Strait (TS) feature large semidiurnal lunar (M2) amplitudes. An extended Taylor method is employed in this study to provide an analytical model for the M2 tide in the TS. The strait is idealized as a rectangular basin with a uniform depth, and the Coriolis force and bottom friction are retained in the governing equations. The observed tides at the northern and southern openings are used as open boundary conditions. The obtained analytical solution, which consists of a stronger southward propagating Kelvin wave, a weaker northward propagating Kelvin wave, and two families of Poincaré modes trapped at the northern and southern openings, agrees well with the observations in the strait. The superposition of two Kelvin waves basically represents the observed tidal pattern, including an anti-nodal band in the central strait, and the cross-strait asymmetry (greater amplitudes in the west and smaller in the east) of the anti-nodal band. Inclusion of Poincaré modes further improves the model result in that the cross-strait asymmetry can be better reproduced. To explore the formation mechanism of the northward propagating wave in the TS, three experiments are carried out, including the deep basin south of the strait. The results show that the southward incident wave is reflected to form a northward wave by the abruptly deepened topography south of the strait, but the reflected wave is slightly weaker than the northward wave obtained from the above analytical solution, in which the southern open boundary condition is specified with observations. Inclusion of the forcing at the Luzon Strait strengthens the northward Kelvin wave in the TS, and the forcing is thus of some (but lesser) importance to the M2 tide in the TS.
Analytic Result for the Two-loop Six-point NMHV Amplitude in N = 4 Super Yang-Mills Theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dixon, Lance J.; /SLAC; Drummond, James M.
2012-02-15
We provide a simple analytic formula for the two-loop six-point ratio function of planar N = 4 super Yang-Mills theory. This result extends the analytic knowledge of multi-loop six-point amplitudes beyond those with maximal helicity violation. We make a natural ansatz for the symbols of the relevant functions appearing in the two-loop amplitude, and impose various consistency conditions, including symmetry, the absence of spurious poles, the correct collinear behavior, and agreement with the operator product expansion for light-like (super) Wilson loops. This information reduces the ansatz to a small number of relatively simple functions. In order to fix these parametersmore » uniquely, we utilize an explicit representation of the amplitude in terms of loop integrals that can be evaluated analytically in various kinematic limits. The final compact analytic result is expressed in terms of classical polylogarithms, whose arguments are rational functions of the dual conformal cross-ratios, plus precisely two functions that are not of this type. One of the functions, the loop integral {Omega}{sup (2)}, also plays a key role in a new representation of the remainder function R{sub 6}{sup (2)} in the maximally helicity violating sector. Another interesting feature at two loops is the appearance of a new (parity odd) x (parity odd) sector of the amplitude, which is absent at one loop, and which is uniquely determined in a natural way in terms of the more familiar (parity even) x (parity even) part. The second non-polylogarithmic function, the loop integral {tilde {Omega}}{sup (2)}, characterizes this sector. Both {Omega}{sup (2)} and {tilde {Omega}}{sup (2)} can be expressed as one-dimensional integrals over classical polylogarithms with rational arguments.« less
Relativistic electron kinetic effects on laser diagnostics in burning plasmas
NASA Astrophysics Data System (ADS)
Mirnov, V. V.; Den Hartog, D. J.
2018-02-01
Toroidal interferometry/polarimetry (TIP), poloidal polarimetry (PoPola), and Thomson scattering systems (TS) are major optical diagnostics being designed and developed for ITER. Each of them relies upon a sophisticated quantitative understanding of the electron response to laser light propagating through a burning plasma. Review of the theoretical results for two different applications is presented: interferometry/polarimetry (I/P) and polarization of Thomson scattered light, unified by the importance of relativistic (quadratic in vTe/c) electron kinetic effects. For I/P applications, rigorous analytical results are obtained perturbatively by expansion in powers of the small parameter τ = Te/me c2, where Te is electron temperature and me is electron rest mass. Experimental validation of the analytical models has been made by analyzing data of more than 1200 pulses collected from high-Te JET discharges. Based on this validation the relativistic analytical expressions are included in the error analysis and design projects of the ITER TIP and PoPola systems. The polarization properties of incoherent Thomson scattered light are being examined as a method of Te measurement relevant to ITER operational regimes. The theory is based on Stokes vector transformation and Mueller matrices formalism. The general approach is subdivided into frequency-integrated and frequency-resolved cases. For each of them, the exact analytical relativistic solutions are presented in the form of Mueller matrix elements averaged over the relativistic Maxwellian distribution function. New results related to the detailed verification of the frequency-resolved solutions are reported. The precise analytic expressions provide output much more rapidly than relativistic kinetic numerical codes allowing for direct real-time feedback control of ITER device operation.
Search Analytics: Automated Learning, Analysis, and Search with Open Source
NASA Astrophysics Data System (ADS)
Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.
2016-12-01
The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.
NASA Astrophysics Data System (ADS)
Aghakhani, Amirreza; Basdogan, Ipek; Erturk, Alper
2016-04-01
Plate-like components are widely used in numerous automotive, marine, and aerospace applications where they can be employed as host structures for vibration based energy harvesting. Piezoelectric patch harvesters can be easily attached to these structures to convert the vibrational energy to the electrical energy. Power output investigations of these harvesters require accurate models for energy harvesting performance evaluation and optimization. Equivalent circuit modeling of the cantilever-based vibration energy harvesters for estimation of electrical response has been proposed in recent years. However, equivalent circuit formulation and analytical modeling of multiple piezo-patch energy harvesters integrated to thin plates including nonlinear circuits has not been studied. In this study, equivalent circuit model of multiple parallel piezoelectric patch harvesters together with a resistive load is built in electronic circuit simulation software SPICE and voltage frequency response functions (FRFs) are validated using the analytical distributedparameter model. Analytical formulation of the piezoelectric patches in parallel configuration for the DC voltage output is derived while the patches are connected to a standard AC-DC circuit. The analytic model is based on the equivalent load impedance approach for piezoelectric capacitance and AC-DC circuit elements. The analytic results are validated numerically via SPICE simulations. Finally, DC power outputs of the harvesters are computed and compared with the peak power amplitudes in the AC output case.
Comparison and validation of point spread models for imaging in natural waters.
Hou, Weilin; Gray, Deric J; Weidemann, Alan D; Arnone, Robert A
2008-06-23
It is known that scattering by particulates within natural waters is the main cause of the blur in underwater images. Underwater images can be better restored or enhanced with knowledge of the point spread function (PSF) of the water. This will extend the performance range as well as the information retrieval from underwater electro-optical systems, which is critical in many civilian and military applications, including target and especially mine detection, search and rescue, and diver visibility. A better understanding of the physical process involved also helps to predict system performance and simulate it accurately on demand. The presented effort first reviews several PSF models, including the introduction of a semi-analytical PSF given optical properties of the medium, including scattering albedo, mean scattering angles and the optical range. The models under comparison include the empirical model of Duntley, a modified PSF model by Dolin et al, as well as the numerical integration of analytical forms from Wells, as a benchmark of theoretical results. For experimental results, in addition to that of Duntley, we validate the above models with measured point spread functions by applying field measured scattering properties with Monte Carlo simulations. Results from these comparisons suggest it is sufficient but necessary to have the three parameters listed above to model PSFs. The simplified approach introduced also provides adequate accuracy and flexibility for imaging applications, as shown by examples of restored underwater images.
Analyte detection using an active assay
Morozov, Victor; Bailey, Charles L.; Evanskey, Melissa R.
2010-11-02
Analytes using an active assay may be detected by introducing an analyte solution containing a plurality of analytes to a lacquered membrane. The lacquered membrane may be a membrane having at least one surface treated with a layer of polymers. The lacquered membrane may be semi-permeable to nonanalytes. The layer of polymers may include cross-linked polymers. A plurality of probe molecules may be arrayed and immobilized on the lacquered membrane. An external force may be applied to the analyte solution to move the analytes towards the lacquered membrane. Movement may cause some or all of the analytes to bind to the lacquered membrane. In cases where probe molecules are presented, some or all of the analytes may bind to probe molecules. The direction of the external force may be reversed to remove unbound or weakly bound analytes. Bound analytes may be detected using known detection types.
Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.
Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U
2015-05-01
The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Analytical Applications of NMR: Summer Symposium on Analytical Chemistry.
ERIC Educational Resources Information Center
Borman, Stuart A.
1982-01-01
Highlights a symposium on analytical applications of nuclear magnetic resonance spectroscopy (NMR), discussing pulse Fourier transformation technique, two-dimensional NMR, solid state NMR, and multinuclear NMR. Includes description of ORACLE, an NMR data processing system at Syracuse University using real-time color graphics, and algorithms for…
Quantitative and Qualitative Relations between Motivation and Critical-Analytic Thinking
ERIC Educational Resources Information Center
Miele, David B.; Wigfield, Allan
2014-01-01
The authors examine two kinds of factors that affect students' motivation to engage in critical-analytic thinking. The first, which includes ability beliefs, achievement values, and achievement goal orientations, influences the "quantitative" relation between motivation and critical-analytic thinking; that is, whether students are…
RUPTURES IN THE ANALYTIC SETTING AND DISTURBANCES IN THE TRANSFORMATIONAL FIELD OF DREAMS.
Brown, Lawrence J
2015-10-01
This paper explores some implications of Bleger's (1967, 2013) concept of the analytic situation, which he views as comprising the analytic setting and the analytic process. The author discusses Bleger's idea of the analytic setting as the depositary for projected painful aspects in either the analyst or patient or both-affects that are then rendered as nonprocess. In contrast, the contents of the analytic process are subject to an incessant process of transformation (Green 2005). The author goes on to enumerate various components of the analytic setting: the nonhuman, object relational, and the analyst's "person" (including mental functioning). An extended clinical vignette is offered as an illustration. © 2015 The Psychoanalytic Quarterly, Inc.
Doping control analysis at the Rio 2016 Olympic and Paralympic Games.
Pereira, Henrique Marcelo Gualberto; Sardela, Vinicius Figueiredo; Padilha, Monica Costa; Mirotti, Luciana; Casilli, Alessandro; de Oliveira, Fabio Azamor; de Albuquerque Cavalcanti, Gustavo; Rodrigues, Lucas Martins Lisandro; de Araujo, Amanda Lessa Dutra; Levy, Rachel Santos; Teixeira, Pedro Antonio Castelo; de Oliveira, Felipe Alves Gomes; Duarte, Ana Carolina Giordani; Carneiro, Ana Carolina Dudenhoeffer; Evaristo, Joseph Albert Medeiros; Dos Santos, Gustavo Ramalho Cardoso; da Costa, Giovanni Carlo Verissimo; de Lima Castro, Fernando; Nogueira, Fabio Cesar Sousa; Scalco, Fernanda Bertão; Pizzatti, Luciana; de Aquino Neto, Francisco Radler
2017-11-01
This paper summarises the results obtained from the doping control analyses performed during the Summer XXXI Olympic Games (August 3-21, 2016) and the XV Paralympic Games (September 7-18, 2016). The analyses of all doping control samples were performed at the Brazilian Doping Control Laboratory (LBCD), a World Anti-Doping Agency (WADA)-accredited laboratory located in Rio de Janeiro, Brazil. A new facility at Rio de Janeiro Federal University (UFRJ) was built and fully operated by over 700 professionals, including Brazilian and international scientists, administrative staff, and volunteers. For the Olympic Games, 4913 samples were analysed. In 29 specimens, the presence of a prohibited substance was confirmed, resulting in adverse analytical findings (AAFs). For the Paralympic Games, 1687 samples were analysed, 12 of which were reported as AAFs. For both events, 82.8% of the samples were urine, and 17.2% were blood samples. In total, more than 31 000 analytical procedures were conducted. New WADA technical documents were fully implemented; consequently, state-of-the-art analytical toxicology instrumentation and strategies were applied during the Games, including different types of mass spectrometry (MS) analysers, peptide, and protein detection strategies, endogenous steroid profile measurements, and blood analysis. This enormous investment yielded one of the largest Olympic legacies in Brazil and South America. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Bio-analytical applications of mid-infrared spectroscopy using silver halide fiber-optic probes1
NASA Astrophysics Data System (ADS)
Heise, H. M.; Küpper, L.; Butvina, L. N.
2002-10-01
Infrared-spectroscopy has proved to be a powerful method for the study of various biomedical samples, in particular for in-vitro analysis in the clinical laboratory and for non-invasive diagnostics. In general, the analysis of biofluids such as whole blood, urine, microdialysates and bioreactor broth media takes advantage of the fact that a multitude of analytes can be quantified simultaneously and rapidly without the need for reagents. Progress in the quality of infrared silver halide fibers enabled us to construct several flexible fiber-optic probes of different geometries, which are particularly suitable for the measurement of small biosamples. Recent trends show that dry film measurements by mid-infrared spectroscopy could revolutionize analytical tools in the clinical chemistry laboratory, and an example is given. Infrared diagnostic tools show a promising potential for patients, and minimal-invasive blood glucose assays or skin tissue pathology in particular cannot be left out using mid-infrared fiber-based probes. Other applications include the measurement of skin samples including penetration studies of vitamins and constituents of cosmetic cream formulations. A further field is the micro-domain analysis of biopsy samples from bog mummified corpses, and recent results on the chemistry of dermis and hair samples are reported. Another field of application, for which results are reported, is food analysis and bio-reactor monitoring.
NASA Astrophysics Data System (ADS)
Groeneweg, John F.; Sofrin, Thomas G.; Rice, Edward J.; Gliebe, Phillip R.
1991-08-01
Summarized here are key advances in experimental techniques and theoretical applications which point the way to a broad understanding and control of turbomachinery noise. On the experimental side, the development of effective inflow control techniques makes it possible to conduct, in ground based facilities, definitive experiments in internally controlled blade row interactions. Results can now be valid indicators of flight behavior and can provide a firm base for comparison with analytical results. Inflow control coupled with detailed diagnostic tools such as blade pressure measurements can be used to uncover the more subtle mechanisms such as rotor strut interaction, which can set tone levels for some engine configurations. Initial mappings of rotor wake-vortex flow fields have provided a data base for a first generation semiempirical flow disturbance model. Laser velocimetry offers a nonintrusive method for validating and improving the model. Digital data systems and signal processing algorithms are bringing mode measurement closer to a working tool that can be frequently applied to a real machine such as a turbofan engine. On the analytical side, models of most of the links in the chain from turbomachine blade source to far field observation point have been formulated. Three dimensional lifting surface theory for blade rows, including source noncompactness and cascade effects, blade row transmission models incorporating mode and frequency scattering, and modal radiation calculations, including hybrid numerical-analytical approaches, are tools which await further application.
A Simulation Investigation of Principal Component Regression.
ERIC Educational Resources Information Center
Allen, David E.
Regression analysis is one of the more common analytic tools used by researchers. However, multicollinearity between the predictor variables can cause problems in using the results of regression analyses. Problems associated with multicollinearity include entanglement of relative influences of variables due to reduced precision of estimation,…
Identifying Evidence of Reflective Ability in Preservice Teacher Electronic Portfolios
ERIC Educational Resources Information Center
Sulzen, James
2011-01-01
Results of this study identified "evidence markers" that characterize reflection in preservice teacher electronic portfolios. Examples of such markers include openness to self-learning, willingness to self-critique, analytical detail of reflections, and taking responsibility for pupil learning challenges. To identify the markers, school…
Management Reviewing Literature: An Evaluation of Selected Characteristics.
ERIC Educational Resources Information Center
Rehman, Sajjad ur
1987-01-01
Reports results of a study which compared the treatment of selected characteristics of the reviewing literature of management in professional and trade journals. The characteristics examined included lag time, review length, descriptive or analytic nature of reviews, positive or negative evaluations, and affiliation of the reviewer. (CLB)
Noise limitations in optical linear algebra processors.
Batsell, S G; Jong, T L; Walkup, J F; Krile, T F
1990-05-10
A general statistical noise model is presented for optical linear algebra processors. A statistical analysis which includes device noise, the multiplication process, and the addition operation is undertaken. We focus on those processes which are architecturally independent. Finally, experimental results which verify the analytical predictions are also presented.
Leveraging Web-Based Environments for Mass Atrocity Prevention
ERIC Educational Resources Information Center
Harding, Tucker B.; Whitlock, Mark A.
2013-01-01
A growing literature exploring large-scale, identity-based political violence, including mass killing and genocide, debates the plausibility of, and prospects for, early warning and prevention. An extension of the debate involves the prospects for creating educational experiences that result in more sophisticated analytical products that enhance…
Small, high pressure liquid hydrogen turbopump
NASA Technical Reports Server (NTRS)
Csomor, A.; Warren, D. J.
1980-01-01
A high pressure, low capacity, liquid hydrogen turbopump was designed, fabricated, and tested. The design configuration of the turbopump is summarized and the results of the analytical and test efforts are presented. Approaches used to pin point the cause of poor suction performance with the original design are described and performance data are included with an axial inlet design which results in excellent suction capability.
The importance of quality control in validating concentrations ...
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer
Bolann, B J; Asberg, A
2004-01-01
The deviation of test results from patients' homeostatic set points in steady-state conditions may complicate interpretation of the results and the comparison of results with clinical decision limits. In this study the total deviation from the homeostatic set point is defined as the maximum absolute deviation for 95% of measurements, and we present analytical quality requirements that prevent analytical error from increasing this deviation to more than about 12% above the value caused by biology alone. These quality requirements are: 1) The stable systematic error should be approximately 0, and 2) a systematic error that will be detected by the control program with 90% probability, should not be larger than half the value of the combined analytical and intra-individual standard deviation. As a result, when the most common control rules are used, the analytical standard deviation may be up to 0.15 times the intra-individual standard deviation. Analytical improvements beyond these requirements have little impact on the interpretability of measurement results.
Using Google Analytics to evaluate the impact of the CyberTraining project.
McGuckin, Conor; Crowley, Niall
2012-11-01
A focus on results and impact should be at the heart of every project's approach to research and dissemination. This article discusses the potential of Google Analytics (GA: http://google.com/analytics ) as an effective resource for measuring the impact of academic research output and understanding the geodemographics of users of specific Web 2.0 content (e.g., intervention and prevention materials, health promotion and advice). This article presents the results of GA analyses as a resource used in measuring the impact of the EU-funded CyberTraining project, which provided a well-grounded, research-based training manual on cyberbullying for trainers through the medium of a Web-based eBook ( www.cybertraining-project.org ). The training manual includes review information on cyberbullying, its nature and extent across Europe, analyses of current projects, and provides resources for trainers working with the target groups of pupils, parents, teachers, and other professionals. Results illustrate the promise of GA as an effective tool for measuring the impact of academic research and project output with real potential for tracking and understanding intra- and intercountry regional variations in the uptake of prevention and intervention materials, thus enabling precision focusing of attention to those regions.
Modern analytical chemistry in the contemporary world
NASA Astrophysics Data System (ADS)
Šíma, Jan
2016-12-01
Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among sciences and in the contemporary world is discussed. Its interdisciplinary character and the necessity of the collaboration between analytical chemists and other experts in order to effectively solve the actual problems of the human society and the environment are emphasized. The importance of the analytical method validation in order to obtain the accurate and precise results is highlighted. The invalid results are not only useless; they can often be even fatal (e.g., in clinical laboratories). The curriculum of analytical chemistry at schools and universities is discussed. It is referred to be much broader than traditional equilibrium chemistry coupled with a simple description of individual analytical methods. Actually, the schooling of analytical chemistry should closely connect theory and practice.
Association of Biotin Ingestion With Performance of Hormone and Nonhormone Assays in Healthy Adults
Radulescu, Angela; Shrestha, Rupendra T.; Root, Matthew; Karger, Amy B.; Killeen, Anthony A.; Hodges, James S.; Fan, Shu-Ling; Ferguson, Angela; Garg, Uttam; Sokoll, Lori J.; Burmeister, Lynn A.
2017-01-01
Importance Biotinylated antibodies and analogues, with their strong binding to streptavidin, are used in many clinical laboratory tests. Excess biotin in blood due to supplemental biotin ingestion may affect biotin-streptavidin binding, leading to potential clinical misinterpretation. However, the degree of interference remains undefined in healthy adults. Objective To assess performance of specific biotinylated immunoassays after 7 days of ingesting 10 mg/d of biotin, a dose common in over-the-counter supplements for healthy adults. Design, Setting, and Participants Nonrandomized crossover trial involving 6 healthy adults who were treated at an academic medical center research laboratory Exposure Administration of 10 mg/d of biotin supplementation for 7 days. Main Outcomes and Measures Analyte concentrations were compared with baseline (day 0) measures on the seventh day of biotin treatment and 7 days after treatment had stopped (day 14). The 11 analytes included 9 hormones (ie, thyroid-stimulating hormone, total thyroxine, total triiodothyronine, free thyroxine, free triiodothyronine, parathyroid hormone, prolactin, N-terminal pro-brain natriuretic peptide, 25-hydroxyvitamin D) and 2 nonhormones (prostate-specific antigen and ferritin). A total of 37 immunoassays for the 11 analytes were evaluated on 4 diagnostic systems, including 23 assays that incorporated biotin and streptavidin components and 14 assays that did not include biotin and streptavidin components and served as negative controls. Results Among the 2 women and 4 men (mean age, 38 years [range, 31-45 years]) who took 10 mg/d of biotin for 7 days, biotin ingestion–associated interference was found in 9 of the 23 (39%) biotinylated assays compared with none of the 14 nonbiotinylated assays (P = .007). Results from 5 of 8 biotinylated (63%) competitive immunoassays tested falsely high and results from 4 out of 15 (27%) biotinylated sandwich immunoassays tested falsely low. Conclusions and Relevance In this preliminary study of 6 healthy adult participants and 11 hormone and nonhormone analytes measured by 37 immunoassays, ingesting 10 mg/d of biotin for 1 week was associated with potentially clinically important assay interference in some but not all biotinylated assays studied. These findings should be considered for patients taking biotin supplements before ordering blood tests or when interpreting results. Trial Registration clinicaltrials.gov Identifier: NCT03034707 PMID:28973622
Dervieux, Thierry; Conklin, John; Ligayon, Jo-Anne; Wolover, Leilani; O'Malley, Tyler; Alexander, Roberta Vezza; Weinstein, Arthur; Ibarra, Claudia A
2017-07-01
We describe the analytical validation of an assay panel intended to assist clinicians with the diagnosis of systemic lupus erythematosus (SLE). The multi-analyte panel includes quantitative assessment of complement activation and measurement of autoantibodies. The levels of the complement split product C4d bound to erythrocytes (EC4d) and B-lymphocytes (BC4d) (expressed as mean fluorescence intensity [MFI]) are measured by quantitative flow cytometry, while autoantibodies (inclusive of antinuclear and anti-double stranded DNA antibodies) are determined by immunoassays. Results of the multi-analyte panel are reported as positive or negative based on a 2-tiered index score. Post-phlebotomy stability of EC4d and BC4d in EDTA-anticoagulated blood is determined using specimens collected from patients with SLE and normal donors. Three-level C4 coated positive beads are run daily as controls. Analytical validity is reported using intra-day and inter-day coefficient of variation (CV). EC4d and BC4d are stable for 2days at ambient temperature and for 4days at 4°C post-phlebotomy. Median intra-day and inter-day CV range from 2.9% to 7.8% (n=30) and 7.3% to 12.4% (n=66), respectively. The 2-tiered index score is reproducible over 4 consecutive daysupon storage of blood at 4°C. A total of 2,888 three-level quality control data were collected from 6 flow cytometers with an overall failure rate below 3%. Median EC4d level is 6 net MFI (Interquartile [IQ] range 4-9 net MFI) and median BC4d is 18 net MFI (IQ range 13-27 net MFI) among 86,852 specimens submitted for testing. The incidence of 2-tiered positive test results is 13.4%. We have established the analytical validity of a multi-analyte assay panel for SLE. Copyright © 2017 Elsevier B.V. All rights reserved.
Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-10-01
Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.
NASA Technical Reports Server (NTRS)
Dring, R. P.; Blair, M. F.; Joslyn, H. D.; Power, G. D.; Verdon, J. M.
1987-01-01
A combined experimental and analytical program was conducted to examine the effects of inlet turbulence on airfoil heat transfer. Heat transfer measurements were obtained using low conductivity airfoils with miniature thermocouples welded to a thin, electrically heated surface skin. Heat transfer data were acquired for various combinations of low or high inlet turbulence intensity, flow coefficient (incidence), first-stator/rotor axial spacing, Reynolds number, and relative circumferential position of the first and second stators. Aerodynamic measurements include distributions of the mean and fluctuating velocities at the turbine inlet and, for each airfoil row, midspan airfoil surface pressures and circumferential distributions of the downstream steady state pressures and fluctuating velocities. Analytical results include airfoil heat transfer predictions and a examination of solutions of the unstead boundary layer equipment.
Epp: A C++ EGSnrc user code for x-ray imaging and scattering simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lippuner, Jonas; Elbakri, Idris A.; Cui Congwu
2011-03-15
Purpose: Easy particle propagation (Epp) is a user code for the EGSnrc code package based on the C++ class library egspp. A main feature of egspp (and Epp) is the ability to use analytical objects to construct simulation geometries. The authors developed Epp to facilitate the simulation of x-ray imaging geometries, especially in the case of scatter studies. While direct use of egspp requires knowledge of C++, Epp requires no programming experience. Methods: Epp's features include calculation of dose deposited in a voxelized phantom and photon propagation to a user-defined imaging plane. Projection images of primary, single Rayleigh scattered, singlemore » Compton scattered, and multiple scattered photons may be generated. Epp input files can be nested, allowing for the construction of complex simulation geometries from more basic components. To demonstrate the imaging features of Epp, the authors simulate 38 keV x rays from a point source propagating through a water cylinder 12 cm in diameter, using both analytical and voxelized representations of the cylinder. The simulation generates projection images of primary and scattered photons at a user-defined imaging plane. The authors also simulate dose scoring in the voxelized version of the phantom in both Epp and DOSXYZnrc and examine the accuracy of Epp using the Kawrakow-Fippel test. Results: The results of the imaging simulations with Epp using voxelized and analytical descriptions of the water cylinder agree within 1%. The results of the Kawrakow-Fippel test suggest good agreement between Epp and DOSXYZnrc. Conclusions: Epp provides the user with useful features, including the ability to build complex geometries from simpler ones and the ability to generate images of scattered and primary photons. There is no inherent computational time saving arising from Epp, except for those arising from egspp's ability to use analytical representations of simulation geometries. Epp agrees with DOSXYZnrc in dose calculation, since they are both based on the well-validated standard EGSnrc radiation transport physics model.« less
Steuer, Andrea E; Forss, Anna-Maria; Dally, Annika M; Kraemer, Thomas
2014-11-01
In the context of driving under the influence of drugs (DUID), not only common drugs of abuse may have an influence, but also medications with similar mechanisms of action. Simultaneous quantification of a variety of drugs and medications relevant in this context allows faster and more effective analyses. Therefore, multi-analyte approaches have gained more and more popularity in recent years. Usually, calibration curves for such procedures contain a mixture of all analytes, which might lead to mutual interferences. In this study we investigated whether the use of such mixtures leads to reliable results for authentic samples containing only one or two analytes. Five hundred microliters of whole blood were extracted by routine solid-phase extraction (SPE, HCX). Analysis was performed on an ABSciex 3200 QTrap instrument with ESI+ in scheduled MRM mode. The method was fully validated according to international guidelines including selectivity, recovery, matrix effects, accuracy and precision, stabilities, and limit of quantification. The selected SPE provided recoveries >60% for all analytes except 6-monoacetylmorphine (MAM) with coefficients of variation (CV) below 15% or 20% for quality controls (QC) LOW and HIGH, respectively. Ion suppression >30% was found for benzoylecgonine, hydrocodone, hydromorphone, MDA, oxycodone, and oxymorphone at QC LOW, however CVs were always below 10% (n=6 different whole blood samples). Accuracy and precision criteria were fulfilled for all analytes except for MAM. Systematic investigation of accuracy determined for QC MED in a multi-analyte mixture compared to samples containing only single analytes revealed no relevant differences for any analyte, indicating that a multi-analyte calibration is suitable for the presented method. Comparison of approximately 60 samples to a former GC-MS method showed good correlation. The newly validated method was successfully applied to more than 1600 routine samples and 3 proficiency tests. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
33 CFR 385.33 - Revisions to models and analytical tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...
21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.
Code of Federal Regulations, 2010 CFR
2010-04-01
... other than providing diagnostic information to patients and practitioners, e.g., forensic, academic... include the statement for class I exempt ASR's: “Analyte Specific Reagent. Analytical and performance... and performance characteristics are not established”; and (4) Shall not make any statement regarding...
betaFIT: A computer program to fit pointwise potentials to selected analytic functions
NASA Astrophysics Data System (ADS)
Le Roy, Robert J.; Pashov, Asen
2017-01-01
This paper describes program betaFIT, which performs least-squares fits of sets of one-dimensional (or radial) potential function values to four different types of sophisticated analytic potential energy functional forms. These families of potential energy functions are: the Expanded Morse Oscillator (EMO) potential [J Mol Spectrosc 1999;194:197], the Morse/Long-Range (MLR) potential [Mol Phys 2007;105:663], the Double Exponential/Long-Range (DELR) potential [J Chem Phys 2003;119:7398], and the "Generalized Potential Energy Function (GPEF)" form introduced by Šurkus et al. [Chem Phys Lett 1984;105:291], which includes a wide variety of polynomial potentials, such as the Dunham [Phys Rev 1932;41:713], Simons-Parr-Finlan [J Chem Phys 1973;59:3229], and Ogilvie-Tipping [Proc R Soc A 1991;378:287] polynomials, as special cases. This code will be useful for providing the realistic sets of potential function shape parameters that are required to initiate direct fits of selected analytic potential functions to experimental data, and for providing better analytical representations of sets of ab initio results.
Brasca, Milena; Morandi, Stefano; Silvetti, Tiziana; Rosi, Veronica; Cattaneo, Stefano; Pellegrino, Luisa
2013-05-21
Hen egg-white lysozyme (LSZ) is currently used in the food industry to limit the proliferation of lactic acid bacteria spoilage in the production of wine and beer, and to inhibit butyric acid fermentation in hard and extra hard cheeses (late blowing) caused by the outgrowth of clostridial spores. The aim of this work was to evaluate how the enzyme activity in commercial preparations correlates to the enzyme concentration and can be affected by the presence of process-related impurities. Different analytical approaches, including turbidimetric assay, SDS-PAGE and HPLC were used to analyse 17 commercial preparations of LSZ marketed in different countries. The HPLC method adopted by ISO allowed the true LSZ concentration to be determined with accuracy. The turbidimetric assay was the most suitable method to evaluate LSZ activity, whereas SDS-PAGE allowed the presence of other egg proteins, which are potential allergens, to be detected. The analytical results showed that the purity of commercially available enzyme preparations can vary significantly, and evidenced the effectiveness of combining different analytical approaches in this type of control.
Local Spatial Obesity Analysis and Estimation Using Online Social Network Sensors.
Sun, Qindong; Wang, Nan; Li, Shancang; Zhou, Hongyi
2018-03-15
Recently, the online social networks (OSNs) have received considerable attentions as a revolutionary platform to offer users massive social interaction among users that enables users to be more involved in their own healthcare. The OSNs have also promoted increasing interests in the generation of analytical, data models in health informatics. This paper aims at developing an obesity identification, analysis, and estimation model, in which each individual user is regarded as an online social network 'sensor' that can provide valuable health information. The OSN-based obesity analytic model requires each sensor node in an OSN to provide associated features, including dietary habit, physical activity, integral/incidental emotions, and self-consciousness. Based on the detailed measurements on the correlation of obesity and proposed features, the OSN obesity analytic model is able to estimate the obesity rate in certain urban areas and the experimental results demonstrate a high success estimation rate. The measurements and estimation experimental findings created by the proposed obesity analytic model show that the online social networks could be used in analyzing the local spatial obesity problems effectively. Copyright © 2018. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Shevade, A. V.; Ryan, M. A.; Homer, M. L.; Manfreda, A. M.; Zhou, H.; Manatt, K.
2002-01-01
We report a molecular modeling study to investigate the polymer-carbon black (CB) composite-analyte interactions in resistive sensors. These sensors comprise the JPL Electronic Nose (ENose) sensing array developed for monitoring breathing air in human habitats. The polymer in the composite is modeled based on its stereisomerism and sequence isomerism, while the CB is modeled as uncharged naphthalene rings (with no hydrogens). The Dreiding 2.21 force field is used for the polymer and solvent molecules and graphite parameters are assigned to the carbon black atoms. A combination of molecular mechanics (MM) and molecular dynamics (NPT-MD and NVT-MD) techniques are used to obtain the equilibrium composite structure by inserting naphthalene rings in the polymer matrix. Polymers considered for this work include poly(4- vinylphenol), polyethylene oxide, and ethyl cellulose. Analytes studied are representative of both inorganic (ammonia) and organic (methanol, toluene, hydrazine) compounds. The results are analyzed for the composite microstructure by calculating the radial distribution profiles as well as for the sensor response by predicting the interaction energies of the analytes with the composites.
Molecular modeling of polymer composite-analyte interactions in electronic nose sensors
NASA Technical Reports Server (NTRS)
Shevade, A. V.; Ryan, M. A.; Homer, M. L.; Manfreda, A. M.; Zhou, H.; Manatt, K. S.
2003-01-01
We report a molecular modeling study to investigate the polymer-carbon black (CB) composite-analyte interactions in resistive sensors. These sensors comprise the JPL electronic nose (ENose) sensing array developed for monitoring breathing air in human habitats. The polymer in the composite is modeled based on its stereoisomerism and sequence isomerism, while the CB is modeled as uncharged naphthalene rings with no hydrogens. The Dreiding 2.21 force field is used for the polymer, solvent molecules and graphite parameters are assigned to the carbon black atoms. A combination of molecular mechanics (MM) and molecular dynamics (NPT-MD and NVT-MD) techniques are used to obtain the equilibrium composite structure by inserting naphthalene rings in the polymer matrix. Polymers considered for this work include poly(4-vinylphenol), polyethylene oxide, and ethyl cellulose. Analytes studied are representative of both inorganic and organic compounds. The results are analyzed for the composite microstructure by calculating the radial distribution profiles as well as for the sensor response by predicting the interaction energies of the analytes with the composites. c2003 Elsevier Science B.V. All rights reserved.
Piérard, Gérald E; Courtois, Justine; Ritacco, Caroline; Humbert, Philippe; Fanian, Ferial; Piérard-Franchimont, Claudine
2015-01-01
Background In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD) and cyanoacrylate skin surface strippings (CSSSs). Methods Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. Results With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. Conclusion A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping. PMID:25767402
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm
2016-01-01
The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.
Dunand, Marielle; Donzelli, Massimiliano; Rickli, Anna; Hysek, Cédric M; Liechti, Matthias E; Grouzmann, Eric
2014-08-01
The diagnosis of pheochromocytoma relies on the measurement of plasma free metanephrines assay whose reliability has been considerably improved by ultra-high pressure liquid chromatography tandem mass spectrometry (UHPLC-MS/MS). Here we report an analytical interference occurring between 4-hydroxy-3-methoxymethamphetamine (HMMA), a metabolite of 3,4-methylenedioxymethamphetamine (MDMA, "Ecstasy"), and normetanephrine (NMN) since they share a common pharmacophore resulting in the same product ion after fragmentation. Synthetic HMMA was spiked into plasma samples containing various concentrations of NMN and the intensity of the interference was determined by UPLC-MS/MS before and after improvement of the analytical method. Using a careful adjustment of chromatographic conditions including the change of the UPLC analytical column, we were able to distinguish both compounds. HMMA interference for NMN determination should be seriously considered since MDMA activates the sympathetic nervous system and if confounded with NMN may lead to false-positive tests when performing a differential diagnostic of pheochromocytoma. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Buffer gas cooling and mixture analysis
Patterson, David S.; Doyle, John M.
2018-03-06
An apparatus for spectroscopy of a gas mixture is described. Such an apparatus includes a gas mixing system configured to mix a hot analyte gas that includes at least one analyte species in a gas phase into a cold buffer gas, thereby forming a supersaturated mixture to be provided for spectroscopic analysis.
Analysis of structural dynamic data from Skylab. Volume 2: Skylab analytical and test model data
NASA Technical Reports Server (NTRS)
Demchak, L.; Harcrow, H.
1976-01-01
The orbital configuration test modal data, analytical test correlation modal data, and analytical flight configuration modal data are presented. Tables showing the generalized mass contributions (GMCs) for each of the thirty tests modes are given along with the two dimensional mode shape plots and tables of GMCs for the test correlated analytical modes. The two dimensional mode shape plots for the analytical modes and uncoupled and coupled modes of the orbital flight configuration at three development phases of the model are included.
Lou, Ping; Lee, Jin Yong
2009-04-14
For a simple modified Poisson-Boltzmann (SMPB) theory, taking into account the finite ionic size, we have derived the exact analytic expression for the contact values of the difference profile of the counterion and co-ion, as well as of the sum (density) and product profiles, near a charged planar electrode that is immersed in a binary symmetric electrolyte. In the zero ionic size or dilute limit, these contact values reduce to the contact values of the Poisson-Boltzmann (PB) theory. The analytic results of the SMPB theory, for the difference, sum, and product profiles were compared with the results of the Monte-Carlo (MC) simulations [ Bhuiyan, L. B.; Outhwaite, C. W.; Henderson, D. J. Electroanal. Chem. 2007, 607, 54 ; Bhuiyan, L. B.; Henderson, D. J. Chem. Phys. 2008, 128, 117101 ], as well as of the PB theory. In general, the analytic expression of the SMPB theory gives better agreement with the MC data than the PB theory does. For the difference profile, as the electrode charge increases, the result of the PB theory departs from the MC data, but the SMPB theory still reproduces the MC data quite well, which indicates the importance of including steric effects in modeling diffuse layer properties. As for the product profile, (i) it drops to zero as the electrode charge approaches infinity; (ii) the speed of the drop increases with the ionic size, and these behaviors are in contrast with the predictions of the PB theory, where the product is identically 1.
Physics of thermo-acoustic sound generation
NASA Astrophysics Data System (ADS)
Daschewski, M.; Boehm, R.; Prager, J.; Kreutzbruck, M.; Harrer, A.
2013-09-01
We present a generalized analytical model of thermo-acoustic sound generation based on the analysis of thermally induced energy density fluctuations and their propagation into the adjacent matter. The model provides exact analytical prediction of the sound pressure generated in fluids and solids; consequently, it can be applied to arbitrary thermal power sources such as thermophones, plasma firings, laser beams, and chemical reactions. Unlike existing approaches, our description also includes acoustic near-field effects and sound-field attenuation. Analytical results are compared with measurements of sound pressures generated by thermo-acoustic transducers in air for frequencies up to 1 MHz. The tested transducers consist of titanium and indium tin oxide coatings on quartz glass and polycarbonate substrates. The model reveals that thermo-acoustic efficiency increases linearly with the supplied thermal power and quadratically with thermal excitation frequency. Comparison of the efficiency of our thermo-acoustic transducers with those of piezoelectric-based airborne ultrasound transducers using impulse excitation showed comparable sound pressure values. The present results show that thermo-acoustic transducers can be applied as broadband, non-resonant, high-performance ultrasound sources.
Sánchez, C; Ortega, B; Wei, J L; Tang, J; Capmany, J
2013-03-25
We provide an analytical study on the propagation effects of a directly modulated OOFDM signal through a dispersive fiber and subsequent photo-detection. The analysis includes the effects of the laser operation point and the interplay between chromatic dispersion and laser chirp. The final expression allows to understand the physics behind the transmission of a multi-carrier signal in the presence of residual frequency modulation and the description of the induced intermodulation distortion gives us a detailed insight into the diferent intermodulation products which impair the recovered signal at the receiver-end side. Numerical comparisons between transmission simulations results and those provided by evaluating the expression obtained are carried out for different laser operation points. Results obtained by changing the fiber length, laser parameters and using single mode fiber with negative and positive dispersion are calculated in order to demonstrate the validity and versatility of the theory provided in this paper. Therefore, a novel analytical formulation is presented as a versatile tool for the description and study of IM/DD OOFDM systems with variable design parameters.
Energy analysis in the elliptic restricted three-body problem
NASA Astrophysics Data System (ADS)
Qi, Yi; de Ruiter, Anton
2018-07-01
The gravity assist or flyby is investigated by analysing the inertial energy of a test particle in the elliptic restricted three-body problem (ERTBP), where two primary bodies are moving in elliptic orbits. First, the expression of the derivation of energy is obtained and discussed. Then, the approximate expressions of energy change in a circular neighbourhood of the smaller primary are derived. Numerical computation indicates that the obtained expressions can be applied to study the flyby problem of the nine planets and the Moon in the Solar system. Parameters related to the flyby are discussed analytically and numerically. The optimal conditions, including the position and time of the periapsis, for a flyby orbit are found to make a maximum energy gain or loss. Finally, the mechanical process of a flyby orbit is uncovered by an approximate expression in the ERTBP. Numerical computations testify that our analytical results well approximate the mechanical process of flyby orbits obtained by the numerical simulation in the ERTBP. Compared with the previous research established in the patched-conic method and numerical calculation, our analytical investigations based on a more elaborate derivation get more original results.
Energy Analysis in the Elliptic Restricted Three-body Problem
NASA Astrophysics Data System (ADS)
Qi, Yi; de Ruiter, Anton
2018-05-01
The gravity assist or flyby is investigated by analyzing the inertial energy of a test particle in the elliptic restricted three-body problem (ERTBP), where two primary bodies are moving in elliptic orbits. Firstly, the expression of the derivation of energy is obtained and discussed. Then, the approximate expressions of energy change in a circular neighborhood of the smaller primary are derived. Numerical computation indicates that the obtained expressions can be applied to study the flyby problem of the nine planets and the Moon in the solar system. Parameters related to the flyby are discussed analytically and numerically. The optimal conditions, including the position and time of the periapsis, for a flyby orbit are found to make a maximum energy gain or loss. Finally, the mechanical process of a flyby orbit is uncovered by an approximate expression in the ERTBP. Numerical computations testify that our analytical results well approximate the mechanical process of flyby orbits obtained by the numerical simulation in the ERTBP. Compared with the previous research established in the patched-conic method and numerical calculation, our analytical investigations based on a more elaborate derivation get more original results.
Perturbations of the Richardson number field by gravity waves
NASA Technical Reports Server (NTRS)
Wurtele, M. G.; Sharman, R. D.
1985-01-01
An analytic solution is presented for a stratified fluid of arbitrary constant Richardson number. By computer aided analysis the perturbation fields, including that of the Richardson number can be calculated. The results of the linear analytic model were compared with nonlinear simulations, leading to the following conclusions: (1) the perturbations in the Richardson number field, when small, are produced primarily by the perturbations of the shear; (2) perturbations of in the Richardson number field, even when small, are not symmetric, the increase being significantly larger than the decrease (the linear analytic solution and the nonlinear simulations both confirm this result); (3) as the perturbations grow, this asymmetry increases, but more so in the nonlinear simulations than in the linear analysis; (4) for large perturbations of the shear flow, the static stability, as represented by N2, is the dominating mechanism, becoming zero or negative, and producing convective overturning; and (5) the convectional measure of linearity in lee wave theory, NH/U, is no longer the critical parameter (it is suggested that (H/u sub 0) (du sub 0/dz) takes on this role in a shearing flow).
Analytical solutions for efficient interpretation of single-well push-pull tracer tests
NASA Astrophysics Data System (ADS)
Huang, Junqi; Christ, John A.; Goltz, Mark N.
2010-08-01
Single-well push-pull tracer tests have been used to characterize the extent, fate, and transport of subsurface contamination. Analytical solutions provide one alternative for interpreting test results. In this work, an exact analytical solution to two-dimensional equations describing the governing processes acting on a dissolved compound during a modified push-pull test (advection, longitudinal and transverse dispersion, first-order decay, and rate-limited sorption/partitioning in steady, divergent, and convergent flow fields) is developed. The coupling of this solution with inverse modeling to estimate aquifer parameters provides an efficient methodology for subsurface characterization. Synthetic data for single-well push-pull tests are employed to demonstrate the utility of the solution for determining (1) estimates of aquifer longitudinal and transverse dispersivities, (2) sorption distribution coefficients and rate constants, and (3) non-aqueous phase liquid (NAPL) saturations. Employment of the solution to estimate NAPL saturations based on partitioning and non-partitioning tracers is designed to overcome limitations of previous efforts by including rate-limited mass transfer. This solution provides a new tool for use by practitioners when interpreting single-well push-pull test results.
Molecular motion in cell membranes: Analytic study of fence-hindered random walks
NASA Astrophysics Data System (ADS)
Kenkre, V. M.; Giuggioli, L.; Kalay, Z.
2008-05-01
A theoretical calculation is presented to describe the confined motion of transmembrane molecules in cell membranes. The study is analytic, based on Master equations for the probability of the molecules moving as random walkers, and leads to explicit usable solutions including expressions for the molecular mean square displacement and effective diffusion constants. One outcome is a detailed understanding of the dependence of the time variation of the mean square displacement on the initial placement of the molecule within the confined region. How to use the calculations is illustrated by extracting (confinement) compartment sizes from experimentally reported published observations from single particle tracking experiments on the diffusion of gold-tagged G -protein coupled μ -opioid receptors in the normal rat kidney cell membrane, and by further comparing the analytical results to observations on the diffusion of phospholipids, also in normal rat kidney cells.
Ding, Chenghua; Qu, Kang; Li, Yongbo; Hu, Kai; Liu, Hongxia; Ye, Baoxian; Wu, Yangjie; Zhang, Shusheng
2007-11-02
Six calixarene bonded silica gel stationary phases were prepared and characterized by elemental analysis, infrared spectroscopy and thermal analysis. Their chromatographic performance was investigated by using PAHs, aromatic positional isomers and E- and Z-ethyl 3-(4-acetylphenyl) acrylate isomers as probes. Separation mechanism based on the different interactions between calixarenes and analytes were discussed. The chromatographic behaviors of those analytes on the calixarene columns were influenced by the supramolecular interaction including pi-pi interaction, space steric hindrance and hydrogen bonding interaction between calixarenes and analytes. Notably, the presence of polar groups (-OH, -NO(2) and -NH(2)) in the aromatic isomers could improve their separation selectivity on calixarene phase columns. The results from quantum chemistry calculation using DFT-B3LYP/STO-3G* base group were consistent with the retention behaviors of PHAs on calix[4]arene column.
Parametric Modeling of the Safety Effects of NextGen Terminal Maneuvering Area Conflict Scenarios
NASA Technical Reports Server (NTRS)
Rogers, William H.; Waldron, Timothy P.; Stroiney, Steven R.
2011-01-01
The goal of this work was to analytically identify and quantify the issues, challenges, technical hurdles, and pilot-vehicle interface issues associated with conflict detection and resolution (CD&R)in emerging operational concepts for a NextGen terminal aneuvering area, including surface operations. To this end, the work entailed analytical and trade studies focused on modeling the achievable safety benefits of different CD&R strategies and concepts in the current and future airport environment. In addition, crew-vehicle interface and pilot performance enhancements and potential issues were analyzed based on review of envisioned NextGen operations, expected equipage advances, and human factors expertise. The results of perturbation analysis, which quantify the high-level performance impact of changes to key parameters such as median response time and surveillance position error, show that the analytical model developed could be useful in making technology investment decisions.
Potvin, Christopher M; Zhou, Hongde
2011-11-01
The objective of this study was to demonstrate the effects of complex matrix effects caused by chemical materials on the analysis of key soluble microbial products (SMP) including proteins, humics, carbohydrates, and polysaccharides in activated sludge samples. Emphasis was placed on comparison of the commonly used standard curve technique with standard addition (SA), a technique that differs in that the analytical responses are measured for sample solutions spiked with known quantities of analytes. The results showed that using SA provided a great improvement in compensating for SMP recovery and thus improving measurement accuracy by correcting for matrix effects. Analyte recovery was found to be highly dependent on sample dilution, and changed due to extraction techniques, storage conditions and sample composition. Storage of sample extracts by freezing changed SMP concentrations dramatically, as did storage at 4°C for as little as 1d. Copyright © 2011 Elsevier Ltd. All rights reserved.