Sample records for target analytes included

  1. 100-B/C Target Analyte List Development for Soil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.W. Ovink

    2010-03-18

    This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.

  2. Portable apparatus for separating sample and detecting target analytes

    DOEpatents

    Renzi, Ronald F.; Wally, Karl; Crocker, Robert W.; Stamps, James F.; Griffiths; Stewart K. ,; Fruetel, Julia A.; Horn, Brent A.; Shokair, Isaac R.; Yee, Daniel D.; VanderNoot, Victoria A.; Wiedenman, Boyd J.; West, Jason A. A.; Ferko, Scott M.

    2008-11-18

    Portable devices and methods for determining the presence of a target analyte using a portable device are provided. The portable device is preferably hand-held. A sample is injected to the portable device. A microfluidic separation is performed within the portable device and at least one separated component detected by a detection module within the portable device, in embodiments of the invention. A target analyte is identified, based on the separated component, and the presence of the target analyte is indicated on an output interface of the portable device, in accordance with embodiments of the invention.

  3. Smartphone-based portable wireless optical system for the detection of target analytes.

    PubMed

    Gautam, Shreedhar; Batule, Bhagwan S; Kim, Hyo Yong; Park, Ki Soo; Park, Hyun Gyu

    2017-02-01

    Rapid and accurate on-site wireless measurement of hazardous molecules or biomarkers is one of the biggest challenges in nanobiotechnology. A novel smartphone-based Portable and Wireless Optical System (PAWS) for rapid, quantitative, and on-site analysis of target analytes is described. As a proof-of-concept, we employed gold nanoparticles (GNP) and an enzyme, horse radish peroxidase (HRP), to generate colorimetric signals in response to two model target molecules, melamine and hydrogen peroxide, respectively. The colorimetric signal produced by the presence of the target molecules is converted to an electrical signal by the inbuilt electronic circuit of the device. The converted electrical signal is then measured wirelessly via multimeter in the smartphone which processes the data and displays the results, including the concentration of analytes and its significance. This handheld device has great potential as a programmable and miniaturized platform to achieve rapid and on-site detection of various analytes in a point-of-care testing (POCT) manner. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry

    PubMed Central

    Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui

    2014-01-01

    Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355

  5. Analytical dose modeling for preclinical proton irradiation of millimetric targets.

    PubMed

    Vanstalle, Marie; Constanzo, Julie; Karakaya, Yusuf; Finck, Christian; Rousseau, Marc; Brasse, David

    2018-01-01

    Due to the considerable development of proton radiotherapy, several proton platforms have emerged to irradiate small animals in order to study the biological effectiveness of proton radiation. A dedicated analytical treatment planning tool was developed in this study to accurately calculate the delivered dose given the specific constraints imposed by the small dimensions of the irradiated areas. The treatment planning system (TPS) developed in this study is based on an analytical formulation of the Bragg peak and uses experimental range values of protons. The method was validated after comparison with experimental data from the literature and then compared to Monte Carlo simulations conducted using Geant4. Three examples of treatment planning, performed with phantoms made of water targets and bone-slab insert, were generated with the analytical formulation and Geant4. Each treatment planning was evaluated using dose-volume histograms and gamma index maps. We demonstrate the value of the analytical function for mouse irradiation, which requires a targeting accuracy of 0.1 mm. Using the appropriate database, the analytical modeling limits the errors caused by misestimating the stopping power. For example, 99% of a 1-mm tumor irradiated with a 24-MeV beam receives the prescribed dose. The analytical dose deviations from the prescribed dose remain within the dose tolerances stated by report 62 of the International Commission on Radiation Units and Measurements for all tested configurations. In addition, the gamma index maps show that the highly constrained targeting accuracy of 0.1 mm for mouse irradiation leads to a significant disagreement between Geant4 and the reference. This simulated treatment planning is nevertheless compatible with a targeting accuracy exceeding 0.2 mm, corresponding to rat and rabbit irradiations. Good dose accuracy for millimetric tumors is achieved with the analytical calculation used in this work. These volume sizes are typical in mouse

  6. 100-F Target Analyte List Development for Soil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-F Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  7. 100-K Target Analyte List Development for Soil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-K Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  8. An Investigation to Manufacturing Analytical Services Composition using the Analytical Target Cascading Method.

    PubMed

    Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas

    2017-01-01

    As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.

  9. Universal surface-enhanced Raman scattering amplification detector for ultrasensitive detection of multiple target analytes.

    PubMed

    Zheng, Jing; Hu, Yaping; Bai, Junhui; Ma, Cheng; Li, Jishan; Li, Yinhui; Shi, Muling; Tan, Weihong; Yang, Ronghua

    2014-02-18

    Up to now, the successful fabrication of efficient hot-spot substrates for surface-enhanced Raman scattering (SERS) remains an unsolved problem. To address this issue, we describe herein a universal aptamer-based SERS biodetection approach that uses a single-stranded DNA as a universal trigger (UT) to induce SERS-active hot-spot formation, allowing, in turn, detection of a broad range of targets. More specifically, interaction between the aptamer probe and its target perturbs a triple-helix aptamer/UT structure in a manner that activates a hybridization chain reaction (HCR) among three short DNA building blocks that self-assemble into a long DNA polymer. The SERS-active hot-spots are formed by conjugating 4-aminobenzenethiol (4-ABT)-encoded gold nanoparticles with the DNA polymer through a specific Au-S bond. As proof-of-principle, we used this approach to quantify multiple target analytes, including thrombin, adenosine, and CEM cancer cells, achieving lowest limit of detection values of 18 pM, 1.5 nM, and 10 cells/mL, respectively. As a universal SERS detector, this prototype can be applied to many other target analytes through the use of suitable DNA-functional partners, thus inspiring new designs and applications of SERS for bioanalysis.

  10. Targeted analyte detection by standard addition improves detection limits in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui

    2012-09-18

    Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.

  11. Analytical model for release calculations in solid thin-foils ISOL targets

    NASA Astrophysics Data System (ADS)

    Egoriti, L.; Boeckx, S.; Ghys, L.; Houngbo, D.; Popescu, L.

    2016-10-01

    A detailed analytical model has been developed to simulate isotope-release curves from thin-foils ISOL targets. It involves the separate modeling of diffusion and effusion inside the target. The former has been modeled using both first and second Fick's law. The latter, effusion from the surface of the target material to the end of the ionizer, was simulated with the Monte Carlo code MolFlow+. The calculated delay-time distribution for this process was then fitted using a double-exponential function. The release curve obtained from the convolution of diffusion and effusion shows good agreement with experimental data from two different target geometries used at ISOLDE. Moreover, the experimental yields are well reproduced when combining the release fraction with calculated in-target production.

  12. 100-N Area Decision Unit Target Analyte List Development for Soil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-N Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  13. Target-responsive DNA hydrogel mediated "stop-flow" microfluidic paper-based analytic device for rapid, portable and visual detection of multiple targets.

    PubMed

    Wei, Xiaofeng; Tian, Tian; Jia, Shasha; Zhu, Zhi; Ma, Yanli; Sun, Jianjun; Lin, Zhenyu; Yang, Chaoyong James

    2015-04-21

    A versatile point-of-care assay platform was developed for simultaneous detection of multiple targets based on a microfluidic paper-based analytic device (μPAD) using a target-responsive hydrogel to mediate fluidic flow and signal readout. An aptamer-cross-linked hydrogel was used as a target-responsive flow regulator in the μPAD. In the absence of a target, the hydrogel is formed in the flow channel, stopping the flow in the μPAD and preventing the colored indicator from traveling to the final observation spot, thus yielding a "signal off" readout. In contrast, in the presence of a target, no hydrogel is formed because of the preferential interaction of target and aptamer. This allows free fluidic flow in the μPAD, carrying the indicator to the observation spot and producing a "signal on" readout. The device is inexpensive to fabricate, easy to use, and disposable after detection. Testing results can be obtained within 6 min by the naked eye via a simple loading operation without the need for any auxiliary equipment. Multiple targets, including cocaine, adenosine, and Pb(2+), can be detected simultaneously, even in complex biological matrices such as urine. The reported method offers simple, low cost, rapid, user-friendly, point-of-care testing, which will be useful in many applications.

  14. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  15. An Analytic Model for the Success Rate of a Robotic Actuator System in Hitting Random Targets.

    PubMed

    Bradley, Stuart

    2015-11-20

    Autonomous robotic systems are increasingly being used in a wide range of applications such as precision agriculture, medicine, and the military. These systems have common features which often includes an action by an "actuator" interacting with a target. While simulations and measurements exist for the success rate of hitting targets by some systems, there is a dearth of analytic models which can give insight into, and guidance on optimization, of new robotic systems. The present paper develops a simple model for estimation of the success rate for hitting random targets from a moving platform. The model has two main dimensionless parameters: the ratio of actuator spacing to target diameter; and the ratio of platform distance moved (between actuator "firings") to the target diameter. It is found that regions of parameter space having specified high success are described by simple equations, providing guidance on design. The role of a "cost function" is introduced which, when minimized, provides optimization of design, operating, and risk mitigation costs.

  16. Analytical calculation of proton linear energy transfer in voxelized geometries including secondary protons.

    PubMed

    Sanchez-Parcerisa, D; Cortés-Giraldo, M A; Dolney, D; Kondrla, M; Fager, M; Carabe, A

    2016-02-21

    In order to integrate radiobiological modelling with clinical treatment planning for proton radiotherapy, we extended our in-house treatment planning system FoCa with a 3D analytical algorithm to calculate linear energy transfer (LET) in voxelized patient geometries. Both active scanning and passive scattering delivery modalities are supported. The analytical calculation is much faster than the Monte-Carlo (MC) method and it can be implemented in the inverse treatment planning optimization suite, allowing us to create LET-based objectives in inverse planning. The LET was calculated by combining a 1D analytical approach including a novel correction for secondary protons with pencil-beam type LET-kernels. Then, these LET kernels were inserted into the proton-convolution-superposition algorithm in FoCa. The analytical LET distributions were benchmarked against MC simulations carried out in Geant4. A cohort of simple phantom and patient plans representing a wide variety of sites (prostate, lung, brain, head and neck) was selected. The calculation algorithm was able to reproduce the MC LET to within 6% (1 standard deviation) for low-LET areas (under 1.7 keV μm(-1)) and within 22% for the high-LET areas above that threshold. The dose and LET distributions can be further extended, using radiobiological models, to include radiobiological effectiveness (RBE) calculations in the treatment planning system. This implementation also allows for radiobiological optimization of treatments by including RBE-weighted dose constraints in the inverse treatment planning process.

  17. Analytical calculation of proton linear energy transfer in voxelized geometries including secondary protons

    NASA Astrophysics Data System (ADS)

    Sanchez-Parcerisa, D.; Cortés-Giraldo, M. A.; Dolney, D.; Kondrla, M.; Fager, M.; Carabe, A.

    2016-02-01

    In order to integrate radiobiological modelling with clinical treatment planning for proton radiotherapy, we extended our in-house treatment planning system FoCa with a 3D analytical algorithm to calculate linear energy transfer (LET) in voxelized patient geometries. Both active scanning and passive scattering delivery modalities are supported. The analytical calculation is much faster than the Monte-Carlo (MC) method and it can be implemented in the inverse treatment planning optimization suite, allowing us to create LET-based objectives in inverse planning. The LET was calculated by combining a 1D analytical approach including a novel correction for secondary protons with pencil-beam type LET-kernels. Then, these LET kernels were inserted into the proton-convolution-superposition algorithm in FoCa. The analytical LET distributions were benchmarked against MC simulations carried out in Geant4. A cohort of simple phantom and patient plans representing a wide variety of sites (prostate, lung, brain, head and neck) was selected. The calculation algorithm was able to reproduce the MC LET to within 6% (1 standard deviation) for low-LET areas (under 1.7 keV μm-1) and within 22% for the high-LET areas above that threshold. The dose and LET distributions can be further extended, using radiobiological models, to include radiobiological effectiveness (RBE) calculations in the treatment planning system. This implementation also allows for radiobiological optimization of treatments by including RBE-weighted dose constraints in the inverse treatment planning process.

  18. Sigma metrics used to assess analytical quality of clinical chemistry assays: importance of the allowable total error (TEa) target.

    PubMed

    Hens, Koen; Berth, Mario; Armbruster, Dave; Westgard, Sten

    2014-07-01

    Six Sigma metrics were used to assess the analytical quality of automated clinical chemistry and immunoassay tests in a large Belgian clinical laboratory and to explore the importance of the source used for estimation of the allowable total error. Clinical laboratories are continually challenged to maintain analytical quality. However, it is difficult to measure assay quality objectively and quantitatively. The Sigma metric is a single number that estimates quality based on the traditional parameters used in the clinical laboratory: allowable total error (TEa), precision and bias. In this study, Sigma metrics were calculated for 41 clinical chemistry assays for serum and urine on five ARCHITECT c16000 chemistry analyzers. Controls at two analyte concentrations were tested and Sigma metrics were calculated using three different TEa targets (Ricos biological variability, CLIA, and RiliBÄK). Sigma metrics varied with analyte concentration, the TEa target, and between/among analyzers. Sigma values identified those assays that are analytically robust and require minimal quality control rules and those that exhibit more variability and require more complex rules. The analyzer to analyzer variability was assessed on the basis of Sigma metrics. Six Sigma is a more efficient way to control quality, but the lack of TEa targets for many analytes and the sometimes inconsistent TEa targets from different sources are important variables for the interpretation and the application of Sigma metrics in a routine clinical laboratory. Sigma metrics are a valuable means of comparing the analytical quality of two or more analyzers to ensure the comparability of patient test results.

  19. 75 FR 57016 - Maple Analytics, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-17

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER10-2541-000] Maple Analytics, LLC; Supplemental Notice That Initial Market- Based Rate Filing Includes Request for Blanket... proceeding of Maple Analytics, LLC's application for market-based rate authority, with an accompanying rate...

  20. An analytical approach of thermodynamic behavior in a gas target system on a medical cyclotron.

    PubMed

    Jahangiri, Pouyan; Zacchia, Nicholas A; Buckley, Ken; Bénard, François; Schaffer, Paul; Martinez, D Mark; Hoehr, Cornelia

    2016-01-01

    An analytical model has been developed to study the thermo-mechanical behavior of gas targets used to produce medical isotopes, assuming that the system reaches steady-state. It is based on an integral analysis of the mass and energy balance of the gas-target system, the ideal gas law, and the deformation of the foil. The heat transfer coefficients for different target bodies and gases have been calculated. Excellent agreement is observed between experiments performed at TRIUMF's 13 MeV cyclotron and the model. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Target analyte quantification by isotope dilution LC-MS/MS directly referring to internal standard concentrations--validation for serum cortisol measurement.

    PubMed

    Maier, Barbara; Vogeser, Michael

    2013-04-01

    Isotope dilution LC-MS/MS methods used in the clinical laboratory typically involve multi-point external calibration in each analytical series. Our aim was to test the hypothesis that determination of target analyte concentrations directly derived from the relation of the target analyte peak area to the peak area of a corresponding stable isotope labelled internal standard compound [direct isotope dilution analysis (DIDA)] may be not inferior to conventional external calibration with respect to accuracy and reproducibility. Quality control samples and human serum pools were analysed in a comparative validation protocol for cortisol as an exemplary analyte by LC-MS/MS. Accuracy and reproducibility were compared between quantification either involving a six-point external calibration function, or a result calculation merely based on peak area ratios of unlabelled and labelled analyte. Both quantification approaches resulted in similar accuracy and reproducibility. For specified analytes, reliable analyte quantification directly derived from the ratio of peak areas of labelled and unlabelled analyte without the need for a time consuming multi-point calibration series is possible. This DIDA approach is of considerable practical importance for the application of LC-MS/MS in the clinical laboratory where short turnaround times often have high priority.

  2. Multi-analyte validation in heterogeneous solution by ELISA.

    PubMed

    Lakshmipriya, Thangavel; Gopinath, Subash C B; Hashim, Uda; Murugaiyah, Vikneswaran

    2017-12-01

    Enzyme Linked Immunosorbent Assay (ELISA) is a standard assay that has been used widely to validate the presence of analyte in the solution. With the advancement of ELISA, different strategies have shown and became a suitable immunoassay for a wide range of analytes. Herein, we attempted to provide additional evidence with ELISA, to show its suitability for multi-analyte detection. To demonstrate, three clinically relevant targets have been chosen, which include 16kDa protein from Mycobacterium tuberculosis, human blood clotting Factor IXa and a tumour marker Squamous Cell Carcinoma antigen. Indeed, we adapted the routine steps from the conventional ELISA to validate the occurrence of analytes both in homogeneous and heterogeneous solutions. With the homogeneous and heterogeneous solutions, we could attain the sensitivity of 2, 8 and 1nM for the targets 16kDa protein, FIXa and SSC antigen, respectively. Further, the specific multi-analyte validations were evidenced with the similar sensitivities in the presence of human serum. ELISA assay in this study has proven its applicability for the genuine multiple target validation in the heterogeneous solution, can be followed for other target validations. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Analytic model of a laser-accelerated composite plasma target and its stability

    NASA Astrophysics Data System (ADS)

    Khudik, Vladimir; Shvets, Gennady

    2013-10-01

    A self-consistent analytical model of monoenergetic acceleration of a one and two-species ultrathin target irradiated by a circularly polarized laser pulse is developed. In the accelerated reference frame, the bulk plasma in the target is neutral and its parameters are assumed to be stationary. It is found that the structure of the target depends strongly on the temperatures of electrons and ions, which are both strongly influenced by the laser pulse pedestal. When the electron temperature is large, the hot electrons bounce back and forth inside the potential well formed by ponderomotive and electrostatic potentials while the heavy and light ions are forced-balanced by the electrostatic and non-inertial fields forming two separated layers. In the opposite limiting case when the ion temperature is large, the hot ions are trapped in the potential well formed by the ion-sheath's electric and non-inertial potentials while the cold electrons are forced-balanced by the electrostatic and ponderomotive fields. Using PIC simulations we have determined which scenario is realized in practice depending on the initial target structure and laser intensity. Target stability with respect to Rayleigh-Taylor instability will also be discussed. This work is supported by the US DOE grants DE-FG02-04ER41321 and DE-FG02-07ER54945.

  4. Optical measurements and analytical modeling of magnetic field generated in a dieletric target

    NASA Astrophysics Data System (ADS)

    Yafeng, BAI; Shiyi, ZHOU; Yushan, ZENG; Yihan, LIANG; Rong, QI; Wentao, LI; Ye, TIAN; Xiaoya, LI; Jiansheng, LIU

    2018-01-01

    Polarization rotation of a probe pulse by the target is observed with the Faraday rotation method in the interaction of an intense laser pulse with a solid target. The rotation of the polarization plane of the probe pulse may result from a combined action of fused silica and diffused electrons. After the irradiation of the main pulse, the rotation angle changed significantly and lasted ∼2 ps. These phenomena may imply a persistent magnetic field inside the target. An analytical model is developed to explain the experimental observation. The model indicates that a strong toroidal magnetic field is induced by an energetic electron beam. Meanwhile, an ionization channel is observed in the shadowgraph and extends at the speed of light after the irradiation of the main beam. The formation of this ionization channel is complex, and a simple explanation is given.

  5. Method and apparatus for optimized sampling of volatilizable target substances

    DOEpatents

    Lindgren, Eric R.; Phelan, James M.

    2004-10-12

    An apparatus for capturing, from gases such as soil gas, target analytes. Target analytes may include emanations from explosive materials or from residues of explosive materials. The apparatus employs principles of sorption common to solid phase microextraction, and is best used in conjunction with analysis means such as a gas chromatograph. To sorb target analytes, the apparatus functions using various sorptive structures to capture target analyte. Depending upon the embodiment, those structures may include a capillary tube including an interior surface on which sorptive material (similar to that on the surface of a SPME fiber) is supported (along with means for moving gases through the capillary tube so that the gases come into close proximity to the sorptive material). In one disclosed embodiment, at least one such sorptive structure is associated with an enclosure including an opening in communication with the surface of a soil region potentially contaminated with buried explosive material such as unexploded ordnance. Emanations from explosive materials can pass into and accumulate in the enclosure where they are sorbed by the sorptive structures. Also disclosed is the use of heating means such as microwave horns to drive target analytes into the soil gas from solid and liquid phase components of the soil.

  6. CTEPP STANDARD OPERATING PROCEDURE FOR PREPARATION OF SURROGATE RECOVERY STANDARD AND INTERNAL STANDARD SOLUTIONS FOR POLAR TARGET ANALYTES (SOP-5.26)

    EPA Science Inventory

    This SOP describes the method used for preparing surrogate recovery standard and internal standard solutions for the analysis of polar target analytes. It also describes the method for preparing calibration standard solutions for polar analytes used for gas chromatography/mass sp...

  7. Ultrasensitive detection of target analyte-induced aggregation of gold nanoparticles using laser-induced nanoparticle Rayleigh scattering.

    PubMed

    Lin, Jia-Hui; Tseng, Wei-Lung

    2015-01-01

    Detection of salt- and analyte-induced aggregation of gold nanoparticles (AuNPs) mostly relies on costly and bulky analytical instruments. To response this drawback, a portable, miniaturized, sensitive, and cost-effective detection technique is urgently required for rapid field detection and monitoring of target analyte via the use of AuNP-based sensor. This study combined a miniaturized spectrometer with a 532-nm laser to develop a laser-induced Rayleigh scattering technique, allowing the sensitive and selective detection of Rayleigh scattering from the aggregated AuNPs. Three AuNP-based sensing systems, including salt-, thiol- and metal ion-induced aggregation of the AuNPs, were performed to examine the sensitivity of laser-induced Rayleigh scattering technique. Salt-, thiol-, and metal ion-promoted NP aggregation were exemplified by the use of aptamer-adsorbed, fluorosurfactant-stabilized, and gallic acid-capped AuNPs for probing K(+), S-adenosylhomocysteine hydrolase-induced hydrolysis of S-adenosylhomocysteine, and Pb(2+), in sequence. Compared to the reported methods for monitoring the aggregated AuNPs, the proposed system provided distinct advantages of sensitivity. Laser-induced Rayleigh scattering technique was improved to be convenient, cheap, and portable by replacing a diode laser and a miniaturized spectrometer with a laser pointer and a smart-phone. Using this smart-phone-based detection platform, we can determine whether or not the Pb(2+) concentration exceed the maximum allowable level of Pb(2+) in drinking water. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Analytical model for investigation of interior noise characteristics in aircraft with multiple propellers including synchrophasing

    NASA Technical Reports Server (NTRS)

    Fuller, C. R.

    1986-01-01

    A simplified analytical model of transmission of noise into the interior of propeller-driven aircraft has been developed. The analysis includes directivity and relative phase effects of the propeller noise sources, and leads to a closed form solution for the coupled motion between the interior and exterior fields via the shell (fuselage) vibrational response. Various situations commonly encountered in considering sound transmission into aircraft fuselages are investigated analytically and the results obtained are compared to measurements in real aircraft. In general the model has proved successful in identifying basic mechanisms behind noise transmission phenomena.

  9. Challenges of Sustaining the International Space Station through 2020 and Beyond: Including Epistemic Uncertainty in Reassessing Confidence Targets

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Carter-Journet, Katrina; Box, Neil; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    This paper introduces an analytical approach, Probability and Confidence Trade-space (PACT), which can be used to assess uncertainty in International Space Station (ISS) hardware sparing necessary to extend the life of the vehicle. There are several key areas under consideration in this research. We investigate what sparing confidence targets may be reasonable to ensure vehicle survivability and for completion of science on the ISS. The results of the analysis will provide a methodological basis for reassessing vehicle subsystem confidence targets. An ongoing annual analysis currently compares the probability of existing spares exceeding the total expected unit demand of the Orbital Replacement Unit (ORU) in functional hierarchies approximating the vehicle subsystems. In cases where the functional hierarchies availability does not meet subsystem confidence targets, the current sparing analysis further identifies which ORUs may require additional spares to extend the life of the ISS. The resulting probability is dependent upon hardware reliability estimates. However, the ISS hardware fleet carries considerable epistemic uncertainty (uncertainty in the knowledge of the true hardware failure rate), which does not currently factor into the annual sparing analysis. The existing confidence targets may be conservative. This paper will also discuss how confidence targets may be relaxed based on the inclusion of epistemic uncertainty for each ORU. The paper will conclude with strengths and limitations for implementing the analytical approach in sustaining the ISS through end of life, 2020 and beyond.

  10. Tiered analytics for purity assessment of macrocyclic peptides in drug discovery: Analytical consideration and method development.

    PubMed

    Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A

    2017-05-10

    Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.

  11. Applications of reversible covalent chemistry in analytical sample preparation.

    PubMed

    Siegel, David

    2012-12-07

    Reversible covalent chemistry (RCC) adds another dimension to commonly used sample preparation techniques like solid-phase extraction (SPE), solid-phase microextraction (SPME), molecular imprinted polymers (MIPs) or immuno-affinity cleanup (IAC): chemical selectivity. By selecting analytes according to their covalent reactivity, sample complexity can be reduced significantly, resulting in enhanced analytical performance for low-abundance target analytes. This review gives a comprehensive overview of the applications of RCC in analytical sample preparation. The major reactions covered include reversible boronic ester formation, thiol-disulfide exchange and reversible hydrazone formation, targeting analyte groups like diols (sugars, glycoproteins and glycopeptides, catechols), thiols (cysteinyl-proteins and cysteinyl-peptides) and carbonyls (carbonylated proteins, mycotoxins). Their applications range from low abundance proteomics to reversible protein/peptide labelling to antibody chromatography to quantitative and qualitative food analysis. In discussing the potential of RCC, a special focus is on the conditions and restrictions of the utilized reaction chemistry.

  12. Method and apparatus for optimized sampling of volatilizable target substances

    DOEpatents

    Lindgren, Eric R.; Phelan, James M.

    2002-01-01

    An apparatus for capturing, from gases such as soil gas, target analytes. Target analytes may include emanations from explosive materials or from residues of explosive materials. The apparatus employs principles of sorption common to solid phase microextraction, and is best used in conjunction with analysis means such as a gas chromatograph. To sorb target analytes, the apparatus functions using various sorptive structures to capture target analyte. Depending upon the embodiment, those structures may include 1) a conventional solid-phase microextraction (SPME) fiber, 2) a SPME fiber suspended in a capillary tube (with means provided for moving gases through the capillary tube so that the gases come into close proximity to the suspended fiber), and 3) a capillary tube including an interior surface on which sorptive material (similar to that on the surface of a SPME fiber) is supported (along with means for moving gases through the capillary tube so that the gases come into close proximity to the sorptive material). In one disclosed embodiment, at least one such sorptive structure is associated with an enclosure including an opening in communication with the surface of a soil region potentially contaminated with buried explosive material such as unexploded ordnance. Emanations from explosive materials can pass into and accumulate in the enclosure where they are sorbed by the sorptive structures. Also disclosed is the use of heating means such as microwave horns to drive target analytes into the soil gas from solid and liquid phase components of the soil.

  13. Text-based Analytics for Biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah

    The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are

  14. Toxicologic evaluation of analytes from Tank 241-C-103

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahlum, D.D.; Young, J.Y.; Weller, R.E.

    1994-11-01

    Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team`s objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise,more » including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found.« less

  15. Measuring and Reducing Off-Target Activities of Programmable Nucleases Including CRISPR-Cas9

    PubMed Central

    Koo, Taeyoung; Lee, Jungjoon; Kim, Jin-Soo

    2015-01-01

    Programmable nucleases, which include zinc-finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and RNA-guided engineered nucleases (RGENs) repurposed from the type II clustered, regularly interspaced short palindromic repeats (CRISPR)-CRISPR-associated protein 9 (Cas9) system are now widely used for genome editing in higher eukaryotic cells and whole organisms, revolutionising almost every discipline in biological research, medicine, and biotechnology. All of these nucleases, however, induce off-target mutations at sites homologous in sequence with on-target sites, limiting their utility in many applications including gene or cell therapy. In this review, we compare methods for detecting nuclease off-target mutations. We also review methods for profiling genome-wide off-target effects and discuss how to reduce or avoid off-target mutations. PMID:25985872

  16. Utility of the summation chromatographic peak integration function to avoid manual reintegrations in the analysis of targeted analytes

    USDA-ARS?s Scientific Manuscript database

    As sample preparation and analytical techniques have improved, data handling has become the main limitation in automated high-throughput analysis of targeted chemicals in many applications. Conventional chromatographic peak integration functions rely on complex software and settings, but untrustwor...

  17. Visualization of the membrane engineering concept: evidence for the specific orientation of electroinserted antibodies and selective binding of target analytes.

    PubMed

    Kokla, Anna; Blouchos, Petros; Livaniou, Evangelia; Zikos, Christos; Kakabakos, Sotiris E; Petrou, Panagiota S; Kintzios, Spyridon

    2013-12-01

    Membrane engineering is a generic methodology for increasing the selectivity of a cell biosensor against a target molecule, by electroinserting target-specific receptor-like molecules on the cell surface. Previous studies have elucidated the biochemical aspects of the interaction between various analytes (including viruses) and their homologous membrane-engineered cells. In the present study, purified anti-biotin antibodies from a rabbit antiserum along with in-house prepared biotinylated bovine serum albumin (BSA) were used as a model antibody-antigen pair of molecules for facilitating membrane engineering experiments. It was proven, with the aid of fluorescence microscopy, that (i) membrane-engineered cells incorporated the specific antibodies in the correct orientation and that (ii) the inserted antibodies are selectively interacting with the homologous target molecules. This is the first time the actual working concept of membrane engineering has been visualized, thus providing a final proof of the concept behind this innovative process. In addition, the fluorescence microscopy measurements were highly correlated with bioelectric measurements done with the aid of a bioelectric recognition assay. Copyright © 2013 John Wiley & Sons, Ltd.

  18. Validating An Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments

    NASA Astrophysics Data System (ADS)

    Catanzarite, Joseph; Burke, Christopher J.; Li, Jie; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    The Kepler Mission is developing an Analytic Completeness Model (ACM) to estimate detection completeness contours as a function of exoplanet radius and period for each target star. Accurate completeness contours are necessary for robust estimation of exoplanet occurrence rates.The main components of the ACM for a target star are: detection efficiency as a function of SNR, the window function (WF) and the one-sigma depth function (OSDF). (Ref. Burke et al. 2015). The WF captures the falloff in transit detection probability at long periods that is determined by the observation window (the duration over which the target star has been observed). The OSDF is the transit depth (in parts per million) that yields SNR of unity for the full transit train. It is a function of period, and accounts for the time-varying properties of the noise and for missing or deweighted data.We are performing flux-level transit injection (FLTI) experiments on selected Kepler target stars with the goal of refining and validating the ACM. “Flux-level” injection machinery inserts exoplanet transit signatures directly into the flux time series, as opposed to “pixel-level” injection, which inserts transit signatures into the individual pixels using the pixel response function. See Jie Li's poster: ID #2493668, "Flux-level transit injection experiments with the NASA Pleiades Supercomputer" for details, including performance statistics.Since FLTI is affordable for only a small subset of the Kepler targets, the ACM is designed to apply to most Kepler target stars. We validate this model using “deep” FLTI experiments, with ~500,000 injection realizations on each of a small number of targets and “shallow” FLTI experiments with ~2000 injection realizations on each of many targets. From the results of these experiments, we identify anomalous targets, model their behavior and refine the ACM accordingly.In this presentation, we discuss progress in validating and refining the ACM, and we

  19. Combining CBT and Behavior-Analytic Approaches to Target Severe Emotion Dysregulation in Verbal Youth with ASD and ID

    ERIC Educational Resources Information Center

    Parent, Veronique; Birtwell, Kirstin B.; Lambright, Nathan; DuBard, Melanie

    2016-01-01

    This article presents an individual intervention combining cognitive-behavioral and behavior-analytic approaches to target severe emotion dysregulation in verbal youth with autism spectrum disorder (ASD) concurrent with intellectual disability (ID). The article focuses on two specific individuals who received the treatment within a therapeutic…

  20. Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi

    2016-10-01

    The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (target analytes down to a few thousands of molecules (˜zmoles ).

  1. NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR METALS IN SPIKES

    EPA Science Inventory

    This data set includes analytical results for measurements of metals in 49 field control samples (spikes). Measurements were made for up to 11 metals in samples of water, blood, and urine. Field controls were used to assess recovery of target analytes from a sample media during s...

  2. CTEPP STANDARD OPERATING PROCEDURE FOR DETECTION AND QUANTIFICATION OF TARGET ANALYTES BY GAS CHROMATOGRAPHY/MASS SPECTROMETRY (GC/MS) (SOP-5.24)

    EPA Science Inventory

    This standard operating procedure describes the method used for the determination of target analytes in sample extracts and related quality assurance/quality control sample extracts generated in the CTEPP study.

  3. Analytic Guided-Search Model of Human Performance Accuracy in Target- Localization Search Tasks

    NASA Technical Reports Server (NTRS)

    Eckstein, Miguel P.; Beutter, Brent R.; Stone, Leland S.

    2000-01-01

    Current models of human visual search have extended the traditional serial/parallel search dichotomy. Two successful models for predicting human visual search are the Guided Search model and the Signal Detection Theory model. Although these models are inherently different, it has been difficult to compare them because the Guided Search model is designed to predict response time, while Signal Detection Theory models are designed to predict performance accuracy. Moreover, current implementations of the Guided Search model require the use of Monte-Carlo simulations, a method that makes fitting the model's performance quantitatively to human data more computationally time consuming. We have extended the Guided Search model to predict human accuracy in target-localization search tasks. We have also developed analytic expressions that simplify simulation of the model to the evaluation of a small set of equations using only three free parameters. This new implementation and extension of the Guided Search model will enable direct quantitative comparisons with human performance in target-localization search experiments and with the predictions of Signal Detection Theory and other search accuracy models.

  4. Analytic validation and real-time clinical application of an amplicon-based targeted gene panel for advanced cancer

    PubMed Central

    Wing, Michele R.; Reeser, Julie W.; Smith, Amy M.; Reeder, Matthew; Martin, Dorrelyn; Jewell, Benjamin M.; Datta, Jharna; Miya, Jharna; Monk, J. Paul; Mortazavi, Amir; Otterson, Gregory A.; Goldberg, Richard M.; VanDeusen, Jeffrey B.; Cole, Sharon; Dittmar, Kristin; Jaiswal, Sunny; Kinzie, Matthew; Waikhom, Suraj; Freud, Aharon G.; Zhou, Xiao-Ping; Chen, Wei; Bhatt, Darshna; Roychowdhury, Sameek

    2017-01-01

    Multiplex somatic testing has emerged as a strategy to test patients with advanced cancer. We demonstrate our analytic validation approach for a gene hotspot panel and real-time prospective clinical application for any cancer type. The TruSight Tumor 26 assay amplifies 85 somatic hotspot regions across 26 genes. Using cell line and tumor mixes, we observed that 100% of the 14,715 targeted bases had at least 1000x raw coverage. We determined the sensitivity (100%, 95% CI: 96-100%), positive predictive value (100%, 95% CI: 96-100%), reproducibility (100% concordance), and limit of detection (3% variant allele frequency at 1000x read depth) of this assay to detect single nucleotide variants and small insertions and deletions. Next, we applied the assay prospectively in a clinical tumor sequencing study to evaluate 174 patients with metastatic or advanced cancer, including frozen tumors, formalin-fixed tumors, and enriched peripheral blood mononuclear cells in hematologic cancers. We reported one or more somatic mutations in 89 (53%) of the sequenced tumors (167 passing quality filters). Forty-three of these patients (26%) had mutations that would enable eligibility for targeted therapies. This study demonstrates the validity and feasibility of applying TruSight Tumor 26 for pan-cancer testing using multiple specimen types. PMID:29100271

  5. Magnetic ionic liquids in analytical chemistry: A review.

    PubMed

    Clark, Kevin D; Nacham, Omprakash; Purslow, Jeffrey A; Pierson, Stephen A; Anderson, Jared L

    2016-08-31

    Magnetic ionic liquids (MILs) have recently generated a cascade of innovative applications in numerous areas of analytical chemistry. By incorporating a paramagnetic component within the cation or anion, MILs exhibit a strong response toward external magnetic fields. Careful design of the MIL structure has yielded magnetoactive compounds with unique physicochemical properties including high magnetic moments, enhanced hydrophobicity, and the ability to solvate a broad range of molecules. The structural tunability and paramagnetic properties of MILs have enabled magnet-based technologies that can easily be added to the analytical method workflow, complement needed extraction requirements, or target specific analytes. This review highlights the application of MILs in analytical chemistry and examines the important structural features of MILs that largely influence their physicochemical and magnetic properties. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Targeted analyte deconvolution and identification by four-way parallel factor analysis using three-dimensional gas chromatography with mass spectrometry data.

    PubMed

    Watson, Nathanial E; Prebihalo, Sarah E; Synovec, Robert E

    2017-08-29

    Comprehensive three-dimensional gas chromatography with time-of-flight mass spectrometry (GC 3 -TOFMS) creates an opportunity to explore a new paradigm in chemometric analysis. Using this newly described instrument and the well understood Parallel Factor Analysis (PARAFAC) model we present one option for utilization of the novel GC 3 -TOFMS data structure. We present a method which builds upon previous work in both GC 3 and targeted analysis using PARAFAC to simplify some of the implementation challenges previously discovered. Conceptualizing the GC 3 -TOFMS instead as a one-dimensional gas chromatograph with GC × GC-TOFMS detection we allow the instrument to create the PARAFAC target window natively. Each first dimension modulation thus creates a full GC × GC-TOFMS chromatogram fully amenable to PARAFAC. A simple mixture of 115 compounds and a diesel sample are interrogated through this methodology. All test analyte targets are successfully identified in both mixtures. In addition, mass spectral matching of the PARAFAC loadings to library spectra yielded results greater than 900 in 40 of 42 test analyte cases. Twenty-nine of these cases produced match values greater than 950. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Cryostat including heater to heat a target

    DOEpatents

    Pehl, Richard H.; Madden, Norman W.; Malone, Donald F.

    1990-01-01

    A cryostat is provided which comprises a vacuum vessel; a target disposed within the vacuum vessel; a heat sink disposed within the vacuum vesssel for absorbing heat from the detector; a cooling mechanism for cooling the heat sink; a cryoabsorption mechanism for cryoabsorbing residual gas within the vacuum vessel; and a heater for maintaining the target above a temperature at which the residual gas is cryoabsorbed in the course of cryoabsorption of the residual gas by the cryoabsorption mechanism.

  8. MALDI based identification of soybean protein markers--possible analytical targets for allergen detection in processed foods.

    PubMed

    Cucu, Tatiana; De Meulenaer, Bruno; Devreese, Bart

    2012-02-01

    Soybean (Glycine max) is extensively used all over the world due to its nutritional qualities. However, soybean is included in the "big eight" list of food allergens. According to the EU directive 2007/68/EC, food products containing soybeans have to be labeled in order to protect the allergic consumers. Nevertheless, soybeans can still inadvertently be present in food products. The development of analytical methods for the detection of traces of allergens is important for the protection of allergic consumers. Mass spectrometry of marker proteolytical fragments of protein allergens is growingly recognized as a detection method in food control. However, quantification of soybean at the peptide level is hindered due to limited information regarding specific stable markers derived after proteolytic digestion. The aim of this study was to use MALDI-TOF/MS and MS/MS as a fast screening tool for the identification of stable soybean derived tryptic markers which were still identifiable even if the proteins were subjected to various changes at the molecular level through a number of reactions typically occurring during food processing (denaturation, the Maillard reaction and oxidation). The peptides (401)Val-Arg(410) from the G1 glycinin (Gly m 6) and the (518)Gln-Arg(528) from the α' chain of the β-conglycinin (Gly m 5) proved to be the most stable. These peptides hold potential to be used as targets for the development of new analytical methods for the detection of soybean protein traces in processed foods. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Cryostat including heater to heat a target

    DOEpatents

    Pehl, R.H.; Madden, N.W.; Malone, D.F.

    1990-09-11

    A cryostat is provided which comprises a vacuum vessel; a target disposed within the vacuum vessel; a heat sink disposed within the vacuum vessel for absorbing heat from the detector; a cooling mechanism for cooling the heat sink; a cryoabsorption mechanism for cryoabsorbing residual gas within the vacuum vessel; and a heater for maintaining the target above a temperature at which the residual gas is cryoabsorbed in the course of cryoabsorption of the residual gas by the cryoabsorption mechanism. 2 figs.

  10. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  11. Active matrix-based collection of airborne analytes: an analyte recording chip providing exposure history and finger print.

    PubMed

    Fang, Jun; Park, Se-Chul; Schlag, Leslie; Stauden, Thomas; Pezoldt, Jörg; Jacobs, Heiko O

    2014-12-03

    In the field of sensors that target the detection of airborne analytes, Corona/lens-based-collection provides a new path to achieve a high sensitivity. An active-matrix-based analyte collection approach referred to as "airborne analyte memory chip/recorder" is demonstrated, which takes and stores airborne analytes in a matrix to provide an exposure history for off-site analysis. © 2014 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    PubMed

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select

  13. Evaluating child welfare policies with decision-analytic simulation models.

    PubMed

    Goldhaber-Fiebert, Jeremy D; Bailey, Stephanie L; Hurlburt, Michael S; Zhang, Jinjin; Snowden, Lonnie R; Wulczyn, Fred; Landsverk, John; Horwitz, Sarah M

    2012-11-01

    The objective was to demonstrate decision-analytic modeling in support of Child Welfare policymakers considering implementing evidence-based interventions. Outcomes included permanency (e.g., adoptions) and stability (e.g., foster placement changes). Analyses of a randomized trial of KEEP-a foster parenting intervention-and NSCAW-1 estimated placement change rates and KEEP's effects. A microsimulation model generalized these findings to other Child Welfare systems. The model projected that KEEP could increase permanency and stability, identifying strategies targeting higher-risk children and geographical regions that achieve benefits efficiently. Decision-analytic models enable planners to gauge the value of potential implementations.

  14. A unified analytical drain current model for Double-Gate Junctionless Field-Effect Transistors including short channel effects

    NASA Astrophysics Data System (ADS)

    Raksharam; Dutta, Aloke K.

    2017-04-01

    In this paper, a unified analytical model for the drain current of a symmetric Double-Gate Junctionless Field-Effect Transistor (DG-JLFET) is presented. The operation of the device has been classified into four modes: subthreshold, semi-depleted, accumulation, and hybrid; with the main focus of this work being on the accumulation mode, which has not been dealt with in detail so far in the literature. A physics-based model, using a simplified one-dimensional approach, has been developed for this mode, and it has been successfully integrated with the model for the hybrid mode. It also includes the effect of carrier mobility degradation due to the transverse electric field, which was hitherto missing in the earlier models reported in the literature. The piece-wise models have been unified using suitable interpolation functions. In addition, the model includes two most important short-channel effects pertaining to DG-JLFETs, namely the Drain Induced Barrier Lowering (DIBL) and the Subthreshold Swing (SS) degradation. The model is completely analytical, and is thus computationally highly efficient. The results of our model have shown an excellent match with those obtained from TCAD simulations for both long- and short-channel devices, as well as with the experimental data reported in the literature.

  15. TAPIR, a web server for the prediction of plant microRNA targets, including target mimics.

    PubMed

    Bonnet, Eric; He, Ying; Billiau, Kenny; Van de Peer, Yves

    2010-06-15

    We present a new web server called TAPIR, designed for the prediction of plant microRNA targets. The server offers the possibility to search for plant miRNA targets using a fast and a precise algorithm. The precise option is much slower but guarantees to find less perfectly paired miRNA-target duplexes. Furthermore, the precise option allows the prediction of target mimics, which are characterized by a miRNA-target duplex having a large loop, making them undetectable by traditional tools. The TAPIR web server can be accessed at: http://bioinformatics.psb.ugent.be/webtools/tapir. Supplementary data are available at Bioinformatics online.

  16. On-target separation of analyte with 3-aminoquinoline/α-cyano-4-hydroxycinnamic acid liquid matrix for matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Sekiya, Sadanori; Taniguchi, Kenichi; Tanaka, Koichi

    2012-03-30

    3-Aminoquinoline/α-cyano-4-hydroxycinnamic acid (3AQ/CHCA) is a liquid matrix (LM), which was reported by Kumar et al. in 1996 for matrix-assisted laser desorption/ionization (MALDI) mass spectrometry. It is a viscous liquid and has some advantages of durability of ion generation by a self-healing surface and quantitative performance. In this study, we found a novel aspect of 3AQ/CHCA as a MALDI matrix, which converges hydrophilic material into the center of the droplet of analyte-3AQ/CHCA mixture on a MALDI sample target well during the process of evaporation of water derived from analyte solvent. This feature made it possible to separate not only the buffer components, but also the peptides and oligosaccharides from one another within 3AQ/CHCA. The MALDI imaging analyses of the analyte-3AQ/CHCA droplet indicated that the oligosaccharides and the peptides were distributed in the center and in the whole area around the center of 3AQ/CHCA, respectively. This 'on-target separation' effect was also applicable to glycoprotein digests such as ribonuclease B. These features of 3AQ/CHCA liquid matrix eliminate the requirement for pretreatment, and reduce sample handling losses thus resulting in the improvement of throughput and sensitivity. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Analytical model for tilting proprotor aircraft dynamics, including blade torsion and coupled bending modes, and conversion mode operation

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1974-01-01

    An analytical model is developed for proprotor aircraft dynamics. The rotor model includes coupled flap-lag bending modes, and blade torsion degrees of freedom. The rotor aerodynamic model is generally valid for high and low inflow, and for axial and nonaxial flight. For the rotor support, a cantilever wing is considered; incorporation of a more general support with this rotor model will be a straight-forward matter.

  18. Novel medical therapeutics in glioblastomas, including targeted molecular therapies, current and future clinical trials.

    PubMed

    Quant, Eudocia C; Wen, Patrick Y

    2010-08-01

    The prognosis for glioblastoma is poor despite optimal therapy with surgery, radiation, and chemotherapy. New therapies that improve survival and quality of life are needed. Research has increased our understanding of the molecular pathways important for gliomagenesis and disease progression. Novel agents have been developed against these targets, including receptor tyrosine kinases, intracellular signaling molecules, epigenetic abnormalities, and tumor vasculature and microenvironment. This article reviews novel therapies for glioblastoma, with an emphasis on targeted agents. Copyright 2010 Elsevier Inc. All rights reserved.

  19. Magnetic nanoparticles for bio-analytical applications

    NASA Astrophysics Data System (ADS)

    Yedlapalli, Sri Lakshmi

    Magnetic nanoparticles are widely being used in various fields of medicine, biology and separations. This dissertation focuses on the synthesis and use of magnetic nanoparticles for targeted drug delivery and analytical separations. The goals of this research include synthesis of biocompatible surface modified monodisperse superparamagnetic iron oxide nanoparticles (SPIONs) by novel techniques for targeted drug delivery and use of SPIONs as analytical sensing tools. Surface modification of SPIONs was performed with two different co-polymers: tri block co-polymer Pluronics and octylamine modified polyacrylic acid. Samples of SPIONs were subsequently modified with 4 different commercially available, FDA approved tri-block copolymers (Pluronics), covering a wide range of molecular weights (5.75-14.6 kDa). A novel, technically simpler and faster phase transfer approach was developed to surface modify the SPIONs with Pluronics for drug delivery and other biomedical applications. The hydrodynamic diameter and aggregation properties of the Pluronic modified SPIONs were studied by dynamic light scattering (DLS). The coverage of SPIONs with Pluronics was supported with IR Spectroscopy and characterized by Thermo gravimetric Analysis (TGA). The drug entrapment capacity of SPIONs was studied by UV-VIS spectroscopy using a hydrophobic carbocyanine dye, which serves as a model for hydrophobic drugs. These studies resulted in a comparison of physical properties and their implications for drug loading capacities of the four types of Pluronic coated SPIONs for drug delivery assessment. These drug delivery systems could be used for passive drug targeting. However, Pluronics lack the functional group necessary for bioconjugation and hence cannot achieve active targeting. SPIONs were functionalized with octylamine modified polyacrylic acid-based copolymer, providing water solubility and facile biomolecular conjugation. Epirubicin was loaded onto SPIONs and the drug entrapment was

  20. Analytic model for the long-term evolution of circular Earth satellite orbits including lunar node regression

    NASA Astrophysics Data System (ADS)

    Zhu, Ting-Lei; Zhao, Chang-Yin; Zhang, Ming-Jiang

    2017-04-01

    This paper aims to obtain an analytic approximation to the evolution of circular orbits governed by the Earth's J2 and the luni-solar gravitational perturbations. Assuming that the lunar orbital plane coincides with the ecliptic plane, Allan and Cook (Proc. R. Soc. A, Math. Phys. Eng. Sci. 280(1380):97, 1964) derived an analytic solution to the orbital plane evolution of circular orbits. Using their result as an intermediate solution, we establish an approximate analytic model with lunar orbital inclination and its node regression be taken into account. Finally, an approximate analytic expression is derived, which is accurate compared to the numerical results except for the resonant cases when the period of the reference orbit approximately equals the integer multiples (especially 1 or 2 times) of lunar node regression period.

  1. Comprehensive analytical strategy for biomonitoring of pesticides in urine by liquid chromatography–orbitrap high resolution masss pectrometry.

    PubMed

    Roca, M; Leon, N; Pastor, A; Yusà, V

    2014-12-29

    In this study we propose an analytical strategy that combines a target approach for the quantitative analysis of contemporary pesticide metabolites with a comprehensive post-target screening for the identification of biomarkers of exposure to environmental contaminants in urine using liquid chromatography coupled to high-resolution mass spectrometry (LC–HRMS). The quantitative method for the target analysis of 29 urinary metabolites of organophosphate (OP) insecticides, synthetic pyrethroids, herbicides and fungicides was validated after a previous statistical optimization of the main factors governing the ion source ionization and a fragmentation study using the high energy collision dissociation (HCD) cell. The full scan accurate mass data were acquired with a resolving power of 50,000 FWHM (scan speed, 2 Hz), in both ESI+ and ESI− modes, and with and without HCD-fragmentation. The method – LOQ was lower than 3.2 μg L−1 for the majority of the analytes. For post-target screening a customized theoretical database was built, for the identification of 60 metabolites including pesticides, PAHs, phenols, and other metabolites of environmental pollutants. For identification purposes, accurate exact mass with less than 5 ppm, and diagnostic ions including isotopes and/or fragments were used. The analytical strategy was applied to 20 urine sample collected from children living in Valencia Region. Eleven target metabolites were detected with concentrations ranging from 1.18 to 131 μg L−1. Likewise, several compounds were tentatively identified in the post-target analysis belonging to the families of phthalates, phenols and parabenes. The proposed strategy is suitable for the determination of target pesticide biomarkers in urine in the framework of biomonitoring studies, and appropriate for the identification of other non-target metabolites.

  2. Quantitative evaluation of analyte transport on microfluidic paper-based analytical devices (μPADs).

    PubMed

    Ota, Riki; Yamada, Kentaro; Suzuki, Koji; Citterio, Daniel

    2018-02-07

    The transport efficiency during capillary flow-driven sample transport on microfluidic paper-based analytical devices (μPADs) made from filter paper has been investigated for a selection of model analytes (Ni 2+ , Zn 2+ , Cu 2+ , PO 4 3- , bovine serum albumin, sulforhodamine B, amaranth) representing metal cations, complex anions, proteins and anionic molecules. For the first time, the transport of the analytical target compounds rather than the sample liquid, has been quantitatively evaluated by means of colorimetry and absorption spectrometry-based methods. The experiments have revealed that small paperfluidic channel dimensions, additional user operation steps (e.g. control of sample volume, sample dilution, washing step) as well as the introduction of sample liquid wicking areas allow to increase analyte transport efficiency. It is also shown that the interaction of analytes with the negatively charged cellulosic paper substrate surface is strongly influenced by the physico-chemical properties of the model analyte and can in some cases (Cu 2+ ) result in nearly complete analyte depletion during sample transport. The quantitative information gained through these experiments is expected to contribute to the development of more sensitive μPADs.

  3. Feasibility of including green tea products for an analytically verified dietary supplement database

    USDA-ARS?s Scientific Manuscript database

    The Dietary Supplement Ingredient Database (DSID) is a federally-funded, publically-accessible dietary supplement database that currently contains analytically derived information on micronutrients in selected adult and children’s multivitamin and mineral (MVM) supplements. Other constituents in di...

  4. 76 FR 56193 - KAP Analytics, LLC ; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-12

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-4440-000] KAP Analytics... 204 Authorization This is a supplemental notice in the above-referenced proceeding of KAP Analytics... eSubscription link on the Web site that enables subscribers to receive e-mail notification when a...

  5. Analyticity without Differentiability

    ERIC Educational Resources Information Center

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  6. An analytical model for calculating microdosimetric distributions from heavy ions in nanometer site targets.

    PubMed

    Czopyk, L; Olko, P

    2006-01-01

    The analytical model of Xapsos used for calculating microdosimetric spectra is based on the observation that straggling of energy loss can be approximated by a log-normal distribution of energy deposition. The model was applied to calculate microdosimetric spectra in spherical targets of nanometer dimensions from heavy ions at energies between 0.3 and 500 MeV amu(-1). We recalculated the originally assumed 1/E(2) initial delta electrons spectrum by applying the Continuous Slowing Down Approximation for secondary electrons. We also modified the energy deposition from electrons of energy below 100 keV, taking into account the effective path length of the scattered electrons. Results of our model calculations agree favourably with results of Monte Carlo track structure simulations using MOCA-14 for light ions (Z = 1-8) of energy ranging from E = 0.3 to 10.0 MeV amu(-1) as well as with results of Nikjoo for a wall-less proportional counter (Z = 18).

  7. Expanding the analyte set of the JPL Electronic Nose to include inorganic compounds

    NASA Technical Reports Server (NTRS)

    Ryan, M. A.; Homer, M. L.; Zhou, H.; Mannat, K.; Manfreda, A.; Kisor, A.; Shevade, A.; Yen, S. P. S.

    2005-01-01

    An array-based sensing system based on 32 polymer/carbon composite conductometric sensors is under development at JPL. Until the present phase of development, the analyte set has focuses on organic compounds and a few selected inorganic compounds, notably ammonia and hydrazine.

  8. CAMELOT: Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox

    NASA Astrophysics Data System (ADS)

    Di Carlo, Marilena; Romero Martin, Juan Manuel; Vasile, Massimiliano

    2018-03-01

    Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox (CAMELOT) is a toolbox for the fast preliminary design and optimisation of low-thrust trajectories. It solves highly complex combinatorial problems to plan multi-target missions characterised by long spirals including different perturbations. To do so, CAMELOT implements a novel multi-fidelity approach combining analytical surrogate modelling and accurate computational estimations of the mission cost. Decisions are then made using two optimisation engines included in the toolbox, a single-objective global optimiser, and a combinatorial optimisation algorithm. CAMELOT has been applied to a variety of case studies: from the design of interplanetary trajectories to the optimal de-orbiting of space debris and from the deployment of constellations to on-orbit servicing. In this paper, the main elements of CAMELOT are described and two examples, solved using the toolbox, are presented.

  9. Radiation-driven winds of hot stars. VI - Analytical solutions for wind models including the finite cone angle effect

    NASA Technical Reports Server (NTRS)

    Kudritzki, R. P.; Pauldrach, A.; Puls, J.; Abbott, D. C.

    1989-01-01

    Analytical solutions for radiation-driven winds of hot stars including the important finite cone angle effect (see Pauldrach et al., 1986; Friend and Abbott, 1986) are derived which approximate the detailed numerical solutions of the exact wind equation of motion very well. They allow a detailed discussion of the finite cone angle effect and provide for given line force parameters k, alpha, delta definite formulas for mass-loss rate M and terminal velocity v-alpha as function of stellar parameters.

  10. Analytical applications of aptamers

    NASA Astrophysics Data System (ADS)

    Tombelli, S.; Minunni, M.; Mascini, M.

    2007-05-01

    Aptamers are single stranded DNA or RNA ligands which can be selected for different targets starting from a library of molecules containing randomly created sequences. Aptamers have been selected to bind very different targets, from proteins to small organic dyes. Aptamers are proposed as alternatives to antibodies as biorecognition elements in analytical devices with ever increasing frequency. This in order to satisfy the demand for quick, cheap, simple and highly reproducible analytical devices, especially for protein detection in the medical field or for the detection of smaller molecules in environmental and food analysis. In our recent experience, DNA and RNA aptamers, specific for three different proteins (Tat, IgE and thrombin), have been exploited as bio-recognition elements to develop specific biosensors (aptasensors). These recognition elements have been coupled to piezoelectric quartz crystals and surface plasmon resonance (SPR) devices as transducers where the aptamers have been immobilized on the gold surface of the crystals electrodes or on SPR chips, respectively.

  11. An analytical and experimental study of the behavior of semi-infinite metal targets under hypervelocity impact

    NASA Technical Reports Server (NTRS)

    Chakrapani, B.; Rand, J. L.

    1971-01-01

    The material strength and strain rate effects associated with the hypervelocity impact problem were considered. A yield criterion involving the second and third invariants of the stress deviator and a strain rate sensitive constitutive equation were developed. The part of total deformation which represents change in shape is attributable to the stress deviator. Constitutive equation is a means for analytically describing the mechanical response of a continuum under study. The accuracy of the yield criterion was verified utilizing the published two and three dimensional experimental data. The constants associated with the constitutive equation were determined from one dimensional quasistatic and dynamic experiments. Hypervelocity impact experiments were conducted on semi-infinite targets of 1100 aluminum, 6061 aluminum alloy, mild steel, and commercially pure lead using spherically shaped and normally incident pyrex projectiles.

  12. Treating spondyloarthritis, including ankylosing spondylitis and psoriatic arthritis, to target: recommendations of an international task force

    PubMed Central

    Smolen, Josef S; Braun, Jürgen; Dougados, Maxime; Emery, Paul; FitzGerald, Oliver; Helliwell, Philip; Kavanaugh, Arthur; Kvien, Tore K; Landewé, Robert; Luger, Thomas; Mease, Philip; Olivieri, Ignazio; Reveille, John; Ritchlin, Christopher; Rudwaleit, Martin; Schoels, Monika; Sieper, Joachim; de Wit, Martinus; Baraliakos, Xenofon; Betteridge, Neil; Burgos-Vargas, Ruben; Collantes-Estevez, Eduardo; Deodhar, Atul; Elewaut, Dirk; Gossec, Laure; Jongkees, Merryn; Maccarone, Mara; Redlich, Kurt; van den Bosch, Filip; Wei, James Cheng-Chung; Winthrop, Kevin; van der Heijde, Désirée

    2014-01-01

    Background Therapeutic targets have been defined for diseases like diabetes, hypertension or rheumatoid arthritis and adhering to them has improved outcomes. Such targets are just emerging for spondyloarthritis (SpA). Objective To define the treatment target for SpA including ankylosing spondylitis and psoriatic arthritis (PsA) and develop recommendations for achieving the target, including a treat-to-target management strategy. Methods Based on results of a systematic literature review and expert opinion, a task force of expert physicians and patients developed recommendations which were broadly discussed and voted upon in a Delphi-like process. Level of evidence, grade and strength of the recommendations were derived by respective means. The commonalities between axial SpA, peripheral SpA and PsA were discussed in detail. Results Although the literature review did not reveal trials comparing a treat-to-target approach with another or no strategy, it provided indirect evidence regarding an optimised approach to therapy that facilitated the development of recommendations. The group agreed on 5 overarching principles and 11 recommendations; 9 of these recommendations related commonly to the whole spectrum of SpA and PsA, and only 2 were designed separately for axial SpA, peripheral SpA and PsA. The main treatment target, which should be based on a shared decision with the patient, was defined as remission, with the alternative target of low disease activity. Follow-up examinations at regular intervals that depend on the patient's status should safeguard the evolution of disease activity towards the targeted goal. Additional recommendations relate to extra-articular and extramusculoskeletal aspects and other important factors, such as comorbidity. While the level of evidence was generally quite low, the mean strength of recommendation was 9–10 (10: maximum agreement) for all recommendations. A research agenda was formulated. Conclusions The task force defined the

  13. An Analytical Means of Determining Mass Loss from High Velocity Rigid Penetrators based on the Thermodynamic and Mechanical Properties of the Penetrator and Target

    NASA Astrophysics Data System (ADS)

    Foster, Joseph C., Jr.; Jones, S. E.; Rule, William; Toness, Odin

    1999-06-01

    Sub-scale experimentation is commonly used as a cost-effective means of conducting terminal ballistics research. Analytical models of the penetration process focus on calculating the depth of penetration based on target density, target strength represented by the unconfined compressive-strength (f”c), the areal density of the penetrator (W/A), and the impact velocity.1 Forrestal, et. al. have documented the mass loss from the penetrator during the penetration process and employed improved equations of motion.2 Various researchers have investigated the upper limits of rigid body penetration and identified the onset of instabilities.3 In an effort to better understand the physical processes associated with this instability, experimental techniques have been developed to capture the details of the penetrator and target and subject them to microscopic analysis.4 These results have served as motivation to explore new forms for the physics included in the penetration equation as a means of identifying the processes associated with high velocity instability. We have included target shear and nose friction in the formulation of the fundamental load function expressions.5 When the resulting equations of motion are integrated and combined with the thermodynamics indicated by microscopic analysis, methods are identified to calculated penetrator mass loss. A comparison of results with experimental data serves as an indicator of the thermodynamic state variables associated with the quasi-steady state penetrator target interface conditions. 1 Young, C. W. , “Depth Predictions for Earth Penetrating Projectiles,” Journal of Soil Mechanics and Foundations, Division of ASCE, May 1998 pp 803-817 2. M.J. Forrestal, D.J. Frew, S.J. Hanchak, amd Brar, “ Pentration of Grout and Concrete Targets with Ogive-Nose Steel Projectiles,” Inrt. J. Impact Engng. Vol 18, pp. 465-476,1996 3. Andrew J. Piekutowski, Michael J. Forrestal, Kevin L. Poormon, and Thomas L. Warren,

  14. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  15. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  16. Challenges in Modern Anti-Doping Analytical Science.

    PubMed

    Ayotte, Christiane; Miller, John; Thevis, Mario

    2017-01-01

    The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.

  17. Size separation of analytes using monomeric surfactants

    DOEpatents

    Yeung, Edward S.; Wei, Wei

    2005-04-12

    A sieving medium for use in the separation of analytes in a sample containing at least one such analyte comprises a monomeric non-ionic surfactant of the of the general formula, B-A, wherein A is a hydrophilic moiety and B is a hydrophobic moiety, present in a solvent at a concentration forming a self-assembled micelle configuration under selected conditions and having an aggregation number providing an equivalent weight capable of effecting the size separation of the sample solution so as to resolve a target analyte(s) in a solution containing the same, the size separation taking place in a chromatography or electrophoresis separation system.

  18. A program wide framework for evaluating data driven teaching and learning - earth analytics approaches, results and lessons learned

    NASA Astrophysics Data System (ADS)

    Wasser, L. A.; Gold, A. U.

    2017-12-01

    There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.

  19. Chapter 16 - Predictive Analytics for Comprehensive Energy Systems State Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yingchen; Yang, Rui; Hodge, Brian S

    Energy sustainability is a subject of concern to many nations in the modern world. It is critical for electric power systems to diversify energy supply to include systems with different physical characteristics, such as wind energy, solar energy, electrochemical energy storage, thermal storage, bio-energy systems, geothermal, and ocean energy. Each system has its own range of control variables and targets. To be able to operate such a complex energy system, big-data analytics become critical to achieve the goal of predicting energy supplies and consumption patterns, assessing system operation conditions, and estimating system states - all providing situational awareness to powermore » system operators. This chapter presents data analytics and machine learning-based approaches to enable predictive situational awareness of the power systems.« less

  20. Assessing Analytical Similarity of Proposed Amgen Biosimilar ABP 501 to Adalimumab.

    PubMed

    Liu, Jennifer; Eris, Tamer; Li, Cynthia; Cao, Shawn; Kuhns, Scott

    2016-08-01

    ABP 501 is being developed as a biosimilar to adalimumab. Comprehensive comparative analytical characterization studies have been conducted and completed. The objective of this study was to assess analytical similarity between ABP 501 and two adalimumab reference products (RPs), licensed by the United States Food and Drug Administration (adalimumab [US]) and authorized by the European Union (adalimumab [EU]), using state-of-the-art analytical methods. Comprehensive analytical characterization incorporating orthogonal analytical techniques was used to compare products. Physicochemical property comparisons comprised the primary structure related to amino acid sequence and post-translational modifications including glycans; higher-order structure; primary biological properties mediated by target and receptor binding; product-related substances and impurities; host-cell impurities; general properties of the finished drug product, including strength and formulation; subvisible and submicron particles and aggregates; and forced thermal degradation. ABP 501 had the same amino acid sequence and similar post-translational modification profiles compared with adalimumab RPs. Primary structure, higher-order structure, and biological activities were similar for the three products. Product-related size and charge variants and aggregate and particle levels were also similar. ABP 501 had very low residual host-cell protein and DNA. The finished ABP 501 drug product has the same strength with regard to protein concentration and fill volume as adalimumab RPs. ABP 501 and the RPs had a similar stability profile both in normal storage and thermal stress conditions. Based on the comprehensive analytical similarity assessment, ABP 501 was found to be similar to adalimumab with respect to physicochemical and biological properties.

  1. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    PubMed

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  2. Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.

    PubMed

    Stolper, Charles D; Perer, Adam; Gotz, David

    2014-12-01

    As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.

  3. An analytical expression for ion velocities at the wall including the sheath electric field and surface biasing for erosion modeling at JET ILW

    DOE PAGES

    Borodkina, I.; Borodin, D.; Brezinsek, S.; ...

    2017-04-12

    For simulation of plasma-facing component erosion in fusion experiments, an analytical expression for the ion velocity just before the surface impact including the local electric field and an optional surface biasing effect is suggested. Energy and angular impact distributions and the resulting effective sputtering yields were produced for several experimental scenarios at JET ILW mostly involving PFCs exposed to an oblique magnetic field. The analytic solution has been applied as an improvement to earlier ERO modelling of localized, Be outer limiter, RF-enhanced erosion, modulated by toggling of a remote, however magnetically connected ICRH antenna. The effective W sputtering yields duemore » to D and Be ion impact in Type-I and Type-III ELMs and inter-ELM conditions were also estimated using the analytical approach and benchmarked by spectroscopy. The intra-ELM W sputtering flux increases almost 10 times in comparison to the inter-ELM flux.« less

  4. Competing on talent analytics.

    PubMed

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  5. Media Literacy Interventions: A Meta-Analytic Review

    PubMed Central

    Jeong, Se-Hoon; Cho, Hyunyi; Hwang, Yoori

    2012-01-01

    Although numerous media literacy interventions have been developed and delivered over the past 3 decades, a comprehensive meta-analytic assessment of their effects has not been available. This study investigates the average effect size and moderators of 51 media literacy interventions. Media literacy interventions had positive effects (d=.37) on outcomes including media knowledge, criticism, perceived realism, influence, behavioral beliefs, attitudes, self-efficacy, and behavior. Moderator analyses indicated that interventions with more sessions were more effective, but those with more components were less effective. Intervention effects did not vary by the agent, target age, the setting, audience involvement, the topic, the country, or publication status. PMID:22736807

  6. Recent α decay half-lives and analytic expression predictions including superheavy nuclei

    NASA Astrophysics Data System (ADS)

    Royer, G.; Zhang, H. F.

    2008-03-01

    New recent experimental α decay half-lives have been compared with the results obtained from previously proposed formulas depending only on the mass and charge numbers of the α emitter and the Qα value. For the heaviest nuclei they are also compared with calculations using the Density-Dependent M3Y (DDM3Y) effective interaction and the Viola-Seaborg-Sobiczewski (VSS) formulas. The correct agreement allows us to make predictions for the α decay half-lives of other still unknown superheavy nuclei from these analytic formulas using the extrapolated Qα of G. Audi, A. H. Wapstra, and C. Thibault [Nucl. Phys. A729, 337 (2003)].

  7. Recent {alpha} decay half-lives and analytic expression predictions including superheavy nuclei

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Royer, G.; Zhang, H. F.

    New recent experimental {alpha} decay half-lives have been compared with the results obtained from previously proposed formulas depending only on the mass and charge numbers of the {alpha} emitter and the Q{sub {alpha}} value. For the heaviest nuclei they are also compared with calculations using the Density-Dependent M3Y (DDM3Y) effective interaction and the Viola-Seaborg-Sobiczewski (VSS) formulas. The correct agreement allows us to make predictions for the {alpha} decay half-lives of other still unknown superheavy nuclei from these analytic formulas using the extrapolated Q{sub {alpha}} of G. Audi, A. H. Wapstra, and C. Thibault [Nucl. Phys. A729, 337 (2003)].

  8. Integrated Targeting and Guidance for Powered Planetary Descent

    NASA Astrophysics Data System (ADS)

    Azimov, Dilmurat M.; Bishop, Robert H.

    2018-02-01

    This paper presents an on-board guidance and targeting design that enables explicit state and thrust vector control and on-board targeting for planetary descent and landing. These capabilities are developed utilizing a new closed-form solution for the constant thrust arc of the braking phase of the powered descent trajectory. The key elements of proven targeting and guidance architectures, including braking and approach phase quartics, are employed. It is demonstrated that implementation of the proposed solution avoids numerical simulation iterations, thereby facilitating on-board execution of targeting procedures during the descent. It is shown that the shape of the braking phase constant thrust arc is highly dependent on initial mass and propulsion system parameters. The analytic solution process is explicit in terms of targeting and guidance parameters, while remaining generic with respect to planetary body and descent trajectory design. These features increase the feasibility of extending the proposed integrated targeting and guidance design to future cargo and robotic landing missions.

  9. Integrated Targeting and Guidance for Powered Planetary Descent

    NASA Astrophysics Data System (ADS)

    Azimov, Dilmurat M.; Bishop, Robert H.

    2018-06-01

    This paper presents an on-board guidance and targeting design that enables explicit state and thrust vector control and on-board targeting for planetary descent and landing. These capabilities are developed utilizing a new closed-form solution for the constant thrust arc of the braking phase of the powered descent trajectory. The key elements of proven targeting and guidance architectures, including braking and approach phase quartics, are employed. It is demonstrated that implementation of the proposed solution avoids numerical simulation iterations, thereby facilitating on-board execution of targeting procedures during the descent. It is shown that the shape of the braking phase constant thrust arc is highly dependent on initial mass and propulsion system parameters. The analytic solution process is explicit in terms of targeting and guidance parameters, while remaining generic with respect to planetary body and descent trajectory design. These features increase the feasibility of extending the proposed integrated targeting and guidance design to future cargo and robotic landing missions.

  10. Biological Matrix Effects in Quantitative Tandem Mass Spectrometry-Based Analytical Methods: Advancing Biomonitoring

    PubMed Central

    Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd

    2015-01-01

    The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585

  11. Visual Analytics 101

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Burtner, Edwin R.; Cook, Kristin A.

    This course will introduce the field of Visual Analytics to HCI researchers and practitioners highlighting the contributions they can make to this field. Topics will include a definition of visual analytics along with examples of current systems, types of tasks and end users, issues in defining user requirements, design of visualizations and interactions, guidelines and heuristics, the current state of user-centered evaluations, and metrics for evaluation. We encourage designers, HCI researchers, and HCI practitioners to attend to learn how their skills can contribute to advancing the state of the art of visual analytics

  12. Morphoproteomics, E6/E7 in-situ hybridization, and biomedical analytics define the etiopathogenesis of HPV-associated oropharyngeal carcinoma and provide targeted therapeutic options.

    PubMed

    Brown, Robert E; Naqvi, Syed; McGuire, Mary F; Buryanek, Jamie; Karni, Ron J

    2017-08-17

    Human papillomavirus (HPV) has been identified as an etiopathogenetic factor in oropharyngeal squamous cell carcinoma. The HPV E6 and E7 oncogenes are instrumental in promoting proliferation and blocking differentiation leading to tumorigenesis. Although surgical intervention can remove such tumors, the potential for an etiologic field effect with recurrent disease is real. A downstream effector of E7 oncoprotein, enhancer of zeste homolog 2 (EZH2), is known to promote proliferation and to pose a block in differentiation and in turn, could lead to HPV-induced malignant transformation. However, the EZH2 pathway is amenable to low toxicity therapies designed to promote differentiation to a more benign state and prevent recurrent disease by inhibiting the incorporation of HPV into the genome. This is the first study using clinical specimens to demonstrate EZH2 protein expression in oropharyngeal carcinoma (OPC). The study included eight patients with oropharyngeal carcinoma, confirmed p16INK4a- positive by immunohistochemistry (IHC). The tissue expression of E6/E7 messenger RNA (mRNA) was measured by RNAscope® in-situ hybridization technology. Expression of EZH2, Ki-67, and mitotic indices were assessed by morphoproteomic analysis. Biomedical analytics expanded the results with data from Ingenuity Pathway Analysis (IPA) and KEGG databases to construct a molecular network pathway for further insights. Expression of E6 and E7 oncogenes in p16INK4a- positive oropharyngeal carcinoma was confirmed. EZH2 and its correlates, including elevated proliferation index (Ki-67) and mitotic progression were also present. Biomedical analytics validated the relationship between HPV- E6 and E7 and the expression of the EZH2 pathway. There is morphoproteomic and mRNA evidence of the association of p16INK4a-HPV infection with the E6 and E7 oncogenes and the expression of EZH2, Ki-67 and mitotic progression in oropharyngeal carcinoma. The molecular network biology was confirmed by

  13. Strand Invasion Based Amplification (SIBA®): a novel isothermal DNA amplification technology demonstrating high specificity and sensitivity for a single molecule of target analyte.

    PubMed

    Hoser, Mark J; Mansukoski, Hannu K; Morrical, Scott W; Eboigbodin, Kevin E

    2014-01-01

    Isothermal nucleic acid amplification technologies offer significant advantages over polymerase chain reaction (PCR) in that they do not require thermal cycling or sophisticated laboratory equipment. However, non-target-dependent amplification has limited the sensitivity of isothermal technologies and complex probes are usually required to distinguish between non-specific and target-dependent amplification. Here, we report a novel isothermal nucleic acid amplification technology, Strand Invasion Based Amplification (SIBA). SIBA technology is resistant to non-specific amplification, is able to detect a single molecule of target analyte, and does not require target-specific probes. The technology relies on the recombinase-dependent insertion of an invasion oligonucleotide (IO) into the double-stranded target nucleic acid. The duplex regions peripheral to the IO insertion site dissociate, thereby enabling target-specific primers to bind. A polymerase then extends the primers onto the target nucleic acid leading to exponential amplification of the target. The primers are not substrates for the recombinase and are, therefore unable to extend the target template in the absence of the IO. The inclusion of 2'-O-methyl RNA to the IO ensures that it is not extendible and that it does not take part in the extension of the target template. These characteristics ensure that the technology is resistant to non-specific amplification since primer dimers or mis-priming are unable to exponentially amplify. Consequently, SIBA is highly specific and able to distinguish closely-related species with single molecule sensitivity in the absence of complex probes or sophisticated laboratory equipment. Here, we describe this technology in detail and demonstrate its use for the detection of Salmonella.

  14. Amendment to "Analytical Solution for the Convectively-Mixed Atmospheric Boundary Layer": Inclusion of Subsidence

    NASA Astrophysics Data System (ADS)

    Ouwersloot, H. G.; de Arellano, J. Vilà-Guerau

    2013-09-01

    In Ouwersloot and Vilà-Guerau de Arellano (Boundary-Layer Meteorol. doi: 10.1007/s10546-013-9816-z Target Address="10.1007/s10546-013-9816-z" TargetType="DOI"/> , 2013, this issue), the analytical solutions for the boundary-layer height and scalar evolutions are derived for the convective boundary layer, based on the prognostic equations of mixed-layer slab models without taking subsidence into account. Here, we include and quantify the added effect of subsidence if the subsidence velocity scales linearly with height throughout the atmosphere. This enables analytical analyses for a wider range of observational cases. As a demonstration, the sensitivity of the boundary-layer height and the potential temperature jump to subsidence and the free tropospheric stability is graphically presented. The new relations show the importance of the temporal distribution of the surface buoyancy flux in determining the evolution if there is subsidence.

  15. Building pit dewatering: application of transient analytic elements.

    PubMed

    Zaadnoordijk, Willem J

    2006-01-01

    Analytic elements are well suited for the design of building pit dewatering. Wells and drains can be modeled accurately by analytic elements, both nearby to determine the pumping level and at some distance to verify the targeted drawdown at the building site and to estimate the consequences in the vicinity. The ability to shift locations of wells or drains easily makes the design process very flexible. The temporary pumping has transient effects, for which transient analytic elements may be used. This is illustrated using the free, open-source, object-oriented analytic element simulator Tim(SL) for the design of a building pit dewatering near a canal. Steady calculations are complemented with transient calculations. Finally, the bandwidths of the results are estimated using linear variance analysis.

  16. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  17. Electrostatic Interactions between OmpG Nanopore and Analyte Protein Surface Can Distinguish between Glycosylated Isoforms.

    PubMed

    Fahie, Monifa A; Chen, Min

    2015-08-13

    The flexible loops decorating the entrance of OmpG nanopore move dynamically during ionic current recording. The gating caused by these flexible loops changes when a target protein is bound. The gating is characterized by parameters including frequency, duration, and open-pore current, and these features combine to reveal the identity of a specific analyte protein. Here, we show that OmpG nanopore equipped with a biotin ligand can distinguish glycosylated and deglycosylated isoforms of avidin by their differences in surface charge. Our studies demonstrate that the direct interaction between the nanopore and analyte surface, induced by the electrostatic attraction between the two molecules, is essential for protein isoform detection. Our technique is remarkably sensitive to the analyte surface, which may provide a useful tool for glycoprotein profiling.

  18. In silico target prediction for elucidating the mode of action of herbicides including prospective validation.

    PubMed

    Chiddarwar, Rucha K; Rohrer, Sebastian G; Wolf, Antje; Tresch, Stefan; Wollenhaupt, Sabrina; Bender, Andreas

    2017-01-01

    The rapid emergence of pesticide resistance has given rise to a demand for herbicides with new mode of action (MoA). In the agrochemical sector, with the availability of experimental high throughput screening (HTS) data, it is now possible to utilize in silico target prediction methods in the early discovery phase to suggest the MoA of a compound via data mining of bioactivity data. While having been established in the pharmaceutical context, in the agrochemical area this approach poses rather different challenges, as we have found in this work, partially due to different chemistry, but even more so due to different (usually smaller) amounts of data, and different ways of conducting HTS. With the aim to apply computational methods for facilitating herbicide target identification, 48,000 bioactivity data against 16 herbicide targets were processed to train Laplacian modified Naïve Bayesian (NB) classification models. The herbicide target prediction model ("HerbiMod") is an ensemble of 16 binary classification models which are evaluated by internal, external and prospective validation sets. In addition to the experimental inactives, 10,000 random agrochemical inactives were included in the training process, which showed to improve the overall balanced accuracy of our models up to 40%. For all the models, performance in terms of balanced accuracy of≥80% was achieved in five-fold cross validation. Ranking target predictions was addressed by means of z-scores which improved predictivity over using raw scores alone. An external testset of 247 compounds from ChEMBL and a prospective testset of 394 compounds from BASF SE tested against five well studied herbicide targets (ACC, ALS, HPPD, PDS and PROTOX) were used for further validation. Only 4% of the compounds in the external testset lied in the applicability domain and extrapolation (and correct prediction) was hence impossible, which on one hand was surprising, and on the other hand illustrated the utilization of

  19. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Beard, Daniel A.

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  20. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  1. Planar optical waveguide based sandwich assay sensors and processes for the detection of biological targets including protein markers, pathogens and cellular debris

    DOEpatents

    Martinez, Jennifer S [Santa Fe, NM; Swanson, Basil I [Los Alamos, NM; Grace, Karen M [Los Alamos, NM; Grace, Wynne K [Los Alamos, NM; Shreve, Andrew P [Santa Fe, NM

    2009-06-02

    An assay element is described including recognition ligands bound to a film on a single mode planar optical waveguide, the film from the group of a membrane, a polymerized bilayer membrane, and a self-assembled monolayer containing polyethylene glycol or polypropylene glycol groups therein and an assay process for detecting the presence of a biological target is described including injecting a biological target-containing sample into a sensor cell including the assay element, with the recognition ligands adapted for binding to selected biological targets, maintaining the sample within the sensor cell for time sufficient for binding to occur between selected biological targets within the sample and the recognition ligands, injecting a solution including a reporter ligand into the sensor cell; and, interrogating the sample within the sensor cell with excitation light from the waveguide, the excitation light provided by an evanescent field of the single mode penetrating into the biological target-containing sample to a distance of less than about 200 nanometers from the waveguide thereby exciting the fluorescent-label in any bound reporter ligand within a distance of less than about 200 nanometers from the waveguide and resulting in a detectable signal.

  2. Semi-analytical model for a static sheath including a weakly collisional presheath

    NASA Astrophysics Data System (ADS)

    Shirafuji, Tatsuru; Denpoh, Kazuki

    2018-06-01

    A semi-analytical static sheath (SASS) model can provide a spatial potential profile on a biased surface with microstructures, which can be used for predicting ion trajectories on the surface. However, two- or three-dimensional SASS models require a search procedure for a sheath edge equipotential profile, at which ions have the Bohm velocity, as the starting positions for calculating ion trajectories. This procedure can be troublesome when surface microstructures have complex structures. This difficulty is due to the fact that the SASS model cannot handle a presheath region. In this work, we propose a modified SASS model that can handle a presheath region. By using the modified SASS model, ion trajectories can be calculated from edges with arbitrary geometry without searching for the equipotential profile corresponding to sheath edges.

  3. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  4. Enzyme Biosensors for Biomedical Applications: Strategies for Safeguarding Analytical Performances in Biological Fluids

    PubMed Central

    Rocchitta, Gaia; Spanu, Angela; Babudieri, Sergio; Latte, Gavinella; Madeddu, Giordano; Galleri, Grazia; Nuvoli, Susanna; Bagella, Paola; Demartis, Maria Ilaria; Fiore, Vito; Manetti, Roberto; Serra, Pier Andrea

    2016-01-01

    Enzyme-based chemical biosensors are based on biological recognition. In order to operate, the enzymes must be available to catalyze a specific biochemical reaction and be stable under the normal operating conditions of the biosensor. Design of biosensors is based on knowledge about the target analyte, as well as the complexity of the matrix in which the analyte has to be quantified. This article reviews the problems resulting from the interaction of enzyme-based amperometric biosensors with complex biological matrices containing the target analyte(s). One of the most challenging disadvantages of amperometric enzyme-based biosensor detection is signal reduction from fouling agents and interference from chemicals present in the sample matrix. This article, therefore, investigates the principles of functioning of enzymatic biosensors, their analytical performance over time and the strategies used to optimize their performance. Moreover, the composition of biological fluids as a function of their interaction with biosensing will be presented. PMID:27249001

  5. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.

    PubMed

    White, B J; Amrine, D E; Larson, R L

    2018-04-14

    Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.

  6. Analytical method for nitroaromatic explosives in radiologically contaminated soil for ISO/IEC 17025 accreditation

    DOE PAGES

    Boggess, Andrew; Crump, Stephen; Gregory, Clint; ...

    2017-12-06

    Here, unique hazards are presented in the analysis of radiologically contaminated samples. Strenuous safety and security precautions must be in place to protect the analyst, laboratory, and instrumentation used to perform analyses. A validated method has been optimized for the analysis of select nitroaromatic explosives and degradative products using gas chromatography/mass spectrometry via sonication extraction of radiologically contaminated soils, for samples requiring ISO/IEC 17025 laboratory conformance. Target analytes included 2-nitrotoluene, 4-nitrotoluene, 2,6-dinitrotoluene, and 2,4,6-trinitrotoluene, as well as the degradative product 4-amino-2,6-dinitrotoluene. Analytes were extracted from soil in methylene chloride by sonication. Administrative and engineering controls, as well as instrument automationmore » and quality control measures, were utilized to minimize potential human exposure to radiation at all times and at all stages of analysis, from receiving through disposition. Though thermal instability increased uncertainties of these selected compounds, a mean lower quantitative limit of 2.37 µg/mL and mean accuracy of 2.3% relative error and 3.1% relative standard deviation were achieved. Quadratic regression was found to be optimal for calibration of all analytes, with compounds of lower hydrophobicity displaying greater parabolic curve. Blind proficiency testing (PT) of spiked soil samples demonstrated a mean relative error of 9.8%. Matrix spiked analyses of PT samples demonstrated that 99% recovery of target analytes was achieved. To the knowledge of the authors, this represents the first safe, accurate, and reproducible quantitative method for nitroaromatic explosives in soil for specific use on radiologically contaminated samples within the constraints of a nuclear analytical lab.« less

  7. Analytical method for nitroaromatic explosives in radiologically contaminated soil for ISO/IEC 17025 accreditation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boggess, Andrew; Crump, Stephen; Gregory, Clint

    Here, unique hazards are presented in the analysis of radiologically contaminated samples. Strenuous safety and security precautions must be in place to protect the analyst, laboratory, and instrumentation used to perform analyses. A validated method has been optimized for the analysis of select nitroaromatic explosives and degradative products using gas chromatography/mass spectrometry via sonication extraction of radiologically contaminated soils, for samples requiring ISO/IEC 17025 laboratory conformance. Target analytes included 2-nitrotoluene, 4-nitrotoluene, 2,6-dinitrotoluene, and 2,4,6-trinitrotoluene, as well as the degradative product 4-amino-2,6-dinitrotoluene. Analytes were extracted from soil in methylene chloride by sonication. Administrative and engineering controls, as well as instrument automationmore » and quality control measures, were utilized to minimize potential human exposure to radiation at all times and at all stages of analysis, from receiving through disposition. Though thermal instability increased uncertainties of these selected compounds, a mean lower quantitative limit of 2.37 µg/mL and mean accuracy of 2.3% relative error and 3.1% relative standard deviation were achieved. Quadratic regression was found to be optimal for calibration of all analytes, with compounds of lower hydrophobicity displaying greater parabolic curve. Blind proficiency testing (PT) of spiked soil samples demonstrated a mean relative error of 9.8%. Matrix spiked analyses of PT samples demonstrated that 99% recovery of target analytes was achieved. To the knowledge of the authors, this represents the first safe, accurate, and reproducible quantitative method for nitroaromatic explosives in soil for specific use on radiologically contaminated samples within the constraints of a nuclear analytical lab.« less

  8. Analytical linear energy transfer model including secondary particles: calculations along the central axis of the proton pencil beam

    NASA Astrophysics Data System (ADS)

    Marsolat, F.; De Marzi, L.; Pouzoulet, F.; Mazal, A.

    2016-01-01

    In proton therapy, the relative biological effectiveness (RBE) depends on various types of parameters such as linear energy transfer (LET). An analytical model for LET calculation exists (Wilkens’ model), but secondary particles are not included in this model. In the present study, we propose a correction factor, L sec, for Wilkens’ model in order to take into account the LET contributions of certain secondary particles. This study includes secondary protons and deuterons, since the effects of these two types of particles can be described by the same RBE-LET relationship. L sec was evaluated by Monte Carlo (MC) simulations using the GATE/GEANT4 platform and was defined by the ratio of the LET d distributions of all protons and deuterons and only primary protons. This method was applied to the innovative Pencil Beam Scanning (PBS) delivery systems and L sec was evaluated along the beam axis. This correction factor indicates the high contribution of secondary particles in the entrance region, with L sec values higher than 1.6 for a 220 MeV clinical pencil beam. MC simulations showed the impact of pencil beam parameters, such as mean initial energy, spot size, and depth in water, on L sec. The variation of L sec with these different parameters was integrated in a polynomial function of the L sec factor in order to obtain a model universally applicable to all PBS delivery systems. The validity of this correction factor applied to Wilkens’ model was verified along the beam axis of various pencil beams in comparison with MC simulations. A good agreement was obtained between the corrected analytical model and the MC calculations, with mean-LET deviations along the beam axis less than 0.05 keV μm-1. These results demonstrate the efficacy of our new correction of the existing LET model in order to take into account secondary protons and deuterons along the pencil beam axis.

  9. Combining targeted and nontargeted data analysis for liquid chromatography/high-resolution mass spectrometric analyses.

    PubMed

    Croley, Timothy R; White, Kevin D; Wong, Jon; Callahan, John H; Musser, Steven M; Antler, Margaret; Lashin, Vitaly; McGibbon, Graham A

    2013-03-01

    Increasing importation of food and the diversity of potential contaminants have necessitated more analytical testing of these foods. Historically, mass spectrometric methods for testing foods were confined to monitoring selected ions (SIM or MRM), achieving sensitivity by focusing on targeted ion signals. A limiting factor in this approach is that any contaminants not included on the target list are not typically identified and retrospective data mining is limited. A potential solution is to utilize high-resolution MS to acquire accurate mass full-scan data. Based on the instrumental resolution, these data can be correlated to the actual mass of a contaminant, which would allow for identification of both target compounds and compounds that are not on a target list (nontargets). The focus of this research was to develop software algorithms to provide rapid and accurate data processing of LC/MS data to identify both targeted and nontargeted analytes. Software from a commercial vendor was developed to process LC/MS data and the results were compared to an alternate, vendor-supplied solution. The commercial software performed well and demonstrated the potential for a fully automated processing solution. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Pediatric, Adolescent, and Young Adult Thyroid Carcinoma Harbors Frequent and Diverse Targetable Genomic Alterations, Including Kinase Fusions

    PubMed Central

    Schrock, Alexa B.; Anderson, Peter M.; Morris, John C.; Heilmann, Andreas M.; Holmes, Oliver; Wang, Kai; Johnson, Adrienne; Waguespack, Steven G.; Ou, Sai‐Hong Ignatius; Khan, Saad; Fung, Kar‐Ming; Stephens, Philip J.; Erlich, Rachel L.; Miller, Vincent A.; Ross, Jeffrey S.; Ali, Siraj M.

    2017-01-01

    Background. Thyroid carcinoma, which is rare in pediatric patients (age 0–18 years) but more common in adolescent and young adult (AYA) patients (age 15–39 years), carries the potential for morbidity and mortality. Methods. Hybrid‐capture‐based comprehensive genomic profiling (CGP) was performed prospectively on 512 consecutively submitted thyroid carcinomas, including 58 from pediatric and AYA (PAYA) patients, to identify genomic alterations (GAs), including base substitutions, insertions/deletions, copy number alterations, and rearrangements. This PAYA data series includes 41 patients with papillary thyroid carcinoma (PTC), 3 with anaplastic thyroid carcinoma (ATC), and 14 with medullary thyroid carcinoma (MTC). Results. GAs were detected in 93% (54/58) of PAYA cases, with a mean of 1.4 GAs per case. In addition to BRAF V600E mutations, detected in 46% (19/41) of PAYA PTC cases and in 1 of 3 AYA ATC cases, oncogenic fusions involving RET, NTRK1, NTRK3, and ALK were detected in 37% (15/41) of PAYA PTC and 33% (1/3) of AYA ATC cases. Ninety‐three percent (13/14) of MTC patients harbored RET alterations, including 3 novel insertions/deletions in exons 6 and 11. Two of these MTC patients with novel alterations in RET experienced clinical benefit from vandetanib treatment. Conclusion. CGP identified diverse clinically relevant GAs in PAYA patients with thyroid carcinoma, including 83% (34/41) of PTC cases harboring activating kinase mutations or activating kinase rearrangements. These genomic observations and index cases exhibiting clinical benefit from targeted therapy suggest that young patients with advanced thyroid carcinoma can benefit from CGP and rationally matched targeted therapy. Implications for Practice. The detection of diverse clinically relevant genomic alterations in the majority of pediatric, adolescent, and young adult patients with thyroid carcinoma in this study suggests that comprehensive genomic profiling may be beneficial for young

  11. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  12. Implosion of multilayered cylindrical targets driven by intense heavy ion beams.

    PubMed

    Piriz, A R; Portugues, R F; Tahir, N A; Hoffmann, D H H

    2002-11-01

    An analytical model for the implosion of a multilayered cylindrical target driven by an intense heavy ion beam has been developed. The target is composed of a cylinder of frozen hydrogen or deuterium, which is enclosed in a thick shell of solid lead. This target has been designed for future high-energy-density matter experiments to be carried out at the Gesellschaft für Schwerionenforschung, Darmstadt. The model describes the implosion dynamics including the motion of the incident shock and the first reflected shock and allows for calculation of the physical conditions of the hydrogen at stagnation. The model predicts that the conditions of the compressed hydrogen are not sensitive to significant variations in target and beam parameters. These predictions are confirmed by one-dimensional numerical simulations and thus allow for a robust target design.

  13. A simple analytical model for dynamics of time-varying target leverage ratios

    NASA Astrophysics Data System (ADS)

    Lo, C. F.; Hui, C. H.

    2012-03-01

    In this paper we have formulated a simple theoretical model for the dynamics of the time-varying target leverage ratio of a firm under some assumptions based upon empirical observations. In our theoretical model the time evolution of the target leverage ratio of a firm can be derived self-consistently from a set of coupled Ito's stochastic differential equations governing the leverage ratios of an ensemble of firms by the nonlinear Fokker-Planck equation approach. The theoretically derived time paths of the target leverage ratio bear great resemblance to those used in the time-dependent stationary-leverage (TDSL) model [Hui et al., Int. Rev. Financ. Analy. 15, 220 (2006)]. Thus, our simple model is able to provide a theoretical foundation for the selected time paths of the target leverage ratio in the TDSL model. We also examine how the pace of the adjustment of a firm's target ratio, the volatility of the leverage ratio and the current leverage ratio affect the dynamics of the time-varying target leverage ratio. Hence, with the proposed dynamics of the time-dependent target leverage ratio, the TDSL model can be readily applied to generate the default probabilities of individual firms and to assess the default risk of the firms.

  14. Teaching Analytical Thinking

    ERIC Educational Resources Information Center

    Behn, Robert D.; Vaupel, James W.

    1976-01-01

    Description of the philosophy and general nature of a course at Drake University that emphasizes basic concepts of analytical thinking, including think, decompose, simplify, specify, and rethink problems. Some sample homework exercises are included. The journal is available from University of California Press, Berkeley, California 94720.…

  15. Analyte detection using an active assay

    DOEpatents

    Morozov, Victor; Bailey, Charles L.; Evanskey, Melissa R.

    2010-11-02

    Analytes using an active assay may be detected by introducing an analyte solution containing a plurality of analytes to a lacquered membrane. The lacquered membrane may be a membrane having at least one surface treated with a layer of polymers. The lacquered membrane may be semi-permeable to nonanalytes. The layer of polymers may include cross-linked polymers. A plurality of probe molecules may be arrayed and immobilized on the lacquered membrane. An external force may be applied to the analyte solution to move the analytes towards the lacquered membrane. Movement may cause some or all of the analytes to bind to the lacquered membrane. In cases where probe molecules are presented, some or all of the analytes may bind to probe molecules. The direction of the external force may be reversed to remove unbound or weakly bound analytes. Bound analytes may be detected using known detection types.

  16. Protein-targeted corona phase molecular recognition

    PubMed Central

    Bisker, Gili; Dong, Juyao; Park, Hoyoung D.; Iverson, Nicole M.; Ahn, Jiyoung; Nelson, Justin T.; Landry, Markita P.; Kruss, Sebastian; Strano, Michael S.

    2016-01-01

    Corona phase molecular recognition (CoPhMoRe) uses a heteropolymer adsorbed onto and templated by a nanoparticle surface to recognize a specific target analyte. This method has not yet been extended to macromolecular analytes, including proteins. Herein we develop a variant of a CoPhMoRe screening procedure of single-walled carbon nanotubes (SWCNT) and use it against a panel of human blood proteins, revealing a specific corona phase that recognizes fibrinogen with high selectivity. In response to fibrinogen binding, SWCNT fluorescence decreases by >80% at saturation. Sequential binding of the three fibrinogen nodules is suggested by selective fluorescence quenching by isolated sub-domains and validated by the quenching kinetics. The fibrinogen recognition also occurs in serum environment, at the clinically relevant fibrinogen concentrations in the human blood. These results open new avenues for synthetic, non-biological antibody analogues that recognize biological macromolecules, and hold great promise for medical and clinical applications. PMID:26742890

  17. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    PubMed

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.

  18. Development and Validation of A 48-Target Analytical Method for High-throughput Monitoring of Genetically Modified Organisms

    PubMed Central

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-01

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930

  19. Target scattering characteristics for OAM-based radar

    NASA Astrophysics Data System (ADS)

    Liu, Kang; Gao, Yue; Li, Xiang; Cheng, Yongqiang

    2018-02-01

    The target scattering characteristics are crucial for radar systems. However, there is very little study conducted for the recently developed orbital angular momentum (OAM) based radar system. To illustrate the role of OAM-based radar cross section (ORCS), conventional radar equation is modified by taking characteristics of the OAM waves into account. Subsequently, the ORCS is defined in analogy to classical radar cross section (RCS). The unique features of the incident OAM-carrying field are analyzed. The scattered field is derived, and the analytical expressions of ORCSs for metal plate and cylinder targets are obtained. Furthermore, the ORCS and RCS are compared to illustrate the influences of OAM mode number, target size and signal frequency on the ORCS. Analytical studies demonstrate that the mirror-reflection phenomenon disappears and peak values of ORCS are in the non-specular direction. Finally, the ORCS features are summarized to show its advantages in radar target detection. This work can provide theoretical guidance to the design of OAM-based radar as well as the target detection and identification applications.

  20. Learning Analytics: Challenges and Limitations

    ERIC Educational Resources Information Center

    Wilson, Anna; Watson, Cate; Thompson, Terrie Lynn; Drew, Valerie; Doyle, Sarah

    2017-01-01

    Learning analytic implementations are increasingly being included in learning management systems in higher education. We lay out some concerns with the way learning analytics--both data and algorithms--are often presented within an unproblematized Big Data discourse. We describe some potential problems with the often implicit assumptions about…

  1. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    PubMed

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  2. Feasibility of Including Green Tea Products for an Analytically Verified Dietary Supplement Database

    PubMed Central

    Saldanha, Leila; Dwyer, Johanna; Andrews, Karen; Betz, Joseph; Harnely, James; Pehrsson, Pamela; Rimmer, Catherine; Savarala, Sushma

    2015-01-01

    The Dietary Supplement Ingredient Database (DSID) is a federally funded, publicly accessible dietary supplement database that currently contains analytically-derived information on micronutrients in selected adult and children’s multivitamin and mineral (MVM) supplements. Other constituents in dietary supplement products such as botanicals are also of interest and thus are being considered for inclusion in the DSID. Thirty-eight constituents, mainly botanicals were identified and prioritized by a federal interagency committee. Green tea was selected from this list as the botanical for expansion of the DSID. This paper describes the process for prioritizing dietary ingredients in the DSID. It also discusses the criteria for inclusion of these ingredients, and the approach for selecting and testing products for the green tea pilot study. PMID:25817236

  3. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR PESTICIDES IN SPIKE SAMPLES

    EPA Science Inventory

    The Pesticides in Spikes data set contains the analytical results of measurements of up to 17 pesticides in 12 control samples (spikes) from 11 households. Measurements were made in samples of blood serum. Controls were used to assess recovery of target analytes from a sample m...

  4. Promoting Efficacy Research on Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Maitland, Daniel W. M.; Gaynor, Scott T.

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a form of therapy grounded in behavioral principles that utilizes therapist reactions to shape target behavior. Despite a growing literature base, there is a paucity of research to establish the efficacy of FAP. As a general approach to psychotherapy, and how the therapeutic relationship produces change,…

  5. Metal-amplified Density Assays, (MADAs), including a Density-Linked Immunosorbent Assay (DeLISA).

    PubMed

    Subramaniam, Anand Bala; Gonidec, Mathieu; Shapiro, Nathan D; Kresse, Kayleigh M; Whitesides, George M

    2015-02-21

    This paper reports the development of Metal-amplified Density Assays, or MADAs - a method of conducting quantitative or multiplexed assays, including immunoassays, by using Magnetic Levitation (MagLev) to measure metal-amplified changes in the density of beads labeled with biomolecules. The binding of target analytes (i.e. proteins, antibodies, antigens) to complementary ligands immobilized on the surface of the beads, followed by a chemical amplification of the binding in a form that results in a change in the density of the beads (achieved by using gold nanoparticle-labeled biomolecules, and electroless deposition of gold or silver), translates analyte binding events into changes in density measureable using MagLev. A minimal model based on diffusion-limited growth of hemispherical nuclei on a surface reproduces the dynamics of the assay. A MADA - when performed with antigens and antibodies - is called a Density-Linked Immunosorbent Assay, or DeLISA. Two immunoassays provided a proof of principle: a competitive quantification of the concentration of neomycin in whole milk, and a multiplexed detection of antibodies against Hepatitis C virus NS3 protein and syphilis T. pallidum p47 protein in serum. MADAs, including DeLISAs, require, besides the requisite biomolecules and amplification reagents, minimal specialized equipment (two permanent magnets, a ruler or a capillary with calibrated length markings) and no electrical power to obtain a quantitative readout of analyte concentration. With further development, the method may be useful in resource-limited or point-of-care settings.

  6. Experimental validation of an analytical kinetic model for edge-localized modes in JET-ITER-like wall

    NASA Astrophysics Data System (ADS)

    Guillemaut, C.; Metzger, C.; Moulton, D.; Heinola, K.; O’Mullane, M.; Balboa, I.; Boom, J.; Matthews, G. F.; Silburn, S.; Solano, E. R.; contributors, JET

    2018-06-01

    The design and operation of future fusion devices relying on H-mode plasmas requires reliable modelling of edge-localized modes (ELMs) for precise prediction of divertor target conditions. An extensive experimental validation of simple analytical predictions of the time evolution of target plasma loads during ELMs has been carried out here in more than 70 JET-ITER-like wall H-mode experiments with a wide range of conditions. Comparisons of these analytical predictions with diagnostic measurements of target ion flux density, power density, impact energy and electron temperature during ELMs are presented in this paper and show excellent agreement. The analytical predictions tested here are made with the ‘free-streaming’ kinetic model (FSM) which describes ELMs as a quasi-neutral plasma bunch expanding along the magnetic field lines into the Scrape-Off Layer without collisions. Consequences of the FSM on energy reflection and deposition on divertor targets during ELMs are also discussed.

  7. Identification of Homeotic Target Genes in Drosophila Melanogaster Including Nervy, a Proto-Oncogene Homologue

    PubMed Central

    Feinstein, P. G.; Kornfeld, K.; Hogness, D. S.; Mann, R. S.

    1995-01-01

    In Drosophila, the specific morphological characteristics of each segment are determined by the homeotic genes that regulate the expression of downstream target genes. We used a subtractive hybridization procedure to isolate activated target genes of the homeotic gene Ultrabithorax (Ubx). In addition, we constructed a set of mutant genotypes that measures the regulatory contribution of individual homeotic genes to a complex target gene expression pattern. Using these mutants, we demonstrate that homeotic genes can regulate target gene expression at the start of gastrulation, suggesting a previously unknown role for the homeotic genes at this early stage. We also show that, in abdominal segments, the levels of expression for two target genes increase in response to high levels of Ubx, demonstrating that the normal down-regulation of Ubx in these segments is functional. Finally, the DNA sequence of cDNAs for one of these genes predicts a protein that is similar to a human proto-oncogene involved in acute myeloid leukemias. These results illustrate potentially general rules about the homeotic control of target gene expression and suggest that subtractive hybridization can be used to isolate interesting homeotic target genes. PMID:7498738

  8. Analytical solution for heat transfer in three-dimensional porous media including variable fluid properties

    NASA Technical Reports Server (NTRS)

    Siegel, R.; Goldstein, M. E.

    1972-01-01

    An analytical solution is obtained for flow and heat transfer in a three-dimensional porous medium. Coolant from a reservoir at constant pressure and temperature enters one portion of the boundary of the medium and exits through another portion of the boundary which is at a specified uniform temperature and uniform pressure. The variation with temperature of coolant density and viscosity are both taken into account. A general solution is found that provides the temperature distribution in the medium and the mass and heat fluxes along the portion of the surface through which the coolant is exiting.

  9. Sensor for detecting and differentiating chemical analytes

    DOEpatents

    Yi, Dechang [Metuchen, NJ; Senesac, Lawrence R [Knoxville, TN; Thundat, Thomas G [Knoxville, TN

    2011-07-05

    A sensor for detecting and differentiating chemical analytes includes a microscale body having a first end and a second end and a surface between the ends for adsorbing a chemical analyte. The surface includes at least one conductive heating track for heating the chemical analyte and also a conductive response track, which is electrically isolated from the heating track, for producing a thermal response signal from the chemical analyte. The heating track is electrically connected with a voltage source and the response track is electrically connected with a signal recorder. The microscale body is restrained at the first end and the second end and is substantially isolated from its surroundings therebetween, thus having a bridge configuration.

  10. Analytical Applications of NMR: Summer Symposium on Analytical Chemistry.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1982-01-01

    Highlights a symposium on analytical applications of nuclear magnetic resonance spectroscopy (NMR), discussing pulse Fourier transformation technique, two-dimensional NMR, solid state NMR, and multinuclear NMR. Includes description of ORACLE, an NMR data processing system at Syracuse University using real-time color graphics, and algorithms for…

  11. Analytical method for the evaluation of the outdoor air contamination by emerging pollutants using tree leaves as bioindicators.

    PubMed

    Barroso, Pedro José; Martín, Julia; Santos, Juan Luis; Aparicio, Irene; Alonso, Esteban

    2018-01-01

    In this work, an analytical method, based on sonication-assisted extraction, clean-up by dispersive solid-phase extraction and determination by liquid chromatography-tandem mass spectrometry, has been developed and validated for the simultaneous determination of 15 emerging pollutants in leaves from four ornamental tree species. Target compounds include perfluorinated organic compounds, plasticizers, surfactants, brominated flame retardant, and preservatives. The method was optimized using Box-Behnken statistical experimental design with response surface methodology and validated in terms of recovery, accuracy, precision, and method detection and quantification limits. Quantification of target compounds was carried out using matrix-matched calibration curves. The highest recoveries were achieved for the perfluorinated organic compounds (mean values up to 87%) and preservatives (up to 88%). The lowest recoveries were achieved for plasticizers (51%) and brominated flame retardant (63%). Method detection and quantification limits were in the ranges 0.01-0.09 ng/g dry matter (dm) and 0.02-0.30 ng/g dm, respectively, for most of the target compounds. The method was successfully applied to the determination of the target compounds on leaves from four tree species used as urban ornamental trees (Citrus aurantium, Celtis australis, Platanus hispanica, and Jacaranda mimosifolia). Graphical abstract Analytical method for the biomonitorization of emerging pollutants in outdoor air.

  12. Analytical and Clinical Performance of Blood Glucose Monitors

    PubMed Central

    Boren, Suzanne Austin; Clarke, William L.

    2010-01-01

    Background The objective of this study was to understand the level of performance of blood glucose monitors as assessed in the published literature. Methods Medline from January 2000 to October 2009 and reference lists of included articles were searched to identify eligible studies. Key information was abstracted from eligible studies: blood glucose meters tested, blood sample, meter operators, setting, sample of people (number, diabetes type, age, sex, and race), duration of diabetes, years using a glucose meter, insulin use, recommendations followed, performance evaluation measures, and specific factors affecting the accuracy evaluation of blood glucose monitors. Results Thirty-one articles were included in this review. Articles were categorized as review articles of blood glucose accuracy (6 articles), original studies that reported the performance of blood glucose meters in laboratory settings (14 articles) or clinical settings (9 articles), and simulation studies (2 articles). A variety of performance evaluation measures were used in the studies. The authors did not identify any studies that demonstrated a difference in clinical outcomes. Examples of analytical tools used in the description of accuracy (e.g., correlation coefficient, linear regression equations, and International Organization for Standardization standards) and how these traditional measures can complicate the achievement of target blood glucose levels for the patient were presented. The benefits of using error grid analysis to quantify the clinical accuracy of patient-determined blood glucose values were discussed. Conclusions When examining blood glucose monitor performance in the real world, it is important to consider if an improvement in analytical accuracy would lead to improved clinical outcomes for patients. There are several examples of how analytical tools used in the description of self-monitoring of blood glucose accuracy could be irrelevant to treatment decisions. PMID:20167171

  13. Including scattering within the room acoustics diffusion model: An analytical approach.

    PubMed

    Foy, Cédric; Picaut, Judicaël; Valeau, Vincent

    2016-10-01

    Over the last 20 years, a statistical acoustic model has been developed to predict the reverberant sound field in buildings. This model is based on the assumption that the propagation of the reverberant sound field follows a transport process and, as an approximation, a diffusion process that can be easily solved numerically. This model, initially designed and validated for rooms with purely diffuse reflections, is extended in the present study to mixed reflections, with a proportion of specular and diffuse reflections defined by a scattering coefficient. The proposed mathematical developments lead to an analytical expression of the diffusion constant that is a function of the scattering coefficient, but also on the absorption coefficient of the walls. The results obtained with this extended diffusion model are then compared with the classical diffusion model, as well as with a sound particles tracing approach considering mixed wall reflections. The comparison shows a good agreement for long rooms with uniform low absorption (α = 0.01) and uniform scattering. For a larger absorption (α = 0.1), the agreement is moderate, due to the fact that the proposed expression of the diffusion coefficient does not vary spatially. In addition, the proposed model is for now limited to uniform diffusion and should be extended in the future to more general cases.

  14. Radar cross sections of standard and complex shape targets

    NASA Technical Reports Server (NTRS)

    Sohel, M. S.

    1974-01-01

    The theoretical, analytical, and experimental results are described for radar cross sections (RCS) of different-shaped targets. Various techniques for predicting RCS are given, and RCS of finite standard targets are presented. Techniques used to predict the RCS of complex targets are made, and the RCS complex shapes are provided.

  15. Analytical steady-state solutions for water-limited cropping systems using saline irrigation water

    NASA Astrophysics Data System (ADS)

    Skaggs, T. H.; Anderson, R. G.; Corwin, D. L.; Suarez, D. L.

    2014-12-01

    Due to the diminishing availability of good quality water for irrigation, it is increasingly important that irrigation and salinity management tools be able to target submaximal crop yields and support the use of marginal quality waters. In this work, we present a steady-state irrigated systems modeling framework that accounts for reduced plant water uptake due to root zone salinity. Two explicit, closed-form analytical solutions for the root zone solute concentration profile are obtained, corresponding to two alternative functional forms of the uptake reduction function. The solutions express a general relationship between irrigation water salinity, irrigation rate, crop salt tolerance, crop transpiration, and (using standard approximations) crop yield. Example applications are illustrated, including the calculation of irrigation requirements for obtaining targeted submaximal yields, and the generation of crop-water production functions for varying irrigation waters, irrigation rates, and crops. Model predictions are shown to be mostly consistent with existing models and available experimental data. Yet the new solutions possess advantages over available alternatives, including: (i) the solutions were derived from a complete physical-mathematical description of the system, rather than based on an ad hoc formulation; (ii) the analytical solutions are explicit and can be evaluated without iterative techniques; (iii) the solutions permit consideration of two common functional forms of salinity induced reductions in crop water uptake, rather than being tied to one particular representation; and (iv) the utilized modeling framework is compatible with leading transient-state numerical models.

  16. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR PESTICIDE METABOLITES IN SPIKE SAMPLES

    EPA Science Inventory

    The Pesticides in Spikes data set contains the analytical results of measurements of up to 17 pesticides in 12 control samples (spikes) from 11 households. Measurements were made in samples of blood serum. Controls were used to assess recovery of target analytes from a sample m...

  17. Difference in target definition using three different methods to include respiratory motion in radiotherapy of lung cancer.

    PubMed

    Sloth Møller, Ditte; Knap, Marianne Marquard; Nyeng, Tine Bisballe; Khalil, Azza Ahmed; Holt, Marianne Ingerslev; Kandi, Maria; Hoffmann, Lone

    2017-11-01

    Minimizing the planning target volume (PTV) while ensuring sufficient target coverage during the entire respiratory cycle is essential for free-breathing radiotherapy of lung cancer. Different methods are used to incorporate the respiratory motion into the PTV. Fifteen patients were analyzed. Respiration can be included in the target delineation process creating a respiratory GTV, denoted iGTV. Alternatively, the respiratory amplitude (A) can be measured based on the 4D-CT and A can be incorporated in the margin expansion. The GTV expanded by A yielded GTV + resp, which was compared to iGTV in terms of overlap. Three methods for PTV generation were compared. PTV del (delineated iGTV expanded to CTV plus PTV margin), PTV σ (GTV expanded to CTV and A was included as a random uncertainty in the CTV to PTV margin) and PTV ∑ (GTV expanded to CTV, succeeded by CTV linear expansion by A to CTV + resp, which was finally expanded to PTV ∑ ). Deformation of tumor and lymph nodes during respiration resulted in volume changes between the respiratory phases. The overlap between iGTV and GTV + resp showed that on average 7% of iGTV was outside the GTV + resp implying that GTV + resp did not capture the tumor during the full deformable respiration cycle. A comparison of the PTV volumes showed that PTV σ was smallest and PTV Σ largest for all patients. PTV σ was in mean 14% (31 cm 3 ) smaller than PTV del , while PTV del was 7% (20 cm 3 ) smaller than PTV Σ . PTV σ yields the smallest volumes but does not ensure coverage of tumor during the full respiratory motion due to tumor deformation. Incorporating the respiratory motion in the delineation (PTV del ) takes into account the entire respiratory cycle including deformation, but at the cost, however, of larger treatment volumes. PTV Σ should not be used, since it incorporates the disadvantages of both PTV del and PTV σ .

  18. 42 CFR 493.1289 - Standard: Analytic systems quality assessment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Analytic systems quality assessment. 493... Nonwaived Testing Analytic Systems § 493.1289 Standard: Analytic systems quality assessment. (a) The... through 493.1283. (b) The analytic systems quality assessment must include a review of the effectiveness...

  19. Non-target screening to trace ozonation transformation products in a wastewater treatment train including different post-treatments.

    PubMed

    Schollée, Jennifer E; Bourgin, Marc; von Gunten, Urs; McArdell, Christa S; Hollender, Juliane

    2018-05-25

    Ozonation and subsequent post-treatments are increasingly implemented in wastewater treatment plants (WWTPs) for enhanced micropollutant abatement. While this technology is effective, micropollutant oxidation leads to the formation of ozonation transformation products (OTPs). Target and suspect screening provide information about known parent compounds and known OTPs, but for a more comprehensive picture, non-target screening is needed. Here, sampling was conducted at a full-scale WWTP to investigate OTP formation at four ozone doses (2, 3, 4, and 5 mg/L, ranging from 0.3 to 1.0 gO 3 /gDOC) and subsequent changes during five post-treatment steps (i.e., sand filter, fixed bed bioreactor, moving bed bioreactor, and two granular activated carbon (GAC) filters, relatively fresh and pre-loaded). Samples were measured with online solid-phase extraction coupled to liquid chromatography high-resolution tandem mass spectrometry (LC-HRMS/MS) using electrospray ionization (ESI) in positive and negative modes. Existing non-target screening workflows were adapted to (1) examine the formation of potential OTPs at four ozone doses and (2) compare the removal of OTPs among five post-treatments. In (1), data processing included principal component analysis (PCA) and chemical knowledge on possible oxidation reactions to prioritize non-target features likely to be OTPs. Between 394 and 1328 unique potential OTPs were detected in positive ESI for the four ozone doses tested; between 12 and 324 unique potential OTPs were detected in negative ESI. At a specific ozone dose of 0.5 gO 3 /gDOC, 27 parent compounds were identified and were related to 69 non-target features selected as potential OTPs. Two OTPs were confirmed with reference standards (venlafaxine N-oxide and chlorothiazide); 34 other potential OTPs were in agreement with literature data and/or reaction mechanisms. In (2), hierarchical cluster analysis (HCA) was applied on profiles detected in positive ESI mode across the

  20. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.

    PubMed

    Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.

  1. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.

  2. Specifying a target trial prevents immortal time bias and other self-inflicted injuries in observational analyses

    PubMed Central

    Hernán, Miguel A.; Sauer, Brian C.; Hernández-Díaz, Sonia; Platt, Robert; Shrier, Ian

    2016-01-01

    Many analyses of observational data are attempts to emulate a target trial. The emulation of the target trial may fail when researchers deviate from simple principles that guide the design and analysis of randomized experiments. We review a framework to describe and prevent biases, including immortal time bias, that result from a failure to align start of follow-up, specification of eligibility, and treatment assignment. We review some analytic approaches to avoid these problems in comparative effectiveness or safety research. PMID:27237061

  3. Analytical Tools in School Finance Reform.

    ERIC Educational Resources Information Center

    Johns, R. L.

    This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…

  4. Limitations of analytical dose calculations for small field proton radiosurgery.

    PubMed

    Geng, Changran; Daartz, Juliane; Lam-Tin-Cheung, Kimberley; Bussiere, Marc; Shih, Helen A; Paganetti, Harald; Schuemann, Jan

    2017-01-07

    The purpose of the work was to evaluate the dosimetric uncertainties of an analytical dose calculation engine and the impact on treatment plans using small fields in intracranial proton stereotactic radiosurgery (PSRS) for a gantry based double scattering system. 50 patients were evaluated including 10 patients for each of 5 diagnostic indications of: arteriovenous malformation (AVM), acoustic neuroma (AN), meningioma (MGM), metastasis (METS), and pituitary adenoma (PIT). Treatment plans followed standard prescription and optimization procedures for PSRS. We performed comparisons between delivered dose distributions, determined by Monte Carlo (MC) simulations, and those calculated with the analytical dose calculation algorithm (ADC) used in our current treatment planning system in terms of dose volume histogram parameters and beam range distributions. Results show that the difference in the dose to 95% of the target (D95) is within 6% when applying measured field size output corrections for AN, MGM, and PIT. However, for AVM and METS, the differences can be as great as 10% and 12%, respectively. Normalizing the MC dose to the ADC dose based on the dose of voxels in a central area of the target reduces the difference of the D95 to within 6% for all sites. The generally applied margin to cover uncertainties in range (3.5% of the prescribed range  +  1 mm) is not sufficient to cover the range uncertainty for ADC in all cases, especially for patients with high tissue heterogeneity. The root mean square of the R90 difference, the difference in the position of distal falloff to 90% of the prescribed dose, is affected by several factors, especially the patient geometry heterogeneity, modulation and field diameter. In conclusion, implementation of Monte Carlo dose calculation techniques into the clinic can reduce the uncertainty of the target dose for proton stereotactic radiosurgery. If MC is not available for treatment planning, using MC dose distributions to

  5. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  6. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    PubMed Central

    Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-01-01

    Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928

  7. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  8. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    PubMed

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    PubMed

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  10. Organic materials able to detect analytes

    NASA Technical Reports Server (NTRS)

    Swager, Timothy M. (Inventor); Zhu, Zhengguo (Inventor); Bulovic, Vladimir (Inventor); Rose, Aimee (Inventor); Madigan, Conor Francis (Inventor)

    2012-01-01

    The present invention generally relates to polymers with lasing characteristics that allow the polymers to be useful in detecting analytes. In one aspect, the polymer, upon an interaction with an analyte, may exhibit a change in a lasing characteristic that can be determined in some fashion. For example, interaction of an analyte with the polymer may affect the ability of the polymer to reach an excited state that allows stimulated emission of photons to occur, which may be determined, thereby determining the analyte. In another aspect, the polymer, upon interaction with an analyte, may exhibit a change in stimulated emission that is at least 10 times greater with respect to a change in the spontaneous emission of the polymer upon interaction with the analyte. The polymer may be a conjugated polymer in some cases. In one set of embodiments, the polymer includes one or more hydrocarbon side chains, which may be parallel to the polymer backbone in some instances. In another set of embodiments, the polymer may include one or more pendant aromatic rings. In yet another set of embodiments, the polymer may be substantially encapsulated in a hydrocarbon. In still another set of embodiments, the polymer may be substantially resistant to photobleaching. In certain aspects, the polymer may be useful in the detection of explosive agents, such as 2,4,6-trinitrotoluene (TNT) and 2,4-dinitrotoluene (DNT).

  11. Nanoscaled aptasensors for multi-analyte sensing

    PubMed Central

    Saberian-Borujeni, Mehdi; Johari-Ahar, Mohammad; Hamzeiy, Hossein; Barar, Jaleh; Omidi, Yadollah

    2014-01-01

    Introduction: Nanoscaled aptamers (Aps), as short single-stranded DNA or RNA oligonucleotides, are able to bind to their specific targets with high affinity, upon which they are considered as powerful diagnostic and analytical sensing tools (the so-called "aptasensors"). Aptamers are selected from a random pool of oligonucleotides through a procedure known as "systematic evolution of ligands by exponential enrichment". Methods: In this work, the most recent studies in the field of aptasensors are reviewed and discussed with a main focus on the potential of aptasensors for the multianalyte detection(s). Results: Due to the specific folding capability of aptamers in the presence of analyte, aptasensors have substantially successfully been exploited for the detection of a wide range of small and large molecules (e.g., drugs and their metabolites, toxins, and associated biomarkers in various diseases) at very low concentrations in the biological fluids/samples even in presence of interfering species. Conclusion: Biological samples are generally considered as complexes in the real biological media. Hence, the development of aptasensors with capability to determine various targets simultaneously within a biological matrix seems to be our main challenge. To this end, integration of various key scientific dominions such as bioengineering and systems biology with biomedical researches are inevitable. PMID:25671177

  12. An Automated Directed Spectral Search Methodology for Small Target Detection

    NASA Astrophysics Data System (ADS)

    Grossman, Stanley I.

    Much of the current efforts in remote sensing tackle macro-level problems such as determining the extent of wheat in a field, the general health of vegetation or the extent of mineral deposits in an area. However, for many of the remaining remote sensing challenges being studied currently, such as border protection, drug smuggling, treaty verification, and the war on terror, most targets are very small in nature - a vehicle or even a person. While in typical macro-level problems the objective vegetation is in the scene, for small target detection problems it is not usually known if the desired small target even exists in the scene, never mind finding it in abundance. The ability to find specific small targets, such as vehicles, typifies this problem. Complicating the analyst's life, the growing number of available sensors is generating mountains of imagery outstripping the analysts' ability to visually peruse them. This work presents the important factors influencing spectral exploitation using multispectral data and suggests a different approach to small target detection. The methodology of directed search is presented, including the use of scene-modeled spectral libraries, various search algorithms, and traditional statistical and ROC curve analysis. The work suggests a new metric to calibrate analysis labeled the analytic sweet spot as well as an estimation method for identifying the sweet spot threshold for an image. It also suggests a new visualization aid for highlighting the target in its entirety called nearest neighbor inflation (NNI). It brings these all together to propose that these additions to the target detection arena allow for the construction of a fully automated target detection scheme. This dissertation next details experiments to support the hypothesis that the optimum detection threshold is the analytic sweet spot and that the estimation method adequately predicts it. Experimental results and analysis are presented for the proposed directed

  13. Effect of H-wave polarization on laser radar detection of partially convex targets in random media.

    PubMed

    El-Ocla, Hosam

    2010-07-01

    A study on the performance of laser radar cross section (LRCS) of conducting targets with large sizes is investigated numerically in free space and random media. The LRCS is calculated using a boundary value method with beam wave incidence and H-wave polarization. Considered are those elements that contribute to the LRCS problem including random medium strength, target configuration, and beam width. The effect of the creeping waves, stimulated by H-polarization, on the LRCS behavior is manifested. Targets taking large sizes of up to five wavelengths are sufficiently larger than the beam width and are sufficient for considering fairly complex targets. Scatterers are assumed to have analytical partially convex contours with inflection points.

  14. Trace level detection of analytes using artificial olfactometry

    NASA Technical Reports Server (NTRS)

    Lewis, Nathan S. (Inventor); Severin, Erik J. (Inventor); Wong, Bernard (Inventor)

    2002-01-01

    The present invention provides a device for detecting the presence of an analyte, such as for example, a lightweight device, including: a sample chamber having a fluid inlet port for the influx of the analyte; a fluid concentrator in flow communication with the sample chamber wherein the fluid concentrator has an absorbent material capable of absorbing the analyte and capable of desorbing a concentrated analyte; and an array of sensors in fluid communication with the concentrated analyte to be released from the fluid concentrator.

  15. Analyte-Responsive Hydrogels: Intelligent Materials for Biosensing and Drug Delivery.

    PubMed

    Culver, Heidi R; Clegg, John R; Peppas, Nicholas A

    2017-02-21

    Nature has mastered the art of molecular recognition. For example, using synergistic non-covalent interactions, proteins can distinguish between molecules and bind a partner with incredible affinity and specificity. Scientists have developed, and continue to develop, techniques to investigate and better understand molecular recognition. As a consequence, analyte-responsive hydrogels that mimic these recognitive processes have emerged as a class of intelligent materials. These materials are unique not only in the type of analyte to which they respond but also in how molecular recognition is achieved and how the hydrogel responds to the analyte. Traditional intelligent hydrogels can respond to environmental cues such as pH, temperature, and ionic strength. The functional monomers used to make these hydrogels can be varied to achieve responsive behavior. For analyte-responsive hydrogels, molecular recognition can also be achieved by incorporating biomolecules with inherent molecular recognition properties (e.g., nucleic acids, peptides, enzymes, etc.) into the polymer network. Furthermore, in addition to typical swelling/syneresis responses, these materials exhibit unique responsive behaviors, such as gel assembly or disassembly, upon interaction with the target analyte. With the diverse tools available for molecular recognition and the ability to generate unique responsive behaviors, analyte-responsive hydrogels have found great utility in a wide range of applications. In this Account, we discuss strategies for making four different classes of analyte-responsive hydrogels, specifically, non-imprinted, molecularly imprinted, biomolecule-containing, and enzymatically responsive hydrogels. Then we explore how these materials have been incorporated into sensors and drug delivery systems, highlighting examples that demonstrate the versatility of these materials. For example, in addition to the molecular recognition properties of analyte-responsive hydrogels, the

  16. Applying Advanced Analytical Approaches to Characterize the Impact of Specific Clinical Gaps and Profiles on the Management of Rheumatoid Arthritis.

    PubMed

    Ruiz-Cordell, Karyn D; Joubin, Kathy; Haimowitz, Steven

    2016-01-01

    The goal of this study was to add a predictive modeling approach to the meta-analysis of continuing medical education curricula to determine whether this technique can be used to better understand clinical decision making. Using the education of rheumatologists on rheumatoid arthritis management as a model, this study demonstrates how the combined methodology has the ability to not only characterize learning gaps but also identify those proficiency areas that have the greatest impact on clinical behavior. The meta-analysis included seven curricula with 25 activities. Learners who identified as rheumatologists were evaluated across multiple learning domains, using a uniform methodology to characterize learning gains and gaps. A performance composite variable (called the treatment individualization and optimization score) was then established as a target upon which predictive analytics were conducted. Significant predictors of the target included items related to the knowledge of rheumatologists and confidence concerning 1) treatment guidelines and 2) tests that measure disease activity. In addition, a striking demographic predictor related to geographic practice setting was also identified. The results demonstrate the power of advanced analytics to identify key predictors that influence clinical behaviors. Furthermore, the ability to provide an expected magnitude of change if these predictors are addressed has the potential to substantially refine educational priorities to those drivers that, if targeted, will most effectively overcome clinical barriers and lead to the greatest success in achieving treatment goals.

  17. Counterfeit drugs: analytical techniques for their identification.

    PubMed

    Martino, R; Malet-Martino, M; Gilard, V; Balayssac, S

    2010-09-01

    In recent years, the number of counterfeit drugs has increased dramatically, including not only "lifestyle" products but also vital medicines. Besides the threat to public health, the financial and reputational damage to pharmaceutical companies is substantial. The lack of robust information on the prevalence of fake drugs is an obstacle in the fight against drug counterfeiting. It is generally accepted that approximately 10% of drugs worldwide could be counterfeit, but it is also well known that this number covers very different situations depending on the country, the places where the drugs are purchased, and the definition of what constitutes a counterfeit drug. The chemical analysis of drugs suspected to be fake is a crucial step as counterfeiters are becoming increasingly sophisticated, rendering visual inspection insufficient to distinguish the genuine products from the counterfeit ones. This article critically reviews the recent analytical methods employed to control the quality of drug formulations, using as an example artemisinin derivatives, medicines particularly targeted by counterfeiters. Indeed, a broad panel of techniques have been reported for their analysis, ranging from simple and cheap in-field ones (colorimetry and thin-layer chromatography) to more advanced laboratory methods (mass spectrometry, nuclear magnetic resonance, and vibrational spectroscopies) through chromatographic methods, which remain the most widely used. The conclusion section of the article highlights the questions to be posed before selecting the most appropriate analytical approach.

  18. How health leaders can benefit from predictive analytics.

    PubMed

    Giga, Aliyah

    2017-11-01

    Predictive analytics can support a better integrated health system providing continuous, coordinated, and comprehensive person-centred care to those who could benefit most. In addition to dollars saved, using a predictive model in healthcare can generate opportunities for meaningful improvements in efficiency, productivity, costs, and better population health with targeted interventions toward patients at risk.

  19. Analytical model for the threshold voltage of III-V nanowire transistors including quantum effects

    NASA Astrophysics Data System (ADS)

    Marin, E. G.; Ruiz, F. G.; Tienda-Luna, I. M.; Godoy, A.; Gámiz, F.

    2014-02-01

    In this work we propose an analytical model for the threshold voltage (VT) of III-V cylindrical nanowires, that takes into consideration the two dimensional quantum confinement of the carriers, the Fermi-Dirac statistics, the wave-function penetration into the gate insulator and the non-parabolicity of the conduction band structure. A simple expression for VT is obtained assuming some suitable approximations. The model results are compared to those of a 2D self consistent Schrödinger-Poisson solver, demonstrating a good fit for different III-V materials, insulator thicknesses and nanowire sizes with diameter down to 5 nm. The VT dependence on the confinement effective mass is discussed. The different contributions to VT are analyzed showing significant variations among different III-V materials.

  20. Analytical and Experimental Performance Evaluation of BLE Neighbor Discovery Process Including Non-Idealities of Real Chipsets

    PubMed Central

    Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio

    2017-01-01

    The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage. PMID:28273801

  1. Analytical and Experimental Performance Evaluation of BLE Neighbor Discovery Process Including Non-Idealities of Real Chipsets.

    PubMed

    Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio

    2017-03-03

    The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage.

  2. Sensor arrays for detecting analytes in fluids

    NASA Technical Reports Server (NTRS)

    Freund, Michael S. (Inventor); Lewis, Nathan S. (Inventor)

    2000-01-01

    A sensor array for detecting an analyte in a fluid, comprising at least first and second chemically sensitive resistors electrically connected to an electrical measuring apparatus, wherein each of the chemically sensitive resistors comprises a mixture of nonconductive material and a conductive material. Each resistor provides an electrical path through the mixture of nonconductive material and the conductive material. The resistors also provide a difference in resistance between the conductive elements when contacted with a fluid comprising an analyte at a first concentration, than when contacted with an analyte at a second different concentration. A broad range of analytes can be detected using the sensors of the present invention. Examples of such analytes include, but are not limited to, alkanes, alkenes, alkynes, dienes, alicyclic hydrocarbons, arenes, alcohols, ethers, ketones, aldehydes, carbonyls, carbanions, polynuclear aromatics, organic derivatives, biomolecules, sugars, isoprenes, isoprenoids and fatty acids. Moreover, applications for the sensors of the present invention include, but are not limited to, environmental toxicology, remediation, biomedicine, material quality control, food monitoring and agricultural monitoring.

  3. An Evaluation of a Flight Deck Interval Management Algorithm Including Delayed Target Trajectories

    NASA Technical Reports Server (NTRS)

    Swieringa, Kurt A.; Underwood, Matthew C.; Barmore, Bryan; Leonard, Robert D.

    2014-01-01

    NASA's first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature air traffic management technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise timebased scheduling in the terminal airspace; Controller Managed Spacing (CMS), which provides controllers with decision support tools enabling precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain precise in-trail spacing. During high demand operations, TMA-TM may produce a schedule and corresponding aircraft trajectories that include delay to ensure that a particular aircraft will be properly spaced from other aircraft at each schedule waypoint. These delayed trajectories are not communicated to the automation onboard the aircraft, forcing the IM aircraft to use the published speeds to estimate the target aircraft's estimated time of arrival. As a result, the aircraft performing IM operations may follow an aircraft whose TMA-TM generated trajectories have substantial speed deviations from the speeds expected by the spacing algorithm. Previous spacing algorithms were not designed to handle this magnitude of uncertainty. A simulation was conducted to examine a modified spacing algorithm with the ability to follow aircraft flying delayed trajectories. The simulation investigated the use of the new spacing algorithm with various delayed speed profiles and wind conditions, as well as several other variables designed to simulate real-life variability. The results and conclusions of this study indicate that the new spacing algorithm generally exhibits good performance; however, some types of target aircraft speed profiles can cause the spacing algorithm to command less than optimal speed control behavior.

  4. Adjustment of pesticide concentrations for temporal changes in analytical recovery, 1992–2010

    USGS Publications Warehouse

    Martin, Jeffrey D.; Eberle, Michael

    2011-01-01

    Recovery is the proportion of a target analyte that is quantified by an analytical method and is a primary indicator of the analytical bias of a measurement. Recovery is measured by analysis of quality-control (QC) water samples that have known amounts of target analytes added ("spiked" QC samples). For pesticides, recovery is the measured amount of pesticide in the spiked QC sample expressed as a percentage of the amount spiked, ideally 100 percent. Temporal changes in recovery have the potential to adversely affect time-trend analysis of pesticide concentrations by introducing trends in apparent environmental concentrations that are caused by trends in performance of the analytical method rather than by trends in pesticide use or other environmental conditions. This report presents data and models related to the recovery of 44 pesticides and 8 pesticide degradates (hereafter referred to as "pesticides") that were selected for a national analysis of time trends in pesticide concentrations in streams. Water samples were analyzed for these pesticides from 1992 through 2010 by gas chromatography/mass spectrometry. Recovery was measured by analysis of pesticide-spiked QC water samples. Models of recovery, based on robust, locally weighted scatterplot smooths (lowess smooths) of matrix spikes, were developed separately for groundwater and stream-water samples. The models of recovery can be used to adjust concentrations of pesticides measured in groundwater or stream-water samples to 100 percent recovery to compensate for temporal changes in the performance (bias) of the analytical method.

  5. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  6. A Method for Analyzing Commonalities in Clinical Trial Target Populations

    PubMed Central

    He, Zhe; Carini, Simona; Hao, Tianyong; Sim, Ida; Weng, Chunhua

    2014-01-01

    ClinicalTrials.gov presents great opportunities for analyzing commonalities in clinical trial target populations to facilitate knowledge reuse when designing eligibility criteria of future trials or to reveal potential systematic biases in selecting population subgroups for clinical research. Towards this goal, this paper presents a novel data resource for enabling such analyses. Our method includes two parts: (1) parsing and indexing eligibility criteria text; and (2) mining common eligibility features and attributes of common numeric features (e.g., A1c). We designed and built a database called “Commonalities in Target Populations of Clinical Trials” (COMPACT), which stores structured eligibility criteria and trial metadata in a readily computable format. We illustrate its use in an example analytic module called CONECT using COMPACT as the backend. Type 2 diabetes is used as an example to analyze commonalities in the target populations of 4,493 clinical trials on this disease. PMID:25954450

  7. Analytical evaluation of the trajectories of hypersonic projectiles launched into space

    NASA Astrophysics Data System (ADS)

    Stutz, John David

    An equation of motion has been derived that may be solved using simple analytic functions which describes the motion of a projectile launched from the surface of the Earth into space accounting for both Newtonian gravity and aerodynamic drag. The equation of motion is based upon the Kepler equation of motion differential and variable transformations with the inclusion of a decaying angular momentum driving function and appropriate simplifying assumptions. The new equation of motion is first compared to various numerical and analytical trajectory approximations in a non-rotating Earth reference frame. The Modified Kepler solution is then corrected to include Earth rotation and compared to a rotating Earth simulation. Finally, the modified equation of motion is used to predict the apogee and trajectory of projectiles launched into space by the High Altitude Research Project from 1961 to 1967. The new equation of motion allows for the rapid equalization of projectile trajectories and intercept solutions that may be used to calculate firing solutions to enable ground launched projectiles to intercept or rendezvous with targets in low Earth orbit such as ballistic missiles.

  8. Constraints in distortion-invariant target recognition system simulation

    NASA Astrophysics Data System (ADS)

    Iftekharuddin, Khan M.; Razzaque, Md A.

    2000-11-01

    Automatic target recognition (ATR) is a mature but active research area. In an earlier paper, we proposed a novel ATR approach for recognition of targets varying in fine details, rotation, and translation using a Learning Vector Quantization (LVQ) Neural Network (NN). The proposed approach performed segmentation of multiple objects and the identification of the objects using LVQNN. In this current paper, we extend the previous approach for recognition of targets varying in rotation, translation, scale, and combination of all three distortions. We obtain the analytical results of the system level design to show that the approach performs well with some constraints. The first constraint determines the size of the input images and input filters. The second constraint shows the limits on amount of rotation, translation, and scale of input objects. We present the simulation verification of the constraints using DARPA's Moving and Stationary Target Recognition (MSTAR) images with different depression and pose angles. The simulation results using MSTAR images verify the analytical constraints of the system level design.

  9. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  10. Accuracy of Geographically Targeted Internet Advertisements on Google Adwords for Recruitment in a Randomized Trial

    PubMed Central

    Goldsmith, Lesley; Williams, Christopher J; Kamel Boulos, Maged N

    2012-01-01

    Background Google AdWords are increasingly used to recruit people into research studies and clinical services. They offer the potential to recruit from targeted control areas in cluster randomized controlled trials (RCTs), but little is known about the feasibility of accurately targeting ads by location and comparing with control areas. Objective To examine the accuracy and contamination of control areas by a location-targeted online intervention using Google AdWords in a pilot cluster RCT. Methods Based on previous use of online cognitive behavioral therapy for depression and population size, we purposively selected 16 of the 121 British postcode areas and randomized them to three intervention and one (do-nothing) control arms. Two intervention arms included use of location-targeted AdWords, and we compared these with the do-nothing control arm. We did not raise the visibility of our research website to normal Web searches. Users who clicked on the ad were directed to our project website, which collected the computer Internet protocol (IP) address, date, and time. Visitors were asked for their postcode area and to complete the Patient Health Questionnaire (depression). They were then offered links to several online depression resources. Google Analytics largely uses IP methods to estimate location, but AdWords uses additional information. We compared locations assessed by (1) Analytics, and (2) as self-identified by users. Results Ads were shown 300,523 times with 4207 click-throughs. There were few site visits except through AdWord click-throughs. Both methods of location assessment agreed there was little contamination of control areas. According to Analytics, 69.75% (2617/3752) of participants were in intervention areas, only 0% (8/3752) in control areas, but 30.04% (1127/3752) in other areas. However, according to user-stated postcodes, only 20.7% (463/2237) were in intervention areas, 1% (22/2236) in control areas, but 78.31% (1751/2236) in other areas. Both

  11. Analytical Chemistry Laboratory Progress Report for FY 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less

  12. Beyond privacy and exposure: ethical issues within citizen-facing analytics.

    PubMed

    Grindrod, Peter

    2016-12-28

    We discuss the governing forces for analytics, especially concerning citizens' behaviours and their transactions, that depend on which of three spheres of operation an institution is in (corporate, public sector/government and academic). We argue that aspirations and missions also differ by sphere even as digital spaces have drawn these spheres ever closer together. We propose that citizens' expectations and implicit permissions for any exploitation of their data require the perception of a fair balance of benefits, which should be transparent (accessible to citizens) and justifiable. We point out that within the corporate sphere most analytics does not concern identity, targeted marketing nor any direct interference with individual citizens; but instead it supports strategic decision-making, where the data are effectively anonymous. With the three spheres we discuss the nature of models deployed in analytics, including 'black-box' modelling uncheckable by a human mind, and the need to track the provenance and workings or models. We also examine the recent evolution of personal data, where some behaviours, or tokens, identifying individuals (unique and yet non-random) are partially and jointly owned by other individuals that are themselves connected. We consider the ability of heavily and lightly regulated sectors to increase access or to stifle innovation. We also call for clear and inclusive definitions of 'data science and analytics', avoiding the narrow claims of those in technical sub-sectors or sub-themes. Finally, we examine some examples of unethical and abusive practices. We argue for an ethical responsibility to be placed upon professional data scientists to avoid abuses in the future.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).

  13. Specifying a target trial prevents immortal time bias and other self-inflicted injuries in observational analyses.

    PubMed

    Hernán, Miguel A; Sauer, Brian C; Hernández-Díaz, Sonia; Platt, Robert; Shrier, Ian

    2016-11-01

    Many analyses of observational data are attempts to emulate a target trial. The emulation of the target trial may fail when researchers deviate from simple principles that guide the design and analysis of randomized experiments. We review a framework to describe and prevent biases, including immortal time bias, that result from a failure to align start of follow-up, specification of eligibility, and treatment assignment. We review some analytic approaches to avoid these problems in comparative effectiveness or safety research. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Spatiotemporal and geometric optimization of sensor arrays for detecting analytes fluids

    DOEpatents

    Lewis, Nathan S.; Freund, Michael S.; Briglin, Shawn M.; Tokumaru, Phil; Martin, Charles R.; Mitchell, David T.

    2006-10-17

    Sensor arrays and sensor array systems for detecting analytes in fluids. Sensors configured to generate a response upon introduction of a fluid containing one or more analytes can be located on one or more surfaces relative to one or more fluid channels in an array. Fluid channels can take the form of pores or holes in a substrate material. Fluid channels can be formed between one or more substrate plates. Sensor can be fabricated with substantially optimized sensor volumes to generate a response having a substantially maximized signal to noise ratio upon introduction of a fluid containing one or more target analytes. Methods of fabricating and using such sensor arrays and systems are also disclosed.

  16. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  17. Targeted quantitation of proteins by mass spectrometry.

    PubMed

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  18. Analysis of Method TO-14 target analytes using a cryofocusing high-speed gas chromatograph interfaced to a high-speed time-of-flight mass spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berkley, R.E.; Gardner, B.D.; Holland, J.F.

    1997-12-31

    A high-speed gas chromatograph coupled with a high-speed time-of-flight mass spectrometer was used to gain a six-fold increase in overall rate of analytical throughput for analysis of EPA Method TO-14 target compounds. Duration of chromatograms was 180 seconds. One hundred mass spectra per second, ranging from 35 to 270 mass units, were collected. Single ion chromatograms were searched at appropriate retention times for chromatographic peaks, which were integrated. Thirty-eight of the forty-one TO-14 target compounds were calibrated using standards at five concentrations from 2.5 to 40 ppb. Four grab samples of ambient air were collected at four different locations atmore » an automobile repair facility, and two grab samples were collected less than one minute apart at a site near a chemical plant, just before and just after passage of three large diesel trucks. All samples were analyzed on the same day they were collected. Most of the duplicate analyses were in close agreement. Ability of the high-speed TOF/GC/MS system to perform analyses of TO-14 target compounds rapidly and precisely was demonstrated. This paper has been reviewed according to US Environmental Protection Agency peer and administrative review policies and approved for presentation and publication. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.« less

  19. A Requirements-Driven Optimization Method for Acoustic Liners Using Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.; Lopes, Leonard V.

    2017-01-01

    More than ever, there is flexibility and freedom in acoustic liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. In a previous paper on this subject, a method deriving the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground was described. A simple code-wrapping approach was used to evaluate a community noise objective function for an external optimizer. Gradients were evaluated using a finite difference formula. The subject of this paper is an application of analytic derivatives that supply precise gradients to an optimization process. Analytic derivatives improve the efficiency and accuracy of gradient-based optimization methods and allow consideration of more design variables. In addition, the benefit of variable impedance liners is explored using a multi-objective optimization.

  20. Life cycle management of analytical methods.

    PubMed

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. It's time to put the C.A.R.T. before the H.O.R.S.E. or putting critical, analytical, and reflective thinking before

    Treesearch

    David L. Jewell

    2002-01-01

    Higher education is the target of criticism for, among other things, the failure to teach students how to think - critically, analytically, and reflectively - and for placing too much emphasis on career preparation or professional education. While a number of external factors have, perhaps, led to such criticism being warranted, faculty - including those in Recreation...

  2. Analytical performance of a bronchial genomic classifier.

    PubMed

    Hu, Zhanzhi; Whitney, Duncan; Anderson, Jessica R; Cao, Manqiu; Ho, Christine; Choi, Yoonha; Huang, Jing; Frink, Robert; Smith, Kate Porta; Monroe, Robert; Kennedy, Giulia C; Walsh, P Sean

    2016-02-26

    The current standard practice of lung lesion diagnosis often leads to inconclusive results, requiring additional diagnostic follow up procedures that are invasive and often unnecessary due to the high benign rate in such lesions (Chest 143:e78S-e92, 2013). The Percepta bronchial genomic classifier was developed and clinically validated to provide more accurate classification of lung nodules and lesions that are inconclusive by bronchoscopy, using bronchial brushing specimens (N Engl J Med 373:243-51, 2015, BMC Med Genomics 8:18, 2015). The analytical performance of the Percepta test is reported here. Analytical performance studies were designed to characterize the stability of RNA in bronchial brushing specimens during collection and shipment; analytical sensitivity defined as input RNA mass; analytical specificity (i.e. potentially interfering substances) as tested on blood and genomic DNA; and assay performance studies including intra-run, inter-run, and inter-laboratory reproducibility. RNA content within bronchial brushing specimens preserved in RNAprotect is stable for up to 20 days at 4 °C with no changes in RNA yield or integrity. Analytical sensitivity studies demonstrated tolerance to variation in RNA input (157 ng to 243 ng). Analytical specificity studies utilizing cancer positive and cancer negative samples mixed with either blood (up to 10 % input mass) or genomic DNA (up to 10 % input mass) demonstrated no assay interference. The test is reproducible from RNA extraction through to Percepta test result, including variation across operators, runs, reagent lots, and laboratories (standard deviation of 0.26 for scores on > 6 unit scale). Analytical sensitivity, analytical specificity and robustness of the Percepta test were successfully verified, supporting its suitability for clinical use.

  3. Target annihilation by diffusing particles in inhomogeneous geometries

    NASA Astrophysics Data System (ADS)

    Cassi, Davide

    2009-09-01

    The survival probability of immobile targets annihilated by a population of random walkers on inhomogeneous discrete structures, such as disordered solids, glasses, fractals, polymer networks, and gels, is analytically investigated. It is shown that, while it cannot in general be related to the number of distinct visited points as in the case of homogeneous lattices, in the case of bounded coordination numbers its asymptotic behavior at large times can still be expressed in terms of the spectral dimension d˜ and its exact analytical expression is given. The results show that the asymptotic survival probability is site-independent of recurrent structures (d˜≤2) , while on transient structures (d˜>2) it can strongly depend on the target position, and such dependence is explicitly calculated.

  4. Recent advances in analytical methods for the determination of 4-alkylphenols and bisphenol A in solid environmental matrices: A critical review.

    PubMed

    Salgueiro-González, N; Castiglioni, S; Zuccato, E; Turnes-Carou, I; López-Mahía, P; Muniategui-Lorenzo, S

    2018-09-18

    The problem of endocrine disrupting compounds (EDCs) in the environment has become a worldwide concern in recent decades. Besides their toxicological effects at low concentrations and their widespread use in industrial and household applications, these pollutants pose a risk for non-target organisms and also for public safety. Analytical methods to determine these compounds at trace levels in different matrices are urgently needed. This review critically discusses trends in analytical methods for well-known EDCs like alkylphenols and bisphenol A in solid environmental matrices, including sediment and aquatic biological samples (from 2006 to 2018). Information about extraction, clean-up and determination is covered in detail, including analytical quality parameters (QA/QC). Conventional and novel analytical techniques are compared, with their advantages and drawbacks. Ultrasound assisted extraction followed by solid phase extraction clean-up is the most widely used procedure for sediment and aquatic biological samples, although softer extraction conditions have been employed for the latter. The use of liquid chromatography followed by tandem mass spectrometry has greatly increased in the last five years. The majority of these methods have been employed for the analysis of river sediments and bivalve molluscs because of their usefulness in aquatic ecosystem (bio)monitoring programs. Green, simple, fast analytical methods are now needed to determine these compounds in complex matrices. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Accuracy of geographically targeted internet advertisements on Google AdWords for recruitment in a randomized trial.

    PubMed

    Jones, Ray B; Goldsmith, Lesley; Williams, Christopher J; Kamel Boulos, Maged N

    2012-06-20

    Google AdWords are increasingly used to recruit people into research studies and clinical services. They offer the potential to recruit from targeted control areas in cluster randomized controlled trials (RCTs), but little is known about the feasibility of accurately targeting ads by location and comparing with control areas. To examine the accuracy and contamination of control areas by a location-targeted online intervention using Google AdWords in a pilot cluster RCT. Based on previous use of online cognitive behavioral therapy for depression and population size, we purposively selected 16 of the 121 British postcode areas and randomized them to three intervention and one (do-nothing) control arms. Two intervention arms included use of location-targeted AdWords, and we compared these with the do-nothing control arm. We did not raise the visibility of our research website to normal Web searches. Users who clicked on the ad were directed to our project website, which collected the computer Internet protocol (IP) address, date, and time. Visitors were asked for their postcode area and to complete the Patient Health Questionnaire (depression). They were then offered links to several online depression resources. Google Analytics largely uses IP methods to estimate location, but AdWords uses additional information. We compared locations assessed by (1) Analytics, and (2) as self-identified by users. Ads were shown 300,523 times with 4207 click-throughs. There were few site visits except through AdWord click-throughs. Both methods of location assessment agreed there was little contamination of control areas. According to Analytics, 69.75% (2617/3752) of participants were in intervention areas, only 0% (8/3752) in control areas, but 30.04% (1127/3752) in other areas. However, according to user-stated postcodes, only 20.7% (463/2237) were in intervention areas, 1% (22/2236) in control areas, but 78.31% (1751/2236) in other areas. Both location assessments suggested most

  6. BEAM ON TARGET MODEL Produces All Gamma Ray Burst Phenomena Including Afterglow

    NASA Astrophysics Data System (ADS)

    Greyber, H.

    2000-12-01

    While one must applaud the splendid research by L. Piro et al and L. Amati et al reported in SCIENCE recently, one must question, as M. Rees and S. Woolsey have done, their conclusion that a ``supranova model" is the only explanation for these new X-ray observations. In fact L. Piro was quoted as saying, ``Our data helps rule out the scenario where two neutron stars or black holes collide. We think GRBs result from something similar to a supernova explosion, but much more powerful." A relatively unknown physical model for GRBs, Greyber's Beam On Target model (BOT), dating back to the first CGRO observations, can plausibly explain the iron emission lines observed for GRB991216, and also the mass of the dense medium within a light-day of the GRB being roughly equivalent to at least one-tenth solar mass, as well as the initial shedding of material followed by the GRB event. When a galaxy forms under gravitational collapse in the presence of a primordial magnetic field, Mestel and Strittmatter demonstrated that, for finite Ohmic diffusion, a growing equatorial current loop is formed. Even if this stable ``Storage Ring" has only 10exp-9 of the total energy released during a typical galaxy's formation, the relativistic beam can possess 10exp58 ergs. The GRB ``fireball" occurs when a target star races across the powerful beam, blowing off target material as a hot, rapidly expanding plasma cloud, simulating an explosion. Since currents in space are known to be sometimes filamentary, sharp millisecond spikes can be expected in some GRBs. Proton and alpha particle nuclear reactions produce a gamma ray beam. Beam particles impinging on denser cloud material create an electromagnetic shower, producing X-ray, optical and radio radiation. Since the Storage Ring has an intense magnetic field around it, synchrotron radiation is expected. The beam, striking a highly evolved massive target star, produces the iron emission lines. H. D. Greyber, in ``After the Dark Ages:When Galaxies

  7. Prioritizing pesticide compounds for analytical methods development

    USGS Publications Warehouse

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  8. A Functional Analytic Approach to Group Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, Luc

    2009-01-01

    This article provides a particular view on the use of Functional Analytical Psychotherapy (FAP) in a group therapy format. This view is based on the author's experiences as a supervisor of Functional Analytical Psychotherapy Groups, including groups for women with depression and groups for chronic pain patients. The contexts in which this approach…

  9. [Target and non-target screening of volatile organic compounds in industrial exhaust gas using thermal desorption-gas chromatography-mass spectrometry].

    PubMed

    Ma, Huilian; Jin, Jing; Li, Yun; Chen, Jiping

    2017-10-08

    A method of comprehensive screening of the target and non-target volatile organic compounds (VOCs) in industrial exhaust gas using thermal desorption-gas chromatography-mass spectrometry (TD-GC-MS) has been developed. In this paper, two types of solid phase adsorption column were compared, and the Tenex SS TD Tube was selected. The analytes were enriched into the adsorption tube by constant flow sampling, and detected by TD-GC-MS in full scan mode. Target compounds were quantified by internal standard method, and the quantities of non-target compounds were calculated by response coefficient of toluene. The method detection limits (MDLs) for the 24 VOCs were 1.06 to 5.44 ng, and MDLs could also be expressed as 0.004 to 0.018 mg/m 3 assuming that the sampling volume was 300 mL. The average recoveries were in the range of 78.4% to 89.4% with the relative standard deviations (RSDs) of 3.9% to 14.4% ( n =7). The established analytical method was applied for the comprehensive screening of VOCs in a waste incineration power plant in Dalian city. Twenty-nine VOCs were identified. In these compounds, only five VOCs were the target compounds set in advance, which accounted for 26.7% of the total VOCs identified. Therefore, this study further proved the importance of screening non-target compounds in the analysis of VOCs in industrial exhaust gas, and has certain reference significance for the complete determination of VOCs distribution.

  10. Analytical research on impacting load of aircraft crashing upon moveable concrete target

    NASA Astrophysics Data System (ADS)

    Zhu, Tong; Ou, Zhuocheng; Duan, Zhuoping; Huang, Fenglei

    2018-03-01

    The impact load of an aircraft impact upon moveable concrete target was analyzed in this paper by both theoretical and numerical methods. The aircraft was simplified as a one dimensional pole and stress-wave theory was used to deduce the new formula. Furthermore, aiming to compare with previous experimental data, a numerical calculation based on the new formula had been carried out which showed good agreement with the experimental data. The approach, a new formula with particular numerical method, can predict not only the impact load but also the deviation between moveable and static concrete target.

  11. Ion implantation system and process for ultrasensitive determination of target isotopes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmer, III, Orville T.; Liezers, Martin

    2016-09-13

    A system and process are disclosed for ultrasensitive determination of target isotopes of analytical interest in a sample. Target isotopes may be implanted in an implant area on a high-purity substrate to pre-concentrate the target isotopes free of contaminants. A known quantity of a tracer isotope may also be implanted. Target isotopes and tracer isotopes may be determined in a mass spectrometer. The present invention provides ultrasensitive determination of target isotopes in the sample.

  12. Validation of biomarkers to predict response to immunotherapy in cancer: Volume I - pre-analytical and analytical validation.

    PubMed

    Masucci, Giuseppe V; Cesano, Alessandra; Hawtin, Rachael; Janetzki, Sylvia; Zhang, Jenny; Kirsch, Ilan; Dobbin, Kevin K; Alvarez, John; Robbins, Paul B; Selvan, Senthamil R; Streicher, Howard Z; Butterfield, Lisa H; Thurin, Magdalena

    2016-01-01

    Immunotherapies have emerged as one of the most promising approaches to treat patients with cancer. Recently, there have been many clinical successes using checkpoint receptor blockade, including T cell inhibitory receptors such as cytotoxic T-lymphocyte-associated antigen 4 (CTLA-4) and programmed cell death-1 (PD-1). Despite demonstrated successes in a variety of malignancies, responses only typically occur in a minority of patients in any given histology. Additionally, treatment is associated with inflammatory toxicity and high cost. Therefore, determining which patients would derive clinical benefit from immunotherapy is a compelling clinical question. Although numerous candidate biomarkers have been described, there are currently three FDA-approved assays based on PD-1 ligand expression (PD-L1) that have been clinically validated to identify patients who are more likely to benefit from a single-agent anti-PD-1/PD-L1 therapy. Because of the complexity of the immune response and tumor biology, it is unlikely that a single biomarker will be sufficient to predict clinical outcomes in response to immune-targeted therapy. Rather, the integration of multiple tumor and immune response parameters, such as protein expression, genomics, and transcriptomics, may be necessary for accurate prediction of clinical benefit. Before a candidate biomarker and/or new technology can be used in a clinical setting, several steps are necessary to demonstrate its clinical validity. Although regulatory guidelines provide general roadmaps for the validation process, their applicability to biomarkers in the cancer immunotherapy field is somewhat limited. Thus, Working Group 1 (WG1) of the Society for Immunotherapy of Cancer (SITC) Immune Biomarkers Task Force convened to address this need. In this two volume series, we discuss pre-analytical and analytical (Volume I) as well as clinical and regulatory (Volume II) aspects of the validation process as applied to predictive biomarkers

  13. Spatiotemporal and geometric optimization of sensor arrays for detecting analytes in fluids

    DOEpatents

    Lewis, Nathan S [La Canada, CA; Freund, Michael S [Winnipeg, CA; Briglin, Shawn S [Chittenango, NY; Tokumaru, Phillip [Moorpark, CA; Martin, Charles R [Gainesville, FL; Mitchell, David [Newtown, PA

    2009-09-29

    Sensor arrays and sensor array systems for detecting analytes in fluids. Sensors configured to generate a response upon introduction of a fluid containing one or more analytes can be located on one or more surfaces relative to one or more fluid channels in an array. Fluid channels can take the form of pores or holes in a substrate material. Fluid channels can be formed between one or more substrate plates. Sensor can be fabricated with substantially optimized sensor volumes to generate a response having a substantially maximized signal to noise ratio upon introduction of a fluid containing one or more target analytes. Methods of fabricating and using such sensor arrays and systems are also disclosed.

  14. Recent Developments in the VISRAD 3-D Target Design and Radiation Simulation Code

    NASA Astrophysics Data System (ADS)

    Macfarlane, Joseph; Golovkin, Igor; Sebald, James

    2017-10-01

    The 3-D view factor code VISRAD is widely used in designing HEDP experiments at major laser and pulsed-power facilities, including NIF, OMEGA, OMEGA-EP, ORION, Z, and LMJ. It simulates target designs by generating a 3-D grid of surface elements, utilizing a variety of 3-D primitives and surface removal algorithms, and can be used to compute the radiation flux throughout the surface element grid by computing element-to-element view factors and solving power balance equations. Target set-up and beam pointing are facilitated by allowing users to specify positions and angular orientations using a variety of coordinates systems (e.g., that of any laser beam, target component, or diagnostic port). Analytic modeling for laser beam spatial profiles for OMEGA DPPs and NIF CPPs is used to compute laser intensity profiles throughout the grid of surface elements. VISRAD includes a variety of user-friendly graphics for setting up targets and displaying results, can readily display views from any point in space, and can be used to generate image sequences for animations. We will discuss recent improvements to conveniently assess beam capture on target and beam clearance of diagnostic components, as well as plans for future developments.

  15. Subpixel target detection and enhancement in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Tiwari, K. C.; Arora, M.; Singh, D.

    2011-06-01

    Hyperspectral data due to its higher information content afforded by higher spectral resolution is increasingly being used for various remote sensing applications including information extraction at subpixel level. There is however usually a lack of matching fine spatial resolution data particularly for target detection applications. Thus, there always exists a tradeoff between the spectral and spatial resolutions due to considerations of type of application, its cost and other associated analytical and computational complexities. Typically whenever an object, either manmade, natural or any ground cover class (called target, endmembers, components or class) gets spectrally resolved but not spatially, mixed pixels in the image result. Thus, numerous manmade and/or natural disparate substances may occur inside such mixed pixels giving rise to mixed pixel classification or subpixel target detection problems. Various spectral unmixing models such as Linear Mixture Modeling (LMM) are in vogue to recover components of a mixed pixel. Spectral unmixing outputs both the endmember spectrum and their corresponding abundance fractions inside the pixel. It, however, does not provide spatial distribution of these abundance fractions within a pixel. This limits the applicability of hyperspectral data for subpixel target detection. In this paper, a new inverse Euclidean distance based super-resolution mapping method has been presented that achieves subpixel target detection in hyperspectral images by adjusting spatial distribution of abundance fraction within a pixel. Results obtained at different resolutions indicate that super-resolution mapping may effectively aid subpixel target detection.

  16. Optimal starting conditions for the rendezvous maneuver: Analytical and computational approach

    NASA Astrophysics Data System (ADS)

    Ciarcia, Marco

    by the optimal trajectory. For the guidance trajectory, because of the replacement of the variable thrust direction of the powered subarc with a constant thrust direction, the optimal control problem degenerates into a mathematical programming problem with a relatively small number of degrees of freedom, more precisely: three for case (i) time-to-rendezvous free and two for case (ii) time-to-rendezvous given. In particular, we consider the rendezvous between the Space Shuttle (chaser) and the International Space Station (target). Once a given initial distance SS-to-ISS is preselected, the present work supplies not only the best initial conditions for the rendezvous trajectory, but simultaneously the corresponding final conditions for the ascent trajectory. In Part B, an analytical solution of the Clohessy-Wiltshire equations is presented (i) neglecting the change of the spacecraft mass due to the fuel consumption and (ii) and assuming that the thrust is finite, that is, the trajectory includes powered subarcs flown with max thrust and coasting subarc flown with zero thrust. Then, employing the found analytical solution, we study the rendezvous problem under the assumption that the initial separation coordinates and initial separation velocities are free except for the requirement that the initial chaser-to-target distance is given. The main contribution of Part B is the development of analytical solutions for the powered subarcs, an important extension of the analytical solutions already available for the coasting subarcs. One consequence is that the entire optimal trajectory can be described analytically. Another consequence is that the optimal control problems degenerate into mathematical programming problems. A further consequence is that, vis-a-vis the optimal control formulation, the mathematical programming formulation reduces the CPU time by a factor of order 1000. Key words. Space trajectories, rendezvous, optimization, guidance, optimal control, calculus of

  17. Reliable detection of Bacillus anthracis, Francisella tularensis and Yersinia pestis by using multiplex qPCR including internal controls for nucleic acid extraction and amplification.

    PubMed

    Janse, Ingmar; Hamidjaja, Raditijo A; Bok, Jasper M; van Rotterdam, Bart J

    2010-12-08

    Several pathogens could seriously affect public health if not recognized timely. To reduce the impact of such highly pathogenic micro-organisms, rapid and accurate diagnostic tools are needed for their detection in various samples, including environmental samples. Multiplex real-time PCRs were designed for rapid and reliable detection of three major pathogens that have the potential to cause high morbidity and mortality in humans: B. anthracis, F. tularensis and Y. pestis. The developed assays detect three pathogen-specific targets, including at least one chromosomal target, and one target from B. thuringiensis which is used as an internal control for nucleic acid extraction from refractory spores as well as successful DNA amplification. Validation of the PCRs showed a high analytical sensitivity, specificity and coverage of diverse pathogen strains. The multiplex qPCR assays that were developed allow the rapid detection of 3 pathogen-specific targets simultaneously, without compromising sensitivity. The application of B. thuringiensis spores as internal controls further reduces false negative results. This ensures highly reliable detection, while template consumption and laboratory effort are kept at a minimum.

  18. Reliable detection of Bacillus anthracis, Francisella tularensis and Yersinia pestis by using multiplex qPCR including internal controls for nucleic acid extraction and amplification

    PubMed Central

    2010-01-01

    Background Several pathogens could seriously affect public health if not recognized timely. To reduce the impact of such highly pathogenic micro-organisms, rapid and accurate diagnostic tools are needed for their detection in various samples, including environmental samples. Results Multiplex real-time PCRs were designed for rapid and reliable detection of three major pathogens that have the potential to cause high morbidity and mortality in humans: B. anthracis, F. tularensis and Y. pestis. The developed assays detect three pathogen-specific targets, including at least one chromosomal target, and one target from B. thuringiensis which is used as an internal control for nucleic acid extraction from refractory spores as well as successful DNA amplification. Validation of the PCRs showed a high analytical sensitivity, specificity and coverage of diverse pathogen strains. Conclusions The multiplex qPCR assays that were developed allow the rapid detection of 3 pathogen-specific targets simultaneously, without compromising sensitivity. The application of B. thuringiensis spores as internal controls further reduces false negative results. This ensures highly reliable detection, while template consumption and laboratory effort are kept at a minimum PMID:21143837

  19. Fabrication strategies, sensing modes and analytical applications of ratiometric electrochemical biosensors.

    PubMed

    Jin, Hui; Gui, Rijun; Yu, Jianbo; Lv, Wei; Wang, Zonghua

    2017-05-15

    Previously developed electrochemical biosensors with single-electric signal output are probably affected by intrinsic and extrinsic factors. In contrast, the ratiometric electrochemical biosensors (RECBSs) with dual-electric signal outputs have an intrinsic built-in correction to the effects from system or background electric signals, and therefore exhibit a significant potential to improve the accuracy and sensitivity in electrochemical sensing applications. In this review, we systematically summarize the fabrication strategies, sensing modes and analytical applications of RECBSs. First, the different fabrication strategies of RECBSs were introduced, referring to the analytes-induced single- and dual-dependent electrochemical signal strategies for RECBSs. Second, the different sensing modes of RECBSs were illustrated, such as differential pulse voltammetry, square wave voltammetry, cyclic voltammetry, alternating current voltammetry, electrochemiluminescence, and so forth. Third, the analytical applications of RECBSs were discussed based on the types of target analytes. Finally, the forthcoming development and future prospects in the research field of RECBSs were also highlighted. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Nine-analyte detection using an array-based biosensor

    NASA Technical Reports Server (NTRS)

    Taitt, Chris Rowe; Anderson, George P.; Lingerfelt, Brian M.; Feldstein, s. Mark. J.; Ligler, Frances S.

    2002-01-01

    A fluorescence-based multianalyte immunosensor has been developed for simultaneous analysis of multiple samples. While the standard 6 x 6 format of the array sensor has been used to analyze six samples for six different analytes, this same format has the potential to allow a single sample to be tested for 36 different agents. The method described herein demonstrates proof of principle that the number of analytes detectable using a single array can be increased simply by using complementary mixtures of capture and tracer antibodies. Mixtures were optimized to allow detection of closely related analytes without significant cross-reactivity. Following this facile modification of patterning and assay procedures, the following nine targets could be detected in a single 3 x 3 array: Staphylococcal enterotoxin B, ricin, cholera toxin, Bacillus anthracis Sterne, Bacillus globigii, Francisella tularensis LVS, Yersiniapestis F1 antigen, MS2 coliphage, and Salmonella typhimurium. This work maximizes the efficiency and utility of the described array technology, increasing only reagent usage and cost; production and fabrication costs are not affected.

  1. Analytical Sociology: A Bungean Appreciation

    NASA Astrophysics Data System (ADS)

    Wan, Poe Yu-ze

    2012-10-01

    Analytical sociology, an intellectual project that has garnered considerable attention across a variety of disciplines in recent years, aims to explain complex social processes by dissecting them, accentuating their most important constituent parts, and constructing appropriate models to understand the emergence of what is observed. To achieve this goal, analytical sociologists demonstrate an unequivocal focus on the mechanism-based explanation grounded in action theory. In this article I attempt a critical appreciation of analytical sociology from the perspective of Mario Bunge's philosophical system, which I characterize as emergentist systemism. I submit that while the principles of analytical sociology and those of Bunge's approach share a lot in common, the latter brings to the fore the ontological status and explanatory importance of supra-individual actors (as concrete systems endowed with emergent causal powers) and macro-social mechanisms (as processes unfolding in and among social systems), and therefore it does not stipulate that every causal explanation of social facts has to include explicit references to individual-level actors and mechanisms. In this sense, Bunge's approach provides a reasonable middle course between the Scylla of sociological reification and the Charybdis of ontological individualism, and thus serves as an antidote to the untenable "strong program of microfoundations" to which some analytical sociologists are committed.

  2. Analytical and molecular dynamics studies on the impact loading of single-layered graphene sheet by fullerene

    NASA Astrophysics Data System (ADS)

    Hosseini-Hashemi, Shahrokh; Sepahi-Boroujeni, Amin; Sepahi-Boroujeni, Saeid

    2018-04-01

    Normal impact performance of a system including a fullerene molecule and a single-layered graphene sheet is studied in the present paper. Firstly, through a mathematical approach, a new contact law is derived to describe the overall non-bonding interaction forces of the "hollow indenter-target" system. Preliminary verifications show that the derived contact law gives a reliable picture of force field of the system which is in good agreements with the results of molecular dynamics (MD) simulations. Afterwards, equation of the transversal motion of graphene sheet is utilized on the basis of both the nonlocal theory of elasticity and the assumptions of classical plate theory. Then, to derive dynamic behavior of the system, a set including the proposed contact law and the equations of motion of both graphene sheet and fullerene molecule is solved numerically. In order to evaluate outcomes of this method, the problem is modeled by MD simulation. Despite intrinsic differences between analytical and MD methods as well as various errors arise due to transient nature of the problem, acceptable agreements are established between analytical and MD outcomes. As a result, the proposed analytical method can be reliably used to address similar impact problems. Furthermore, it is found that a single-layered graphene sheet is capable of trapping fullerenes approaching with low velocities. Otherwise, in case of rebound, the sheet effectively absorbs predominant portion of fullerene energy.

  3. Target design for materials processing very far from equilibrium

    NASA Astrophysics Data System (ADS)

    Barnard, John J.; Schenkel, Thomas

    2016-10-01

    Local heating and electronic excitations can trigger phase transitions or novel material states that can be stabilized by rapid quenching. An example on the few nanometer scale are phase transitions induced by the passage of swift heavy ions in solids where nitrogen-vacancy color centers form locally in diamonds when ions heat the diamond matrix to warm dense matter conditions at 0.5 eV. We optimize mask geometries for target materials such as silicon and diamond to induce phase transitions by intense ion pulses (e. g. from NDCX-II or from laser-plasma acceleration). The goal is to rapidly heat a solid target volumetrically and to trigger a phase transition or local lattice reconstruction followed by rapid cooling. The stabilized phase can then be studied ex situ. We performed HYDRA simulations that calculate peak temperatures for a series of excitation conditions and cooling rates of crystal targets with micro-structured masks. A simple analytical model, that includes ion heating and radial, diffusive cooling, was developed that agrees closely with the HYDRA simulations. The model gives scaling laws that can guide the design of targets over a wide range of parameters including those for NDCX-II and the proposed BELLA-i. This work was performed under the auspices of the U.S. DOE under contracts DE-AC52-07NA27344 (LLNL), DE-AC02-05CH11231 (LBNL) and was supported by the US DOE Office of Science, Fusion Energy Sciences. LLNL-ABS-697271.

  4. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. Insight solutions are correct more often than analytic solutions

    PubMed Central

    Salvi, Carola; Bricolo, Emanuela; Kounios, John; Bowden, Edward; Beeman, Mark

    2016-01-01

    How accurate are insights compared to analytical solutions? In four experiments, we investigated how participants’ solving strategies influenced their solution accuracies across different types of problems, including one that was linguistic, one that was visual and two that were mixed visual-linguistic. In each experiment, participants’ self-judged insight solutions were, on average, more accurate than their analytic ones. We hypothesised that insight solutions have superior accuracy because they emerge into consciousness in an all-or-nothing fashion when the unconscious solving process is complete, whereas analytic solutions can be guesses based on conscious, prematurely terminated, processing. This hypothesis is supported by the finding that participants’ analytic solutions included relatively more incorrect responses (i.e., errors of commission) than timeouts (i.e., errors of omission) compared to their insight responses. PMID:27667960

  6. Analytic materials

    PubMed Central

    2016-01-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations. PMID:27956882

  7. SuperTarget goes quantitative: update on drug–target interactions

    PubMed Central

    Hecker, Nikolai; Ahmed, Jessica; von Eichborn, Joachim; Dunkel, Mathias; Macha, Karel; Eckert, Andreas; Gilson, Michael K.; Bourne, Philip E.; Preissner, Robert

    2012-01-01

    There are at least two good reasons for the on-going interest in drug–target interactions: first, drug-effects can only be fully understood by considering a complex network of interactions to multiple targets (so-called off-target effects) including metabolic and signaling pathways; second, it is crucial to consider drug-target-pathway relations for the identification of novel targets for drug development. To address this on-going need, we have developed a web-based data warehouse named SuperTarget, which integrates drug-related information associated with medical indications, adverse drug effects, drug metabolism, pathways and Gene Ontology (GO) terms for target proteins. At present, the updated database contains >6000 target proteins, which are annotated with >330 000 relations to 196 000 compounds (including approved drugs); the vast majority of interactions include binding affinities and pointers to the respective literature sources. The user interface provides tools for drug screening and target similarity inclusion. A query interface enables the user to pose complex queries, for example, to find drugs that target a certain pathway, interacting drugs that are metabolized by the same cytochrome P450 or drugs that target proteins within a certain affinity range. SuperTarget is available at http://bioinformatics.charite.de/supertarget. PMID:22067455

  8. Quantitative targeted and retrospective data analysis of relevant pesticides, antibiotics and mycotoxins in bakery products by liquid chromatography-single-stage Orbitrap mass spectrometry.

    PubMed

    De Dominicis, Emiliano; Commissati, Italo; Gritti, Elisa; Catellani, Dante; Suman, Michele

    2015-01-01

    In addition to 'traditional' multi-residue and multi-contaminant multiple reaction monitoring (MRM) mass spectrometric techniques devoted to quantifying a list of targeted compounds, the global food industry requires non-targeted methods capable of detecting other possible potentially hazardous compounds. Ultra-high-performance liquid chromatography combined with a single-stage Orbitrap high-resolution mass spectrometer (UHPLC-HRMS Exactive™-Orbitrap Technology) was successfully exploited for the complete selective and quantitative determination of 33 target compounds within three major cross categories (pesticides, antibiotics and mycotoxins) in bakery matrices (specifically milk, wheat flour and mini-cakes). Resolution was set at 50 000 full width at half maximum (FWHM) to achieve the right compromise between an adequate scan speed and selectivity, allowing for the limitations related to the necessary generic sample preparation approach. An exact mass with tolerance of 5 ppm and minimum peak threshold of 10 000 units were fixed as the main identification conditions, including retention time and isotopic pattern as additional criteria devoted to greatly reducing the risk of false-positive findings. The full validation for all the target analytes was performed: linearity, intermediate repeatability and recovery (28 analytes within 70-120%) were positively assessed; furthermore, limits of quantification between 5 and 100 µg kg(-1) (with most of the analytes having a limit of detection below 6 µg kg(-1)) indicate good performance, which is compatible with almost all the regulatory needs. Naturally contaminated and fortified mini-cakes, prepared through combined use of industrial and pilot plant production lines, were analysed at two different concentration levels, obtaining good overall quantitative results and providing preliminary indications of the potential of full-scan HRMS cluster analysis. The effectiveness of this analytical approach was also tested in

  9. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  10. Ultrasonic analyte concentration and application in flow cytometry

    DOEpatents

    Kaduchak, Gregory; Goddard, Greg; Salzman, Gary; Sinha, Dipen; Martin, John C.; Kwiatkowski, Christopher; Graves, Steven

    2014-07-22

    The present invention includes an apparatus and corresponding method for concentrating analytes within a fluid flowing through a tube using acoustic radiation pressure. The apparatus includes a function generator that outputs a radio frequency electrical signal to a transducer that transforms the radio frequency electric signal to an acoustic signal and couples the acoustic signal to the tube. The acoustic signal is converted within the tube to acoustic pressure that concentrates the analytes within the fluid.

  11. Ultrasonic analyte concentration and application in flow cytometry

    DOEpatents

    Kaduchak, Gregory [Los Alamos, NM; Goddard, Greg [Los Alamos, NM; Salzman, Gary [White Rock, NM; Sinha, Dipen [Los Alamos, NM; Martin, John C [Los Alamos, NM; Kwiatkowski, Christopher [Los Alamos, NM; Graves, Steven [San Juan Pueblo, NM

    2008-03-11

    The present invention includes an apparatus and corresponding method for concentrating analytes within a fluid flowing through a tube using acoustic radiation pressure. The apparatus includes a function generator that outputs a radio frequency electrical signal to a transducer that transforms the radio frequency electric signal to an acoustic signal and couples the acoustic signal to the tube. The acoustic signal is converted within the tube to acoustic pressure that concentrates the analytes within the fluid.

  12. Ultrasonic analyte concentration and application in flow cytometry

    DOEpatents

    Kaduchak, Gregory; Goddard, Greg; Salzman, Gary; Sinha, Dipen; Martin, John C.; Kwiatkowski, Christopher; Graves, Steven

    2015-07-07

    The present invention includes an apparatus and corresponding method for concentrating analytes within a fluid flowing through a tube using acoustic radiation pressure. The apparatus includes a function generator that outputs a radio frequency electrical signal to a transducer that transforms the radio frequency electric signal to an acoustic signal and couples the acoustic signal to the tube. The acoustic signal is converted within the tube to acoustic pressure that concentrates the analytes within the fluid.

  13. Modeling pressure rise in gas targets

    NASA Astrophysics Data System (ADS)

    Jahangiri, P.; Lapi, S. E.; Publicover, J.; Buckley, K.; Martinez, D. M.; Ruth, T. J.; Hoehr, C.

    2017-05-01

    The purpose of this work is to introduce a universal mathematical model to explain a gas target behaviour at steady-state time scale. To obtain our final goal, an analytical model is proposed to study the pressure rise in the targets used to produce medical isotopes on low-energy cyclotrons. The model is developed based on the assumption that during irradiation the system reaches steady-state. The model is verified by various experiments performed at different beam currents, gas type, and initial pressures at 13 MeV cyclotron at TRIUMF. Excellent agreement is achieved.

  14. Atmospheric electromagnetic pulse propagation effects from thick targets in a terawatt laser target chamber.

    PubMed

    Remo, John L; Adams, Richard G; Jones, Michael C

    2007-08-20

    Generation and effects of atmospherically propagated electromagnetic pulses (EMPs) initiated by photoelectrons ejected by the high density and temperature target surface plasmas from multiterawatt laser pulses are analyzed. These laser radiation pulse interactions can significantly increase noise levels, thereby obscuring data (sometimes totally) and may even damage sensitive probe and detection instrumentation. Noise effects from high energy density (approximately multiterawatt) laser pulses (approximately 300-400 ps pulse widths) interacting with thick approximately 1 mm) metallic and dielectric solid targets and dielectric-metallic powder mixtures are interpreted as transient resonance radiation associated with surface charge fluctuations on the target chamber that functions as a radiating antenna. Effective solutions that minimize atmospheric EMP effects on internal and proximate electronic and electro-optical equipment external to the system based on systematic measurements using Moebius loop antennas, interpretations of signal periodicities, and dissipation indicators determining transient noise origin characteristics from target emissions are described. Analytic models for the effect of target chamber resonances and associated noise current and temperature in a probe diode laser are described.

  15. Atmospheric electromagnetic pulse propagation effects from thick targets in a terawatt laser target chamber

    NASA Astrophysics Data System (ADS)

    Remo, John L.; Adams, Richard G.; Jones, Michael C.

    2007-08-01

    Generation and effects of atmospherically propagated electromagnetic pulses (EMPs) initiated by photoelectrons ejected by the high density and temperature target surface plasmas from multiterawatt laser pulses are analyzed. These laser radiation pulse interactions can significantly increase noise levels, thereby obscuring data (sometimes totally) and may even damage sensitive probe and detection instrumentation. Noise effects from high energy density (approximately multiterawatt) laser pulses (˜300-400 ps pulse widths) interacting with thick (˜1 mm) metallic and dielectric solid targets and dielectric-metallic powder mixtures are interpreted as transient resonance radiation associated with surface charge fluctuations on the target chamber that functions as a radiating antenna. Effective solutions that minimize atmospheric EMP effects on internal and proximate electronic and electro-optical equipment external to the system based on systematic measurements using Moebius loop antennas, interpretations of signal periodicities, and dissipation indicators determining transient noise origin characteristics from target emissions are described. Analytic models for the effect of target chamber resonances and associated noise current and temperature in a probe diode laser are described.

  16. Atmospheric electromagnetic pulse propagation effects from thick targets in a terawatt laser target chamber

    DOE PAGES

    Remo, John L.; Adams, Richard G.; Jones, Michael C.

    2007-08-16

    Generation and effects of atmospherically propagated electromagnetic pulses (EMPs) initiated by photoelectrons ejected by the high density and temperature target surface plasmas from multiterawatt laser pulses are analyzed. These laser radiation pulse interactions can significantly increase noise levels, thereby obscuring data (sometimes totally) and may even damage sensitive probe and detection instrumentation. Noise effects from high energy density (approximately multiterawatt) laser pulses (~300–400 ps pulse widths) interacting with thick (~1 mm) metallic and dielectric solid targets and dielectric–metallic powder mixtures are interpreted as transient resonance radiation associated with surface charge fluctuations on the target chamber that functions as a radiatingmore » antenna. Effective solutions that minimize atmospheric EMP effects on internal and proximate electronic and electro-optical equipment external to the system based on systematic measurements using Moebius loop antennas, interpretations of signal periodicities, and dissipation indicators determining transient noise origin characteristics from target emissions are described. Analytic models for the effect of target chamber resonances and associated noise current and temperature in a probe diode laser are described.« less

  17. Analysis of Pre-Analytic Factors Affecting the Success of Clinical Next-Generation Sequencing of Solid Organ Malignancies.

    PubMed

    Chen, Hui; Luthra, Rajyalakshmi; Goswami, Rashmi S; Singh, Rajesh R; Roy-Chowdhuri, Sinchita

    2015-08-28

    Application of next-generation sequencing (NGS) technology to routine clinical practice has enabled characterization of personalized cancer genomes to identify patients likely to have a response to targeted therapy. The proper selection of tumor sample for downstream NGS based mutational analysis is critical to generate accurate results and to guide therapeutic intervention. However, multiple pre-analytic factors come into play in determining the success of NGS testing. In this review, we discuss pre-analytic requirements for AmpliSeq PCR-based sequencing using Ion Torrent Personal Genome Machine (PGM) (Life Technologies), a NGS sequencing platform that is often used by clinical laboratories for sequencing solid tumors because of its low input DNA requirement from formalin fixed and paraffin embedded tissue. The success of NGS mutational analysis is affected not only by the input DNA quantity but also by several other factors, including the specimen type, the DNA quality, and the tumor cellularity. Here, we review tissue requirements for solid tumor NGS based mutational analysis, including procedure types, tissue types, tumor volume and fraction, decalcification, and treatment effects.

  18. High-current fast electron beam propagation in a dielectric target.

    PubMed

    Klimo, Ondrej; Tikhonchuk, V T; Debayle, A

    2007-01-01

    Recent experiments demonstrate an efficient transformation of high intensity laser pulse into a relativistic electron beam with a very high current density exceeding 10(12) A cm(-2). The propagation of such a beam inside the target is possible if its current is neutralized. This phenomenon is not well understood, especially in dielectric targets. In this paper, we study the propagation of high current density electron beam in a plastic target using a particle-in-cell simulation code. The code includes both ionization of the plastic and collisions of newborn electrons. The numerical results are compared with a relatively simple analytical model and a reasonable agreement is found. The temporal evolution of the beam velocity distribution, the spatial density profile, and the propagation velocity of the ionization front are analyzed and their dependencies on the beam density and energy are discussed. The beam energy losses are mainly due to the target ionization induced by the self-generated electric field and the return current. For the highest beam density, a two-stream instability is observed to develop in the plasma behind the ionization front and it contributes to the beam energy losses.

  19. Robustness and fragility in coupled oscillator networks under targeted attacks.

    PubMed

    Yuan, Tianyu; Aihara, Kazuyuki; Tanaka, Gouhei

    2017-01-01

    The dynamical tolerance of coupled oscillator networks against local failures is studied. As the fraction of failed oscillator nodes gradually increases, the mean oscillation amplitude in the entire network decreases and then suddenly vanishes at a critical fraction as a phase transition. This critical fraction, widely used as a measure of the network robustness, was analytically derived for random failures but not for targeted attacks so far. Here we derive the general formula for the critical fraction, which can be applied to both random failures and targeted attacks. We consider the effects of targeting oscillator nodes based on their degrees. First we deal with coupled identical oscillators with homogeneous edge weights. Then our theory is applied to networks with heterogeneous edge weights and to those with nonidentical oscillators. The analytical results are validated by numerical experiments. Our results reveal the key factors governing the robustness and fragility of oscillator networks.

  20. ICDA: A Platform for Intelligent Care Delivery Analytics

    PubMed Central

    Gotz, David; Stavropoulos, Harry; Sun, Jimeng; Wang, Fei

    2012-01-01

    The identification of high-risk patients is a critical component in improving patient outcomes and managing costs. This paper describes the Intelligent Care Delivery Analytics platform (ICDA), a system which enables risk assessment analytics that process large collections of dynamic electronic medical data to identify at-risk patients. ICDA works by ingesting large volumes of data into a common data model, then orchestrating a collection of analytics that identify at-risk patients. It also provides an interactive environment through which users can access and review the analytics results. In addition, ICDA provides APIs via which analytics results can be retrieved to surface in external applications. A detailed review of ICDA’s architecture is provided. Descriptions of four use cases are included to illustrate ICDA’s application within two different data environments. These use cases showcase the system’s flexibility and exemplify the types of analytics it enables. PMID:23304296

  1. Monoenergetic ion acceleration and Rayleigh-Taylor instability of the composite target irradiated by the laser pulse

    NASA Astrophysics Data System (ADS)

    Khudik, Vladimir; Yi, S. Austin; Shvets, Gennady

    2012-10-01

    Acceleration of ions in the two-specie composite target irradiated by a circularly polarized laser pulse is studied analytically and via particle-in-cell (PIC) simulations. A self-consistent analytical model of the composite target is developed. In this model, target parameters are stationary in the center of mass of the system: heavy and light ions are completely separated from each other and form two layers, while electrons are bouncing in the potential well formed by the laser ponderomotive and electrostatic potentials. They are distributed in the direction of acceleration by the Boltzmann law and over velocities by the Maxwell-Juttner law. The laser pulse interacts directly only with electrons in a thin sheath layer, and these electrons transfer the laser pressure to the target ions. In the fluid approximation it is shown, the composite target is still susceptible to the Rayleigh-Taylor instability [1]. Using PIC simulations we found the growth rate of initially seeded perturbations as a function of their wavenumber for different composite target parameters and compare it with analytical results. Useful scaling laws between this rate and laser pulse pressure and target parameters are discussed.[4pt] [1] T.P. Yu, A. Pukhov, G. Shvets, M. Chen, T. H. Ratliff, S. A. Yi, and V. Khudik, Phys. Plasmas, 18, 043110 (2011).

  2. Let's Go Off the Grid: Subsurface Flow Modeling With Analytic Elements

    NASA Astrophysics Data System (ADS)

    Bakker, M.

    2017-12-01

    Subsurface flow modeling with analytic elements has the major advantage that no grid or time stepping are needed. Analytic element formulations exist for steady state and transient flow in layered aquifers and unsaturated flow in the vadose zone. Analytic element models are vector-based and consist of points, lines and curves that represent specific features in the subsurface. Recent advances allow for the simulation of partially penetrating wells and multi-aquifer wells, including skin effect and wellbore storage, horizontal wells of poly-line shape including skin effect, sharp changes in subsurface properties, and surface water features with leaky beds. Input files for analytic element models are simple, short and readable, and can easily be generated from, for example, GIS databases. Future plans include the incorporation of analytic element in parts of grid-based models where additional detail is needed. This presentation will give an overview of advanced flow features that can be modeled, many of which are implemented in free and open-source software.

  3. MALDI-TOF MS identification of anaerobic bacteria: assessment of pre-analytical variables and specimen preparation techniques.

    PubMed

    Hsu, Yen-Michael S; Burnham, Carey-Ann D

    2014-06-01

    Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has emerged as a tool for identifying clinically relevant anaerobes. We evaluated the analytical performance characteristics of the Bruker Microflex with Biotyper 3.0 software system for identification of anaerobes and examined the impact of direct formic acid (FA) treatment and other pre-analytical factors on MALDI-TOF MS performance. A collection of 101 anaerobic bacteria were evaluated, including Clostridium spp., Propionibacterium spp., Fusobacterium spp., Bacteroides spp., and other anaerobic bacterial of clinical relevance. The results of our study indicate that an on-target extraction with 100% FA improves the rate of accurate identification without introducing misidentification (P<0.05). In addition, we modify the reporting cutoffs for the Biotyper "score" yielding acceptable identification. We found that a score of ≥1.700 can maximize the rate of identification. Of interest, MALDI-TOF MS can correctly identify anaerobes grown in suboptimal conditions, such as on selective culture media and following oxygen exposure. In conclusion, we report on a number of simple and cost-effective pre- and post-analytical modifications could enhance MALDI-TOF MS identification for anaerobic bacteria. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  5. Evaporative concentration on a paper-based device to concentrate analytes in a biological fluid.

    PubMed

    Wong, Sharon Y; Cabodi, Mario; Rolland, Jason; Klapperich, Catherine M

    2014-12-16

    We report the first demonstration of using heat on a paper device to rapidly concentrate a clinically relevant analyte of interest from a biological fluid. Our technology relies on the application of localized heat to a paper strip to evaporate off hundreds of microliters of liquid to concentrate the target analyte. This method can be used to enrich for a target analyte that is present at low concentrations within a biological fluid to enhance the sensitivity of downstream detection methods. We demonstrate our method by concentrating the tuberculosis-specific glycolipid, lipoarabinomannan (LAM), a promising urinary biomarker for the detection and diagnosis of tuberculosis. We show that the heat does not compromise the subsequent immunodetectability of LAM, and in 20 min, the tuberculosis biomarker was concentrated by nearly 20-fold in simulated urine. Our method requires only 500 mW of power, and sample flow is self-driven via capillary action. As such, our technology can be readily integrated into portable, battery-powered, instrument-free diagnostic devices intended for use in low-resource settings.

  6. Analytical performance evaluation of SAR ATR with inaccurate or estimated models

    NASA Astrophysics Data System (ADS)

    DeVore, Michael D.

    2004-09-01

    Hypothesis testing algorithms for automatic target recognition (ATR) are often formulated in terms of some assumed distribution family. The parameter values corresponding to a particular target class together with the distribution family constitute a model for the target's signature. In practice such models exhibit inaccuracy because of incorrect assumptions about the distribution family and/or because of errors in the assumed parameter values, which are often determined experimentally. Model inaccuracy can have a significant impact on performance predictions for target recognition systems. Such inaccuracy often causes model-based predictions that ignore the difference between assumed and actual distributions to be overly optimistic. This paper reports on research to quantify the effect of inaccurate models on performance prediction and to estimate the effect using only trained parameters. We demonstrate that for large observation vectors the class-conditional probabilities of error can be expressed as a simple function of the difference between two relative entropies. These relative entropies quantify the discrepancies between the actual and assumed distributions and can be used to express the difference between actual and predicted error rates. Focusing on the problem of ATR from synthetic aperture radar (SAR) imagery, we present estimators of the probabilities of error in both ideal and plug-in tests expressed in terms of the trained model parameters. These estimators are defined in terms of unbiased estimates for the first two moments of the sample statistic. We present an analytical treatment of these results and include demonstrations from simulated radar data.

  7. SU-E-T-422: Fast Analytical Beamlet Optimization for Volumetric Intensity-Modulated Arc Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan, Kenny S K; Lee, Louis K Y; Xing, L

    2015-06-15

    Purpose: To implement a fast optimization algorithm on CPU/GPU heterogeneous computing platform and to obtain an optimal fluence for a given target dose distribution from the pre-calculated beamlets in an analytical approach. Methods: The 2D target dose distribution was modeled as an n-dimensional vector and estimated by a linear combination of independent basis vectors. The basis set was composed of the pre-calculated beamlet dose distributions at every 6 degrees of gantry angle and the cost function was set as the magnitude square of the vector difference between the target and the estimated dose distribution. The optimal weighting of the basis,more » which corresponds to the optimal fluence, was obtained analytically by the least square method. Those basis vectors with a positive weighting were selected for entering into the next level of optimization. Totally, 7 levels of optimization were implemented in the study.Ten head-and-neck and ten prostate carcinoma cases were selected for the study and mapped to a round water phantom with a diameter of 20cm. The Matlab computation was performed in a heterogeneous programming environment with Intel i7 CPU and NVIDIA Geforce 840M GPU. Results: In all selected cases, the estimated dose distribution was in a good agreement with the given target dose distribution and their correlation coefficients were found to be in the range of 0.9992 to 0.9997. Their root-mean-square error was monotonically decreasing and converging after 7 cycles of optimization. The computation took only about 10 seconds and the optimal fluence maps at each gantry angle throughout an arc were quickly obtained. Conclusion: An analytical approach is derived for finding the optimal fluence for a given target dose distribution and a fast optimization algorithm implemented on the CPU/GPU heterogeneous computing environment greatly reduces the optimization time.« less

  8. A Lecture Supporting System Based on Real-Time Learning Analytics

    ERIC Educational Resources Information Center

    Shimada, Atsushi; Konomi, Shin'ichi

    2017-01-01

    A new lecture supporting system based on real-time learning analytics is proposed. Our target is on-site classrooms where teachers give their lectures, and a lot of students listen to teachers' explanation, conduct exercises etc. We utilize not only an e-Learning system, but also an e-Book system to collect real-time learning activities during the…

  9. Aquatic concentrations of chemical analytes compared to ...

    EPA Pesticide Factsheets

    We describe screening level estimates of potential aquatic toxicity posed by 227 chemical analytes that were measured in 25 ambient water samples collected as part of a joint USGS/USEPA drinking water plant study. Measured concentrations were compared to biological effect concentration (EC) estimates, including USEPA aquatic life criteria, effective plasma concentrations of pharmaceuticals, published toxicity data summarized in the USEPA ECOTOX database, and chemical structure-based predictions. Potential dietary exposures were estimated using a generic 3-tiered food web accumulation scenario. For many analytes, few or no measured effect data were found, and for some analytes, reporting limits exceeded EC estimates, limiting the scope of conclusions. Results suggest occasional occurrence above ECs for copper, aluminum, strontium, lead, uranium, and nitrate. Sparse effect data for manganese, antimony, and vanadium suggest that these analytes may occur above ECs, but additional effect data would be desirable to corroborate EC estimates. These conclusions were not affected by bioaccumulation estimates. No organic analyte concentrations were found to exceed EC estimates, but ten analytes had concentrations in excess of 1/10th of their respective EC: triclocarban, norverapamil, progesterone, atrazine, metolachlor, triclosan, para-nonylphenol, ibuprofen, venlafaxine, and amitriptyline, suggesting more detailed characterization of these analytes. Purpose: to provide sc

  10. The Case for Adopting Server-side Analytics

    NASA Astrophysics Data System (ADS)

    Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.

    2017-12-01

    The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for

  11. Student Perceptions of Privacy Principles for Learning Analytics

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk; Schumacher, Clara

    2016-01-01

    The purpose of this study was to examine student perceptions of privacy principles related to learning analytics. Privacy issues for learning analytics include how personal data are collected and stored as well as how they are analyzed and presented to different stakeholders. A total of 330 university students participated in an exploratory study…

  12. Selenium- and tellurium-containing fluorescent molecular probes for the detection of biologically important analytes.

    PubMed

    Manjare, Sudesh T; Kim, Youngsam; Churchill, David G

    2014-10-21

    scientific fields. Organochalcogen (R-E-R) chemistry in such chemical recognition and supramolecular pursuits is a fundamental tool to allow chemists to explore stable organic-based probe modalities of interest to develop better spectroscopic tools for (neuro)biological applications. Chalcogen donor sites also provide sites where metals can coordinate, and facile oxidation may extend to the sulfone analogues (R-EO2-R) or beyond. Consequently, chemists can then make use of reliable reversible chemical probing platforms based on the chemical redox properties valence state switching principally from 2 to 4 (and back to 2) of selenium and tellurium atoms. The main organic molecular skeletons have involved chemical frames including boron-dipyrromethene (BODIPY) systems, extended cyanine groups, naphthalimide, rhodamine, and fluorescein cores, and isoselenazolone, pyrene, coumarin, benzoselenadiazole, and selenoguanine systems. Our group has tested many such molecular probe systems in cellular milieu and under a series of conditions and competitive environments. We have found that the most important analytes have been reactive oxygen species (ROS) such as superoxide and hypochlorite. Reactive nitrogen species (RNS) such as peroxynitrite are also potential targets. In addition, we have also considered Fenton chemistry systems. Our research and that of others shows that the action of ROS is often reversible with H2S or biothiols such as glutathione (GSH). We have also found that a second class of analytes are the thiols (RSH), in particular, biothiols. Here, the target group might involve an R-Se-Se-R group. The study of analytes also extends to metal ions, for example, Hg(2+), and anions such as fluoride (F(-)), and we have developed NIR-based systems as well. These work through various photomechanisms, including photoinduced electron transfer (PET), twisted internal charge transfer (TICT), and internal charge transfer (ICT). The growing understanding of this class of probe

  13. Analytical performance specifications for external quality assessment - definitions and descriptions.

    PubMed

    Jones, Graham R D; Albarede, Stephanie; Kesseler, Dagmar; MacKenzie, Finlay; Mammen, Joy; Pedersen, Morten; Stavelin, Anne; Thelen, Marc; Thomas, Annette; Twomey, Patrick J; Ventura, Emma; Panteghini, Mauro

    2017-06-27

    External Quality Assurance (EQA) is vital to ensure acceptable analytical quality in medical laboratories. A key component of an EQA scheme is an analytical performance specification (APS) for each measurand that a laboratory can use to assess the extent of deviation of the obtained results from the target value. A consensus conference held in Milan in 2014 has proposed three models to set APS and these can be applied to setting APS for EQA. A goal arising from this conference is the harmonisation of EQA APS between different schemes to deliver consistent quality messages to laboratories irrespective of location and the choice of EQA provider. At this time there are wide differences in the APS used in different EQA schemes for the same measurands. Contributing factors to this variation are that the APS in different schemes are established using different criteria, applied to different types of data (e.g. single data points, multiple data points), used for different goals (e.g. improvement of analytical quality; licensing), and with the aim of eliciting different responses from participants. This paper provides recommendations from the European Federation of Laboratory Medicine (EFLM) Task and Finish Group on Performance Specifications for External Quality Assurance Schemes (TFG-APSEQA) and on clear terminology for EQA APS. The recommended terminology covers six elements required to understand APS: 1) a statement on the EQA material matrix and its commutability; 2) the method used to assign the target value; 3) the data set to which APS are applied; 4) the applicable analytical property being assessed (i.e. total error, bias, imprecision, uncertainty); 5) the rationale for the selection of the APS; and 6) the type of the Milan model(s) used to set the APS. The terminology is required for EQA participants and other interested parties to understand the meaning of meeting or not meeting APS.

  14. Threat Assessment and Targeted Violence at Institutions of Higher Education: Implications for Policy and Practice Including Unique Considerations for Community Colleges

    ERIC Educational Resources Information Center

    Bennett, Laura; Bates, Michael

    2015-01-01

    This article provides an overview of the research on targeted violence, including campus violence, and the implications for policy and practice at institutions of higher education. Unique challenges of threat assessment in the community college setting are explored, and an overview of an effective threat assessment policy and team at William…

  15. -Omic and Electronic Health Record Big Data Analytics for Precision Medicine.

    PubMed

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D

    2017-02-01

    Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.

  16. Satellite recovery - Attitude dynamics of the targets

    NASA Technical Reports Server (NTRS)

    Cochran, J. E., Jr.; Lahr, B. S.

    1986-01-01

    The problems of categorizing and modeling the attitude dynamics of uncontrolled artificial earth satellites which may be targets in recovery attempts are addressed. Methods of classification presented are based on satellite rotational kinetic energy, rotational angular momentum and orbit and on the type of control present prior to the benign failure of the control system. The use of approximate analytical solutions and 'exact' numerical solutions to the equations governing satellite attitude motions to predict uncontrolled attitude motion is considered. Analytical and numerical results are presented for the evolution of satellite attitude motions after active control termination.

  17. Neoclassical transport including collisional nonlinearity.

    PubMed

    Candy, J; Belli, E A

    2011-06-10

    In the standard δf theory of neoclassical transport, the zeroth-order (Maxwellian) solution is obtained analytically via the solution of a nonlinear equation. The first-order correction δf is subsequently computed as the solution of a linear, inhomogeneous equation that includes the linearized Fokker-Planck collision operator. This equation admits analytic solutions only in extreme asymptotic limits (banana, plateau, Pfirsch-Schlüter), and so must be solved numerically for realistic plasma parameters. Recently, numerical codes have appeared which attempt to compute the total distribution f more accurately than in the standard ordering by retaining some nonlinear terms related to finite-orbit width, while simultaneously reusing some form of the linearized collision operator. In this work we show that higher-order corrections to the distribution function may be unphysical if collisional nonlinearities are ignored.

  18. Using Google Analytics to evaluate the impact of the CyberTraining project.

    PubMed

    McGuckin, Conor; Crowley, Niall

    2012-11-01

    A focus on results and impact should be at the heart of every project's approach to research and dissemination. This article discusses the potential of Google Analytics (GA: http://google.com/analytics ) as an effective resource for measuring the impact of academic research output and understanding the geodemographics of users of specific Web 2.0 content (e.g., intervention and prevention materials, health promotion and advice). This article presents the results of GA analyses as a resource used in measuring the impact of the EU-funded CyberTraining project, which provided a well-grounded, research-based training manual on cyberbullying for trainers through the medium of a Web-based eBook ( www.cybertraining-project.org ). The training manual includes review information on cyberbullying, its nature and extent across Europe, analyses of current projects, and provides resources for trainers working with the target groups of pupils, parents, teachers, and other professionals. Results illustrate the promise of GA as an effective tool for measuring the impact of academic research and project output with real potential for tracking and understanding intra- and intercountry regional variations in the uptake of prevention and intervention materials, thus enabling precision focusing of attention to those regions.

  19. Echo scintillation Index affected by cat-eye target's caliber with Cassegrain lens

    NASA Astrophysics Data System (ADS)

    Shan, Cong-miao; Sun, Hua-yan; Zhao, Yan-zhong; Zheng, Yong-hui

    2015-10-01

    The optical aperture of cat-eye target has the aperture averaging effect to the active detecting laser of active laser detection system, which can be used to identify optical targets. The echo scintillation characteristics of the transmission-type lens target have been studied in previous work. Discussing the differences of the echo scintillation characteristics between the transmission-type lens target and Cassegrain lens target can be helpful to targets classified. In this paper, the echo scintillation characteristics of Cat-eye target's caliber with Cassegrain lens has been discussed . By using the flashing theory of spherical wave in the weak atmospheric turbulence, the annular aperture filter function and the Kolmogorov power spectrum, the analytic expression of the scintillation index of the cat-eye target echo of the horizontal path two-way transmission was given when the light is normal incidence. Then the impact of turbulence inner and outer scale to the echo scintillation index and the analytic expression of the echo scintillation index at the receiving aperture were presented using the modified Hill spectrum and the modified Von Karman spectrum. Echo scintillation index shows the tendency of decreasing with the target aperture increases and different ratios of the inner and outer aperture diameter show the different echo scintillation index curves. This conclusion has a certain significance for target recognition in the active laser detection system that can largely determine the target type by largely determining the scope of the cat-eye target which depending on echo scintillation index.

  20. Analytical Ultrasonics in Materials Research and Testing

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1986-01-01

    Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.

  1. Analytical Performance of Four Polymerase Chain Reaction (PCR) and Real Time PCR (qPCR) Assays for the Detection of Six Leishmania Species DNA in Colombia.

    PubMed

    León, Cielo M; Muñoz, Marina; Hernández, Carolina; Ayala, Martha S; Flórez, Carolina; Teherán, Aníbal; Cubides, Juan R; Ramírez, Juan D

    2017-01-01

    Leishmaniasis comprises a spectrum of parasitic diseases caused by protozoans of the genus Leishmania . Molecular tools have been widely employed for the detection of Leishmania due to its high sensitivity and specificity. However, the analytical performance of molecular platforms as PCR and real time PCR (qPCR) including a wide variety of molecular markers has never been evaluated. Herein, the aim was to evaluate the analytical performance of 4 PCR-based assays (designed on four different targets) and applied on conventional and real-time PCR platforms. We evaluated the analytical performance of conventional PCR and real time PCR, determining exclusivity and inclusivity, Anticipated Reportable Range (ARR), limit of detection (LoD) and accuracy using primers directed to kDNA, HSP70, 18S and ITS-1 targets. We observed that the kDNA was the most sensitive but does not meet the criterion of exclusivity. The HSP70 presented a higher LoD in conventional PCR and qPCR in comparison with the other markers (1 × 10 1 and 1 × 10 -1 equivalent parasites/mL respectively) and had a higher coefficient of variation in qPCR. No statistically significant differences were found between the days of the test with the four molecular markers. The present study revealed that the 18S marker presented the best performance in terms of analytical sensitivity and specificity for the qPCR in the species tested (species circulating in Colombia). Therefore, we recommend to explore the analytical and diagnostic performance in future studies using a broader number of species across America.

  2. Analytical Performance of Four Polymerase Chain Reaction (PCR) and Real Time PCR (qPCR) Assays for the Detection of Six Leishmania Species DNA in Colombia

    PubMed Central

    León, Cielo M.; Muñoz, Marina; Hernández, Carolina; Ayala, Martha S.; Flórez, Carolina; Teherán, Aníbal; Cubides, Juan R.; Ramírez, Juan D.

    2017-01-01

    Leishmaniasis comprises a spectrum of parasitic diseases caused by protozoans of the genus Leishmania. Molecular tools have been widely employed for the detection of Leishmania due to its high sensitivity and specificity. However, the analytical performance of molecular platforms as PCR and real time PCR (qPCR) including a wide variety of molecular markers has never been evaluated. Herein, the aim was to evaluate the analytical performance of 4 PCR-based assays (designed on four different targets) and applied on conventional and real-time PCR platforms. We evaluated the analytical performance of conventional PCR and real time PCR, determining exclusivity and inclusivity, Anticipated Reportable Range (ARR), limit of detection (LoD) and accuracy using primers directed to kDNA, HSP70, 18S and ITS-1 targets. We observed that the kDNA was the most sensitive but does not meet the criterion of exclusivity. The HSP70 presented a higher LoD in conventional PCR and qPCR in comparison with the other markers (1 × 101 and 1 × 10-1 equivalent parasites/mL respectively) and had a higher coefficient of variation in qPCR. No statistically significant differences were found between the days of the test with the four molecular markers. The present study revealed that the 18S marker presented the best performance in terms of analytical sensitivity and specificity for the qPCR in the species tested (species circulating in Colombia). Therefore, we recommend to explore the analytical and diagnostic performance in future studies using a broader number of species across America. PMID:29046670

  3. Let's Talk... Analytics

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  4. Analytical aspects of hydrogen exchange mass spectrometry

    PubMed Central

    Engen, John R.; Wales, Thomas E.

    2016-01-01

    The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552

  5. Trace detection of analytes using portable raman systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alam, M. Kathleen; Hotchkiss, Peter J.; Martin, Laura E.

    Apparatuses and methods for in situ detection of a trace amount of an analyte are disclosed herein. In a general embodiment, the present disclosure provides a surface-enhanced Raman spectroscopy (SERS) insert including a passageway therethrough, where the passageway has a SERS surface positioned therein. The SERS surface is configured to adsorb molecules of an analyte of interest. A concentrated sample is caused to flow over the SERS surface. The SERS insert is then provided to a portable Raman spectroscopy system, where it is analyzed for the analyte of interest.

  6. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  7. The analyst's participation in the analytic process.

    PubMed

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  8. Analytical study of comet nucleus samples

    NASA Technical Reports Server (NTRS)

    Albee, A. L.

    1989-01-01

    Analytical procedures for studying and handling frozen (130 K) core samples of comet nuclei are discussed. These methods include neutron activation analysis, x ray fluorescent analysis and high resolution mass spectroscopy.

  9. Analytical Study of Ventilated Wind Tunnel Boundary Interference on V/ STOL Models Including Wake Curvature and Decay Effects

    DTIC Science & Technology

    1974-11-01

    ol the abstract entered In Block 30. II dlllerent from Report) IB. SUPPLEMENTARY NOTES Available in DDC 19. KEY WORDS (Continue on revetee...Stream. " UTME TP 6808, June 1968. 20. Davis, D. D. , Jr. and Moore, Dewey . "Analytical Study of Blockage- and Lift-Interference...The variables N and NM must be right justified in their fields, and punched without a decimal point. The variables XLAM, UE, DO, BO, XMIN, and

  10. Implementation of evidence-based home visiting programs aimed at reducing child maltreatment: A meta-analytic review.

    PubMed

    Casillas, Katherine L; Fauchier, Angèle; Derkash, Bridget T; Garrido, Edward F

    2016-03-01

    In recent years there has been an increase in the popularity of home visitation programs as a means of addressing risk factors for child maltreatment. The evidence supporting the effectiveness of these programs from several meta-analyses, however, is mixed. One potential explanation for this inconsistency explored in the current study involves the manner in which these programs were implemented. In the current study we reviewed 156 studies associated with 9 different home visitation program models targeted to caregivers of children between the ages of 0 and 5. Meta-analytic techniques were used to determine the impact of 18 implementation factors (e.g., staff selection, training, supervision, fidelity monitoring, etc.) and four study characteristics (publication type, target population, study design, comparison group) in predicting program outcomes. Results from analyses revealed that several implementation factors, including training, supervision, and fidelity monitoring, had a significant effect on program outcomes, particularly child maltreatment outcomes. Study characteristics, including the program's target population and the comparison group employed, also had a significant effect on program outcomes. Implications of the study's results for those interested in implementing home visitation programs are discussed. A careful consideration and monitoring of program implementation is advised as a means of achieving optimal study results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Developments in analytical instrumentation

    NASA Astrophysics Data System (ADS)

    Petrie, G.

    The situation regarding photogrammetric instrumentation has changed quite dramatically over the last 2 or 3 years with the withdrawal of most analogue stereo-plotting machines from the market place and their replacement by analytically based instrumentation. While there have been few new developments in the field of comparators, there has been an explosive development in the area of small, relatively inexpensive analytical stereo-plotters based on the use of microcomputers. In particular, a number of new instruments have been introduced by manufacturers who mostly have not been associated previously with photogrammetry. Several innovative concepts have been introduced in these small but capable instruments, many of which are aimed at specialised applications, e.g. in close-range photogrammetry (using small-format cameras); for thematic mapping (by organisations engaged in environmental monitoring or resources exploitation); for map revision, etc. Another innovative and possibly significant development has been the production of conversion kits to convert suitable analogue stereo-plotting machines such as the Topocart, PG-2 and B-8 into fully fledged analytical plotters. The larger and more sophisticated analytical stereo-plotters are mostly being produced by the traditional mainstream photogrammetric systems suppliers with several new instruments and developments being introduced at the top end of the market. These include the use of enlarged photo stages to handle images up to 25 × 50 cm format; the complete integration of graphics workstations into the analytical plotter design; the introduction of graphics superimposition and stereo-superimposition; the addition of correlators for the automatic measurement of height, etc. The software associated with this new analytical instrumentation is now undergoing extensive re-development with the need to supply photogrammetric data as input to the more sophisticated G.I.S. systems now being installed by clients, instead

  12. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    ERIC Educational Resources Information Center

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  13. Analytical Chemistry and the Microchip.

    ERIC Educational Resources Information Center

    Lowry, Robert K.

    1986-01-01

    Analytical techniques used at various points in making microchips are described. They include: Fourier transform infrared spectrometry (silicon purity); optical emission spectroscopy (quantitative thin-film composition); X-ray photoelectron spectroscopy (chemical changes in thin films); wet chemistry, instrumental analysis (process chemicals);…

  14. Chemical clocks, oscillations, and other temporal effects in analytical chemistry: oddity or viable approach?

    PubMed

    Prabhu, Gurpur Rakesh D; Witek, Henryk A; Urban, Pawel L

    2018-05-31

    Most analytical methods are based on "analogue" inputs from sensors of light, electric potentials, or currents. The signals obtained by such sensors are processed using certain calibration functions to determine concentrations of the target analytes. The signal readouts are normally done after an optimised and fixed time period, during which an assay mixture is incubated. This minireview covers another-and somewhat unusual-analytical strategy, which relies on the measurement of time interval between the occurrences of two distinguishable states in the assay reaction. These states manifest themselves via abrupt changes in the properties of the assay mixture (e.g. change of colour, appearance or disappearance of luminescence, change in pH, variations in optical activity or mechanical properties). In some cases, a correlation between the time of appearance/disappearance of a given property and the analyte concentration can be also observed. An example of an assay based on time measurement is an oscillating reaction, in which the period of oscillations is linked to the concentration of the target analyte. A number of chemo-chronometric assays, relying on the existing (bio)transformations or artificially designed reactions, were disclosed in the past few years. They are very attractive from the fundamental point of view but-so far-only few of them have be validated and used to address real-world problems. Then, can chemo-chronometric assays become a practical tool for chemical analysis? Is there a need for further development of such assays? We are aiming to answer these questions.

  15. Analytical effective tensor for flow-through composites

    DOEpatents

    Sviercoski, Rosangela De Fatima [Los Alamos, NM

    2012-06-19

    A machine, method and computer-usable medium for modeling an average flow of a substance through a composite material. Such a modeling includes an analytical calculation of an effective tensor K.sup.a suitable for use with a variety of media. The analytical calculation corresponds to an approximation to the tensor K, and follows by first computing the diagonal values, and then identifying symmetries of the heterogeneity distribution. Additional calculations include determining the center of mass of the heterogeneous cell and its angle according to a defined Cartesian system, and utilizing this angle into a rotation formula to compute the off-diagonal values and determining its sign.

  16. Colon-targeted delivery of live bacterial cell biotherapeutics including microencapsulated live bacterial cells

    PubMed Central

    Prakash, Satya; Malgorzata Urbanska, Aleksandra

    2008-01-01

    There has been an ample interest in delivery of therapeutic molecules using live cells. Oral delivery has been stipulated as best way to deliver live cells to humans for therapy. Colon, in particular, is a part of gastrointestinal (GI) tract that has been proposed to be an oral targeted site. The main objective of these oral therapy procedures is to deliver live cells not only to treat diseases like colorectal cancer, inflammatory bowel disease, and other GI tract diseases like intestinal obstruction and gastritis, but also to deliver therapeutic molecules for overall therapy in various diseases such as renal failure, coronary heart disease, hypertension, and others. This review provides a comprehensive summary of recent advancement in colon targeted live bacterial cell biotherapeutics. Current status of bacterial cell therapy, principles of artificial cells and its potentials in oral delivery of live bacterial cell biotherapeutics for clinical applications as well as biotherapeutic future perspectives are also discussed in our review. PMID:19707368

  17. Problem-based learning on quantitative analytical chemistry course

    NASA Astrophysics Data System (ADS)

    Fitri, Noor

    2017-12-01

    This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.

  18. Analytical Performance Characteristics of the Cepheid GeneXpert Ebola Assay for the Detection of Ebola Virus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinsky, Benjamin A.; Sahoo, Malaya K.; Sandlund, Johanna

    The recently developed Xpert® Ebola Assay is a novel nucleic acid amplification test for simplified detection of Ebola virus (EBOV) in whole blood and buccal swab samples. The assay targets sequences in two EBOV genes, lowering the risk for new variants to escape detection in the test. The objective of this report is to present analytical characteristics of the Xpert® Ebola Assay on whole blood samples. Our study evaluated the assay’s analytical sensitivity, analytical specificity, inclusivity and exclusivity performance in whole blood specimens. EBOV RNA, inactivated EBOV, and infectious EBOV were used as targets. The dynamic range of the assay,more » the inactivation of virus, and specimen stability were also evaluated. The lower limit of detection (LoD) for the assay using inactivated virus was estimated to be 73 copies/mL (95% CI: 51–97 copies/mL). The LoD for infectious virus was estimated to be 1 plaque-forming unit/mL, and for RNA to be 232 copies/mL (95% CI 163–302 copies/mL). The assay correctly identified five different Ebola viruses, Yambuku-Mayinga, Makona-C07, Yambuku-Ecran, Gabon-Ilembe, and Kikwit-956210, and correctly excluded all non-EBOV isolates tested. The conditions used by Xpert® Ebola for inactivation of infectious virus reduced EBOV titer by ≥6 logs. In conclusion, we found the Xpert® Ebola Assay to have high analytical sensitivity and specificity for the detection of EBOV in whole blood. It offers ease of use, fast turnaround time, and remote monitoring. The test has an efficient viral inactivation protocol, fulfills inclusivity and exclusivity criteria, and has specimen stability characteristics consistent with the need for decentralized testing. The simplicity of the assay should enable testing in a wide variety of laboratory settings, including remote laboratories that are not capable of performing highly complex nucleic acid amplification tests, and during outbreaks where time to detection is critical.« less

  19. Analytical Performance Characteristics of the Cepheid GeneXpert Ebola Assay for the Detection of Ebola Virus

    PubMed Central

    Pinsky, Benjamin A.; Sahoo, Malaya K.; Sandlund, Johanna; Kleman, Marika; Kulkarni, Medha; Grufman, Per; Nygren, Malin; Kwiatkowski, Robert; Baron, Ellen Jo; Tenover, Fred; Denison, Blake; Higuchi, Russell; Van Atta, Reuel; Beer, Neil Reginald; Carrillo, Alda Celena; Naraghi-Arani, Pejman; Mire, Chad E.; Ranadheera, Charlene; Grolla, Allen; Lagerqvist, Nina; Persing, David H.

    2015-01-01

    Background The recently developed Xpert® Ebola Assay is a novel nucleic acid amplification test for simplified detection of Ebola virus (EBOV) in whole blood and buccal swab samples. The assay targets sequences in two EBOV genes, lowering the risk for new variants to escape detection in the test. The objective of this report is to present analytical characteristics of the Xpert® Ebola Assay on whole blood samples. Methods and Findings This study evaluated the assay’s analytical sensitivity, analytical specificity, inclusivity and exclusivity performance in whole blood specimens. EBOV RNA, inactivated EBOV, and infectious EBOV were used as targets. The dynamic range of the assay, the inactivation of virus, and specimen stability were also evaluated. The lower limit of detection (LoD) for the assay using inactivated virus was estimated to be 73 copies/mL (95% CI: 51–97 copies/mL). The LoD for infectious virus was estimated to be 1 plaque-forming unit/mL, and for RNA to be 232 copies/mL (95% CI 163–302 copies/mL). The assay correctly identified five different Ebola viruses, Yambuku-Mayinga, Makona-C07, Yambuku-Ecran, Gabon-Ilembe, and Kikwit-956210, and correctly excluded all non-EBOV isolates tested. The conditions used by Xpert® Ebola for inactivation of infectious virus reduced EBOV titer by ≥6 logs. Conclusion In summary, we found the Xpert® Ebola Assay to have high analytical sensitivity and specificity for the detection of EBOV in whole blood. It offers ease of use, fast turnaround time, and remote monitoring. The test has an efficient viral inactivation protocol, fulfills inclusivity and exclusivity criteria, and has specimen stability characteristics consistent with the need for decentralized testing. The simplicity of the assay should enable testing in a wide variety of laboratory settings, including remote laboratories that are not capable of performing highly complex nucleic acid amplification tests, and during outbreaks where time to detection

  20. Analytical Performance Characteristics of the Cepheid GeneXpert Ebola Assay for the Detection of Ebola Virus

    DOE PAGES

    Pinsky, Benjamin A.; Sahoo, Malaya K.; Sandlund, Johanna; ...

    2015-11-12

    The recently developed Xpert® Ebola Assay is a novel nucleic acid amplification test for simplified detection of Ebola virus (EBOV) in whole blood and buccal swab samples. The assay targets sequences in two EBOV genes, lowering the risk for new variants to escape detection in the test. The objective of this report is to present analytical characteristics of the Xpert® Ebola Assay on whole blood samples. Our study evaluated the assay’s analytical sensitivity, analytical specificity, inclusivity and exclusivity performance in whole blood specimens. EBOV RNA, inactivated EBOV, and infectious EBOV were used as targets. The dynamic range of the assay,more » the inactivation of virus, and specimen stability were also evaluated. The lower limit of detection (LoD) for the assay using inactivated virus was estimated to be 73 copies/mL (95% CI: 51–97 copies/mL). The LoD for infectious virus was estimated to be 1 plaque-forming unit/mL, and for RNA to be 232 copies/mL (95% CI 163–302 copies/mL). The assay correctly identified five different Ebola viruses, Yambuku-Mayinga, Makona-C07, Yambuku-Ecran, Gabon-Ilembe, and Kikwit-956210, and correctly excluded all non-EBOV isolates tested. The conditions used by Xpert® Ebola for inactivation of infectious virus reduced EBOV titer by ≥6 logs. In conclusion, we found the Xpert® Ebola Assay to have high analytical sensitivity and specificity for the detection of EBOV in whole blood. It offers ease of use, fast turnaround time, and remote monitoring. The test has an efficient viral inactivation protocol, fulfills inclusivity and exclusivity criteria, and has specimen stability characteristics consistent with the need for decentralized testing. The simplicity of the assay should enable testing in a wide variety of laboratory settings, including remote laboratories that are not capable of performing highly complex nucleic acid amplification tests, and during outbreaks where time to detection is critical.« less

  1. The Role of Shaping the Client's Interpretations in Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Abreu, Paulo Roberto; Hubner, Maria Martha Costa; Lucchese, Fernanda

    2012-01-01

    Clinical behavior analysis often targets the shaping of clients' functional interpretations of/or rules about his own behavior. These are referred to as clinically relevant behavior 3 (CRB3) in functional analytic psychotherapy (FAP). We suggest that CRB3s should be seen as contingency-specifying stimuli (CSS), due to the their ability to change…

  2. Transport Phenomena in Thin Rotating Liquid Films Including: Nucleate Boiling

    NASA Technical Reports Server (NTRS)

    Faghri, Amir

    2005-01-01

    In this grant, experimental, numerical and analytical studies of heat transfer in a thin liquid film flowing over a rotating disk have been conducted. Heat transfer coefficients were measured experimentally in a rotating disk heat transfer apparatus where the disk was heated from below with electrical resistance heaters. The heat transfer measurements were supplemented by experimental characterization of the liquid film thickness using a novel laser based technique. The heat transfer measurements show that the disk rotation plays an important role on enhancement of heat transfer primarily through the thinning of the liquid film. Experiments covered both momentum and rotation dominated regimes of the flow and heat transfer in this apparatus. Heat transfer measurements have been extended to include evaporation and nucleate boiling and these experiments are continuing in our laboratory. Empirical correlations have also been developed to provide useful information for design of compact high efficiency heat transfer devices. The experimental work has been supplemented by numerical and analytical analyses of the same problem. Both numerical and analytical results have been found to agree reasonably well with the experimental results on liquid film thickness and heat transfer Coefficients/Nusselt numbers. The numerical simulations include the free surface liquid film flow and heat transfer under disk rotation including the conjugate effects. The analytical analysis utilizes an integral boundary layer approach from which

  3. The Challenge of Developing a Universal Case Conceptualization for Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Bonow, Jordan T.; Maragakis, Alexandros; Follette, William C.

    2012-01-01

    Functional Analytic Psychotherapy (FAP) targets a client's interpersonal behavior for change with the goal of improving his or her quality of life. One question guiding FAP case conceptualization is, "What interpersonal behavioral repertoires will allow a specific client to function optimally?" Previous FAP writings have suggested that a therapist…

  4. Selected Analytical Methods for Environmental Remediation and Recovery (SAM) - Home

    EPA Pesticide Factsheets

    The SAM Home page provides access to all information provided in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM), and includes a query function allowing users to search methods by analyte, sample type and instrumentation.

  5. Assessment of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.

    2014-01-01

    National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…

  6. Methods and limitations in radar target imagery

    NASA Astrophysics Data System (ADS)

    Bertrand, P.

    An analytical examination of the reflectivity of radar targets is presented for the two-dimensional case of flat targets. A complex backscattering coefficient is defined for the amplitude and phase of the received field in comparison with the emitted field. The coefficient is dependent on the frequency of the emitted signal and the orientation of the target with respect to the transmitter. The target reflection is modeled in terms of the density of illumined, colored points independent from one another. The target therefore is represented as an infinite family of densities indexed by the observational angle. Attention is given to the reflectivity parameters and their distribution function, and to the conjunct distribution function for the color, position, and the directivity of bright points. It is shown that a fundamental ambiguity exists between the localization of the illumined points and the determination of their directivity and color.

  7. A Modern Approach to College Analytical Chemistry.

    ERIC Educational Resources Information Center

    Neman, R. L.

    1983-01-01

    Describes a course which emphasizes all facets of analytical chemistry, including sampling, preparation, interference removal, selection of methodology, measurement of a property, and calculation/interpretation of results. Includes special course features (such as cooperative agreement with an environmental protection center) and course…

  8. -Omic and Electronic Health Records Big Data Analytics for Precision Medicine

    PubMed Central

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.

    2017-01-01

    Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470

  9. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    PubMed

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  10. An integrated molecular analysis of lung adenocarcinomas identifies potential therapeutic targets among TTF1-negative tumors including DNA repair proteins and Nrf2

    PubMed Central

    Cardnell, Robert J.G.; Behrens, Carmen; Diao, Lixia; Fan, YouHong; Tang, Ximing; Tong, Pan; John D., Minna; Mills, Gordon B.; Heymach, John V.; Wistuba, Ignacio I.; Wang, Jing; Byers., Lauren A.

    2015-01-01

    Purpose Thyroid transcription factor-1 (TTF1) immunohistochemistry (IHC) is used clinically to differentiate primary lung adenocarcinomas (LUAD) from squamous lung cancers and metastatic adenocarcinomas from other primary sites. However, a subset of LUAD (15-20%) does not express TTF1 and TTF1-negative patients have worse clinical outcomes. As there are no established targeted agents with activity in TTF1-negative LUAD, we performed an integrated molecular analysis to identify potential therapeutic targets. Experimental Design Using two clinical LUAD cohorts (274 tumors), one from our institution (PROSPECT) and the TCGA, we interrogated proteomic profiles (by reverse-phase protein array (RPPA)), gene expression, and mutational data. Drug response data from 74 cell lines were used to validate potential therapeutic agents. Results Strong correlations were observed between TTF1 IHC and TTF1 measurements by RPPA (Rho=0.57, p<0.001) and gene expression (NKX2-1, Rho=0.61, p<0.001). Established driver mutations (e.g. BRAF and EGFR) were associated with high TTF1 expression. In contrast, TTF1-negative LUAD had a higher frequency of inactivating KEAP1 mutations (p=0.001). Proteomic profiling identified increased expression of DNA repair proteins (e.g., Chk1 and the DNA repair score) and suppressed PI3K/MAPK signaling among TTF1-negative tumors, with differences in total proteins confirmed at the mRNA level. Cell line analysis showed drugs targeting DNA repair to be more active in TTF1-low cell lines. Conclusions Combined genomic and proteomic analyses demonstrated infrequent alteration of validated lung cancer targets (including the absence of BRAF mutations in TTF1-negative LUAD), but identified novel potential targets for TTF1-negative LUAD includingKEAP1/Nrf2 and DNA repair pathways. PMID:25878335

  11. Pavement Performance : Approaches Using Predictive Analytics

    DOT National Transportation Integrated Search

    2018-03-23

    Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...

  12. Google Analytics – Index of Resources

    EPA Pesticide Factsheets

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  13. Introducing Text Analytics as a Graduate Business School Course

    ERIC Educational Resources Information Center

    Edgington, Theresa M.

    2011-01-01

    Text analytics refers to the process of analyzing unstructured data from documented sources, including open-ended surveys, blogs, and other types of web dialog. Text analytics has enveloped the concept of text mining, an analysis approach influenced heavily from data mining. While text mining has been covered extensively in various computer…

  14. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  15. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  16. Liquid-Liquid Extraction of Insecticides from Juice: An Analytical Chemistry Laboratory Experiment

    ERIC Educational Resources Information Center

    Radford, Samantha A.; Hunter, Ronald E., Jr.; Barr, Dana Boyd; Ryan, P. Barry

    2013-01-01

    A laboratory experiment was developed to target analytical chemistry students and to teach them about insecticides in food, sample extraction, and cleanup. Micro concentrations (sub-microgram/mL levels) of 12 insecticides spiked into apple juice samples are extracted using liquid-liquid extraction and cleaned up using either a primary-secondary…

  17. Strategic, Analytic and Operational Domains of Information Management.

    ERIC Educational Resources Information Center

    Diener, Richard AV

    1992-01-01

    Discussion of information management focuses on three main areas of activities and their interrelationship: (1) strategic, including establishing frameworks and principles of operations; (2) analytic, or research elements, including user needs assessment, data gathering, and data analysis; and (3) operational activities, including reference…

  18. Integrating Variances into an Analytical Database

    NASA Technical Reports Server (NTRS)

    Sanchez, Carlos

    2010-01-01

    For this project, I enrolled in numerous SATERN courses that taught the basics of database programming. These include: Basic Access 2007 Forms, Introduction to Database Systems, Overview of Database Design, and others. My main job was to create an analytical database that can handle many stored forms and make it easy to interpret and organize. Additionally, I helped improve an existing database and populate it with information. These databases were designed to be used with data from Safety Variances and DCR forms. The research consisted of analyzing the database and comparing the data to find out which entries were repeated the most. If an entry happened to be repeated several times in the database, that would mean that the rule or requirement targeted by that variance has been bypassed many times already and so the requirement may not really be needed, but rather should be changed to allow the variance's conditions permanently. This project did not only restrict itself to the design and development of the database system, but also worked on exporting the data from the database to a different format (e.g. Excel or Word) so it could be analyzed in a simpler fashion. Thanks to the change in format, the data was organized in a spreadsheet that made it possible to sort the data by categories or types and helped speed up searches. Once my work with the database was done, the records of variances could be arranged so that they were displayed in numerical order, or one could search for a specific document targeted by the variances and restrict the search to only include variances that modified a specific requirement. A great part that contributed to my learning was SATERN, NASA's resource for education. Thanks to the SATERN online courses I took over the summer, I was able to learn many new things about computers and databases and also go more in depth into topics I already knew about.

  19. Analytical and Biological Methods for Probing the Blood-Brain Barrier

    PubMed Central

    Sloan, Courtney D. Kuhnline; Nandi, Pradyot; Linz, Thomas H.; Aldrich, Jane V.; Audus, Kenneth L.; Lunte, Susan M.

    2013-01-01

    The blood-brain barrier (BBB) is an important interface between the peripheral and central nervous systems. It protects the brain against the infiltration of harmful substances and regulates the permeation of beneficial endogenous substances from the blood into the extracellular fluid of the brain. It can also present a major obstacle in the development of drugs that are targeted for the central nervous system. Several methods have been developed to investigate the transport and metabolism of drugs, peptides, and endogenous compounds at the BBB. In vivo methods include intravenous injection, brain perfusion, positron emission tomography, and microdialysis sampling. Researchers have also developed in vitro cell-culture models that can be employed to investigate transport and metabolism at the BBB without the complication of systemic involvement. All these methods require sensitive and selective analytical methods to monitor the transport and metabolism of the compounds of interest at the BBB. PMID:22708905

  20. Missing the target: including perspectives of women with overweight and obesity to inform stigma‐reduction strategies

    PubMed Central

    Himmelstein, M. S.; Gorin, A. A.; Suh, Y. J.

    2017-01-01

    Summary Objective Pervasive weight stigma and discrimination have led to ongoing calls for efforts to reduce this bias. Despite increasing research on stigma‐reduction strategies, perspectives of individuals who have experienced weight stigma have rarely been included to inform this research. The present study conducted a systematic examination of women with high body weight to assess their perspectives about a broad range of strategies to reduce weight‐based stigma. Methods Women with overweight or obesity (N = 461) completed an online survey in which they evaluated the importance, feasibility and potential impact of 35 stigma‐reduction strategies in diverse settings. Participants (91.5% who reported experiencing weight stigma) also completed self‐report measures assessing experienced and internalized weight stigma. Results Most participants assigned high importance to all stigma‐reduction strategies, with school‐based and healthcare approaches accruing the highest ratings. Adding weight stigma to existing anti‐harassment workplace training was rated as the most impactful and feasible strategy. The family environment was viewed as an important intervention target, regardless of participants' experienced or internalized stigma. Conclusion These findings underscore the importance of including people with stigmatized identities in stigma‐reduction research; their insights provide a necessary and valuable contribution that can inform ways to reduce weight‐based inequities and prioritize such efforts. PMID:28392929

  1. Contemporary Privacy Theory Contributions to Learning Analytics

    ERIC Educational Resources Information Center

    Heath, Jennifer

    2014-01-01

    With the continued adoption of learning analytics in higher education institutions, vast volumes of data are generated and "big data" related issues, including privacy, emerge. Privacy is an ill-defined concept and subject to various interpretations and perspectives, including those of philosophers, lawyers, and information systems…

  2. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR METALS IN SPIKE SAMPLES

    EPA Science Inventory

    The Metals in Spikes data set contains the analytical results of measurements of up to 4 metals in 71 control samples (spikes) from 47 households. Measurements were made in samples of indoor and outdoor air, blood, and urine. Controls were used to assess recovery of target anal...

  3. Rapid and continuous analyte processing in droplet microfluidic devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strey, Helmut; Kimmerling, Robert; Bakowski, Tomasz

    The compositions and methods described herein are designed to introduce functionalized microparticles into droplets that can be manipulated in microfluidic devices by fields, including electric (dielectrophoretic) or magnetic fields, and extracted by splitting a droplet to separate the portion of the droplet that contains the majority of the microparticles from the part that is largely devoid of the microparticles. Within the device, channels are variously configured at Y- or T junctions that facilitate continuous, serial isolation and dilution of analytes in solution. The devices can be limited in the sense that they can be designed to output purified analytes thatmore » are then further analyzed in separate machines or they can include additional channels through which purified analytes can be further processed and analyzed.« less

  4. Annual banned-substance review: Analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans

    2018-01-01

    Several high-profile revelations concerning anti-doping rule violations over the past 12 months have outlined the importance of tackling prevailing challenges and reducing the limitations of the current anti-doping system. At this time, the necessity to enhance, expand, and improve analytical test methods in response to the substances outlined in the World Anti-Doping Agency's (WADA) Prohibited List represents an increasingly crucial task for modern sports drug-testing programs. The ability to improve analytical testing methods often relies on the expedient application of novel information regarding superior target analytes for sports drug-testing assays, drug elimination profiles, alternative test matrices, together with recent advances in instrumental developments. This annual banned-substance review evaluates literature published between October 2016 and September 2017 offering an in-depth evaluation of developments in these arenas and their potential application to substances reported in WADA's 2017 Prohibited List. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Considerations on the Use of Custom Accelerators for Big Data Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Tumeo, Antonino; Minutoli, Marco

    Accelerators, including Graphic Processing Units (GPUs) for gen- eral purpose computation, many-core designs with wide vector units (e.g., Intel Phi), have become a common component of many high performance clusters. The appearance of more stable and reliable tools tools that can automatically convert code written in high-level specifications with annotations (such as C or C++) to hardware de- scription languages (High-Level Synthesis - HLS), is also setting the stage for a broader use of reconfigurable devices (e.g., Field Pro- grammable Gate Arrays - FPGAs) in high performance system for the implementation of custom accelerators, helped by the fact that newmore » processors include advanced cache-coherent interconnects for these components. In this chapter, we briefly survey the status of the use of accelerators in high performance systems targeted at big data analytics applications. We argue that, although the progress in the use of accelerators for this class of applications has been sig- nificant, differently from scientific simulations there still are gaps to close. This is particularly true for the ”irregular” behaviors exhibited by no-SQL, graph databases. We focus our attention on the limits of HLS tools for data analytics and graph methods, and discuss a new architectural template that better fits the requirement of this class of applications. We validate the new architectural templates by mod- ifying the Graph Engine for Multithreaded System (GEMS) frame- work to support accelerators generated with such a methodology, and testing with queries coming from the Lehigh University Benchmark (LUBM). The architectural template enables better supporting the task and memory level parallelism present in graph methods by sup- porting a new control model and a enhanced memory interface. We show that out solution allows generating parallel accelerators, pro- viding speed ups with respect to conventional HLS flows. We finally draw conclusions and present a

  6. Implementation of an Autonomous Multi-Maneuver Targeting Sequence for Lunar Trans-Earth Injection

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan J.; Williams, Jacob

    2010-01-01

    Using a fully analytic initial guess estimate as a first iterate, a targeting procedure that constructs a flyable burn maneuver sequence to transfer a spacecraft from any closed Moon orbit to a desired Earth entry state is developed and implemented. The algorithm is built to support the need for an anytime abort capability for Orion. Based on project requirements, the Orion spacecraft must be able to autonomously calculate the translational maneuver targets for an entire Lunar mission. Translational maneuver target sequences for the Orion spacecraft include Lunar Orbit Insertion (LOI), Trans-Earth Injection (TEI), and Trajectory Correction Maneuvers (TCMs). This onboard capability is generally assumed to be supplemental to redundant ground computation in nominal mission operations and considered as a viable alternative primarily in loss of communications contingencies. Of these maneuvers, the ability to accurately and consistently establish a flyable 3-burn TEI target sequence is especially critical. The TEI is the sole means by which the crew can successfully return from the Moon to a narrowly banded Earth Entry Interface (EI) state. This is made even more critical by the desire for global access on the lunar surface. Currently, the designed propellant load is based on fully optimized TEI solutions for the worst case geometries associated with the accepted range of epochs and landing sites. This presents two challenges for an autonomous algorithm: in addition to being feasible, the targets must include burn sequences that do not exceed the anticipated propellant load.

  7. Recent Applications of Carbon-Based Nanomaterials in Analytical Chemistry: Critical Review

    PubMed Central

    Scida, Karen; Stege, Patricia W.; Haby, Gabrielle; Messina, Germán A.; García, Carlos D.

    2011-01-01

    The objective of this review is to provide a broad overview of the advantages and limitations of carbon-based nanomaterials with respect to analytical chemistry. Aiming to illustrate the impact of nanomaterials on the development of novel analytical applications, developments reported in the 2005–2010 period have been included and divided into sample preparation, separation, and detection. Within each section, fullerenes, carbon nanotubes, graphene, and composite materials will be addressed specifically. Although only briefly discussed, included is a section highlighting nanomaterials with interesting catalytic properties that can be used in the design of future devices for analytical chemistry. PMID:21458626

  8. Understanding Business Analytics

    DTIC Science & Technology

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  9. Accurate mass measurements and their appropriate use for reliable analyte identification.

    PubMed

    Godfrey, A Ruth; Brenton, A Gareth

    2012-09-01

    Accurate mass instrumentation is becoming increasingly available to non-expert users. This data can be mis-used, particularly for analyte identification. Current best practice in assigning potential elemental formula for reliable analyte identification has been described with modern informatic approaches to analyte elucidation, including chemometric characterisation, data processing and searching using facilities such as the Chemical Abstracts Service (CAS) Registry and Chemspider.

  10. Quantitative and Qualitative Relations between Motivation and Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Miele, David B.; Wigfield, Allan

    2014-01-01

    The authors examine two kinds of factors that affect students' motivation to engage in critical-analytic thinking. The first, which includes ability beliefs, achievement values, and achievement goal orientations, influences the "quantitative" relation between motivation and critical-analytic thinking; that is, whether students are…

  11. Analytic cognitive style predicts religious and paranormal belief.

    PubMed

    Pennycook, Gordon; Cheyne, James Allan; Seli, Paul; Koehler, Derek J; Fugelsang, Jonathan A

    2012-06-01

    An analytic cognitive style denotes a propensity to set aside highly salient intuitions when engaging in problem solving. We assess the hypothesis that an analytic cognitive style is associated with a history of questioning, altering, and rejecting (i.e., unbelieving) supernatural claims, both religious and paranormal. In two studies, we examined associations of God beliefs, religious engagement (attendance at religious services, praying, etc.), conventional religious beliefs (heaven, miracles, etc.) and paranormal beliefs (extrasensory perception, levitation, etc.) with performance measures of cognitive ability and analytic cognitive style. An analytic cognitive style negatively predicted both religious and paranormal beliefs when controlling for cognitive ability as well as religious engagement, sex, age, political ideology, and education. Participants more willing to engage in analytic reasoning were less likely to endorse supernatural beliefs. Further, an association between analytic cognitive style and religious engagement was mediated by religious beliefs, suggesting that an analytic cognitive style negatively affects religious engagement via lower acceptance of conventional religious beliefs. Results for types of God belief indicate that the association between an analytic cognitive style and God beliefs is more nuanced than mere acceptance and rejection, but also includes adopting less conventional God beliefs, such as Pantheism or Deism. Our data are consistent with the idea that two people who share the same cognitive ability, education, political ideology, sex, age and level of religious engagement can acquire very different sets of beliefs about the world if they differ in their propensity to think analytically. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Ontology for customer centric digital services and analytics

    NASA Astrophysics Data System (ADS)

    Keat, Ng Wai; Shahrir, Mohammad Shazri

    2017-11-01

    In computer science research, ontologies are commonly utilised to create a unified abstract across many rich and different fields. In this paper, we apply the concept to the customer centric domain of digital services analytics and present an analytics solution ontology. The essence is based from traditional Entity Relationship Diagram (ERD), which then was abstracted out to cover wider areas on customer centric digital services. The ontology we developed covers both static aspects (customer identifiers) and dynamic aspects (customer's temporal interactions). The structure of the customer scape is modeled with classes that represent different types of customer touch points, ranging from digital and digital-stamps which represent physical analogies. The dynamic aspects of customer centric digital service are modeled with a set of classes, with the importance is represented in different associations involving establishment and termination of the target interaction. The realized ontology can be used in development of frameworks for customer centric applications, and for specification of common data format used by cooperating digital service applications.

  13. Dynamically analyte-responsive macrocyclic host-fluorophore systems.

    PubMed

    Ghale, Garima; Nau, Werner M

    2014-07-15

    . We will begin by describing the underlying principles that govern the use of macrocycle-fluorescent dye complexes to monitor time-dependent changes in analyte concentrations. Suitable chemosensing ensembles are introduced, along with their fluorescence responses (switch-on or switch-off). This includes supramolecular tandem assays in their product- and substrate-selective variants, and in their domino and enzyme-coupled modifications, with assays for amino acid decarboxylases, diamine, and choline oxidase, proteases, methyl transferases, acetylcholineesterase (including an unpublished direct tandem assay), choline oxidase, and potato apyrase as examples. It also includes the very recently introduced tandem membrane assays in their published influx and unpublished efflux variants, with the outer membrane protein F as channel protein and protamine as bidirectionally translocated analyte. As proof-of-principle for environmental monitoring applications, we describe sensing ensembles for volatile hydrocarbons.

  14. Usefulness of Analytical Research: Rethinking Analytical R&D&T Strategies.

    PubMed

    Valcárcel, Miguel

    2017-11-07

    This Perspective is intended to help foster true innovation in Research & Development & Transfer (R&D&T) in Analytical Chemistry in the form of advances that are primarily useful for analytical purposes rather than solely for publishing. Devising effective means to strengthen the crucial contribution of Analytical Chemistry to progress in Chemistry, Science & Technology, and Society requires carefully examining the present status of our discipline and also identifying internal and external driving forces with a potential adverse impact on its development. The diagnostic process should be followed by administration of an effective therapy and supported by adoption of a theragnostic strategy if Analytical Chemistry is to enjoy a better future.

  15. The drug target genes show higher evolutionary conservation than non-target genes.

    PubMed

    Lv, Wenhua; Xu, Yongdeng; Guo, Yiying; Yu, Ziqi; Feng, Guanglong; Liu, Panpan; Luan, Meiwei; Zhu, Hongjie; Liu, Guiyou; Zhang, Mingming; Lv, Hongchao; Duan, Lian; Shang, Zhenwei; Li, Jin; Jiang, Yongshuai; Zhang, Ruijie

    2016-01-26

    Although evidence indicates that drug target genes share some common evolutionary features, there have been few studies analyzing evolutionary features of drug targets from an overall level. Therefore, we conducted an analysis which aimed to investigate the evolutionary characteristics of drug target genes. We compared the evolutionary conservation between human drug target genes and non-target genes by combining both the evolutionary features and network topological properties in human protein-protein interaction network. The evolution rate, conservation score and the percentage of orthologous genes of 21 species were included in our study. Meanwhile, four topological features including the average shortest path length, betweenness centrality, clustering coefficient and degree were considered for comparison analysis. Then we got four results as following: compared with non-drug target genes, 1) drug target genes had lower evolutionary rates; 2) drug target genes had higher conservation scores; 3) drug target genes had higher percentages of orthologous genes and 4) drug target genes had a tighter network structure including higher degrees, betweenness centrality, clustering coefficients and lower average shortest path lengths. These results demonstrate that drug target genes are more evolutionarily conserved than non-drug target genes. We hope that our study will provide valuable information for other researchers who are interested in evolutionary conservation of drug targets.

  16. Exploratory Analysis in Learning Analytics

    ERIC Educational Resources Information Center

    Gibson, David; de Freitas, Sara

    2016-01-01

    This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…

  17. Big data analytics to improve cardiovascular care: promise and challenges.

    PubMed

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  18. A sensitive multi-residue method for the determination of 35 micropollutants including pharmaceuticals, iodinated contrast media and pesticides in water.

    PubMed

    Valls-Cantenys, Carme; Scheurer, Marco; Iglesias, Mònica; Sacher, Frank; Brauch, Heinz-Jürgen; Salvadó, Victoria

    2016-09-01

    A sensitive, multi-residue method using solid-phase extraction followed by liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed to determine a representative group of 35 analytes, including corrosion inhibitors, pesticides and pharmaceuticals such as analgesic and anti-inflammatory drugs, five iodinated contrast media, β-blockers and some of their metabolites and transformation products in water samples. Few other methods are capable of determining such a broad range of contrast media together with other analytes. We studied the parameters affecting the extraction of the target analytes, including sorbent selection and extraction conditions, their chromatographic separation (mobile phase composition and column) and detection conditions using two ionisation sources: electrospray ionisation (ESI) and atmospheric pressure chemical ionisation (APCI). In order to correct matrix effects, a total of 20 surrogate/internal standards were used. ESI was found to have better sensitivity than APCI. Recoveries ranging from 79 to 134 % for tap water and 66 to 144 % for surface water were obtained. Intra-day precision, calculated as relative standard deviation, was below 34 % for tap water and below 21 % for surface water, groundwater and effluent wastewater. Method quantification limits (MQL) were in the low ng L(-1) range, except for the contrast agents iomeprol, amidotrizoic acid and iohexol (22, 25.5 and 17.9 ng L(-1), respectively). Finally, the method was applied to the analysis of 56 real water samples as part of the validation procedure. All of the compounds were detected in at least some of the water samples analysed. Graphical Abstract Multi-residue method for the determination of micropollutants including pharmaceuticals, iodinated contrast media and pesticides in waters by LC-MS/MS.

  19. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    PubMed

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  20. Electrospray ion source with reduced analyte electrochemistry

    DOEpatents

    Kertesz, Vilmos [Knoxville, TN; Van Berkel, Gary [Clinton, TN

    2011-08-23

    An electrospray ion (ESI) source and method capable of ionizing an analyte molecule without oxidizing or reducing the analyte of interest. The ESI source can include an emitter having a liquid conduit, a working electrode having a liquid contacting surface, a spray tip, a secondary working electrode, and a charge storage coating covering partially or fully the liquid contacting surface of the working electrode. The liquid conduit, the working electrode and the secondary working electrode can be in liquid communication. The electrospray ion source can also include a counter electrode proximate to, but separated from, said spray tip. The electrospray ion source can also include a power system for applying a voltage difference between the working electrodes and a counter-electrode. The power system can deliver pulsed voltage changes to the working electrodes during operation of said electrospray ion source to minimize the surface potential of the charge storage coating.

  1. Electrospray ion source with reduced analyte electrochemistry

    DOEpatents

    Kertesz, Vilmos; Van Berkel, Gary J

    2013-07-30

    An electrospray ion (ESI) source and method capable of ionizing an analyte molecule without oxidizing or reducing the analyte of interest. The ESI source can include an emitter having a liquid conduit, a working electrode having a liquid contacting surface, a spray tip, a secondary working electrode, and a charge storage coating covering partially or fully the liquid contacting surface of the working electrode. The liquid conduit, the working electrode and the secondary working electrode can be in liquid communication. The electrospray ion source can also include a counter electrode proximate to, but separated from, said spray tip. The electrospray ion source can also include a power system for applying a voltage difference between the working electrodes and a counter-electrode. The power system can deliver pulsed voltage changes to the working electrodes during operation of said electrospray ion source to minimize the surface potential of the charge storage coating.

  2. Integrated signal probe based aptasensor for dual-analyte detection.

    PubMed

    Xiang, Juan; Pi, Xiaomei; Chen, Xiaoqing; Xiang, Lei; Yang, Minghui; Ren, Hao; Shen, Xiaojuan; Qi, Ning; Deng, Chunyan

    2017-10-15

    For the multi-analyte detection, although the sensitivity has commonly met the practical requirements, the reliability, reproducibility and stability need to be further improved. In this work, two different aptamer probes labeled with redox tags were used as signal probe1 (sP1) and signal probe2 (sP2), which were integrated into one unity DNA architecture to develop the integrated signal probe (ISP). Comparing with the conventional independent signal probes for the simultaneous multi-analyte detection, the proposed ISP was more reproducible and accurate. This can be due to that ISP in one DNA structure can ensure the completely same modification condition and an equal stoichiometric ratio between sP1 and sP2, and furthermore the cross interference between sP1 and sP2 can be successfully prevented by regulating the complementary position of sP1 and sP2. The ISP-based assay system would be a great progress for the dual-analyte detection. Combining with gold nanoparticles (AuNPs) signal amplification, the ISP/AuNPs-based aptasensor for the sensitive dual-analyte detection was explored. Based on DNA structural switching induced by targets binding to aptamer, the simultaneous dual-analyte detection was simply achieved by monitoring the electrochemical responses of methylene blue (MB) and ferrocene (Fc) This proposed detection system possesses such advantages as simplicity in design, easy operation, good reproducibility and accuracy, high sensitivity and selectivity, which indicates the excellent application of this aptasensor in the field of clinical diagnosis or other molecular sensors. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Concurrence of big data analytics and healthcare: A systematic review.

    PubMed

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of

  4. TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuemann, J; Grassberger, C; Paganetti, H

    2014-06-15

    Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50)more » were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we

  5. Using personal glucose meters and functional DNA sensors to quantify a variety of analytical targets

    PubMed Central

    Xiang, Yu; Lu, Yi

    2012-01-01

    Portable, low-cost and quantitative detection of a broad range of targets at home and in the field has the potential to revolutionize medical diagnostics and environmental monitoring. Despite many years of research, very few such devices are commercially available. Taking advantage of the wide availability and low cost of the pocket-sized personal glucose meter—used worldwide by diabetes sufferers—we demonstrate a method to use such meters to quantify non-glucose targets, ranging from a recreational drug (cocaine, 3.4 μM detection limit) to an important biological cofactor (adenosine, 18 μM detection limit), to a disease marker (interferon-gamma of tuberculosis, 2.6 nM detection limit) and a toxic metal ion (uranium, 9.1 nM detection limit). The method is based on the target-induced release of invertase from a functional-DNA–invertase conjugate. The released invertase converts sucrose into glucose, which is detectable using the meter. The approach should be easily applicable to the detection of many other targets through the use of suitable functional-DNA partners (aptamers DNAzymes or aptazymes). PMID:21860458

  6. Using personal glucose meters and functional DNA sensors to quantify a variety of analytical targets

    NASA Astrophysics Data System (ADS)

    Xiang, Yu; Lu, Yi

    2011-09-01

    Portable, low-cost and quantitative detection of a broad range of targets at home and in the field has the potential to revolutionize medical diagnostics and environmental monitoring. Despite many years of research, very few such devices are commercially available. Taking advantage of the wide availability and low cost of the pocket-sized personal glucose meter—used worldwide by diabetes sufferers—we demonstrate a method to use such meters to quantify non-glucose targets, ranging from a recreational drug (cocaine, 3.4 µM detection limit) to an important biological cofactor (adenosine, 18 µM detection limit), to a disease marker (interferon-gamma of tuberculosis, 2.6 nM detection limit) and a toxic metal ion (uranium, 9.1 nM detection limit). The method is based on the target-induced release of invertase from a functional-DNA-invertase conjugate. The released invertase converts sucrose into glucose, which is detectable using the meter. The approach should be easily applicable to the detection of many other targets through the use of suitable functional-DNA partners (aptamers, DNAzymes or aptazymes).

  7. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...

  8. Metabolomics and Diabetes: Analytical and Computational Approaches

    PubMed Central

    Sas, Kelli M.; Karnovsky, Alla; Michailidis, George

    2015-01-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  9. Targeting Membrane-Bound Viral RNA Synthesis Reveals Potent Inhibition of Diverse Coronaviruses Including the Middle East Respiratory Syndrome Virus

    PubMed Central

    Bergström, Tomas; Kann, Nina; Adamiak, Beata; Hannoun, Charles; Kindler, Eveline; Jónsdóttir, Hulda R.; Muth, Doreen; Kint, Joeri; Forlenza, Maria; Müller, Marcel A.; Drosten, Christian; Thiel, Volker; Trybala, Edward

    2014-01-01

    Coronaviruses raise serious concerns as emerging zoonotic viruses without specific antiviral drugs available. Here we screened a collection of 16671 diverse compounds for anti-human coronavirus 229E activity and identified an inhibitor, designated K22, that specifically targets membrane-bound coronaviral RNA synthesis. K22 exerts most potent antiviral activity after virus entry during an early step of the viral life cycle. Specifically, the formation of double membrane vesicles (DMVs), a hallmark of coronavirus replication, was greatly impaired upon K22 treatment accompanied by near-complete inhibition of viral RNA synthesis. K22-resistant viruses contained substitutions in non-structural protein 6 (nsp6), a membrane-spanning integral component of the viral replication complex implicated in DMV formation, corroborating that K22 targets membrane bound viral RNA synthesis. Besides K22 resistance, the nsp6 mutants induced a reduced number of DMVs, displayed decreased specific infectivity, while RNA synthesis was not affected. Importantly, K22 inhibits a broad range of coronaviruses, including Middle East respiratory syndrome coronavirus (MERS–CoV), and efficient inhibition was achieved in primary human epithelia cultures representing the entry port of human coronavirus infection. Collectively, this study proposes an evolutionary conserved step in the life cycle of positive-stranded RNA viruses, the recruitment of cellular membranes for viral replication, as vulnerable and, most importantly, druggable target for antiviral intervention. We expect this mode of action to serve as a paradigm for the development of potent antiviral drugs to combat many animal and human virus infections. PMID:24874215

  10. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    PubMed Central

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370

  11. Predictors of Bullying and Victimization in Childhood and Adolescence: A Meta-Analytic Investigation

    ERIC Educational Resources Information Center

    Cook, Clayton R.; Williams, Kirk R.; Guerra, Nancy G.; Kim, Tia E.; Sadek, Shelly

    2010-01-01

    Research on the predictors of 3 bully status groups (bullies, victims, and bully victims) for school-age children and adolescents was synthesized using meta-analytic procedures. The primary purpose was to determine the relative strength of individual and contextual predictors to identify targets for prevention and intervention. Age and how…

  12. Recent applications of carbon-based nanomaterials in analytical chemistry: critical review.

    PubMed

    Scida, Karen; Stege, Patricia W; Haby, Gabrielle; Messina, Germán A; García, Carlos D

    2011-04-08

    The objective of this review is to provide a broad overview of the advantages and limitations of carbon-based nanomaterials with respect to analytical chemistry. Aiming to illustrate the impact of nanomaterials on the development of novel analytical applications, developments reported in the 2005-2010 period have been included and divided into sample preparation, separation, and detection. Within each section, fullerenes, carbon nanotubes, graphene, and composite materials will be addressed specifically. Although only briefly discussed, included is a section highlighting nanomaterials with interesting catalytic properties that can be used in the design of future devices for analytical chemistry. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Analytical Quality by Design Approach in RP-HPLC Method Development for the Assay of Etofenamate in Dosage Forms

    PubMed Central

    Peraman, R.; Bhadraya, K.; Reddy, Y. Padmanabha; Reddy, C. Surayaprakash; Lokesh, T.

    2015-01-01

    By considering the current regulatory requirement for an analytical method development, a reversed phase high performance liquid chromatographic method for routine analysis of etofenamate in dosage form has been optimized using analytical quality by design approach. Unlike routine approach, the present study was initiated with understanding of quality target product profile, analytical target profile and risk assessment for method variables that affect the method response. A liquid chromatography system equipped with a C18 column (250×4.6 mm, 5 μ), a binary pump and photodiode array detector were used in this work. The experiments were conducted based on plan by central composite design, which could save time, reagents and other resources. Sigma Tech software was used to plan and analyses the experimental observations and obtain quadratic process model. The process model was used for predictive solution for retention time. The predicted data from contour diagram for retention time were verified actually and it satisfied with actual experimental data. The optimized method was achieved at 1.2 ml/min flow rate of using mobile phase composition of methanol and 0.2% triethylamine in water at 85:15, % v/v, pH adjusted to 6.5. The method was validated and verified for targeted method performances, robustness and system suitability during method transfer. PMID:26997704

  14. Analytical impact time and angle guidance via time-varying sliding mode technique.

    PubMed

    Zhao, Yao; Sheng, Yongzhi; Liu, Xiangdong

    2016-05-01

    To concretely provide a feasible solution for homing missiles with the precise impact time and angle, this paper develops a novel guidance law, based on the nonlinear engagement dynamics. The guidance law is firstly designed with the prior assumption of a stationary target, followed by the practical extension to a moving target scenario. The time-varying sliding mode (TVSM) technique is applied to fulfill the terminal constraints, in which a specific TVSM surface is constructed with two unknown coefficients. One is tuned to meet the impact time requirement and the other one is targeted with a global sliding mode, so that the impact angle constraint as well as the zero miss distance can be satisfied. Because the proposed law possesses three guidance gain as design parameters, the intercept trajectory can be shaped according to the operational conditions and missile׳s capability. To improve the tolerance of initial heading errors and broaden the application, a new frame of reference is also introduced. Furthermore, the analytical solutions of the flight trajectory, heading angle and acceleration command can be totally expressed for the prediction and offline parameter selection by solving a first-order linear differential equation. Numerical simulation results for various scenarios validate the effectiveness of the proposed guidance law and demonstrate the accuracy of the analytic solutions. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Semi-analytic model of plasma-jet-driven magneto-inertial fusion

    DOE PAGES

    Langendorf, Samuel J.; Hsu, Scott C.

    2017-03-01

    A semi-analytic model for plasma-jet-driven magneto-inertial fusion is presented here. Compressions of a magnetized plasma target by a spherically imploding plasma liner are calculated in one dimension (1D), accounting for compressible hydrodynamics and ionization of the liner material, energy losses due to conduction and radiation, fusion burn and alpha deposition, separate ion and electron temperatures in the target, magnetic pressure, and fuel burn-up. Results show 1D gains of 3–30 at spherical convergence ratio <15 and 20–40 MJ of liner energy, for cases in which the liner thickness is 1 cm and the initial radius of a preheated magnetized target ismore » 4 cm. Some exploration of parameter space and physics settings is presented. The yields observed suggest that there is a possibility of igniting additional dense fuel layers to reach high gain.« less

  16. Hanford analytical sample projections FY 1998--FY 2002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joyce, S.M.

    1998-02-12

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management,more » and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.« less

  17. Enhancement in the sensitivity of microfluidic enzyme-linked immunosorbent assays through analyte preconcentration.

    PubMed

    Yanagisawa, Naoki; Dutta, Debashis

    2012-08-21

    In this Article, we describe a microfluidic enzyme-linked immunosorbent assay (ELISA) method whose sensitivity can be substantially enhanced through preconcentration of the target analyte around a semipermeable membrane. The reported preconcentration has been accomplished in our current work via electrokinetic means allowing a significant increase in the amount of captured analyte relative to nonspecific binding in the trapping/detection zone. Upon introduction of an enzyme substrate into this region, the rate of generation of the ELISA reaction product (resorufin) was observed to increase by over a factor of 200 for the sample and 2 for the corresponding blank compared to similar assays without analyte trapping. Interestingly, in spite of nonuniformities in the amount of captured analyte along the surface of our analysis channel, the measured fluorescence signal in the preconcentration zone increased linearly with time over an enzyme reaction period of 30 min and at a rate that was proportional to the analyte concentration in the bulk sample. In our current study, the reported technique has been shown to reduce the smallest detectable concentration of the tumor marker CA 19-9 and Blue Tongue Viral antibody by over 2 orders of magnitude compared to immunoassays without analyte preconcentration. When compared to microwell based ELISAs, the reported microfluidic approach not only yielded a similar improvement in the smallest detectable analyte concentration but also reduced the sample consumption in the assay by a factor of 20 (5 μL versus 100 μL).

  18. Kalman filter data assimilation: targeting observations and parameter estimation.

    PubMed

    Bellsky, Thomas; Kostelich, Eric J; Mahalov, Alex

    2014-06-01

    This paper studies the effect of targeted observations on state and parameter estimates determined with Kalman filter data assimilation (DA) techniques. We first provide an analytical result demonstrating that targeting observations within the Kalman filter for a linear model can significantly reduce state estimation error as opposed to fixed or randomly located observations. We next conduct observing system simulation experiments for a chaotic model of meteorological interest, where we demonstrate that the local ensemble transform Kalman filter (LETKF) with targeted observations based on largest ensemble variance is skillful in providing more accurate state estimates than the LETKF with randomly located observations. Additionally, we find that a hybrid ensemble Kalman filter parameter estimation method accurately updates model parameters within the targeted observation context to further improve state estimation.

  19. Big Data Analytics in Chemical Engineering.

    PubMed

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-06-07

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.

  20. Multi-analyte method development for analysis of brominated flame retardants (BFRs) and PBDE metabolites in human serum.

    PubMed

    Lu, Dasheng; Jin, Yu'e; Feng, Chao; Wang, Dongli; Lin, Yuanjie; Qiu, Xinlei; Xu, Qian; Wen, Yimin; She, Jianwen; Wang, Guoquan; Zhou, Zhijun

    2017-09-01

    Commonly, analytical methods measuring brominated flame retardants (BFRs) of different chemical polarities in human serum are labor consuming and tedious. Our study used acidified diatomaceous earth as solid-phase extraction (SPE) adsorbent and defatting material to simultaneously determine the most abundant BFRs and their metabolites with different polarities in human serum samples. The analytes include three types of commercial BFRs, tetrabromobisphenol A (TBBPA), hexabromocyclododecane (HBCD) isomers, and polybrominated biphenyl ethers (PBDEs), and dominant hydroxylated BDE (OH-PBDE) and methoxylated BDE (MeO-PBDE) metabolites of PBDEs. The sample eluents were sequentially analyzed for PBDEs and MeO-BDEs on online gel permeation chromatography/gas chromatography-electron capture-negative ionization mass spectrometry (online GPC GC-ECNI-MS) and for TBBPA, HBCD, and OH-BDEs on liquid chromatography-tandem mass spectrometry (LC-MS/MS). Method recoveries were 67-134% with a relative standard deviation (RSD) of less than 20%. Method detection limits (MDLs) were 0.30-4.20 pg/mL fresh weight (f.w.) for all analytes, except for BDE-209 of 16 pg/mL f.w. The methodology was also applied in a pilot study, which analyzed ten real samples from healthy donors in China, and the majority of target analytes were detected with a detection rate of more than 80%. To our knowledge, it is the first time for us in effectively determining BFRs of most types in one aliquot of human serum samples. This new analytical method is more specific, sensitive, accurate, and time saving for routine biomonitoring of these BFRs and for integrated assessment of health risk of BFR exposure.

  1. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  2. A comparison of two dose calculation algorithms-anisotropic analytical algorithm and Acuros XB-for radiation therapy planning of canine intranasal tumors.

    PubMed

    Nagata, Koichi; Pethel, Timothy D

    2017-07-01

    Although anisotropic analytical algorithm (AAA) and Acuros XB (AXB) are both radiation dose calculation algorithms that take into account the heterogeneity within the radiation field, Acuros XB is inherently more accurate. The purpose of this retrospective method comparison study was to compare them and evaluate the dose discrepancy within the planning target volume (PTV). Radiation therapy (RT) plans of 11 dogs with intranasal tumors treated by radiation therapy at the University of Georgia were evaluated. All dogs were planned for intensity-modulated radiation therapy using nine coplanar X-ray beams that were equally spaced, then dose calculated with anisotropic analytical algorithm. The same plan with the same monitor units was then recalculated using Acuros XB for comparisons. Each dog's planning target volume was separated into air, bone, and tissue and evaluated. The mean dose to the planning target volume estimated by Acuros XB was 1.3% lower. It was 1.4% higher for air, 3.7% lower for bone, and 0.9% lower for tissue. The volume of planning target volume covered by the prescribed dose decreased by 21% when Acuros XB was used due to increased dose heterogeneity within the planning target volume. Anisotropic analytical algorithm relatively underestimates the dose heterogeneity and relatively overestimates the dose to the bone and tissue within the planning target volume for the radiation therapy planning of canine intranasal tumors. This can be clinically significant especially if the tumor cells are present within the bone, because it may result in relative underdosing of the tumor. © 2017 American College of Veterinary Radiology.

  3. Nuclear reactor target assemblies, nuclear reactor configurations, and methods for producing isotopes, modifying materials within target material, and/or characterizing material within a target material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toth, James J.; Wall, Donald; Wittman, Richard S.

    Target assemblies are provided that can include a uranium-comprising annulus. The assemblies can include target material consisting essentially of non-uranium material within the volume of the annulus. Reactors are disclosed that can include one or more discrete zones configured to receive target material. At least one uranium-comprising annulus can be within one or more of the zones. Methods for producing isotopes within target material are also disclosed, with the methods including providing neutrons to target material within a uranium-comprising annulus. Methods for modifying materials within target material are disclosed as well as are methods for characterizing material within a targetmore » material.« less

  4. Optimization of analytical and pre-analytical conditions for MALDI-TOF-MS human urine protein profiles.

    PubMed

    Calvano, C D; Aresta, A; Iacovone, M; De Benedetto, G E; Zambonin, C G; Battaglia, M; Ditonno, P; Rutigliano, M; Bettocchi, C

    2010-03-11

    Protein analysis in biological fluids, such as urine, by means of mass spectrometry (MS) still suffers for insufficient standardization in protocols for sample collection, storage and preparation. In this work, the influence of these variables on healthy donors human urine protein profiling performed by matrix assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF-MS) was studied. A screening of various urine sample pre-treatment procedures and different sample deposition approaches on the MALDI target was performed. The influence of urine samples storage time and temperature on spectral profiles was evaluated by means of principal component analysis (PCA). The whole optimized procedure was eventually applied to the MALDI-TOF-MS analysis of human urine samples taken from prostate cancer patients. The best results in terms of detected ions number and abundance in the MS spectra were obtained by using home-made microcolumns packed with hydrophilic-lipophilic balance (HLB) resin as sample pre-treatment method; this procedure was also less expensive and suitable for high throughput analyses. Afterwards, the spin coating approach for sample deposition on the MALDI target plate was optimized, obtaining homogenous and reproducible spots. Then, PCA indicated that low storage temperatures of acidified and centrifuged samples, together with short handling time, allowed to obtain reproducible profiles without artifacts contribution due to experimental conditions. Finally, interesting differences were found by comparing the MALDI-TOF-MS protein profiles of pooled urine samples of healthy donors and prostate cancer patients. The results showed that analytical and pre-analytical variables are crucial for the success of urine analysis, to obtain meaningful and reproducible data, even if the intra-patient variability is very difficult to avoid. It has been proven how pooled urine samples can be an interesting way to make easier the comparison between

  5. Development of an analytical method for the targeted screening and multi-residue quantification of environmental contaminants in urine by liquid chromatography coupled to high resolution mass spectrometry for evaluation of human exposures.

    PubMed

    Cortéjade, A; Kiss, A; Cren, C; Vulliet, E; Buleté, A

    2016-01-01

    The aim of this study was to develop an analytical method and contribute to the assessment of the Exposome. Thus, a targeted analysis of a wide range of contaminants in contact with humans on daily routines in urine was developed. The method focused on a list of 38 contaminants, including 12 pesticides, one metabolite of pesticide, seven veterinary drugs, five parabens, one UV filter, one plastic additive, two surfactants and nine substances found in different products present in the everyday human environment. These contaminants were analyzed by high performance liquid chromatography coupled to high resolution mass spectrometry (HPLC-HRMS) with a quadrupole-time-of-flight (QqToF) instrument from a raw urinary matrix. A validation according to the FDA guidelines was employed to evaluate the specificity, linear or quadratic curve fitting, inter- and intra-day precision, accuracy and limits of detection and quantification (LOQ). The developed analysis allows for the quantification of 23 contaminants in the urine samples, with the LOQs ranging between 4.3 ng.mL(-1) and 113.2 ng.mL(-1). This method was applied to 17 urine samples. Among the targeted contaminants, four compounds were detected in samples. One of the contaminants (tributyl phosphate) was detected below the LOQ. The three others (4-hydroxybenzoic acid, sodium dodecylbenzenesulfonate and O,O-diethyl thiophosphate potassium) were detected but did not fulfill the validation criteria for quantification. Among these four compounds, two of them were found in all samples: tributyl phosphate and the surfactant sodium dodecylbenzenesulfonate. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Evaluation of analytical errors in a clinical chemistry laboratory: a 3 year experience.

    PubMed

    Sakyi, As; Laing, Ef; Ephraim, Rk; Asibey, Of; Sadique, Ok

    2015-01-01

    Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified.

  7. Planar optical waveguide based sandwich assay sensors and processes for the detection of biological targets including early detection of cancers

    DOEpatents

    Martinez, Jennifer S [Santa Fe, NM; Swanson, Basil I [Los Alamos, NM; Shively, John E [Arcadia, CA; Li, Lin [Monrovia, CA

    2009-06-02

    An assay element is described including recognition ligands adapted for binding to carcinoembryonic antigen (CEA) bound to a film on a single mode planar optical waveguide, the film from the group of a membrane, a polymerized bilayer membrane, and a self-assembled monolayer containing polyethylene glycol or polypropylene glycol groups therein and an assay process for detecting the presence of CEA is described including injecting a possible CEA-containing sample into a sensor cell including the assay element, maintaining the sample within the sensor cell for time sufficient for binding to occur between CEA present within the sample and the recognition ligands, injecting a solution including a reporter ligand into the sensor cell; and, interrogating the sample within the sensor cell with excitation light from the waveguide, the excitation light provided by an evanescent field of the single mode penetrating into the biological target-containing sample to a distance of less than about 200 nanometers from the waveguide thereby exciting any bound reporter ligand within a distance of less than about 200 nanometers from the waveguide and resulting in a detectable signal.

  8. NHEXAS PHASE I REGION 5 STUDY--METALS IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    This data set includes analytical results for measurements of metals in 1,906 dust samples. Dust samples were collected to assess potential residential sources of dermal and inhalation exposures and to examine relationships between analyte levels in dust and in personal and bioma...

  9. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    PubMed

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  10. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples.

    PubMed

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart; Larsen, Pia Bükmann

    2017-12-01

    Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability in incurred samples. We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used to determine stability of each analyte. Additionally, evaporation from the decapped blood collection tubes and the residual platelet count in the plasma after centrifugation were quantified. We report a post-analytical stability of most routine analytes of ≥8h and do therefore - with few exceptions - suggest a standard 8hour-time limit for reordering and reanalysis of analytes in incurred samples. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  11. Use of an activated beta-catenin to identify Wnt pathway target genes in caenorhabditis elegans, including a subset of collagen genes expressed in late larval development.

    PubMed

    Jackson, Belinda M; Abete-Luzi, Patricia; Krause, Michael W; Eisenmann, David M

    2014-04-16

    The Wnt signaling pathway plays a fundamental role during metazoan development, where it regulates diverse processes, including cell fate specification, cell migration, and stem cell renewal. Activation of the beta-catenin-dependent/canonical Wnt pathway up-regulates expression of Wnt target genes to mediate a cellular response. In the nematode Caenorhabditis elegans, a canonical Wnt signaling pathway regulates several processes during larval development; however, few target genes of this pathway have been identified. To address this deficit, we used a novel approach of conditionally activated Wnt signaling during a defined stage of larval life by overexpressing an activated beta-catenin protein, then used microarray analysis to identify genes showing altered expression compared with control animals. We identified 166 differentially expressed genes, of which 104 were up-regulated. A subset of the up-regulated genes was shown to have altered expression in mutants with decreased or increased Wnt signaling; we consider these genes to be bona fide C. elegans Wnt pathway targets. Among these was a group of six genes, including the cuticular collagen genes, bli-1 col-38, col-49, and col-71. These genes show a peak of expression in the mid L4 stage during normal development, suggesting a role in adult cuticle formation. Consistent with this finding, reduction of function for several of the genes causes phenotypes suggestive of defects in cuticle function or integrity. Therefore, this work has identified a large number of putative Wnt pathway target genes during larval life, including a small subset of Wnt-regulated collagen genes that may function in synthesis of the adult cuticle.

  12. Aquatic concentrations of chemical analytes compared to ecotoxicity estimates

    USGS Publications Warehouse

    Kostich, Mitchell S.; Flick, Robert W.; Angela L. Batt,; Mash, Heath E.; Boone, J. Scott; Furlong, Edward T.; Kolpin, Dana W.; Glassmeyer, Susan T.

    2017-01-01

    We describe screening level estimates of potential aquatic toxicity posed by 227 chemical analytes that were measured in 25 ambient water samples collected as part of a joint USGS/USEPA drinking water plant study. Measured concentrations were compared to biological effect concentration (EC) estimates, including USEPA aquatic life criteria, effective plasma concentrations of pharmaceuticals, published toxicity data summarized in the USEPA ECOTOX database, and chemical structure-based predictions. Potential dietary exposures were estimated using a generic 3-tiered food web accumulation scenario. For many analytes, few or no measured effect data were found, and for some analytes, reporting limits exceeded EC estimates, limiting the scope of conclusions. Results suggest occasional occurrence above ECs for copper, aluminum, strontium, lead, uranium, and nitrate. Sparse effect data for manganese, antimony, and vanadium suggest that these analytes may occur above ECs, but additional effect data would be desirable to corroborate EC estimates. These conclusions were not affected by bioaccumulation estimates. No organic analyte concentrations were found to exceed EC estimates, but ten analytes had concentrations in excess of 1/10th of their respective EC: triclocarban, norverapamil, progesterone, atrazine, metolachlor, triclosan, para-nonylphenol, ibuprofen, venlafaxine, and amitriptyline, suggesting more detailed characterization of these analytes.

  13. Aquatic concentrations of chemical analytes compared to ecotoxicity estimates.

    PubMed

    Kostich, Mitchell S; Flick, Robert W; Batt, Angela L; Mash, Heath E; Boone, J Scott; Furlong, Edward T; Kolpin, Dana W; Glassmeyer, Susan T

    2017-02-01

    We describe screening level estimates of potential aquatic toxicity posed by 227 chemical analytes that were measured in 25 ambient water samples collected as part of a joint USGS/USEPA drinking water plant study. Measured concentrations were compared to biological effect concentration (EC) estimates, including USEPA aquatic life criteria, effective plasma concentrations of pharmaceuticals, published toxicity data summarized in the USEPA ECOTOX database, and chemical structure-based predictions. Potential dietary exposures were estimated using a generic 3-tiered food web accumulation scenario. For many analytes, few or no measured effect data were found, and for some analytes, reporting limits exceeded EC estimates, limiting the scope of conclusions. Results suggest occasional occurrence above ECs for copper, aluminum, strontium, lead, uranium, and nitrate. Sparse effect data for manganese, antimony, and vanadium suggest that these analytes may occur above ECs, but additional effect data would be desirable to corroborate EC estimates. These conclusions were not affected by bioaccumulation estimates. No organic analyte concentrations were found to exceed EC estimates, but ten analytes had concentrations in excess of 1/10th of their respective EC: triclocarban, norverapamil, progesterone, atrazine, metolachlor, triclosan, para-nonylphenol, ibuprofen, venlafaxine, and amitriptyline, suggesting more detailed characterization of these analytes. Published by Elsevier B.V.

  14. Target-in-the-loop phasing of a fiber laser array fed by a linewidth-broadened master oscillator

    NASA Astrophysics Data System (ADS)

    Hyde, Milo W.; Tyler, Glenn A.; Rosado Garcia, Carlos

    2017-05-01

    In a recent paper [J. Opt. Soc. Am. A 33, 1931-1937 (2016)], the target-in-the-loop (TIL) phasing of an RF-modulated or multi-phase-dithered fiber laser array, fed by a linewidth-broadened master oscillator (MO) source, was investigated. It was found that TIL phasing was possible even on a target with scattering features separated by more than the MO's coherence length as long as the received, backscattered irradiance changed with the array's modulation or phase dither. To simplify the problem and gain insight into how temporal coherence affects TIL phasing, speckle and atmospheric turbulence were omitted from the analysis. Here, the scenario analyzed in the prior work is generalized by including speckle and turbulence. First, the key analytical result from the prior paper is reviewed. Simulations, including speckle and turbulence, are then performed to test whether the conclusions derived from that result hold under more realistic conditions.

  15. Trace level detection of analytes using artificial olfactometry

    NASA Technical Reports Server (NTRS)

    Wong, Bernard (Inventor); Munoz, Beth C. (Inventor); Lewis, Nathan S. (Inventor); Kelso, David M. (Inventor); Severin, Erik J. (Inventor)

    2001-01-01

    The present invention provides methods for detecting the presence of an analyte indicative of various medical conditions, including halitosis, periodontal disease and other diseases are also disclosed.

  16. Clustering in analytical chemistry.

    PubMed

    Drab, Klaudia; Daszykowski, Michal

    2014-01-01

    Data clustering plays an important role in the exploratory analysis of analytical data, and the use of clustering methods has been acknowledged in different fields of science. In this paper, principles of data clustering are presented with a direct focus on clustering of analytical data. The role of the clustering process in the analytical workflow is underlined, and its potential impact on the analytical workflow is emphasized.

  17. Analytical Ferrography Standardization.

    DTIC Science & Technology

    1982-01-01

    AD-AII6 508 MECHANICAL TECHNOLOGY INC LATHAM NY RESEARCH AND 0EV--ETC F/6 7/4 ANALYTICAL FERROGRAPHY STANDARDIZATION. (U) JAN 82 P A SENHOLZI, A S...ii Mwl jutio7 Unimte SMechanical Technology Incorporated Research and Development Division ReerhadDvlpetDvso I FINAL REPORT ANALYTICAL FERROGRAPHY ...Final Report MTI Technical Report No. 82TRS6 ANALYTICAL FERROGRAPHY STANDARDIZATION P. B. Senholzi A. S. Maciejewski Applications Engineering Mechanical

  18. Cotinine analytical workshop report: consideration of analytical methods for determining cotinine in human body fluids as a measure of passive exposure to tobacco smoke.

    PubMed Central

    Watts, R R; Langone, J J; Knight, G J; Lewtas, J

    1990-01-01

    A two-day technical workshop was convened November 10-11, 1986, to discuss analytical approaches for determining trace amounts of cotinine in human body fluids resulting from passive exposure to environmental tobacco smoke (ETS). The workshop, jointly sponsored by the U.S. Environmental Protection Agency and Centers for Disease Control, was attended by scientists with expertise in cotinine analytical methodology and/or conduct of human monitoring studies related to ETS. The workshop format included technical presentations, separate panel discussions on chromatography and immunoassay analytical approaches, and group discussions related to the quality assurance/quality control aspects of future monitoring programs. This report presents a consensus of opinion on general issues before the workshop panel participants and also a detailed comparison of several analytical approaches being used by the various represented laboratories. The salient features of the chromatography and immunoassay analytical methods are discussed separately. PMID:2190812

  19. Analytical strategies for organic food packaging contaminants.

    PubMed

    Sanchis, Yovana; Yusà, Vicent; Coscollà, Clara

    2017-03-24

    In this review, we present current approaches in the analysis of food-packaging contaminants. Gas and liquid chromatography coupled to mass spectrometry detection have been widely used in the analysis of some relevant families of these compounds such as primary aromatic amines, bisphenol A, bisphenol A diglycidyl ether and related compounds, UV-ink photoinitiators, perfluorinated compounds, phthalates and non-intentionally added substances. Main applications for sample treatment and different types of food-contact material migration studies have been also discussed. Pressurized Liquid Extraction, Solid-Phase Microextraction, Focused Ultrasound Solid-Liquid Extraction and Quechers have been mainly used in the extraction of food contact material (FCM) contaminants, due to the trend of minimising solvent consumption, automatization of sample preparation and integration of extraction and clean-up steps. Recent advances in analytical methodologies have allowed unequivocal identification and confirmation of these contaminants using Liquid Chromatography coupled to High Resolution Mass Spectrometry (LC-HRMS) through mass accuracy and isotopic pattern applying. LC-HRMS has been used in the target analysis of primary aromatic amines in different plastic materials, but few studies have been carried out applying this technique in post-target and non-target analysis of FCM contaminants. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Intimacy Is a Transdiagnostic Problem for Cognitive Behavior Therapy: Functional Analytical Psychotherapy Is a Solution

    ERIC Educational Resources Information Center

    Wetterneck, Chad T.; Hart, John M.

    2012-01-01

    Problems with intimacy and interpersonal issues are exhibited across most psychiatric disorders. However, most of the targets in Cognitive Behavioral Therapy are primarily intrapersonal in nature, with few directly involved in interpersonal functioning and effective intimacy. Functional Analytic Psychotherapy (FAP) provides a behavioral basis for…

  1. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence

  2. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  3. A Semi-Analytical Orbit Propagator Program for Highly Elliptical Orbits

    NASA Astrophysics Data System (ADS)

    Lara, M.; San-Juan, J. F.; Hautesserres, D.

    2016-05-01

    A semi-analytical orbit propagator to study the long-term evolution of spacecraft in Highly Elliptical Orbits is presented. The perturbation model taken into account includes the gravitational effects produced by the first nine zonal harmonics and the main tesseral harmonics affecting to the 2:1 resonance, which has an impact on Molniya orbit-types, of Earth's gravitational potential, the mass-point approximation for third body perturbations, which on ly include the Legendre polynomial of second order for the sun and the polynomials from second order to sixth order for the moon, solar radiation pressure and atmospheric drag. Hamiltonian formalism is used to model the forces of gravitational nature so as to avoid time-dependence issues the problem is formulated in the extended phase space. The solar radiation pressure is modeled as a potential and included in the Hamiltonian, whereas the atmospheric drag is added as a generalized force. The semi-analytical theory is developed using perturbation techniques based on Lie transforms. Deprit's perturbation algorithm is applied up to the second order of the second zonal harmonics, J2, including Kozay-type terms in the mean elements Hamiltonian to get "centered" elements. The transformation is developed in closed-form of the eccentricity except for tesseral resonances and the coupling between J_2 and the moon's disturbing effects are neglected. This paper describes the semi-analytical theory, the semi-analytical orbit propagator program and some of the numerical validations.

  4. Targeted proteomic assays for quantitation of proteins identified by proteogenomic analysis of ovarian cancer

    DOE PAGES

    Song, Ehwang; Gao, Yuqian; Wu, Chaochao; ...

    2017-07-19

    Here, mass spectrometry (MS) based targeted proteomic methods such as selected reaction monitoring (SRM) are becoming the method of choice for preclinical verification of candidate protein biomarkers. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute has investigated the standardization and analytical validation of the SRM assays and demonstrated robust analytical performance on different instruments across different laboratories. An Assay Portal has also been established by CPTAC to provide the research community a resource consisting of large set of targeted MS-based assays, and a depository to share assays publicly, providing that assays meet the guidelines proposed bymore » CPTAC. Herein, we report 98 SRM assays covering 70 candidate protein biomarkers previously reported as associated with ovarian cancer that have been thoroughly characterized according to the CPTAC Assay Characterization Guidance Document. The experiments, methods and results for characterizing these SRM assays for their MS response, repeatability, selectivity, stability, and reproducible detection of endogenous analytes are described in detail.« less

  5. Evaluation of Analytical Errors in a Clinical Chemistry Laboratory: A 3 Year Experience

    PubMed Central

    Sakyi, AS; Laing, EF; Ephraim, RK; Asibey, OF; Sadique, OK

    2015-01-01

    Background: Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. Aim: We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. Materials and Methods: We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). Results: A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Conclusion: Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified. PMID:25745569

  6. Kalman filter data assimilation: Targeting observations and parameter estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bellsky, Thomas, E-mail: bellskyt@asu.edu; Kostelich, Eric J.; Mahalov, Alex

    2014-06-15

    This paper studies the effect of targeted observations on state and parameter estimates determined with Kalman filter data assimilation (DA) techniques. We first provide an analytical result demonstrating that targeting observations within the Kalman filter for a linear model can significantly reduce state estimation error as opposed to fixed or randomly located observations. We next conduct observing system simulation experiments for a chaotic model of meteorological interest, where we demonstrate that the local ensemble transform Kalman filter (LETKF) with targeted observations based on largest ensemble variance is skillful in providing more accurate state estimates than the LETKF with randomly locatedmore » observations. Additionally, we find that a hybrid ensemble Kalman filter parameter estimation method accurately updates model parameters within the targeted observation context to further improve state estimation.« less

  7. Functional Interfaces Constructed by Controlled/Living Radical Polymerization for Analytical Chemistry.

    PubMed

    Wang, Huai-Song; Song, Min; Hang, Tai-Jun

    2016-02-10

    The high-value applications of functional polymers in analytical science generally require well-defined interfaces, including precisely synthesized molecular architectures and compositions. Controlled/living radical polymerization (CRP) has been developed as a versatile and powerful tool for the preparation of polymers with narrow molecular weight distributions and predetermined molecular weights. Among the CRP system, atom transfer radical polymerization (ATRP) and reversible addition-fragmentation chain transfer (RAFT) are well-used to develop new materials for analytical science, such as surface-modified core-shell particles, monoliths, MIP micro- or nanospheres, fluorescent nanoparticles, and multifunctional materials. In this review, we summarize the emerging functional interfaces constructed by RAFT and ATRP for applications in analytical science. Various polymers with precisely controlled architectures including homopolymers, block copolymers, molecular imprinted copolymers, and grafted copolymers were synthesized by CRP methods for molecular separation, retention, or sensing. We expect that the CRP methods will become the most popular technique for preparing functional polymers that can be broadly applied in analytical chemistry.

  8. Analytical Chemistry Laboratory. Progress report for FY 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1996. This annual report is the thirteenth for the ACL. It describes effort on continuing and new projects and contributions of the ACL staff to various programs at ANL. The ACL operates in the ANL system as a full-cost-recovery service center, but has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support to solve research problems of our clients --more » Argonne National Laboratory, the Department of Energy, and others -- and will conduct world-class research and development in analytical chemistry and its applications. Because of the diversity of research and development work at ANL, the ACL handles a wide range of analytical chemistry problems. Some routine or standard analyses are done, but the ACL usually works with commercial laboratories if our clients require high-volume, production-type analyses. It is common for ANL programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. Thus, much of the support work done by the ACL is very similar to our applied analytical chemistry research.« less

  9. Analytical and numerical treatment of drift-tearing modes in plasma slab

    NASA Astrophysics Data System (ADS)

    Mirnov, V. V.; Hegna, C. C.; Sovinec, C. R.; Howell, E. C.

    2016-10-01

    Two-fluid corrections to linear tearing modes includes 1) diamagnetic drifts that reduce the growth rate and 2) electron and ion decoupling on short scales that can lead to fast reconnection. We have recently developed an analytical model that includes effects 1) and 2) and important contribution from finite electron parallel thermal conduction. Both the tendencies 1) and 2) are confirmed by an approximate analytic dispersion relation that is derived using a perturbative approach of small ion-sound gyroradius ρs. This approach is only valid at the beginning of the transition from the collisional to semi-collisional regimes. Further analytical and numerical work is performed to cover the full interval of ρs connecting these two limiting cases. Growth rates are computed from analytic theory with a shooting method. They match the resistive MHD regime with the dispersion relations known at asymptotically large ion-sound gyroradius. A comparison between this analytical treatment and linear numerical simulations using the NIMROD code with cold ions and hot electrons in plasma slab is reported. The material is based on work supported by the U.S. DOE and NSF.

  10. Workplace mistreatment climate and potential employee and organizational outcomes: a meta-analytic review from the target's perspective.

    PubMed

    Yang, Liu-Qin; Caughlin, David E; Gazica, Michele W; Truxillo, Donald M; Spector, Paul E

    2014-07-01

    This meta-analytic study summarizes relations between workplace mistreatment climate-MC (specific to incivility, aggression, and bullying) and potential outcomes. We define MC as individual or shared perceptions of organizational policies, procedures, and practices that deter interpersonal mistreatment. We located 35 studies reporting results with individual perceptions of MC (psychological MC) that yielded 36 independent samples comprising 91,950 employees. Through our meta-analyses, we found significant mean correlations between psychological MC and employee and organizational outcomes including mistreatment reduction effort (motivation and performance), mistreatment exposure, strains, and job attitudes. Moderator analyses revealed that the psychological MC-outcome relations were generally stronger for perceived civility climate than for perceived aggression-inhibition climate, and content contamination of existing climate scales accentuated the magnitude of the relations between psychological MC and some outcomes (mistreatment exposure and employee strains). Further, the magnitudes of the psychological MC-outcome relations were generally comparable across studies using dominant (i.e., most commonly used) and other climate scales, but for some focal relations, magnitudes varied with respect to cross-sectional versus prospective designs. The 4 studies that assessed MC at the unit-level had results largely consistent with those at the employee level.

  11. Sensitivity of fish density estimates to standard analytical procedures applied to Great Lakes hydroacoustic data

    USGS Publications Warehouse

    Kocovsky, Patrick M.; Rudstam, Lars G.; Yule, Daniel L.; Warner, David M.; Schaner, Ted; Pientka, Bernie; Deller, John W.; Waterfield, Holly A.; Witzel, Larry D.; Sullivan, Patrick J.

    2013-01-01

    Standardized methods of data collection and analysis ensure quality and facilitate comparisons among systems. We evaluated the importance of three recommendations from the Standard Operating Procedure for hydroacoustics in the Laurentian Great Lakes (GLSOP) on density estimates of target species: noise subtraction; setting volume backscattering strength (Sv) thresholds from user-defined minimum target strength (TS) of interest (TS-based Sv threshold); and calculations of an index for multiple targets (Nv index) to identify and remove biased TS values. Eliminating noise had the predictable effect of decreasing density estimates in most lakes. Using the TS-based Sv threshold decreased fish densities in the middle and lower layers in the deepest lakes with abundant invertebrates (e.g., Mysis diluviana). Correcting for biased in situ TS increased measured density up to 86% in the shallower lakes, which had the highest fish densities. The current recommendations by the GLSOP significantly influence acoustic density estimates, but the degree of importance is lake dependent. Applying GLSOP recommendations, whether in the Laurentian Great Lakes or elsewhere, will improve our ability to compare results among lakes. We recommend further development of standards, including minimum TS and analytical cell size, for reducing the effect of biased in situ TS on density estimates.

  12. The Effect of Contingent Reinforcement on Target Variables in Outpatient Psychotherapy for Depression: A Successful and Unsuccessful Case Using Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Kanter, Jonathan W.; Landes, Sara J.; Busch, Andrew M.; Rusch, Laura C.; Brown, Keri R.; Baruch, David E.; Holman, Gareth I.

    2006-01-01

    The current study investigated a behavior-analytic treatment, functional analytic psychotherapy (FAP), for outpatient depression utilizing two single-subject A/A+B designs. The baseline condition was cognitive behavioral therapy. Results demonstrated treatment success in 1 client after the addition of FAP and treatment failure in the 2nd. This…

  13. Means of introducing an analyte into liquid sampling atmospheric pressure glow discharge

    DOEpatents

    Marcus, R. Kenneth; Quarles, Jr., Charles Derrick; Russo, Richard E.; Koppenaal, David W.; Barinaga, Charles J.; Carado, Anthony J.

    2017-01-03

    A liquid sampling, atmospheric pressure, glow discharge (LS-APGD) device as well as systems that incorporate the device and methods for using the device and systems are described. The LS-APGD includes a hollow capillary for delivering an electrolyte solution to a glow discharge space. The device also includes a counter electrode in the form of a second hollow capillary that can deliver the analyte into the glow discharge space. A voltage across the electrolyte solution and the counter electrode creates the microplasma within the glow discharge space that interacts with the analyte to move it to a higher energy state (vaporization, excitation, and/or ionization of the analyte).

  14. Extrapolating target tracks

    NASA Astrophysics Data System (ADS)

    Van Zandt, James R.

    2012-05-01

    Steady-state performance of a tracking filter is traditionally evaluated immediately after a track update. However, there is commonly a further delay (e.g., processing and communications latency) before the tracks can actually be used. We analyze the accuracy of extrapolated target tracks for four tracking filters: Kalman filter with the Singer maneuver model and worst-case correlation time, with piecewise constant white acceleration, and with continuous white acceleration, and the reduced state filter proposed by Mookerjee and Reifler.1, 2 Performance evaluation of a tracking filter is significantly simplified by appropriate normalization. For the Kalman filter with the Singer maneuver model, the steady-state RMS error immediately after an update depends on only two dimensionless parameters.3 By assuming a worst case value of target acceleration correlation time, we reduce this to a single parameter without significantly changing the filter performance (within a few percent for air tracking).4 With this simplification, we find for all four filters that the RMS errors for the extrapolated state are functions of only two dimensionless parameters. We provide simple analytic approximations in each case.

  15. Towards Secure and Trustworthy Cyberspace: Social Media Analytics on Hacker Communities

    ERIC Educational Resources Information Center

    Li, Weifeng

    2017-01-01

    Social media analytics is a critical research area spawned by the increasing availability of rich and abundant online user-generated content. So far, social media analytics has had a profound impact on organizational decision making in many aspects, including product and service design, market segmentation, customer relationship management, and…

  16. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Analytical performance of 17 general chemistry analytes across countries and across manufacturers in the INPUtS project of EQA organizers in Italy, the Netherlands, Portugal, United Kingdom and Spain.

    PubMed

    Weykamp, Cas; Secchiero, Sandra; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Thomas, Annette; Jassam, Nuthar; Barth, Julian H; Perich, Carmen; Ricós, Carmen; Faria, Ana Paula

    2017-02-01

    Optimum patient care in relation to laboratory medicine is achieved when results of laboratory tests are equivalent, irrespective of the analytical platform used or the country where the laboratory is located. Standardization and harmonization minimize differences and the success of efforts to achieve this can be monitored with international category 1 external quality assessment (EQA) programs. An EQA project with commutable samples, targeted with reference measurement procedures (RMPs) was organized by EQA institutes in Italy, the Netherlands, Portugal, UK, and Spain. Results of 17 general chemistry analytes were evaluated across countries and across manufacturers according to performance specifications derived from biological variation (BV). For K, uric acid, glucose, cholesterol and high-density density (HDL) cholesterol, the minimum performance specification was met in all countries and by all manufacturers. For Na, Cl, and Ca, the minimum performance specifications were met by none of the countries and manufacturers. For enzymes, the situation was complicated, as standardization of results of enzymes toward RMPs was still not achieved in 20% of the laboratories and questionable in the remaining 80%. The overall performance of the measurement of 17 general chemistry analytes in European medical laboratories met the minimum performance specifications. In this general picture, there were no significant differences per country and no significant differences per manufacturer. There were major differences between the analytes. There were six analytes for which the minimum quality specifications were not met and manufacturers should improve their performance for these analytes. Standardization of results of enzymes requires ongoing efforts.

  18. psRNATarget: a plant small RNA target analysis server (2017 release).

    PubMed

    Dai, Xinbin; Zhuang, Zhaohong; Zhao, Patrick Xuechun

    2018-04-30

    Plant regulatory small RNAs (sRNAs), which include most microRNAs (miRNAs) and a subset of small interfering RNAs (siRNAs), such as the phased siRNAs (phasiRNAs), play important roles in regulating gene expression. Although generated from genetically distinct biogenesis pathways, these regulatory sRNAs share the same mechanisms for post-translational gene silencing and translational inhibition. psRNATarget was developed to identify plant sRNA targets by (i) analyzing complementary matching between the sRNA sequence and target mRNA sequence using a predefined scoring schema and (ii) by evaluating target site accessibility. This update enhances its analytical performance by developing a new scoring schema that is capable of discovering miRNA-mRNA interactions at higher 'recall rates' without significantly increasing total prediction output. The scoring procedure is customizable for the users to search both canonical and non-canonical targets. This update also enables transmitting and analyzing 'big' data empowered by (a) the implementation of multi-threading chunked file uploading, which can be paused and resumed, using HTML5 APIs and (b) the allocation of significantly more computing nodes to its back-end Linux cluster. The updated psRNATarget server has clear, compelling and user-friendly interfaces that enhance user experiences and present data clearly and concisely. The psRNATarget is freely available at http://plantgrn.noble.org/psRNATarget/.

  19. Electrochemical Detection of Multiple Bioprocess Analytes

    NASA Technical Reports Server (NTRS)

    Rauh, R. David

    2010-01-01

    An apparatus that includes highly miniaturized thin-film electrochemical sensor array has been demonstrated as a prototype of instruments for simultaneous detection of multiple substances of interest (analytes) and measurement of acidity or alkalinity in bioprocess streams. Measurements of pH and of concentrations of nutrients and wastes in cell-culture media, made by use of these instruments, are to be used as feedback for optimizing the growth of cells or the production of desired substances by the cultured cells. The apparatus is designed to utilize samples of minimal volume so as to minimize any perturbation of monitored processes. The apparatus can function in a potentiometric mode (for measuring pH), an amperometric mode (detecting analytes via oxidation/reduction reactions), or both. The sensor array is planar and includes multiple thin-film microelectrodes covered with hydrous iridium oxide. The oxide layer on each electrode serves as both a protective and electrochemical transducing layer. In its transducing role, the oxide provides electrical conductivity for amperometric measurement or pH response for potentiometric measurement. The oxide on an electrode can also serve as a matrix for one or more enzymes that render the electrode sensitive to a specific analyte. In addition to transducing electrodes, the array includes electrodes for potential control. The array can be fabricated by techniques familiar to the microelectronics industry. The sensor array is housed in a thin-film liquid-flow cell that has a total volume of about 100 mL. The flow cell is connected to a computer-controlled subsystem that periodically draws samples from the bioprocess stream to be monitored. Before entering the cell, each 100-mL sample is subjected to tangential-flow filtration to remove particles. In the present version of the apparatus, the electrodes are operated under control by a potentiostat and are used to simultaneously measure the pH and the concentration of glucose

  20. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  1. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  2. Multi-Target Camera Tracking, Hand-off and Display LDRD 158819 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Robert J.

    2014-10-01

    Modern security control rooms gather video and sensor feeds from tens to hundreds of cameras. Advanced camera analytics can detect motion from individual video streams and convert unexpected motion into alarms, but the interpretation of these alarms depends heavily upon human operators. Unfortunately, these operators can be overwhelmed when a large number of events happen simultaneously, or lulled into complacency due to frequent false alarms. This LDRD project has focused on improving video surveillance-based security systems by changing the fundamental focus from the cameras to the targets being tracked. If properly integrated, more cameras shouldn’t lead to more alarms, moremore » monitors, more operators, and increased response latency but instead should lead to better information and more rapid response times. For the course of the LDRD we have been developing algorithms that take live video imagery from multiple video cameras, identify individual moving targets from the background imagery, and then display the results in a single 3D interactive video. In this document we summarize the work in developing this multi-camera, multi-target system, including lessons learned, tools developed, technologies explored, and a description of current capability.« less

  3. Multi-target camera tracking, hand-off and display LDRD 158819 final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Robert J.

    2014-10-01

    Modern security control rooms gather video and sensor feeds from tens to hundreds of cameras. Advanced camera analytics can detect motion from individual video streams and convert unexpected motion into alarms, but the interpretation of these alarms depends heavily upon human operators. Unfortunately, these operators can be overwhelmed when a large number of events happen simultaneously, or lulled into complacency due to frequent false alarms. This LDRD project has focused on improving video surveillance-based security systems by changing the fundamental focus from the cameras to the targets being tracked. If properly integrated, more cameras shouldn't lead to more alarms, moremore » monitors, more operators, and increased response latency but instead should lead to better information and more rapid response times. For the course of the LDRD we have been developing algorithms that take live video imagery from multiple video cameras, identifies individual moving targets from the background imagery, and then displays the results in a single 3D interactive video. In this document we summarize the work in developing this multi-camera, multi-target system, including lessons learned, tools developed, technologies explored, and a description of current capability.« less

  4. Affinity monolith chromatography: A review of principles and recent analytical applications

    PubMed Central

    Pfaunmiller, Erika L.; Paulemond, Marie Laura; Dupper, Courtney M.; Hage, David S.

    2012-01-01

    Affinity monolith chromatography (AMC) is a type of liquid chromatography that uses a monolithic support and a biologically-related binding agent as a stationary phase. AMC is a powerful method for the selective separation, analysis or studies of specific target compounds in a sample. This review discusses the basic principles of AMC and recent developments or applications of this method, with particular emphasis being given to work that has appeared in the last five years. Various materials that have been used to prepare columns for AMC are examined, including organic monoliths, silica monoliths, agarose monoliths and cryogels. These supports have been used in AMC for formats that have ranged from traditional columns to disks, microcolumns and capillaries. Many binding agents have also been employed in AMC, such as antibodies, enzymes, proteins, lectins, immobilized metal-ions and dyes. Some applications that have been reported with these binding agents in AMC are bioaffinity chromatography, immunoaffinity chromatography or immunoextraction, immobilized metal-ion affinity chromatography, dye-ligand affinity chromatography, chiral separations and biointeraction studies. Examples are presented from fields that include analytical chemistry, pharmaceutical analysis, clinical testing and biotechnology. Current trends and possible future directions in AMC are also discussed. PMID:23187827

  5. Design of ligand-targeted nanoparticles for enhanced cancer targeting

    NASA Astrophysics Data System (ADS)

    Stefanick, Jared F.

    Ligand-targeted nanoparticles are increasingly used as drug delivery vehicles for cancer therapy, yet have not consistently produced successful clinical outcomes. Although these inconsistencies may arise from differences in disease models and target receptors, nanoparticle design parameters can significantly influence therapeutic efficacy. By employing a multifaceted synthetic strategy to prepare peptide-targeted nanoparticles with high purity, reproducibility, and precisely controlled stoichiometry of functionalities, this work evaluates the roles of polyethylene glycol (PEG) coating, ethylene glycol (EG) peptide-linker length, peptide hydrophilicity, peptide density, and nanoparticle size on tumor targeting in a systematic manner. These parameters were analyzed in multiple disease models by targeting human epidermal growth factor receptor 2 (HER2) in breast cancer and very late antigen-4 (VLA-4) in multiple myeloma to demonstrate the widespread applicability of this approach. By increasing the hydrophilicity of the targeting peptide sequence and simultaneously optimizing the EG peptide-linker length, the in vitro cellular uptake of targeted liposomes was significantly enhanced. Specifically, including a short oligolysine chain adjacent to the targeting peptide sequence effectively increased cellular uptake ~80-fold using an EG6 peptide-linker compared to ~10-fold using an EG45 linker. In vivo, targeted liposomes prepared in a traditional manner lacking the oligolysine chain demonstrated similar biodistribution and tumor uptake to non-targeted liposomes. However, by including the oligolysine chain, targeted liposomes using an EG45 linker significantly improved tumor uptake ~8-fold over non-targeted liposomes, while the use of an EG6 linker decreased tumor accumulation and uptake, owing to differences in cellular uptake kinetics, clearance mechanisms, and binding site barrier effects. To further improve tumor targeting and enhance the selectivity of targeted

  6. Target screening and confirmation of 35 licit and illicit drugs and metabolites in hair by LC-MSMS.

    PubMed

    Lendoiro, Elena; Quintela, Oscar; de Castro, Ana; Cruz, Angelines; López-Rivadulla, Manuel; Concheiro, Marta

    2012-04-10

    A liquid chromatography-tandem mass spectrometry (LC-MSMS) target screening in 50mg hair was developed and fully validated for 35 analytes (Δ9-tetrahidrocannabinol (THC), morphine, 6-acetylmorphine, codeine, methadone, fentanyl, amphetamine, methamphetamine, 3,4-methylenedioxyamphetamine, 3,4-methylenedioxymethamphetamine, benzoylecgonine, cocaine, lysergic acid diethylamide, ketamine, scopolamine, alprazolam, bromazepam, clonazepam, diazepam, flunitrazepam, 7-aminoflunitrazepam, lorazepam, lormetazepam, nordiazepam, oxazepam, tetrazepam, triazolam, zolpidem, zopiclone, amitriptyline, citalopram, clomipramine, fluoxetine, paroxetine and venlafaxine). Hair decontamination was performed with dichloromethane, and incubation in 2 mL of acetonitrile at 50°C overnight. Extraction procedure was performed in 2 steps, first liquid-liquid extraction, hexane:ethyl acetate (55:45, v:v) at pH 9, followed by solid-phase extraction (Strata-X cartridges). Chromatographic separation was performed in AtlantisT3 (2.1 mm × 100 mm, 3 μm) column, acetonitrile and ammonium formate pH 3 as mobile phase, and 32 min total run time. One transition per analyte was monitored in MRM mode. To confirm a positive result, a second injection monitoring 2 transitions was performed. The method was specific (no endogenous interferences, n=9); LOD was 0.2-50 pg/mg and LOQ 0.5-100 pg/mg; linearity ranged from 0.5-100 to 2000-20,000 pg/mg; imprecision <15%; analytical recovery 85-115%; extraction efficiency 4.1-85.6%; and process efficiency 2.5-207.7%; 27 analytes showed ion suppression (up to -86.2%), 4 ion enhancement (up to 647.1%), and 4 no matrix effect; compounds showed good stability 24-48 h in autosampler. The method was applied to 17 forensic cases. In conclusion, a sensitive and specific target screening of 35 analytes in 50mg hair, including drugs of abuse (THC, cocaine, opiates, amphetamines) and medicines (benzodiazepines, antidepressants) was developed and validated, achieving lower cut

  7. GTARG - THE TOPEX/POSEIDON GROUND TRACK MAINTENANCE MANEUVER TARGETING PROGRAM

    NASA Technical Reports Server (NTRS)

    Shapiro, B. E.

    1994-01-01

    GTARG, The TOPEX/POSEIDON Ground Track Maintenance Maneuver Targeting Program, was developed to assist in the designing of orbit maintenance maneuvers for the TOPEX/POSEIDON satellite. These maneuvers ensure that the ground track is kept within 1 km of an approximately 9.9 day exact repeat pattern. Targeting strategies used by GTARG will either maximize the time between maneuvers (longitude targeting) or force control band exit to occur at specified intervals (time targeting). A runout mode allows for ground track propagation without targeting. The analytic mean-element propagation algorithm used in GTARG includes all perturbations that are known to cause significant variations in the satellite ground track. These include earth oblateness, luni-solar gravity, and drag, as well as the thrust due to impulsive maneuvers and unspecified along-track satellite fixed forces. Merson's extension of Grove's theory is used for the computation of the geopotential field. Kaula's disturbing function is used to attain the luni-solar gravitational perturbations. GTARG includes a satellite unique drag model which incorporates an approximate mean orbital Jacchia-Roberts atmosphere and a variable mean area model. Error models include uncertainties due to orbit determination, maneuver execution, drag unpredictability, as well as utilization of the knowledge of along-track satellite fixed forces. Maneuver Delta-v magnitudes are targeted to precisely maintain either the unbiased ground track itself, or a comfortable (3 sigma) error envelope about the unbiased ground track. GTARG is written in VAX-FORTRAN for DEC VAX Series computers running VMS. GTARG output is provided in two forms: an executive report summary which is in tabular form, and a plot file which is formatted as EZPLOT input namelists. Although the EZPLOT program and documentation are included with GTARG, EZPLOT requires PGPLOT, which was written by the California Institute of Technology Astronomy Department. (For non

  8. Analytical beam-width characteristics of distorted cat-eye reflected beam

    NASA Astrophysics Data System (ADS)

    Zhao, Yanzhong; Shan, Congmiao; Zheng, Yonghui; Zhang, Laixian; Sun, Huayan

    2015-02-01

    The analytical expression of beam-width of distorted cat-eye reflected beam under far-field condition is deduced using the approximate three-dimensional analytical formula for oblique detection laser beam passing through cat-eye optical lens with center shelter, and using the definition of second order moment, Gamma function and integral functions. The laws the variation of divergence angle and astigmatism degree of the reflected light with incident angle, focal shift, aperture size, and center shelter ratio are established by numerical calculation, and physical analysis. The study revealed that the cat-eye reflected beam is like a beam transmitted and collimated by the target optical lens, and has the same characteristics as that of Gaussian beam. A proper choice of positive focal shift would result in a divergence angle smaller than that of no focal shift. The astigmatism is mainly caused by incidence angle.

  9. [The concept of the development of the state of chemical-analytical environmental monitoring].

    PubMed

    Rakhmanin, Iu A; Malysheva, A G

    2013-01-01

    Chemical and analytical monitoring of the quality of environment is based on the accounting of the trace amount of substances. Considering the multicomponent composition of the environment and running processes of transformation of substances in it, in determination of the danger of the exposure to the chemical pollution of environment on population health there is necessary evaluation based on the simultaneous account of complex of substances really contained in the environment and supplying from different sources. Therefore, in the analytical monitoring of the quality and safety of the environment there is a necessary conversion from the orientation, based on the investigation of specific target substances, to estimation of real complex of compounds.

  10. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  11. RUPTURES IN THE ANALYTIC SETTING AND DISTURBANCES IN THE TRANSFORMATIONAL FIELD OF DREAMS.

    PubMed

    Brown, Lawrence J

    2015-10-01

    This paper explores some implications of Bleger's (1967, 2013) concept of the analytic situation, which he views as comprising the analytic setting and the analytic process. The author discusses Bleger's idea of the analytic setting as the depositary for projected painful aspects in either the analyst or patient or both-affects that are then rendered as nonprocess. In contrast, the contents of the analytic process are subject to an incessant process of transformation (Green 2005). The author goes on to enumerate various components of the analytic setting: the nonhuman, object relational, and the analyst's "person" (including mental functioning). An extended clinical vignette is offered as an illustration. © 2015 The Psychoanalytic Quarterly, Inc.

  12. Analytic Methods Used in Quality Control in a Compounding Pharmacy.

    PubMed

    Allen, Loyd V

    2017-01-01

    Analytical testing will no doubt become a more important part of pharmaceutical compounding as the public and regulatory agencies demand increasing documentation of the quality of compounded preparations. Compounding pharmacists must decide what types of testing and what amount of testing to include in their quality-control programs, and whether testing should be done in-house or outsourced. Like pharmaceutical compounding, analytical testing should be performed only by those who are appropriately trained and qualified. This article discusses the analytical methods that are used in quality control in a compounding pharmacy. Copyright© by International Journal of Pharmaceutical Compounding, Inc.

  13. Targeted Proteomic Quantification on Quadrupole-Orbitrap Mass Spectrometer*

    PubMed Central

    Gallien, Sebastien; Duriez, Elodie; Crone, Catharina; Kellmann, Markus; Moehring, Thomas; Domon, Bruno

    2012-01-01

    There is an immediate need for improved methods to systematically and precisely quantify large sets of peptides in complex biological samples. To date protein quantification in biological samples has been routinely performed on triple quadrupole instruments operated in selected reaction monitoring mode (SRM), and two major challenges remain. Firstly, the number of peptides to be included in one survey experiment needs to be increased to routinely reach several hundreds, and secondly, the degree of selectivity should be improved so as to reliably discriminate the targeted analytes from background interferences. High resolution and accurate mass (HR/AM) analysis on the recently developed Q-Exactive mass spectrometer can potentially address these issues. This instrument presents a unique configuration: it is constituted of an orbitrap mass analyzer equipped with a quadrupole mass filter as the front-end for precursor ion mass selection. This configuration enables new quantitative methods based on HR/AM measurements, including targeted analysis in MS mode (single ion monitoring) and in MS/MS mode (parallel reaction monitoring). The ability of the quadrupole to select a restricted m/z range allows one to overcome the dynamic range limitations associated with trapping devices, and the MS/MS mode provides an additional stage of selectivity. When applied to targeted protein quantification in urine samples and benchmarked with the reference SRM technique, the quadrupole-orbitrap instrument exhibits similar or better performance in terms of selectivity, dynamic range, and sensitivity. This high performance is further enhanced by leveraging the multiplexing capability of the instrument to design novel acquisition methods and apply them to large targeted proteomic studies for the first time, as demonstrated on 770 tryptic yeast peptides analyzed in one 60-min experiment. The increased quality of quadrupole-orbitrap data has the potential to improve existing protein

  14. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more

  15. Designing monitoring programs for chemicals of emerging concern in potable reuse--what to include and what not to include?

    PubMed

    Drewes, J E; Anderson, P; Denslow, N; Olivieri, A; Schlenk, D; Snyder, S A; Maruya, K A

    2013-01-01

    This study discussed a proposed process to prioritize chemicals for reclaimed water monitoring programs, selection of analytical methods required for their quantification, toxicological relevance of chemicals of emerging concern regarding human health, and related issues. Given that thousands of chemicals are potentially present in reclaimed water and that information about those chemicals is rapidly evolving, a transparent, science-based framework was developed to guide prioritization of which compounds of emerging concern (CECs) should be included in reclaimed water monitoring programs. The recommended framework includes four steps: (1) compile environmental concentrations (e.g., measured environmental concentration or MEC) of CECs in the source water for reuse projects; (2) develop a monitoring trigger level (MTL) for each of these compounds (or groups thereof) based on toxicological relevance; (3) compare the environmental concentration (e.g., MEC) to the MTL; CECs with a MEC/MTL ratio greater than 1 should be prioritized for monitoring, compounds with a ratio less than '1' should only be considered if they represent viable treatment process performance indicators; and (4) screen the priority list to ensure that a commercially available robust analytical method is available for that compound.

  16. Manufacturing data analytics using a virtual factory representation.

    PubMed

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  17. The Ophidia framework: toward cloud-based data analytics for climate change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni

    2015-04-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private

  18. Basic Information for EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM)

    EPA Pesticide Factsheets

    Contains basic information on the role and origins of the Selected Analytical Methods including the formation of the Homeland Security Laboratory Capacity Work Group and the Environmental Evaluation Analytical Process Roadmap for Homeland Security Events

  19. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Analytical and simulator study of advanced transport

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Rickard, W. W.

    1982-01-01

    An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.

  1. Niosh analytical methods for Set G

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1976-12-01

    Industrial Hygiene sampling and analytical monitoring methods validated under the joint NIOSH/OSHA Standards Completion Program for Set G are contained herein. Monitoring methods for the following compounds are included: butadiene, heptane, ketene, methyl cyclohexane, octachloronaphthalene, pentachloronaphthalene, petroleum distillates, propylene dichloride, turpentine, dioxane, hexane, LPG, naphtha(coal tar), octane, pentane, propane, and stoddard solvent.

  2. Analysis of low molecular weight metabolites in tea using mass spectrometry-based analytical methods.

    PubMed

    Fraser, Karl; Harrison, Scott J; Lane, Geoff A; Otter, Don E; Hemar, Yacine; Quek, Siew-Young; Rasmussen, Susanne

    2014-01-01

    Tea is the second most consumed beverage in the world after water and there are numerous reported health benefits as a result of consuming tea, such as reducing the risk of cardiovascular disease and many types of cancer. Thus, there is much interest in the chemical composition of teas, for example; defining components responsible for contributing to reported health benefits; defining quality characteristics such as product flavor; and monitoring for pesticide residues to comply with food safety import/export requirements. Covered in this review are some of the latest developments in mass spectrometry-based analytical techniques for measuring and characterizing low molecular weight components of tea, in particular primary and secondary metabolites. The methodology; more specifically the chromatography and detection mechanisms used in both targeted and non-targeted studies, and their main advantages and disadvantages are discussed. Finally, we comment on the latest techniques that are likely to have significant benefit to analysts in the future, not merely in the area of tea research, but in the analytical chemistry of low molecular weight compounds in general.

  3. Amino acids in a targeted versus a non-targeted metabolomics LC-MS/MS assay. Are the results consistent?

    PubMed

    Klepacki, Jacek; Klawitter, Jost; Klawitter, Jelena; Karimpour-Fard, Anis; Thurman, Joshua; Ingle, Gordon; Patel, Dharmesh; Christians, Uwe

    2016-09-01

    The results of plasma amino acid patterns in samples from kidney transplant patients with good and impaired renal function using a targeted LC-MS/MS amino acid assay and a non-targeted metabolomics assay were compared. EDTA plasma samples were prospectively collected at baseline, 1, 2, 4 and 6months post-transplant (n=116 patients, n=398 samples). Each sample was analyzed using both a commercial amino acid LC-MS/MS assay and a non-targeted metabolomics assay also based on MS/MS ion transitions. The results of both assays were independently statistically analyzed to identify amino acids associated with estimated glomerular filtration rates using correlation and partial least squares-discriminant analysis. Although there was overlap between the results of the targeted and non-targeted metabolomics assays (tryptophan, 1-methyl histidine), there were also substantial inconsistencies, with the non-targeted assay resulting in more "hits" than the targeted assay. Without further verification of the hits detected by the non-targeted discovery assay, this would have led to different interpretation of the results. There were also false negative results when the non-targeted assay was used (hydroxy proline). Several of said discrepancies could be explained by loss of sensitivity during analytical runs for selected amino acids (serine and threonine), retention time shifts, signals above the range of linear detector response and integration of peaks not separated from background and interferences (aspartate) when the non-targeted metabolomics assay was used. Whenever assessment of a specific pathway such as amino acids is the focus of interest, a targeted seems preferable to a non-targeted metabolomics assay. Copyright © 2016. Published by Elsevier Inc.

  4. Analytical Electrochemistry: Theory and Instrumentation of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Johnson, Dennis C.

    1980-01-01

    Emphasizes trends in the development of six topics concerning analytical electrochemistry, including books and reviews (34 references cited), mass transfer (59), charge transfer (25), surface effects (33), homogeneous reactions (21), and instrumentation (31). (CS)

  5. The analysis of non-linear dynamic behavior (including snap-through) of postbuckled plates by simple analytical solution

    NASA Technical Reports Server (NTRS)

    Ng, C. F.

    1988-01-01

    Static postbuckling and nonlinear dynamic analysis of plates are usually accomplished by multimode analyses, although the methods are complicated and do not give straightforward understanding of the nonlinear behavior. Assuming single-mode transverse displacement, a simple formula is derived for the transverse load displacement relationship of a plate under in-plane compression. The formula is used to derive a simple analytical expression for the static postbuckling displacement and nonlinear dynamic responses of postbuckled plates under sinusoidal or random excitation. Regions with softening and hardening spring behavior are identified. Also, the highly nonlinear motion of snap-through and its effects on the overall dynamic response can be easily interpreted using the single-mode formula. Theoretical results are compared with experimental results obtained using a buckled aluminum panel, using discrete frequency and broadband point excitation. Some important effects of the snap-through motion on the dynamic response of the postbuckled plates are found.

  6. Hierarchical Analytical Approaches for Unraveling the Composition of Proprietary Mixtures

    EPA Pesticide Factsheets

    The composition of commercial mixtures including pesticide inert ingredients, aircraft deicers, and aqueous film-forming foam (AFFF) formulations, and by analogy, fracking fluids, are proprietary. Quantitative analytical methodologies can only be developed for mixture components once their identities are known. Because proprietary mixtures may contain volatile and non-volatile components, a hierarchy of analytical methods is often required for the full identification of all proprietary mixture components.

  7. Spectral multivariate calibration without laboratory prepared or determined reference analyte values.

    PubMed

    Ottaway, Josh; Farrell, Jeremy A; Kalivas, John H

    2013-02-05

    An essential part to calibration is establishing the analyte calibration reference samples. These samples must characterize the sample matrix and measurement conditions (chemical, physical, instrumental, and environmental) of any sample to be predicted. Calibration usually requires measuring spectra for numerous reference samples in addition to determining the corresponding analyte reference values. Both tasks are typically time-consuming and costly. This paper reports on a method named pure component Tikhonov regularization (PCTR) that does not require laboratory prepared or determined reference values. Instead, an analyte pure component spectrum is used in conjunction with nonanalyte spectra for calibration. Nonanalyte spectra can be from different sources including pure component interference samples, blanks, and constant analyte samples. The approach is also applicable to calibration maintenance when the analyte pure component spectrum is measured in one set of conditions and nonanalyte spectra are measured in new conditions. The PCTR method balances the trade-offs between calibration model shrinkage and the degree of orthogonality to the nonanalyte content (model direction) in order to obtain accurate predictions. Using visible and near-infrared (NIR) spectral data sets, the PCTR results are comparable to those obtained using ridge regression (RR) with reference calibration sets. The flexibility of PCTR also allows including reference samples if such samples are available.

  8. Selected field and analytical methods and analytical results in the Dutch Flats area, western Nebraska, 1995-99

    USGS Publications Warehouse

    Verstraeten, Ingrid M.; Steele, G.V.; Cannia, J.C.; Bohlke, J.K.; Kraemer, T.E.; Hitch, D.E.; Wilson, K.E.; Carnes, A.E.

    2001-01-01

    A study of the water resources of the Dutch Flats area in the western part of the North Platte Natural Resources District, western Nebraska, was conducted from 1995 through 1999 to describe the surface water and hydrogeology, the spatial distribution of selected water-quality constituents in surface and ground water, and the surface-water/ground-water interaction in selected areas. This report describes the selected field and analytical methods used in the study and selected analytical results from the study not previously published. Specifically, dissolved gases, age-dating data, and other isotopes collected as part of an intensive sampling effort in August and November 1998 and all uranium and uranium isotope data collected through the course of this study are included in the report.

  9. Updates to Selected Analytical Methods for Environmental Remediation and Recovery (SAM)

    EPA Pesticide Factsheets

    View information on the latest updates to methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM), including the newest recommended methods and publications.

  10. Analytical Chemists: A New Breed of Entrepreneurs.

    ERIC Educational Resources Information Center

    Analytical Chemistry, 1985

    1985-01-01

    Examines the involvement of college faculty in small business activities, indicating that university administrators have become decidedly more supportive of such ventures in recent years. A list of 14 start-up companies (showing type of services) founded recently by university analytical chemists is included. (JN)

  11. Analyte quantification with comprehensive two-dimensional gas chromatography: assessment of methods for baseline correction, peak delineation, and matrix effect elimination for real samples.

    PubMed

    Samanipour, Saer; Dimitriou-Christidis, Petros; Gros, Jonas; Grange, Aureline; Samuel Arey, J

    2015-01-02

    Comprehensive two-dimensional gas chromatography (GC×GC) is used widely to separate and measure organic chemicals in complex mixtures. However, approaches to quantify analytes in real, complex samples have not been critically assessed. We quantified 7 PAHs in a certified diesel fuel using GC×GC coupled to flame ionization detector (FID), and we quantified 11 target chlorinated hydrocarbons in a lake water extract using GC×GC with electron capture detector (μECD), further confirmed qualitatively by GC×GC with electron capture negative chemical ionization time-of-flight mass spectrometer (ENCI-TOFMS). Target analyte peak volumes were determined using several existing baseline correction algorithms and peak delineation algorithms. Analyte quantifications were conducted using external standards and also using standard additions, enabling us to diagnose matrix effects. We then applied several chemometric tests to these data. We find that the choice of baseline correction algorithm and peak delineation algorithm strongly influence the reproducibility of analyte signal, error of the calibration offset, proportionality of integrated signal response, and accuracy of quantifications. Additionally, the choice of baseline correction and the peak delineation algorithm are essential for correctly discriminating analyte signal from unresolved complex mixture signal, and this is the chief consideration for controlling matrix effects during quantification. The diagnostic approaches presented here provide guidance for analyte quantification using GC×GC. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Trends in Process Analytical Technology: Present State in Bioprocessing.

    PubMed

    Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian

    2017-08-04

    Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.

  13. Robustness of network of networks under targeted attack.

    PubMed

    Dong, Gaogao; Gao, Jianxi; Du, Ruijin; Tian, Lixin; Stanley, H Eugene; Havlin, Shlomo

    2013-05-01

    The robustness of a network of networks (NON) under random attack has been studied recently [Gao et al., Phys. Rev. Lett. 107, 195701 (2011)]. Understanding how robust a NON is to targeted attacks is a major challenge when designing resilient infrastructures. We address here the question how the robustness of a NON is affected by targeted attack on high- or low-degree nodes. We introduce a targeted attack probability function that is dependent upon node degree and study the robustness of two types of NON under targeted attack: (i) a tree of n fully interdependent Erdős-Rényi or scale-free networks and (ii) a starlike network of n partially interdependent Erdős-Rényi networks. For any tree of n fully interdependent Erdős-Rényi networks and scale-free networks under targeted attack, we find that the network becomes significantly more vulnerable when nodes of higher degree have higher probability to fail. When the probability that a node will fail is proportional to its degree, for a NON composed of Erdős-Rényi networks we find analytical solutions for the mutual giant component P(∞) as a function of p, where 1-p is the initial fraction of failed nodes in each network. We also find analytical solutions for the critical fraction p(c), which causes the fragmentation of the n interdependent networks, and for the minimum average degree k[over ¯](min) below which the NON will collapse even if only a single node fails. For a starlike NON of n partially interdependent Erdős-Rényi networks under targeted attack, we find the critical coupling strength q(c) for different n. When q>q(c), the attacked system undergoes an abrupt first order type transition. When q≤q(c), the system displays a smooth second order percolation transition. We also evaluate how the central network becomes more vulnerable as the number of networks with the same coupling strength q increases. The limit of q=0 represents no dependency, and the results are consistent with the classical

  14. VALIDATION OF STANDARD ANALYTICAL PROTOCOL FOR ...

    EPA Pesticide Factsheets

    There is a growing concern with the potential for terrorist use of chemical weapons to cause civilian harm. In the event of an actual or suspected outdoor release of chemically hazardous material in a large area, the extent of contamination must be determined. This requires a system with the ability to prepare and quickly analyze a large number of contaminated samples for the traditional chemical agents, as well as numerous toxic industrial chemicals. Liquid samples (both aqueous and organic), solid samples (e.g., soil), vapor samples (e.g., air) and mixed state samples, all ranging from household items to deceased animals, may require some level of analyses. To meet this challenge, the U.S. Environmental Protection Agency (U.S. EPA) National Homeland Security Research Center, in collaboration with experts from across U.S. EPA and other Federal Agencies, initiated an effort to identify analytical methods for the chemical and biological agents that could be used to respond to a terrorist attack or a homeland security incident. U.S. EPA began development of standard analytical protocols (SAPs) for laboratory identification and measurement of target agents in case of a contamination threat. These methods will be used to help assist in the identification of existing contamination, the effectiveness of decontamination, as well as clearance for the affected population to reoccupy previously contaminated areas. One of the first SAPs developed was for the determin

  15. Off-target effects of sulforaphane include the derepression of long terminal repeats through histone acetylation events.

    PubMed

    Baier, Scott R; Zbasnik, Richard; Schlegel, Vicki; Zempleni, Janos

    2014-06-01

    Sulforaphane is a naturally occurring isothiocyanate in cruciferous vegetables. Sulforaphane inhibits histone deacetylases, leading to the transcriptional activation of genes including tumor suppressor genes. The compound has attracted considerable attention in the chemoprevention of prostate cancer. Here we tested the hypothesis that sulforaphane is not specific for tumor suppressor genes but also activates loci such as long terminal repeats (LTRs), which might impair genome stability. Studies were conducted using chemically pure sulforaphane in primary human IMR-90 fibroblasts and in broccoli sprout feeding studies in healthy adults. Sulforaphane (2.0 μM) caused an increase in LTR transcriptional activity in cultured cells. Consumption of broccoli sprouts (34, 68 or 102 g) by human volunteers caused a dose dependent elevation in LTR mRNA in circulating leukocytes, peaking at more than a 10-fold increase. This increase in transcript levels was associated with an increase in histone H3 K9 acetylation marks in LTR 15 in peripheral blood mononuclear cells from subjects consuming sprouts. Collectively, this study suggests that sulforaphane has off-target effects that warrant further investigation when recommending high levels of sulforaphane intake, despite its promising activities in chemoprevention. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Investigation of the persistence of nerve agent degradation analytes on surfaces through wipe sampling and detection with ultrahigh performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Willison, Stuart A

    2015-01-20

    The persistence of chemical warfare nerve agent degradation analytes on surfaces is important, from indicating the presence of nerve agent on a surface to guiding environmental restoration of a site after a release. Persistence was investigated for several chemical warfare nerve agent degradation analytes on indoor surfaces and presents an approach for wipe sampling of surfaces, followed by wipe extraction and liquid chromatography-tandem mass spectrometry detection. Commercially available wipe materials were investigated to determine optimal wipe recoveries. Tested surfaces included porous/permeable (vinyl tile, painted drywall, and wood) and largely nonporous/impermeable (laminate, galvanized steel, and glass) surfaces. Wipe extracts were analyzed by ultrahigh performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS). UPLC provides a separation of targeted degradation analytes in addition to being nearly four times faster than high-performance liquid chromatography, allowing for greater throughput after a large-scale contamination incident and subsequent remediation events. Percent recoveries from nonporous/impermeable surfaces were 60-103% for isopropyl methylphosphonate (IMPA), GB degradate; 61-91% for ethyl methylphosphonate (EMPA), VX degradate; and 60-98% for pinacolyl methylphosphonate (PMPA), GD degradate. Recovery efficiencies for methyl phosphonate (MPA), nerve agent degradate, and ethylhydrogen dimethylphosphonate (EHDMAP), GA degradate, were lower, perhaps due to matrix effects. Diisopropyl methylphosphonate, GB impurity, was not recovered from surfaces. The resulting detection limits for wipe extracts were 0.065 ng/cm(2) for IMPA, 0.079 ng/cm(2) for MPA, 0.040 ng/cm(2) for EMPA, 0.078 ng/cm(2) for EHDMAP, and 0.013 ng/cm(2) for PMPA. The data indicate that laboratories may hold wipe samples for up to 30 days prior to analysis. Target analytes were observed to persist on surfaces for at least 6 weeks.

  17. Audiovisual synchrony enhances BOLD responses in a brain network including multisensory STS while also enhancing target-detection performance for both modalities

    PubMed Central

    Marchant, Jennifer L; Ruff, Christian C; Driver, Jon

    2012-01-01

    The brain seeks to combine related inputs from different senses (e.g., hearing and vision), via multisensory integration. Temporal information can indicate whether stimuli in different senses are related or not. A recent human fMRI study (Noesselt et al. [2007]: J Neurosci 27:11431–11441) used auditory and visual trains of beeps and flashes with erratic timing, manipulating whether auditory and visual trains were synchronous or unrelated in temporal pattern. A region of superior temporal sulcus (STS) showed higher BOLD signal for the synchronous condition. But this could not be related to performance, and it remained unclear if the erratic, unpredictable nature of the stimulus trains was important. Here we compared synchronous audiovisual trains to asynchronous trains, while using a behavioral task requiring detection of higher-intensity target events in either modality. We further varied whether the stimulus trains had predictable temporal pattern or not. Synchrony (versus lag) between auditory and visual trains enhanced behavioral sensitivity (d') to intensity targets in either modality, regardless of predictable versus unpredictable patterning. The analogous contrast in fMRI revealed BOLD increases in several brain areas, including the left STS region reported by Noesselt et al. [2007: J Neurosci 27:11431–11441]. The synchrony effect on BOLD here correlated with the subject-by-subject impact on performance. Predictability of temporal pattern did not affect target detection performance or STS activity, but did lead to an interaction with audiovisual synchrony for BOLD in inferior parietal cortex. PMID:21953980

  18. Extended Analytic Device Optimization Employing Asymptotic Expansion

    NASA Technical Reports Server (NTRS)

    Mackey, Jonathan; Sehirlioglu, Alp; Dynsys, Fred

    2013-01-01

    Analytic optimization of a thermoelectric junction often introduces several simplifying assumptionsincluding constant material properties, fixed known hot and cold shoe temperatures, and thermallyinsulated leg sides. In fact all of these simplifications will have an effect on device performance,ranging from negligible to significant depending on conditions. Numerical methods, such as FiniteElement Analysis or iterative techniques, are often used to perform more detailed analysis andaccount for these simplifications. While numerical methods may stand as a suitable solution scheme,they are weak in gaining physical understanding and only serve to optimize through iterativesearching techniques. Analytic and asymptotic expansion techniques can be used to solve thegoverning system of thermoelectric differential equations with fewer or less severe assumptionsthan the classic case. Analytic methods can provide meaningful closed form solutions and generatebetter physical understanding of the conditions for when simplifying assumptions may be valid.In obtaining the analytic solutions a set of dimensionless parameters, which characterize allthermoelectric couples, is formulated and provide the limiting cases for validating assumptions.Presentation includes optimization of both classic rectangular couples as well as practically andtheoretically interesting cylindrical couples using optimization parameters physically meaningful toa cylindrical couple. Solutions incorporate the physical behavior for i) thermal resistance of hot andcold shoes, ii) variable material properties with temperature, and iii) lateral heat transfer through legsides.

  19. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  20. An analytical study on groundwater flow in drainage basins with horizontal wells

    NASA Astrophysics Data System (ADS)

    Wang, Jun-Zhi; Jiang, Xiao-Wei; Wan, Li; Wang, Xu-Sheng; Li, Hailong

    2014-06-01

    Analytical studies on release/capture zones are often limited to a uniform background groundwater flow. In fact, for basin-scale problems, the undulating water table would lead to the development of hierarchically nested flow systems, which are more complex than a uniform flow. Under the premise that the water table is a replica of undulating topography and hardly influenced by wells, an analytical solution of hydraulic head is derived for a two-dimensional cross section of a drainage basin with horizontal injection/pumping wells. Based on the analytical solution, distributions of hydraulic head, stagnation points and flow systems (including release/capture zones) are explored. The superposition of injection/pumping wells onto the background flow field leads to the development of new internal stagnation points and new flow systems (including release/capture zones). Generally speaking, the existence of n injection/pumping wells would result in up to n new internal stagnation points and up to 2n new flow systems (including release/capture zones). The analytical study presented, which integrates traditional well hydraulics with the theory of regional groundwater flow, is useful in understanding basin-scale groundwater flow influenced by human activities.

  1. Earthdata Cloud Analytics Project

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Lynnes, Chris

    2018-01-01

    This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.

  2. Web Analytics: A Picture of the Academic Library Web Site User

    ERIC Educational Resources Information Center

    Black, Elizabeth L.

    2009-01-01

    This article describes the usefulness of Web analytics for understanding the users of an academic library Web site. Using a case study, the analysis describes how Web analytics can answer questions about Web site user behavior, including when visitors come, the duration of the visit, how they get there, the technology they use, and the most…

  3. IMPROVED METHOD FOR THE STORAGE OF GROUND WATER SAMPLES CONTAINING VOLATILE ORGANIC ANALYTES

    EPA Science Inventory

    The sorption of volatile organic analytes from water samples by the Teflon septum surface used with standard glass 40-ml sample collection vials was investigated. Analytes tested included alkanes, isoalkanes, olefins, cycloalkanes, a cycloalkene, monoaromatics, a polynuclear arom...

  4. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  5. The case for visual analytics of arsenic concentrations in foods.

    PubMed

    Johnson, Matilda O; Cohly, Hari H P; Isokpehi, Raphael D; Awofolu, Omotayo R

    2010-05-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species.

  6. The Case for Visual Analytics of Arsenic Concentrations in Foods

    PubMed Central

    Johnson, Matilda O.; Cohly, Hari H.P.; Isokpehi, Raphael D.; Awofolu, Omotayo R.

    2010-01-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species. PMID:20623005

  7. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is

  8. Aptamer- and nucleic acid enzyme-based systems for simultaneous detection of multiple analytes

    DOEpatents

    Lu, Yi [Champaign, IL; Liu, Juewen [Albuquerque, NM

    2011-11-15

    The present invention provides aptamer- and nucleic acid enzyme-based systems for simultaneously determining the presence and optionally the concentration of multiple analytes in a sample. Methods of utilizing the system and kits that include the sensor components are also provided. The system includes a first reactive polynucleotide that reacts to a first analyte; a second reactive polynucleotide that reacts to a second analyte; a third polynucleotide; a fourth polynucleotide; a first particle, coupled to the third polynucleotide; a second particle, coupled to the fourth polynucleotide; and at least one quencher, for quenching emissions of the first and second quantum dots, coupled to the first and second reactive polynucleotides. The first particle includes a quantum dot having a first emission wavelength. The second particle includes a second quantum dot having a second emission wavelength different from the first emission wavelength. The third polynucleotide and the fourth polynucleotide are different.

  9. Negative dielectrophoresis spectroscopy for rare analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Kirmani, Syed Abdul Mannan; Gudagunti, Fleming Dackson; Velmanickam, Logeeshan; Nawarathna, Dharmakeerthi; Lima, Ivan T., Jr.

    2017-03-01

    We propose the use of negative dielectrophoresis (DEP) spectroscopy as a technique to improve the detection limit of rare analytes in biological samples. We observe a significant dependence of the negative DEP force on functionalized polystyrene beads at the edges of interdigitated electrodes with respect to the frequency of the electric field. We measured this velocity of repulsion for 0% and 0.8% conjugation of avidin with biotin functionalized polystyrene beads with our automated software through real-time image processing that monitors the Rayleigh scattering from the beads. A significant difference in the velocity of the beads was observed in the presence of as little as 80 molecules of avidin per biotin functionalized bead. This technology can be applied in the detection and quantification of rare analytes that can be useful in the diagnosis and the treatment of diseases, such as cancer and myocardial infarction, with the use of polystyrene beads functionalized with antibodies for the target biomarkers.

  10. Analytical surveillance of emerging drugs of abuse and drug formulations

    PubMed Central

    Thomas, Brian F.; Pollard, Gerald T.; Grabenauer, Megan

    2012-01-01

    Uncontrolled recreational drugs are proliferating in number and variety. Effects of long-term use are unknown, and regulation is problematic, as efforts to control one chemical often lead to several other structural analogs. Advanced analytical instrumentation and methods are continuing to be developed to identify drugs, chemical constituents of products, and drug substances and metabolites in biological fluids. Several mass spectrometry based approaches appear promising, particularly those that involve high resolution chromatographic and mass spectrometric methods that allow unbiased data acquisition and sophisticated data interrogation. Several of these techniques are shown to facilitate both targeted and broad spectrum analysis, which is often of particular benefit when dealing with misleadingly labeled products or assessing a biological matrix for illicit drugs and metabolites. The development and application of novel analytical approaches such as these will help to assess the nature and degree of exposure and risk and, where necessary, inform forensics and facilitate implementation of specific regulation and control measures. PMID:23154240

  11. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    PubMed

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  12. Filamentation instability of a fast electron beam in a dielectric target.

    PubMed

    Debayle, A; Tikhonchuk, V T

    2008-12-01

    High-intensity laser-matter interaction is an efficient method for high-current relativistic electron beam production. At current densities exceeding a several kA microm{-2} , the beam propagation is maintained by an almost complete current neutralization by the target electrons. In such a geometry of two oppositely directed flows, beam instabilities can develop, depending on the target and the beam parameters. The present paper proposes an analytical description of the filamentation instability of an electron beam propagating through an insulator target. It is shown that the collisionless and resistive instabilities enter into competition with the ionization instability. This latter process is dominant in insulator targets where the field ionization by the fast beam provides free electrons for the neutralization current.

  13. First passage times for multiple particles with reversible target-binding kinetics

    NASA Astrophysics Data System (ADS)

    Grebenkov, Denis S.

    2017-10-01

    We investigate the first passage problem for multiple particles that diffuse towards a target, partially adsorb there, and then desorb after a finite exponentially distributed residence time. We search for the first time when m particles undergoing such reversible target-binding kinetics are found simultaneously on the target that may trigger an irreversible chemical reaction or a biophysical event. Even if the particles are independent, the finite residence time on the target yields an intricate temporal coupling between particles. We compute analytically the mean first passage time (MFPT) for two independent particles by mapping the original problem to higher-dimensional surface-mediated diffusion and solving the coupled partial differential equations. The respective effects of the adsorption and desorption rates on the MFPT are revealed and discussed.

  14. The role of multi-target policy instruments in agri-environmental policy mixes.

    PubMed

    Schader, Christian; Lampkin, Nicholas; Muller, Adrian; Stolze, Matthias

    2014-12-01

    The Tinbergen Rule has been used to criticise multi-target policy instruments for being inefficient. The aim of this paper is to clarify the role of multi-target policy instruments using the case of agri-environmental policy. Employing an analytical linear optimisation model, this paper demonstrates that there is no general contradiction between multi-target policy instruments and the Tinbergen Rule, if multi-target policy instruments are embedded in a policy-mix with a sufficient number of targeted instruments. We show that the relation between cost-effectiveness of the instruments, related to all policy targets, is the key determinant for an economically sound choice of policy instruments. If economies of scope with respect to achieving policy targets are realised, a higher cost-effectiveness of multi-target policy instruments can be achieved. Using the example of organic farming support policy, we discuss several reasons why economies of scope could be realised by multi-target agri-environmental policy instruments. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Nuclease Target Site Selection for Maximizing On-target Activity and Minimizing Off-target Effects in Genome Editing

    PubMed Central

    Lee, Ciaran M; Cradick, Thomas J; Fine, Eli J; Bao, Gang

    2016-01-01

    The rapid advancement in targeted genome editing using engineered nucleases such as ZFNs, TALENs, and CRISPR/Cas9 systems has resulted in a suite of powerful methods that allows researchers to target any genomic locus of interest. A complementary set of design tools has been developed to aid researchers with nuclease design, target site selection, and experimental validation. Here, we review the various tools available for target selection in designing engineered nucleases, and for quantifying nuclease activity and specificity, including web-based search tools and experimental methods. We also elucidate challenges in target selection, especially in predicting off-target effects, and discuss future directions in precision genome editing and its applications. PMID:26750397

  16. RDF SKETCH MAPS - KNOWLEDGE COMPLEXITY REDUCTION FOR PRECISION MEDICINE ANALYTICS.

    PubMed

    Thanintorn, Nattapon; Wang, Juexin; Ersoy, Ilker; Al-Taie, Zainab; Jiang, Yuexu; Wang, Duolin; Verma, Megha; Joshi, Trupti; Hammer, Richard; Xu, Dong; Shin, Dmitriy

    2016-01-01

    Realization of precision medicine ideas requires significant research effort to be able to spot subtle differences in complex diseases at the molecular level to develop personalized therapies. It is especially important in many cases of highly heterogeneous cancers. Precision diagnostics and therapeutics of such diseases demands interrogation of vast amounts of biological knowledge coupled with novel analytic methodologies. For instance, pathway-based approaches can shed light on the way tumorigenesis takes place in individual patient cases and pinpoint to novel drug targets. However, comprehensive analysis of hundreds of pathways and thousands of genes creates a combinatorial explosion, that is challenging for medical practitioners to handle at the point of care. Here we extend our previous work on mapping clinical omics data to curated Resource Description Framework (RDF) knowledge bases to derive influence diagrams of interrelationships of biomarker proteins, diseases and signal transduction pathways for personalized theranostics. We present RDF Sketch Maps - a computational method to reduce knowledge complexity for precision medicine analytics. The method of RDF Sketch Maps is inspired by the way a sketch artist conveys only important visual information and discards other unnecessary details. In our case, we compute and retain only so-called RDF Edges - places with highly important diagnostic and therapeutic information. To do this we utilize 35 maps of human signal transduction pathways by transforming 300 KEGG maps into highly processable RDF knowledge base. We have demonstrated potential clinical utility of RDF Sketch Maps in hematopoietic cancers, including analysis of pathways associated with Hairy Cell Leukemia (HCL) and Chronic Myeloid Leukemia (CML) where we achieved up to 20-fold reduction in the number of biological entities to be analyzed, while retaining most likely important entities. In experiments with pathways associated with HCL a generated RDF

  17. TLD efficiency calculations for heavy ions: an analytical approach

    DOE PAGES

    Boscolo, Daria; Scifoni, Emanuele; Carlino, Antonio; ...

    2015-12-18

    The use of thermoluminescent dosimeters (TLDs) in heavy charged particles’ dosimetry is limited by their non-linear dose response curve and by their response dependence on the radiation quality. Thus, in order to use TLDs with particle beams, a model that can reproduce the behavior of these detectors under different conditions is needed. Here a new, simple and completely analytical algorithm for the calculation of the relative TL-efficiency depending on the ion charge Z and energy E is presented. In addition, the detector response is evaluated starting from the single ion case, where the computed effectiveness values have been compared withmore » experimental data as well as with predictions from a different method. The main advantage of this approach is that, being fully analytical, it is computationally fast and can be efficiently integrated into treatment planning verification tools. In conclusion, the calculated efficiency values have been then implemented in the treatment planning code TRiP98 and dose calculations on a macroscopic target irradiated with an extended carbon ion field have been performed and verified against experimental data.« less

  18. Electrochemical detection for microscale analytical systems: a review.

    PubMed

    Wang, Joseph

    2002-02-11

    As the field of chip-based microscale systems continues its rapid growth, there are urgent needs for developing compatible detection modes. Electrochemistry detection offers considerable promise for such microfluidic systems, with features that include remarkable sensitivity, inherent miniaturization and portability, independence of optical path length or sample turbidity, low cost, low-power requirements and high compatibility with advanced micromachining and microfabrication technologies. This paper highlights recent advances, directions and key strategies in controlled-potential electrochemical detectors for miniaturized analytical systems. Subjects covered include the design and integration of the electrochemical detection system, its requirements and operational principles, common electrode materials, derivatization reactions, electrical-field decouplers, typical applications and future prospects. It is expected that electrochemical detection will become a powerful tool for microscale analytical systems and will facilitate the creation of truly portable (and possibly disposable) devices.

  19. Applications of Optical Microcavity Resonators in Analytical Chemistry

    PubMed Central

    Wade, James H.; Bailey, Ryan C.

    2018-01-01

    Optical resonator sensors are an emerging class of analytical technologies that use recirculating light confined within a microcavity to sensitively measure the surrounding environment. Bolstered by advances in microfabrication, these devices can be configured for a wide variety of chemical or biomolecular sensing applications. The review begins with a brief description of optical resonator sensor operation followed by discussions regarding sensor design, including different geometries, choices of material systems, methods of sensor interrogation, and new approaches to sensor operation. Throughout, key recent developments are highlighted, including advancements in biosensing and other applications of optical sensors. Alternative sensing mechanisms and hybrid sensing devices are then discussed in terms of their potential for more sensitive and rapid analyses. Brief concluding statements offer our perspective on the future of optical microcavity sensors and their promise as versatile detection elements within analytical chemistry. PMID:27049629

  20. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  1. A new approach to data evaluation in the non-target screening of organic trace substances in water analysis.

    PubMed

    Müller, Alexander; Schulz, Wolfgang; Ruck, Wolfgang K L; Weber, Walter H

    2011-11-01

    Non-target screening via high performance liquid chromatography-mass spectrometry (HPLC-MS) has gained increasingly in importance for monitoring organic trace substances in water resources targeted for the production of drinking water. In this article a new approach for evaluating the data from non-target HPLC-MS screening in water is introduced and its advantages are demonstrated using the supply of drinking water as an example. The crucial difference between this and other approaches is the comparison of samples based on compounds (features) determined by their full scan data. In so doing, we take advantage of the temporal, spatial, or process-based relationships among the samples by applying the set operators, UNION, INTERSECT, and COMPLEMENT to the features of each sample. This approach regards all compounds, detectable by the used analytical method. That is the fundamental meaning of non-target screening, which includes all analytical information from the applied technique for further data evaluation. In the given example, in just one step, all detected features (1729) of a landfill leachate sample could be examined for their relevant influences on water purification respectively drinking water. This study shows that 1721 out of 1729 features were not relevant for the water purification. Only eight features could be determined in the untreated water and three of them were found in the final drinking water after ozonation. In so doing, it was possible to identify 1-adamantylamine as contamination of the landfill in the drinking water at a concentration in the range of 20 ng L(-1). To support the identification of relevant compounds and their transformation products, the DAIOS database (Database-Assisted Identification of Organic Substances) was used. This database concept includes some functions such as product ion search to increase the efficiency of the database query after the screening. To identify related transformation products the database function

  2. Analyte discrimination from chemiresistor response kinetics.

    PubMed

    Read, Douglas H; Martin, James E

    2010-08-15

    Chemiresistors are polymer-based sensors that transduce the sorption of a volatile organic compound into a resistance change. Like other polymer-based gas sensors that function through sorption, chemiresistors can be selective for analytes on the basis of the affinity of the analyte for the polymer. However, a single sensor cannot, in and of itself, discriminate between analytes, since a small concentration of an analyte that has a high affinity for the polymer might give the same response as a high concentration of another analyte with a low affinity. In this paper we use a field-structured chemiresistor to demonstrate that its response kinetics can be used to discriminate between analytes, even between those that have identical chemical affinities for the polymer phase of the sensor. The response kinetics is shown to be independent of the analyte concentration, and thus the magnitude of the sensor response, but is found to vary inversely with the analyte's saturation vapor pressure. Saturation vapor pressures often vary greatly from analyte to analyte, so analysis of the response kinetics offers a powerful method for obtaining analyte discrimination from a single sensor.

  3. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  4. Semi-Analytic Reconstruction of Flux in Finite Volume Formulations

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2006-01-01

    Semi-analytic reconstruction uses the analytic solution to a second-order, steady, ordinary differential equation (ODE) to simultaneously evaluate the convective and diffusive flux at all interfaces of a finite volume formulation. The second-order ODE is itself a linearized approximation to the governing first- and second- order partial differential equation conservation laws. Thus, semi-analytic reconstruction defines a family of formulations for finite volume interface fluxes using analytic solutions to approximating equations. Limiters are not applied in a conventional sense; rather, diffusivity is adjusted in the vicinity of changes in sign of eigenvalues in order to achieve a sufficiently small cell Reynolds number in the analytic formulation across critical points. Several approaches for application of semi-analytic reconstruction for the solution of one-dimensional scalar equations are introduced. Results are compared with exact analytic solutions to Burger s Equation as well as a conventional, upwind discretization using Roe s method. One approach, the end-point wave speed (EPWS) approximation, is further developed for more complex applications. One-dimensional vector equations are tested on a quasi one-dimensional nozzle application. The EPWS algorithm has a more compact difference stencil than Roe s algorithm but reconstruction time is approximately a factor of four larger than for Roe. Though both are second-order accurate schemes, Roe s method approaches a grid converged solution with fewer grid points. Reconstruction of flux in the context of multi-dimensional, vector conservation laws including effects of thermochemical nonequilibrium in the Navier-Stokes equations is developed.

  5. Application of capability indices and control charts in the analytical method control strategy.

    PubMed

    Oliva, Alexis; Llabres Martinez, Matías

    2017-08-01

    In this study, we assessed the usefulness of control charts in combination with the process capability indices, C pm and C pk , in the control strategy of an analytical method. The traditional X-chart and moving range chart were used to monitor the analytical method over a 2-year period. The results confirmed that the analytical method is in-control and stable. Different criteria were used to establish the specifications limits (i.e. analyst requirements) for fixed method performance (i.e. method requirements). If the specification limits and control limits are equal in breadth, the method can be considered "capable" (C pm  = 1), but it does not satisfy the minimum method capability requirements proposed by Pearn and Shu (2003). Similar results were obtained using the C pk index. The method capability was also assessed as a function of method performance for fixed analyst requirements. The results indicate that the method does not meet the requirements of the analytical target approach. A real-example data of a SEC with light-scattering detection method was used as a model whereas previously published data were used to illustrate the applicability of the proposed approach. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Analytical and experimental investigations of the oblique detonation wave engine concept

    NASA Technical Reports Server (NTRS)

    Menees, Gene P.; Adelman, Henry G.; Cambier, Jean-Luc

    1990-01-01

    Wave combustors, which include the oblique detonation wave engine (ODWE), are attractive propulsion concepts for hypersonic flight. These engines utilize oblique shock or detonation waves to rapidly mix, ignite, and combust the air-fuel mixture in thin zones in the combustion chamber. Benefits of these combustion systems include shorter and lighter engines which require less cooling and can provide thrust at higher Mach numbers than conventional scramjets. The wave combustor's ability to operate at lower combustor inlet pressures may allow the vehicle to operate at lower dynamic pressures which could lessen the heating loads on the airframe. The research program at NASA-Ames includes analytical studies of the ODWE combustor using Computational Fluid Dynamics (CFD) codes which fully couple finite rate chemistry with fluid dynamics. In addition, experimental proof-of-concept studies are being performed in an arc heated hypersonic wind tunnel. Several fuel injection design were studied analytically and experimentally. In-stream strut fuel injectors were chosen to provide good mixing with minimal stagnation pressure losses. Measurements of flow field properties behind the oblique wave are compared to analytical predictions.

  7. Analytical and experimental investigations of the oblique detonation wave engine concept

    NASA Technical Reports Server (NTRS)

    Menees, Gene P.; Adelman, Henry G.; Cambier, Jean-Luc

    1991-01-01

    Wave combustors, which include the Oblique Detonation Wave Engine (ODWE), are attractive propulsion concepts for hypersonic flight. These engines utilize oblique shock or detonation waves to rapidly mix, ignite, and combust the air-fuel mixture in thin zones in the combustion chamber. Benefits of these combustion systems include shorter and lighter engines which will require less cooling and can provide thrust at higher Mach numbers than conventional scramjets. The wave combustor's ability to operate at lower combustor inlet pressures may allow the vehicle to operate at lower dynamic pressures which could lessen the heating loads on the airframe. The research program at NASA-Ames includes analytical studies of the ODWE combustor using CFD codes which fully couple finite rate chemistry with fluid dynamics. In addition, experimental proof-of-concept studies are being carried out in an arc heated hypersonic wind tunnel. Several fuel injection designs were studied analytically and experimentally. In-stream strut fuel injectors were chosen to provide good mixing with minimal stagnation pressure losses. Measurements of flow field properties behind the oblique wave are compared to analytical predictions.

  8. Analytic theory of high-order-harmonic generation by an intense few-cycle laser pulse

    NASA Astrophysics Data System (ADS)

    Frolov, M. V.; Manakov, N. L.; Popov, A. M.; Tikhonova, O. V.; Volkova, E. A.; Silaev, A. A.; Vvedenskii, N. V.; Starace, Anthony F.

    2012-03-01

    We present a theoretical model for describing the interaction of an electron, weakly bound in a short-range potential, with an intense, few-cycle laser pulse. General definitions for the differential probability of above-threshold ionization and for the yield of high-order-harmonic generation (HHG) are presented. For HHG we then derive detailed analytic expressions for the spectral density of generated radiation in terms of the key laser parameters, including the number N of optical cycles in the pulse and the carrier-envelope phase (CEP). In particular, in the tunneling approximation, we provide detailed derivations of the closed-form formulas presented briefly by M. V. Frolov [Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.83.021405 83, 021405(R) (2011)], which were used to describe key features of HHG by both H and Xe atom targets in an intense, few-cycle laser pulse. We then provide a complete analysis of the dependence of the HHG spectrum on both N and the CEP φ of an N-cycle laser pulse. Most importantly, we show analytically that the structure of the HHG spectrum stems from interference between electron wave packets originating from electron ionization from neighboring half-cycles near the peak of the intensity envelope of the few-cycle laser pulse. Such interference is shown to be very sensitive to the CEP. The usual HHG spectrum for a monochromatic driving laser field (comprising harmonic peaks at odd multiples of the carrier frequency and spaced by twice the carrier frequency) is shown analytically to occur only in the limit of very large N, and begins to form, as N increases, in the energy region beyond the HHG plateau cutoff.

  9. Analytical Derivation and Experimental Evaluation of Short-Bearing Approximation for Full Journal Bearing

    NASA Technical Reports Server (NTRS)

    Dubois, George B; Ocvirk, Fred W

    1953-01-01

    An approximate analytical solution including the effect of end leakage from the oil film of short plain bearings is presented because of the importance of endwise flow in sleeve bearings of the short lengths commonly used. The analytical approximation is supported by experimental data, resulting in charts which facilitate analysis of short plain bearings. The analytical approximation includes the endwise flow and that part of the circumferential flow which is related to surface velocity and film thickness but neglects the effect of film pressure on the circumferential flow. In practical use, this approximation applies best to bearings having a length-diameter ratio up to 1, and the effects of elastic deflection, inlet oil pressure, and changes of clearance with temperature minimize the relative importance of the neglected term. The analytical approximation was found to be an extension of a little-known pressure-distribution function originally proposed by Michell and Cardullo.

  10. Quo vadis, analytical chemistry?

    PubMed

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  11. A chemodynamic approach for estimating losses of target organic chemicals from water during sample holding time

    USGS Publications Warehouse

    Capel, P.D.; Larson, S.J.

    1995-01-01

    Minimizing the loss of target organic chemicals from environmental water samples between the time of sample collection and isolation is important to the integrity of an investigation. During this sample holding time, there is a potential for analyte loss through volatilization from the water to the headspace, sorption to the walls and cap of the sample bottle; and transformation through biotic and/or abiotic reactions. This paper presents a chemodynamic-based, generalized approach to estimate the most probable loss processes for individual target organic chemicals. The basic premise is that the investigator must know which loss process(es) are important for a particular analyte, based on its chemodynamic properties, when choosing the appropriate method(s) to prevent loss.

  12. Analytical Chemistry in Russia.

    PubMed

    Zolotov, Yuri

    2016-09-06

    Research in Russian analytical chemistry (AC) is carried out on a significant scale, and the analytical service solves practical tasks of geological survey, environmental protection, medicine, industry, agriculture, etc. The education system trains highly skilled professionals in AC. The development and especially manufacturing of analytical instruments should be improved; in spite of this, there are several good domestic instruments and other satisfy some requirements. Russian AC has rather good historical roots.

  13. ANALYTICAL METHOD READINESS FOR THE CONTAMINANT CANDIDATE LIST

    EPA Science Inventory

    The Contaminant Candidate List (CCL), which was promulgated in March 1998, includes 50 chemical and 10 microbiological contaminants/contaminant groups. At the time of promulgation, analytical methods were available for 6 inorganic and 28 organic contaminants. Since then, 4 anal...

  14. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches.

  15. Big Data Analytics for Prostate Radiotherapy

    PubMed Central

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose–volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the “RadoncSpace”) in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  16. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Restrictions on the sale, distribution and use of... Requirements for Manufacturers and Producers § 809.30 Restrictions on the sale, distribution and use of analyte... include the statement for class I exempt ASR's: “Analyte Specific Reagent. Analytical and performance...

  17. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    PubMed

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  18. The role played by Gerhard Adler in the development of analytical psychology internationally and in the UK.

    PubMed

    Casement, Ann

    2014-02-01

    The Jungian analyst Gerhard Adler left Berlin and re-settled in London in 1936. He was closely involved with the professionalization of analytical psychology internationally and in the UK, including the formation of the International Association for Analytical Psychology (IAAP) and The Society of Analytical Psychology (SAP).The tensions that arose within the latter organization led to a split that ended in the formation of the Association of Jungian Analysts (AJA). A further split at AJA resulted in the creation of another organization, the Independent Group of Analytical Psychologists (IGAP). Adler's extensive publications include his role as an editor of Jung's Collected Works and as editor of the C.G. Jung Letters. © 2014, The Society of Analytical Psychology.

  19. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    PubMed Central

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-01-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of TOF scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (Direct Image Reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias vs. variance performance to iterative TOF reconstruction with a matched resolution model. PMID:27032968

  20. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    NASA Astrophysics Data System (ADS)

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-05-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of time-of-flight (TOF) scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (DIRECT: direct image reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias versus variance performance to iterative TOF reconstruction with a matched resolution model.

  1. Correction for isotopic interferences between analyte and internal standard in quantitative mass spectrometry by a nonlinear calibration function.

    PubMed

    Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L

    2013-04-16

    Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.

  2. Analytical Verifications in Cryogenic Testing of NGST Advanced Mirror System Demonstrators

    NASA Technical Reports Server (NTRS)

    Cummings, Ramona; Levine, Marie; VanBuren, Dave; Kegley, Jeff; Green, Joseph; Hadaway, James; Presson, Joan; Cline, Todd; Stahl, H. Philip (Technical Monitor)

    2002-01-01

    Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper opens with strategy of analytical verification segments added to vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies. These AMSD assemblies incorporate material and architecture concepts being considered in the Next Generation Space Telescope (NGST) design. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of specific structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with support plan for analyses, data, and observation requirements during the AMSD testing, currently slated for late calendar year 2002 to mid calendar year 2003. The paper includes anomaly resolution as gleaned by authors from similar analytical verification support of a previous large space telescope, then closes with draft of plans for parameter extrapolations, to form a well-verified portion of the integrated modeling being done for NGST performance predictions.

  3. Targeted Nanomaterials for Phototherapy

    PubMed Central

    Chitgupi, Upendra; Qin, Yiru; Lovell, Jonathan F.

    2017-01-01

    Phototherapies involve the irradiation of target tissues with light. To further enhance selectivity and potency, numerous molecularly targeted photosensitizers and photoactive nanoparticles have been developed. Active targeting typically involves harnessing the affinity between a ligand and a cell surface receptor for improved accumulation in the targeted tissue. Targeting ligands including peptides, proteins, aptamers and small molecules have been explored for phototherapy. In this review, recent examples of targeted nanomaterials used in phototherapy are summarized. PMID:29071178

  4. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  5. Limitless Analytic Elements

    NASA Astrophysics Data System (ADS)

    Strack, O. D. L.

    2018-02-01

    We present equations for new limitless analytic line elements. These elements possess a virtually unlimited number of degrees of freedom. We apply these new limitless analytic elements to head-specified boundaries and to problems with inhomogeneities in hydraulic conductivity. Applications of these new analytic elements to practical problems involving head-specified boundaries require the solution of a very large number of equations. To make the new elements useful in practice, an efficient iterative scheme is required. We present an improved version of the scheme presented by Bandilla et al. (2007), based on the application of Cauchy integrals. The limitless analytic elements are useful when modeling strings of elements, rivers for example, where local conditions are difficult to model, e.g., when a well is close to a river. The solution of such problems is facilitated by increasing the order of the elements to obtain a good solution. This makes it unnecessary to resort to dividing the element in question into many smaller elements to obtain a satisfactory solution.

  6. Generalized Subset Designs in Analytical Chemistry.

    PubMed

    Surowiec, Izabella; Vikström, Ludvig; Hector, Gustaf; Johansson, Erik; Vikström, Conny; Trygg, Johan

    2017-06-20

    Design of experiments (DOE) is an established methodology in research, development, manufacturing, and production for screening, optimization, and robustness testing. Two-level fractional factorial designs remain the preferred approach due to high information content while keeping the number of experiments low. These types of designs, however, have never been extended to a generalized multilevel reduced design type that would be capable to include both qualitative and quantitative factors. In this Article we describe a novel generalized fractional factorial design. In addition, it also provides complementary and balanced subdesigns analogous to a fold-over in two-level reduced factorial designs. We demonstrate how this design type can be applied with good results in three different applications in analytical chemistry including (a) multivariate calibration using microwave resonance spectroscopy for the determination of water in tablets, (b) stability study in drug product development, and (c) representative sample selection in clinical studies. This demonstrates the potential of generalized fractional factorial designs to be applied in many other areas of analytical chemistry where representative, balanced, and complementary subsets are required, especially when a combination of quantitative and qualitative factors at multiple levels exists.

  7. State-of-the-Art of (Bio)Chemical Sensor Developments in Analytical Spanish Groups

    PubMed Central

    Plata, María Reyes; Contento, Ana María; Ríos, Angel

    2010-01-01

    (Bio)chemical sensors are one of the most exciting fields in analytical chemistry today. The development of these analytical devices simplifies and miniaturizes the whole analytical process. Although the initial expectation of the massive incorporation of sensors in routine analytical work has been truncated to some extent, in many other cases analytical methods based on sensor technology have solved important analytical problems. Many research groups are working in this field world-wide, reporting interesting results so far. Modestly, Spanish researchers have contributed to these recent developments. In this review, we summarize the more representative achievements carried out for these groups. They cover a wide variety of sensors, including optical, electrochemical, piezoelectric or electro-mechanical devices, used for laboratory or field analyses. The capabilities to be used in different applied areas are also critically discussed. PMID:22319260

  8. Combination of Cyclodextrin and Ionic Liquid in Analytical Chemistry: Current and Future Perspectives.

    PubMed

    Hui, Boon Yih; Raoov, Muggundha; Zain, Nur Nadhirah Mohamad; Mohamad, Sharifah; Osman, Hasnah

    2017-09-03

    The growth in driving force and popularity of cyclodextrin (CDs) and ionic liquids (ILs) as promising materials in the field of analytical chemistry has resulted in an exponentially increase of their exploitation and production in analytical chemistry field. CDs belong to the family of cyclic oligosaccharides composing of α-(1,4) linked glucopyranose subunits and possess a cage-like supramolecular structure. This structure enables chemical reactions to proceed between interacting ions, radical or molecules in the absence of covalent bonds. Conversely, ILs are an ionic fluids comprising of only cation and anion often with immeasurable vapor pressure making them as green or designer solvent. The cooperative effect between CD and IL due to their fascinating properties, have nowadays contributed their footprints for a better development in analytical chemistry nowadays. This comprehensive review serves to give an overview on some of the recent studies and provides an analytical trend for the application of CDs with the combination of ILs that possess beneficial and remarkable effects in analytical chemistry including their use in various sample preparation techniques such as solid phase extraction, magnetic solid phase extraction, cloud point extraction, microextraction, and separation techniques which includes gas chromatography, high-performance liquid chromatography, capillary electrophoresis as well as applications of electrochemical sensors as electrode modifiers with references to recent applications. This review will highlight the nature of interactions and synergic effects between CDs, ILs, and analytes. It is hoped that this review will stimulate further research in analytical chemistry.

  9. Effective treatment of glioblastoma requires crossing the blood–brain barrier and targeting tumors including cancer stem cells: The promise of nanomedicine

    PubMed Central

    Kim, Sang-Soo; Harford, Joe B.; Pirollo, Kathleen F.; Chang, Esther H.

    2015-01-01

    Glioblastoma multiforme (GBM) is the most aggressive and lethal type of brain tumor. Both therapeutic resistance and restricted permeation of drugs across the blood–brain barrier (BBB) play a major role in the poor prognosis of GBM patients. Accumulated evidence suggests that in many human cancers, including GBM, therapeutic resistance can be attributed to a small fraction of cancer cells known as cancer stem cells (CSCs). CSCs have been shown to have stem cell-like properties that enable them to evade traditional cytotoxic therapies, and so new CSC-directed anti-cancer therapies are needed. Nanoparticles have been designed to selectively deliver payloads to relevant target cells in the body, and there is considerable interest in the use of nanoparticles for CSC-directed anti-cancer therapies. Recent advances in the field of nanomedicine offer new possibilities for overcoming CSC-mediated therapeutic resistance and thus significantly improving management of GBM. In this review, we will examine the current nanomedicine approaches for targeting CSCs and their therapeutic implications. The inhibitory effect of various nanoparticle-based drug delivery system towards CSCs in GBM tumors is the primary focus of this review. PMID:26116770

  10. Analytical model for screening potential CO2 repositories

    USGS Publications Warehouse

    Okwen, R.T.; Stewart, M.T.; Cunningham, J.A.

    2011-01-01

    Assessing potential repositories for geologic sequestration of carbon dioxide using numerical models can be complicated, costly, and time-consuming, especially when faced with the challenge of selecting a repository from a multitude of potential repositories. This paper presents a set of simple analytical equations (model), based on the work of previous researchers, that could be used to evaluate the suitability of candidate repositories for subsurface sequestration of carbon dioxide. We considered the injection of carbon dioxide at a constant rate into a confined saline aquifer via a fully perforated vertical injection well. The validity of the analytical model was assessed via comparison with the TOUGH2 numerical model. The metrics used in comparing the two models include (1) spatial variations in formation pressure and (2) vertically integrated brine saturation profile. The analytical model and TOUGH2 show excellent agreement in their results when similar input conditions and assumptions are applied in both. The analytical model neglects capillary pressure and the pressure dependence of fluid properties. However, simulations in TOUGH2 indicate that little error is introduced by these simplifications. Sensitivity studies indicate that the agreement between the analytical model and TOUGH2 depends strongly on (1) the residual brine saturation, (2) the difference in density between carbon dioxide and resident brine (buoyancy), and (3) the relationship between relative permeability and brine saturation. The results achieved suggest that the analytical model is valid when the relationship between relative permeability and brine saturation is linear or quasi-linear and when the irreducible saturation of brine is zero or very small. ?? 2011 Springer Science+Business Media B.V.

  11. Contributions of Analytical Chemistry to the Clinical Laboratory.

    ERIC Educational Resources Information Center

    Skogerboe, Kristen J.

    1988-01-01

    Highlights several analytical techniques that are being used in state-of-the-art clinical labs. Illustrates how other advances in instrumentation may contribute to clinical chemistry in the future. Topics include: biosensors, polarization spectroscopy, chemiluminescence, fluorescence, photothermal deflection, and chromatography in clinical…

  12. Special Issue: Learning Analytics in Higher Education

    ERIC Educational Resources Information Center

    Lester, Jaime; Klein, Carrie; Rangwala, Huzefa; Johri, Aditya

    2017-01-01

    The purpose of this monograph is to give readers a practical and theoretical foundation in learning analytics in higher education, including an understanding of the challenges and incentives that are present in the institution, in the individual, and in the technologies themselves. Among questions that are explored and answered are: (1) What are…

  13. Systems and Methods for Composable Analytics

    DTIC Science & Technology

    2014-04-29

    simplistic module that performs a mathematical operation on two numbers. The most important method is the Execute() method. This will get called when it is...addition, an input control is also specified in the example below. In this example, the mathematical operator can only be chosen from a preconfigured...approaches. Some of the industries that could benefit from Composable Analytics include pharmaceuticals, health care, insurance, actuaries , and

  14. Analytic solution of field distribution and demagnetization function of ideal hollow cylindrical field source

    NASA Astrophysics Data System (ADS)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-09-01

    The Halbach type hollow cylindrical permanent magnet array (HCPMA) is a volume compact and energy conserved field source, which have attracted intense interests in many practical applications. Here, using the complex variable integration method based on the Biot-Savart Law (including current distributions inside the body and on the surfaces of magnet), we derive analytical field solutions to an ideal multipole HCPMA in entire space including the interior of magnet. The analytic field expression inside the array material is used to construct an analytic demagnetization function, with which we can explain the origin of demagnetization phenomena in HCPMA by taking into account an ideal magnetic hysteresis loop with finite coercivity. These analytical field expressions and demagnetization functions provide deeper insight into the nature of such permanent magnet array systems and offer guidance in designing optimized array system.

  15. Forced degradation and impurity profiling: recent trends in analytical perspectives.

    PubMed

    Jain, Deepti; Basniwal, Pawan Kumar

    2013-12-01

    This review describes an epigrammatic impression of the recent trends in analytical perspectives of degradation and impurities profiling of pharmaceuticals including active pharmaceutical ingredient (API) as well as drug products during 2008-2012. These recent trends in forced degradation and impurity profiling were discussed on the head of year of publication; columns, matrix (API and dosage forms) and type of elution in chromatography (isocratic and gradient); therapeutic categories of the drug which were used for analysis. It focuses distinctly on comprehensive update of various analytical methods including hyphenated techniques for the identification and quantification of thresholds of impurities and degradants in different pharmaceutical matrices. © 2013 Elsevier B.V. All rights reserved.

  16. Lithium target performance evaluation for low-energy accelerator-based in vivo measurements using gamma spectroscopy.

    PubMed

    Aslam; Prestwich, W V; McNeill, F E

    2003-03-01

    The operating conditions at McMaster KN Van de Graaf accelerator have been optimized to produce neutrons via the (7)Li(p, n)(7)Be reaction for in vivo neutron activation analysis. In a number of earlier studies (development of an accelerator based system for in vivo neutron activation analysis measurements of manganese in humans, Ph.D. Thesis, McMaster University, Hamilton, ON, Canada; Appl. Radiat. Isot. 53 (2000) 657; in vivo measurement of some trace elements in human Bone, Ph.D. Thesis. McMaster University, Hamilton, ON, Canada), a significant discrepancy between the experimental and the calculated neutron doses has been pointed out. The hypotheses formulated in the above references to explain the deviation of the experimental results from analytical calculations, have been tested experimentally. The performance of the lithium target for neutron production has been evaluated by measuring the (7)Be activity produced as a result of (p, n) interaction with (7)Li. In contradiction to the formulated hypotheses, lithium target performance was found to be mainly affected by inefficient target cooling and the presence of oxides layer on target surface. An appropriate choice of these parameters resulted in neutron yields same as predicated by analytical calculations.

  17. Analytical learning and term-rewriting systems

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1990-01-01

    Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.

  18. Extended analytical formulas for the perturbed Keplerian motion under a constant control acceleration

    NASA Astrophysics Data System (ADS)

    Zuiani, Federico; Vasile, Massimiliano

    2015-03-01

    This paper presents a set of analytical formulae for the perturbed Keplerian motion of a spacecraft under the effect of a constant control acceleration. The proposed set of formulae can treat control accelerations that are fixed in either a rotating or inertial reference frame. Moreover, the contribution of the zonal harmonic is included in the analytical formulae. It will be shown that the proposed analytical theory allows for the fast computation of long, multi-revolution spirals while maintaining good accuracy. The combined effect of different perturbations and of the shadow regions due to solar eclipse is also included. Furthermore, a simplified control parameterisation is introduced to optimise thrusting patterns with two thrust arcs and two cost arcs per revolution. This simple parameterisation is shown to ensure enough flexibility to describe complex low thrust spirals. The accuracy and speed of the proposed analytical formulae are compared against a full numerical integration with different integration schemes. An averaging technique is then proposed as an application of the analytical formulae. Finally, the paper presents an example of design of an optimal low-thrust spiral to transfer a spacecraft from an elliptical to a circular orbit around the Earth.

  19. The Viking X ray fluorescence experiment - Analytical methods and early results

    NASA Technical Reports Server (NTRS)

    Clark, B. C., III; Castro, A. J.; Rowe, C. D.; Baird, A. K.; Rose, H. J., Jr.; Toulmin, P., III; Christian, R. P.; Kelliher, W. C.; Keil, K.; Huss, G. R.

    1977-01-01

    Ten samples of the Martian regolith have been analyzed by the Viking lander X ray fluorescence spectrometers. Because of high-stability electronics, inclusion of calibration targets, and special data encoding within the instruments the quality of the analyses performed on Mars is closely equivalent to that attainable with the same instruments operated in the laboratory. Determination of absolute elemental concentrations requires gain drift adjustments, subtraction of background components, and use of a mathematical response model with adjustable parameters set by prelaunch measurements on selected rock standards. Bulk fines at both Viking landing sites are quite similar in composition, implying that a chemically and mineralogically homogeneous regolith covers much of the surface of the planet. Important differences between samples include a higher sulfur content in what appear to be duricrust fragments than in fines and a lower iron content in fines taken from beneath large rocks than those taken from unprotected surface material. Further extensive reduction of these data will allow more precise and more accurate analytical numbers to be determined and thus a more comprehensive understanding of elemental trends between samples.

  20. Use of the Threshold of Toxicological Concern (TTC) approach for deriving target values for drinking water contaminants.

    PubMed

    Mons, M N; Heringa, M B; van Genderen, J; Puijker, L M; Brand, W; van Leeuwen, C J; Stoks, P; van der Hoek, J P; van der Kooij, D

    2013-03-15

    Ongoing pollution and improving analytical techniques reveal more and more anthropogenic substances in drinking water sources, and incidentally in treated water as well. In fact, complete absence of any trace pollutant in treated drinking water is an illusion as current analytical techniques are capable of detecting very low concentrations. Most of the substances detected lack toxicity data to derive safe levels and have not yet been regulated. Although the concentrations in treated water usually do not have adverse health effects, their presence is still undesired because of customer perception. This leads to the question how sensitive analytical methods need to become for water quality screening, at what levels water suppliers need to take action and how effective treatment methods need to be designed to remove contaminants sufficiently. Therefore, in the Netherlands a clear and consistent approach called 'Drinking Water Quality for the 21st century (Q21)' has been developed within the joint research program of the drinking water companies. Target values for anthropogenic drinking water contaminants were derived by using the recently introduced Threshold of Toxicological Concern (TTC) approach. The target values for individual genotoxic and steroid endocrine chemicals were set at 0.01 μg/L. For all other organic chemicals the target values were set at 0.1 μg/L. The target value for the total sum of genotoxic chemicals, the total sum of steroid hormones and the total sum of all other organic compounds were set at 0.01, 0.01 and 1.0 μg/L, respectively. The Dutch Q21 approach is further supplemented by the standstill-principle and effect-directed testing. The approach is helpful in defining the goals and limits of future treatment process designs and of analytical methods to further improve and ensure the quality of drinking water, without going to unnecessary extents. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.

    PubMed

    Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U

    2015-05-01

    The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...

    EPA Pesticide Factsheets

    Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli

  3. Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.

    PubMed

    Lubes, Giuseppe; Goodarzi, Mohammad

    2017-05-10

    Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.

  4. Centredale Manor Superfund Site in Rhode Island included on EPA List of Targeted for Immediate Attention

    EPA Pesticide Factsheets

    Today, the U.S. Environmental Protection Agency released the list of Superfund sites that Administrator Pruitt has targeted for immediate and intense attention. The Centredale Manor Restoration Project superfund site is one of the 21 sites on the list.

  5. Target-Pathogen: a structural bioinformatic approach to prioritize drug targets in pathogens.

    PubMed

    Sosa, Ezequiel J; Burguener, Germán; Lanzarotti, Esteban; Defelipe, Lucas; Radusky, Leandro; Pardo, Agustín M; Marti, Marcelo; Turjanski, Adrián G; Fernández Do Porto, Darío

    2018-01-04

    Available genomic data for pathogens has created new opportunities for drug discovery and development to fight them, including new resistant and multiresistant strains. In particular structural data must be integrated with both, gene information and experimental results. In this sense, there is a lack of an online resource that allows genome wide-based data consolidation from diverse sources together with thorough bioinformatic analysis that allows easy filtering and scoring for fast target selection for drug discovery. Here, we present Target-Pathogen database (http://target.sbg.qb.fcen.uba.ar/patho), designed and developed as an online resource that allows the integration and weighting of protein information such as: function, metabolic role, off-targeting, structural properties including druggability, essentiality and omic experiments, to facilitate the identification and prioritization of candidate drug targets in pathogens. We include in the database 10 genomes of some of the most relevant microorganisms for human health (Mycobacterium tuberculosis, Mycobacterium leprae, Klebsiella pneumoniae, Plasmodium vivax, Toxoplasma gondii, Leishmania major, Wolbachia bancrofti, Trypanosoma brucei, Shigella dysenteriae and Schistosoma Smanosoni) and show its applicability. New genomes can be uploaded upon request. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Target-Pathogen: a structural bioinformatic approach to prioritize drug targets in pathogens

    PubMed Central

    Sosa, Ezequiel J; Burguener, Germán; Lanzarotti, Esteban; Radusky, Leandro; Pardo, Agustín M; Marti, Marcelo

    2018-01-01

    Abstract Available genomic data for pathogens has created new opportunities for drug discovery and development to fight them, including new resistant and multiresistant strains. In particular structural data must be integrated with both, gene information and experimental results. In this sense, there is a lack of an online resource that allows genome wide-based data consolidation from diverse sources together with thorough bioinformatic analysis that allows easy filtering and scoring for fast target selection for drug discovery. Here, we present Target-Pathogen database (http://target.sbg.qb.fcen.uba.ar/patho), designed and developed as an online resource that allows the integration and weighting of protein information such as: function, metabolic role, off-targeting, structural properties including druggability, essentiality and omic experiments, to facilitate the identification and prioritization of candidate drug targets in pathogens. We include in the database 10 genomes of some of the most relevant microorganisms for human health (Mycobacterium tuberculosis, Mycobacterium leprae, Klebsiella pneumoniae, Plasmodium vivax, Toxoplasma gondii, Leishmania major, Wolbachia bancrofti, Trypanosoma brucei, Shigella dysenteriae and Schistosoma Smanosoni) and show its applicability. New genomes can be uploaded upon request. PMID:29106651

  7. Analytical applications of microbial fuel cells. Part II: Toxicity, microbial activity and quantification, single analyte detection and other uses.

    PubMed

    Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo

    2015-01-15

    Microbial fuel cells were rediscovered twenty years ago and now are a very active research area. The reasons behind this new activity are the relatively recent discovery of electrogenic or electroactive bacteria and the vision of two important practical applications, as wastewater treatment coupled with clean energy production and power supply systems for isolated low-power sensor devices. Although some analytical applications of MFCs were proposed earlier (as biochemical oxygen demand sensing) only lately a myriad of new uses of this technology are being presented by research groups around the world, which combine both biological-microbiological and electroanalytical expertises. This is the second part of a review of MFC applications in the area of analytical sciences. In Part I a general introduction to biological-based analytical methods including bioassays, biosensors, MFCs design, operating principles, as well as, perhaps the main and earlier presented application, the use as a BOD sensor was reviewed. In Part II, other proposed uses are presented and discussed. As other microbially based analytical systems, MFCs are satisfactory systems to measure and integrate complex parameters that are difficult or impossible to measure otherwise, such as water toxicity (where the toxic effect to aquatic organisms needed to be integrated). We explore here the methods proposed to measure toxicity, microbial metabolism, and, being of special interest to space exploration, life sensors. Also, some methods with higher specificity, proposed to detect a single analyte, are presented. Different possibilities to increase selectivity and sensitivity, by using molecular biology or other modern techniques are also discussed here. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. New Analytical Monographs on TCM Herbal Drugs for Quality Proof.

    PubMed

    Wagner, Hildebert; Bauer, Rudolf; Melchart, Dieter

    2016-01-01

    Regardless of specific national drug regulations there is an international consensus that all TCM drugs must meet stipulated high quality standards focusing on authentication, identification and chemical composition. In addition, safety of all TCM drugs prescribed by physicians has to be guaranteed. During the 25 years history of the TCM hospital Bad Kötzting, 171 TCM drugs underwent an analytical quality proof including thin layer as well as high pressure liquid chromatography. As from now mass spectroscopy will also be available as analytical tool. The findings are compiled and already published in three volumes of analytical monographs. One more volume will be published shortly, and a fifth volume is in preparation. The main issues of the analytical procedure in TCM drugs like authenticity, botanical nomenclature, variability of plant species and parts as well as processing are pointed out and possible ways to overcome them are sketched. © 2016 S. Karger GmbH, Freiburg.

  9. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    PubMed

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  10. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory

    PubMed Central

    Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721

  11. First Steps in FAP: Experiences of Beginning Functional Analytic Psychotherapy Therapist with an Obsessive-Compulsive Personality Disorder Client

    ERIC Educational Resources Information Center

    Manduchi, Katia; Schoendorff, Benjamin

    2012-01-01

    Practicing Functional Analytic Psychotherapy (FAP) for the first time can seem daunting to therapists. Establishing a deep and intense therapeutic relationship, identifying FAP's therapeutic targets of clinically relevant behaviors, and using contingent reinforcement to help clients emit more functional behavior in the therapeutic relationship all…

  12. Learning Analytics: Readiness and Rewards

    ERIC Educational Resources Information Center

    Friesen, Norm

    2013-01-01

    This position paper introduces the relatively new field of learning analytics, first by considering the relevant meanings of both "learning" and "analytics," and then by looking at two main levels at which learning analytics can be or has been implemented in educational organizations. Although integrated turnkey systems or…

  13. Analytic model of a multi-electron atom

    NASA Astrophysics Data System (ADS)

    Skoromnik, O. D.; Feranchuk, I. D.; Leonau, A. U.; Keitel, C. H.

    2017-12-01

    A fully analytical approximation for the observable characteristics of many-electron atoms is developed via a complete and orthonormal hydrogen-like basis with a single-effective charge parameter for all electrons of a given atom. The basis completeness allows us to employ the secondary-quantized representation for the construction of regular perturbation theory, which includes in a natural way correlation effects, converges fast and enables an effective calculation of the subsequent corrections. The hydrogen-like basis set provides a possibility to perform all summations over intermediate states in closed form, including both the discrete and continuous spectra. This is achieved with the help of the decomposition of the multi-particle Green function in a convolution of single-electronic Coulomb Green functions. We demonstrate that our fully analytical zeroth-order approximation describes the whole spectrum of the system, provides accuracy, which is independent of the number of electrons and is important for applications where the Thomas-Fermi model is still utilized. In addition already in second-order perturbation theory our results become comparable with those via a multi-configuration Hartree-Fock approach.

  14. Highly accurate analytic formulae for projectile motion subjected to quadratic drag

    NASA Astrophysics Data System (ADS)

    Turkyilmazoglu, Mustafa

    2016-05-01

    The classical phenomenon of motion of a projectile fired (thrown) into the horizon through resistive air charging a quadratic drag onto the object is revisited in this paper. No exact solution is known that describes the full physical event under such an exerted resistance force. Finding elegant analytical approximations for the most interesting engineering features of dynamical behavior of the projectile is the principal target. Within this purpose, some analytical explicit expressions are derived that accurately predict the maximum height, its arrival time as well as the flight range of the projectile at the highest ascent. The most significant property of the proposed formulas is that they are not restricted to the initial speed and firing angle of the object, nor to the drag coefficient of the medium. In combination with the available approximations in the literature, it is possible to gain information about the flight and complete the picture of a trajectory with high precision, without having to numerically simulate the full governing equations of motion.

  15. Achromatic-chromatic colorimetric sensors for on-off type detection of analytes.

    PubMed

    Heo, Jun Hyuk; Cho, Hui Hun; Lee, Jin Woong; Lee, Jung Heon

    2014-12-21

    We report the development of achromatic colorimetric sensors; sensors changing their colors from achromatic black to other chromatic colors. An achromatic colorimetric sensor was prepared by mixing a general colorimetric indicator, whose color changes between chromatic colors, and a complementary colored dye with no reaction to the targeted analyte. As the color of an achromatic colorimetric sensor changes from black to a chromatic color, the color change could be much easily recognized than general colorimetric sensors with naked eyes. More importantly, the achromatic colorimetric sensors enable on-off type recognition of the presence of analytes, which have not been achieved from most colorimetric sensors. In addition, the color changes from some achromatic colorimetric sensors (achromatic Eriochrome Black T and achromatic Benedict's solution) could be recognized with naked eyes at much lower concentration ranges than normal chromatic colorimetric sensors. These results provide new opportunities in the use of colorimetric sensors for diverse applications, such as harsh industrial, environmental, and biological detection.

  16. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2017-01-01

    There has been an immense amount of visibility of doping issues on the international stage over the past 12 months with the complexity of doping controls reiterated on various occasions. Hence, analytical test methods continuously being updated, expanded, and improved to provide specific, sensitive, and comprehensive test results in line with the World Anti-Doping Agency's (WADA) 2016 Prohibited List represent one of several critical cornerstones of doping controls. This enterprise necessitates expediting the (combined) exploitation of newly generated information on novel and/or superior target analytes for sports drug testing assays, drug elimination profiles, alternative test matrices, and recent advances in instrumental developments. This paper is a continuation of the series of annual banned-substance reviews appraising the literature published between October 2015 and September 2016 concerning human sports drug testing in the context of WADA's 2016 Prohibited List. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Analytical performance of the various acquisition modes in Orbitrap MS and MS/MS.

    PubMed

    Kaufmann, Anton

    2018-04-30

    Quadrupole Orbitrap instruments (Q Orbitrap) permit high-resolution mass spectrometry (HRMS)-based full scan acquisitions and have a number of acquisition modes where the quadrupole isolates a particular mass range prior to a possible fragmentation and HRMS-based acquisition. Selecting the proper acquisition mode(s) is essential if trace analytes are to be quantified in complex matrix extracts. Depending on the particular requirements, such as sensitivity, selectivity of detection, linear dynamic range, and speed of analysis, different acquisition modes may have to be chosen. This is particularly important in the field of multi-residue analysis (e.g., pesticides or veterinary drugs in food samples) where a large number of analytes within a complex matrix have to be detected and reliably quantified. Meeting the specific detection and quantification performance criteria for every targeted compound may be challenging. It is the aim of this paper to describe the strengths and the limitations of the currently available Q Orbitrap acquisition modes. In addition, the incorporation of targeted acquisitions between full scan experiments is discussed. This approach is intended to integrate compounds that require an additional degree of sensitivity or selectivity into multi-residue methods. This article is protected by copyright. All rights reserved.

  18. ENVIRONMENTAL IMMUNOCHEMISTRY RESPONDING TO A SPECTRUM OF ANALYTICAL NEEDS

    EPA Science Inventory

    A review, with 13 references, is given on the field of environmental immunochemistry which brings together several specalties, including analytical chemistry, biochemistry, moluclar biology, and environmental engineering. This multidisciplinary nature is both benefit and a confus...

  19. Discordance between net analyte signal theory and practical multivariate calibration.

    PubMed

    Brown, Christopher D

    2004-08-01

    Lorber's concept of net analyte signal is reviewed in the context of classical and inverse least-squares approaches to multivariate calibration. It is shown that, in the presence of device measurement error, the classical and inverse calibration procedures have radically different theoretical prediction objectives, and the assertion that the popular inverse least-squares procedures (including partial least squares, principal components regression) approximate Lorber's net analyte signal vector in the limit is disproved. Exact theoretical expressions for the prediction error bias, variance, and mean-squared error are given under general measurement error conditions, which reinforce the very discrepant behavior between these two predictive approaches, and Lorber's net analyte signal theory. Implications for multivariate figures of merit and numerous recently proposed preprocessing treatments involving orthogonal projections are also discussed.

  20. Analytic barrage attack model. Final report, January 1986-January 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    St Ledger, J.W.; Naegeli, R.E.; Dowden, N.A.

    An analytic model is developed for a nuclear barrage attack, assuming weapons with no aiming error and a cookie-cutter damage function. The model is then extended with approximations for the effects of aiming error and distance damage sigma. The final result is a fast running model which calculates probability of damage for a barrage attack. The probability of damage is accurate to within seven percent or better, for weapon reliabilities of 50 to 100 percent, distance damage sigmas of 0.5 or less, and zero to very large circular error probabilities. FORTRAN 77 coding is included in the report for themore » analytic model and for a numerical model used to check the analytic results.« less

  1. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  2. Analysis of structural dynamic data from Skylab. Volume 2: Skylab analytical and test model data

    NASA Technical Reports Server (NTRS)

    Demchak, L.; Harcrow, H.

    1976-01-01

    The orbital configuration test modal data, analytical test correlation modal data, and analytical flight configuration modal data are presented. Tables showing the generalized mass contributions (GMCs) for each of the thirty tests modes are given along with the two dimensional mode shape plots and tables of GMCs for the test correlated analytical modes. The two dimensional mode shape plots for the analytical modes and uncoupled and coupled modes of the orbital flight configuration at three development phases of the model are included.

  3. Magnetically attached sputter targets

    DOEpatents

    Makowiecki, Daniel M.; McKernan, Mark A.

    1994-01-01

    An improved method and assembly for attaching sputtering targets to cathode assemblies of sputtering systems which includes a magnetically permeable material. The magnetically permeable material is imbedded in a target base that is brazed, welded, or soldered to the sputter target, or is mechanically retained in the target material. Target attachment to the cathode is achieved by virtue of the permanent magnets and/or the pole pieces in the cathode assembly that create magnetic flux lines adjacent to the backing plate, which strongly attract the magnetically permeable material in the target assembly.

  4. Adjustment of Pesticide Concentrations for Temporal Changes in Analytical Recovery, 1992-2006

    USGS Publications Warehouse

    Martin, Jeffrey D.; Stone, Wesley W.; Wydoski, Duane S.; Sandstrom, Mark W.

    2009-01-01

    Recovery is the proportion of a target analyte that is quantified by an analytical method and is a primary indicator of the analytical bias of a measurement. Recovery is measured by analysis of quality-control (QC) water samples that have known amounts of target analytes added ('spiked' QC samples). For pesticides, recovery is the measured amount of pesticide in the spiked QC sample expressed as percentage of the amount spiked, ideally 100 percent. Temporal changes in recovery have the potential to adversely affect time-trend analysis of pesticide concentrations by introducing trends in environmental concentrations that are caused by trends in performance of the analytical method rather than by trends in pesticide use or other environmental conditions. This report examines temporal changes in the recovery of 44 pesticides and 8 pesticide degradates (hereafter referred to as 'pesticides') that were selected for a national analysis of time trends in pesticide concentrations in streams. Water samples were analyzed for these pesticides from 1992 to 2006 by gas chromatography/mass spectrometry. Recovery was measured by analysis of pesticide-spiked QC water samples. Temporal changes in pesticide recovery were investigated by calculating robust, locally weighted scatterplot smooths (lowess smooths) for the time series of pesticide recoveries in 5,132 laboratory reagent spikes; 1,234 stream-water matrix spikes; and 863 groundwater matrix spikes. A 10-percent smoothing window was selected to show broad, 6- to 12-month time scale changes in recovery for most of the 52 pesticides. Temporal patterns in recovery were similar (in phase) for laboratory reagent spikes and for matrix spikes for most pesticides. In-phase temporal changes among spike types support the hypothesis that temporal change in method performance is the primary cause of temporal change in recovery. Although temporal patterns of recovery were in phase for most pesticides, recovery in matrix spikes was greater

  5. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    ERIC Educational Resources Information Center

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  6. Development of Bone Targeting Drugs.

    PubMed

    Stapleton, Molly; Sawamoto, Kazuki; Alméciga-Díaz, Carlos J; Mackenzie, William G; Mason, Robert W; Orii, Tadao; Tomatsu, Shunji

    2017-06-23

    The skeletal system, comprising bones, ligaments, cartilage and their connective tissues, is critical for the structure and support of the body. Diseases that affect the skeletal system can be difficult to treat, mainly because of the avascular cartilage region. Targeting drugs to the site of action can not only increase efficacy but also reduce toxicity. Bone-targeting drugs are designed with either of two general targeting moieties, aimed at the entire skeletal system or a specific cell type. Most bone-targeting drugs utilize an affinity to hydroxyapatite, a major component of the bone matrix that includes a high concentration of positively-charged Ca 2+ . The strategies for designing such targeting moieties can involve synthetic and/or biological components including negatively-charged amino acid peptides or bisphosphonates. Efficient delivery of bone-specific drugs provides significant impact in the treatment of skeletal related disorders including infectious diseases (osteoarthritis, osteomyelitis, etc.), osteoporosis, and metabolic skeletal dysplasia. Despite recent advances, however, both delivering the drug to its target without losing activity and avoiding adverse local effects remain a challenge. In this review, we investigate the current development of bone-targeting moieties, their efficacy and limitations, and discuss future directions for the development of these specific targeted treatments.

  7. Development of Bone Targeting Drugs

    PubMed Central

    Stapleton, Molly; Sawamoto, Kazuki; Alméciga-Díaz, Carlos J.; Mackenzie, William G.; Mason, Robert W.; Orii, Tadao; Tomatsu, Shunji

    2017-01-01

    The skeletal system, comprising bones, ligaments, cartilage and their connective tissues, is critical for the structure and support of the body. Diseases that affect the skeletal system can be difficult to treat, mainly because of the avascular cartilage region. Targeting drugs to the site of action can not only increase efficacy but also reduce toxicity. Bone-targeting drugs are designed with either of two general targeting moieties, aimed at the entire skeletal system or a specific cell type. Most bone-targeting drugs utilize an affinity to hydroxyapatite, a major component of the bone matrix that includes a high concentration of positively-charged Ca2+. The strategies for designing such targeting moieties can involve synthetic and/or biological components including negatively-charged amino acid peptides or bisphosphonates. Efficient delivery of bone-specific drugs provides significant impact in the treatment of skeletal related disorders including infectious diseases (osteoarthritis, osteomyelitis, etc.), osteoporosis, and metabolic skeletal dysplasia. Despite recent advances, however, both delivering the drug to its target without losing activity and avoiding adverse local effects remain a challenge. In this review, we investigate the current development of bone-targeting moieties, their efficacy and limitations, and discuss future directions for the development of these specific targeted treatments. PMID:28644392

  8. Development and validation of a multiplex real-time PCR method to simultaneously detect 47 targets for the identification of genetically modified organisms.

    PubMed

    Cottenet, Geoffrey; Blancpain, Carine; Sonnard, Véronique; Chuah, Poh Fong

    2013-08-01

    Considering the increase of the total cultivated land area dedicated to genetically modified organisms (GMO), the consumers' perception toward GMO and the need to comply with various local GMO legislations, efficient and accurate analytical methods are needed for their detection and identification. Considered as the gold standard for GMO analysis, the real-time polymerase chain reaction (RTi-PCR) technology was optimised to produce a high-throughput GMO screening method. Based on simultaneous 24 multiplex RTi-PCR running on a ready-to-use 384-well plate, this new procedure allows the detection and identification of 47 targets on seven samples in duplicate. To comply with GMO analytical quality requirements, a negative and a positive control were analysed in parallel. In addition, an internal positive control was also included in each reaction well for the detection of potential PCR inhibition. Tested on non-GM materials, on different GM events and on proficiency test samples, the method offered high specificity and sensitivity with an absolute limit of detection between 1 and 16 copies depending on the target. Easy to use, fast and cost efficient, this multiplex approach fits the purpose of GMO testing laboratories.

  9. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  10. The Analytical Chemistry of Drug Monitoring in Athletes

    NASA Astrophysics Data System (ADS)

    Bowers, Larry D.

    2009-07-01

    The detection and deterrence of the abuse of performance-enhancing drugs in sport are important to maintaining a level playing field among athletes and to decreasing the risk to athletes’ health. The World Anti-Doping Program consists of six documents, three of which play a role in analytical development: The World Anti-Doping Code, The List of Prohibited Substances and Methods, and The International Standard for Laboratories. Among the classes of prohibited substances, three have given rise to the most recent analytical developments in the field: anabolic agents; peptide and protein hormones; and methods to increase oxygen delivery to the tissues, including recombinant erythropoietin. Methods for anabolic agents, including designer steroids, have been enhanced through the use of liquid chromatography/tandem mass spectrometry and gas chromatography/combustion/isotope-ratio mass spectrometry. Protein and peptide identification and quantification have benefited from advances in liquid chromatography/tandem mass spectrometry. Incorporation of techniques such as flow cytometry and isoelectric focusing have supported the detection of blood doping.

  11. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  12. Approaching near real-time biosensing: microfluidic microsphere based biosensor for real-time analyte detection.

    PubMed

    Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania

    2015-04-15

    In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Nuclear Security: Target Analysis-rev

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Surinder Paul; Gibbs, Philip W.; Bultz, Garl A.

    2014-03-01

    The objectives of this presentation are to understand target identification, including roll-up and protracted theft; evaluate target identification in the SNRI; recognize the target characteristics and consequence levels; and understand graded safeguards.

  14. Translating the Theoretical into Practical: A Logical Framework of Functional Analytic Psychotherapy Interactions for Research, Training, and Clinical Purposes

    ERIC Educational Resources Information Center

    Weeks, Cristal E.; Kanter, Jonathan W.; Bonow, Jordan T.; Landes, Sara J.; Busch, Andrew M.

    2012-01-01

    Functional analytic psychotherapy (FAP) provides a behavioral analysis of the psychotherapy relationship that directly applies basic research findings to outpatient psychotherapy settings. Specifically, FAP suggests that a therapist's in vivo (i.e., in-session) contingent responding to targeted client behaviors, particularly positive reinforcement…

  15. AmO 2 Analysis for Analytical Method Testing and Assessment: Analysis Support for AmO 2 Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn, Kevin John; Bland, Galey Jean; Fulwyler, James Brent

    Americium oxide samples will be measured for various analytes to support AmO 2 production. The key analytes that are currently requested by the Am production customer at LANL include total Am content, Am isotopics, Pu assay, Pu isotopics, and trace element content including 237Np content. Multiple analytical methods will be utilized depending on the sensitivity, accuracy and precision needs of the Am matrix. Traceability to the National Institute of Standards and Technology (NIST) will be achieved, where applicable, by running NIST traceable quality control materials. This given that there are no suitable AmO 2 reference materials currently available for requestedmore » analytes. The primary objective is to demonstrate the suitability of actinide analytical chemistry methods to support AmO 2 production operations.« less

  16. Design challenges in nanoparticle-based platforms: Implications for targeted drug delivery systems

    NASA Astrophysics Data System (ADS)

    Mullen, Douglas Gurnett

    Characterization and control of heterogeneous distributions of nanoparticle-ligand components are major design challenges for nanoparticle-based platforms. This dissertation begins with an examination of poly(amidoamine) (PAMAM) dendrimer-based targeted delivery platform. A folic acid targeted modular platform was developed to target human epithelial cancer cells. Although active targeting was observed in vitro, active targeting was not found in vivo using a mouse tumor model. A major flaw of this platform design was that it did not provide for characterization or control of the component distribution. Motivated by the problems experienced with the modular design, the actual composition of nanoparticle-ligand distributions were examined using a model dendrimer-ligand system. High Pressure Liquid Chromatography (HPLC) resolved the distribution of components in samples with mean ligand/dendrimer ratios ranging from 0.4 to 13. A peak fitting analysis enabled the quantification of the component distribution. Quantified distributions were found to be significantly more heterogeneous than commonly expected and standard analytical parameters, namely the mean ligand/nanoparticle ratio, failed to adequately represent the component heterogeneity. The distribution of components was also found to be sensitive to particle modifications that preceded the ligand conjugation. With the knowledge gained from this detailed distribution analysis, a new platform design was developed to provide a system with dramatically improved control over the number of components and with improved batch reproducibility. Using semi-preparative HPLC, individual dendrimer-ligand components were isolated. The isolated dendrimer with precise numbers of ligands were characterized by NMR and analytical HPLC. In total, nine different dendrimer-ligand components were obtained with degrees of purity ≥80%. This system has the potential to serve as a platform to which a precise number of functional molecules

  17. "Analytic continuation" of = 2 minimal model

    NASA Astrophysics Data System (ADS)

    Sugawara, Yuji

    2014-04-01

    In this paper we discuss what theory should be identified as the "analytic continuation" with N rArr -N of the {mathcal N}=2 minimal model with the central charge hat {c} = 1 - frac {2}{N}. We clarify how the elliptic genus of the expected model is written in terms of holomorphic linear combinations of the "modular completions" introduced in [T. Eguchi and Y. Sugawara, JHEP 1103, 107 (2011)] in the SL(2)_{N+2}/U(1) supercoset theory. We further discuss how this model could be interpreted as a kind of model of the SL(2)_{N+2}/U(1) supercoset in the (widetilde {{R}},widetilde {R}) sector, in which only the discrete spectrum appears in the torus partition function and the potential IR divergence due to the non-compactness of the target space is removed. We also briefly discuss possible definitions of the sectors with other spin structures.

  18. Laser-induced extreme magnetic field in nanorod targets

    NASA Astrophysics Data System (ADS)

    Lécz, Zsolt; Andreev, Alexander

    2018-03-01

    The application of nano-structured target surfaces in laser-solid interaction has attracted significant attention in the last few years. Their ability to absorb significantly more laser energy promises a possible route for advancing the currently established laser ion acceleration concepts. However, it is crucial to have a better understanding of field evolution and electron dynamics during laser-matter interactions before the employment of such exotic targets. This paper focuses on the magnetic field generation in nano-forest targets consisting of parallel nanorods grown on plane surfaces. A general scaling law for the self-generated quasi-static magnetic field amplitude is given and it is shown that amplitudes up to 1 MT field are achievable with current technology. Analytical results are supported by three-dimensional particle-in-cell simulations. Non-parallel arrangements of nanorods has also been considered which result in the generation of donut-shaped azimuthal magnetic fields in a larger volume.

  19. Effects of fecal sampling on preanalytical and analytical phases in quantitative fecal immunochemical tests for hemoglobin.

    PubMed

    Rapi, Stefano; Berardi, Margherita; Cellai, Filippo; Ciattini, Samuele; Chelazzi, Laura; Ognibene, Agostino; Rubeca, Tiziana

    2017-07-24

    Information on preanalytical variability is mandatory to bring laboratories up to ISO 15189 requirements. Fecal sampling is greatly affected by lack of harmonization in laboratory medicine. The aims of this study were to obtain information on the devices used for fecal sampling and to explore the effect of different amounts of feces on the results from the fecal immunochemical test for hemoglobin (FIT-Hb). Four commercial sample collection devices for quantitative FIT-Hb measurements were investigated. The volume of interest (VOI) of the probes was measured from diameter and geometry. Quantitative measurements of the mass of feces were carried out by gravimetry. The effects of an increased amount of feces on the analytical environment were investigated measuring the Hb values with a single analytical method. VOI was 8.22, 7.1 and 9.44 mm3 for probes that collected a target of 10 mg of feces, and 3.08 mm3 for one probe that targeted 2 mg of feces. The ratio between recovered and target amounts of devices ranged from 56% to 121%. Different changes in the measured Hb values were observed, in adding increasing amounts of feces in commercial buffers. The amounts of collected materials are related to the design of probes. Three out 4 manufacturers declare the same target amount using different sampling volumes and obtaining different amounts of collected materials. The introduction of a standard probes to reduce preanalytical variability could be an useful step for fecal test harmonization and to fulfill the ISO 15189 requirements.

  20. Integrin Targeted MR Imaging

    PubMed Central

    Tan, Mingqian; Lu, Zheng-Rong

    2011-01-01

    Magnetic resonance imaging (MRI) is a powerful medical diagnostic imaging modality for integrin targeted imaging, which uses the magnetic resonance of tissue water protons to display tissue anatomic structures with high spatial resolution. Contrast agents are often used in MRI to highlight specific regions of the body and make them easier to visualize. There are four main classes of MRI contrast agents based on their different contrast mechanisms, including T1, T2, chemical exchange saturation transfer (CEST) agents, and heteronuclear contrast agents. Integrins are an important family of heterodimeric transmembrane glycoproteins that function as mediators of cell-cell and cell-extracellular matrix interactions. The overexpressed integrins can be used as the molecular targets for designing suitable integrin targeted contrast agents for MR molecular imaging. Integrin targeted contrast agent includes a targeting agent specific to a target integrin, a paramagnetic agent and a linker connecting the targeting agent with the paramagnetic agent. Proper selection of targeting agents is critical for targeted MRI contrast agents to effectively bind to integrins for in vivo imaging. An ideal integrin targeted MR contrast agent should be non-toxic, provide strong contrast enhancement at the target sites and can be completely excreted from the body after MR imaging. An overview of integrin targeted MR contrast agents based on small molecular and macromolecular Gd(III) complexes, lipid nanoparticles and superparamagnetic nanoparticles is provided for MR molecular imaging. By using proper delivery systems for loading sufficient Gd(III) chelates or superparamagnetic nanoparticles, effective molecular imaging of integrins with MRI has been demonstrated in animal models. PMID:21547154