Sample records for develop analytical approaches

  1. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  2. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  3. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  4. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  5. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  6. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  7. Analytical methodology for determination of helicopter IFR precision approach requirements. [pilot workload and acceptance level

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.

    1980-01-01

    A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.

  8. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  9. Development of An Analytic Approach to Determine How Environmental Protection Agency’s Integrated Risk Information System (IRIS) Is Used by Non-EPA Decision Makers (Final Contractor Report)

    EPA Science Inventory

    EPA announced the availability of the final contractor report entitled, Development of an Analytic Approach to Determine How Environmental Protection Agency’s Integrated Risk Information System (IRIS) Is Used By Non EPA Decision Makers. This contractor report analyzed how ...

  10. An analytical approach to γ-ray self-shielding effects for radioactive bodies encountered nuclear decommissioning scenarios.

    PubMed

    Gamage, K A A; Joyce, M J

    2011-10-01

    A novel analytical approach is described that accounts for self-shielding of γ radiation in decommissioning scenarios. The approach is developed with plutonium-239, cobalt-60 and caesium-137 as examples; stainless steel and concrete have been chosen as the media for cobalt-60 and caesium-137, respectively. The analytical methods have been compared MCNPX 2.6.0 simulations. A simple, linear correction factor relates the analytical results and the simulated estimates. This has the potential to greatly simplify the estimation of self-shielding effects in decommissioning activities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Approaching near real-time biosensing: microfluidic microsphere based biosensor for real-time analyte detection.

    PubMed

    Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania

    2015-04-15

    In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Value of Flexibility - Phase 1

    DTIC Science & Technology

    2010-09-25

    weaknesses of each approach. During this period, we also explored the development of an analytical framework based on sound mathematical constructs... mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory or guidance on best approaches to...research activities is in developing a coherent value based definition of flexibility that is based on an analytical framework that is mathematically

  13. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  14. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    PubMed

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binotti, M.; Zhu, G.; Gray, A.

    An analytical approach, as an extension of one newly developed method -- First-principle OPTical Intercept Calculation (FirstOPTIC) -- is proposed to treat the geometrical impact of three-dimensional (3-D) effects on parabolic trough optical performance. The mathematical steps of this analytical approach are presented and implemented numerically as part of the suite of FirstOPTIC code. In addition, the new code has been carefully validated against ray-tracing simulation results and available numerical solutions. This new analytical approach to treating 3-D effects will facilitate further understanding and analysis of the optical performance of trough collectors as a function of incidence angle.

  16. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    ERIC Educational Resources Information Center

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  17. Observability during planetary approach navigation

    NASA Technical Reports Server (NTRS)

    Bishop, Robert H.; Burkhart, P. Daniel; Thurman, Sam W.

    1993-01-01

    The objective of the research is to develop an analytic technique to predict the relative navigation capability of different Earth-based radio navigation measurements. In particular, the problem is to determine the relative ability of geocentric range and Doppler measurements to detect the effects of the target planet gravitational attraction on the spacecraft during the planetary approach and near-encounter mission phases. A complete solution to the two-dimensional problem has been developed. Relatively simple analytic formulas are obtained for range and Doppler measurements which describe the observability content of the measurement data along the approach trajectories. An observability measure is defined which is based on the observability matrix for nonlinear systems. The results show good agreement between the analytic observability analysis and the computational batch processing method.

  18. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  19. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  20. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  1. Determination of aerodynamic sensitivity coefficients in the transonic and supersonic regimes

    NASA Technical Reports Server (NTRS)

    Elbanna, Hesham M.; Carlson, Leland A.

    1989-01-01

    The quasi-analytical approach is developed to compute airfoil aerodynamic sensitivity coefficients in the transonic and supersonic flight regimes. Initial investigation verifies the feasibility of this approach as applied to the transonic small perturbation residual expression. Results are compared to those obtained by the direct (finite difference) approach and both methods are evaluated to determine their computational accuracies and efficiencies. The quasi-analytical approach is shown to be superior and worth further investigation.

  2. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  3. Tiered analytics for purity assessment of macrocyclic peptides in drug discovery: Analytical consideration and method development.

    PubMed

    Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A

    2017-05-10

    Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.

  4. Strategic assay deployment as a method for countering analytical bottlenecks in high throughput process development: case studies in ion exchange chromatography.

    PubMed

    Konstantinidis, Spyridon; Heldin, Eva; Chhatre, Sunil; Velayudhan, Ajoy; Titchener-Hooker, Nigel

    2012-01-01

    High throughput approaches to facilitate the development of chromatographic separations have now been adopted widely in the biopharmaceutical industry, but issues of how to reduce the associated analytical burden remain. For example, acquiring experimental data by high level factorial designs in 96 well plates can place a considerable strain upon assay capabilities, generating a bottleneck that limits significantly the speed of process characterization. This article proposes an approach designed to counter this challenge; Strategic Assay Deployment (SAD). In SAD, a set of available analytical methods is investigated to determine which set of techniques is the most appropriate to use and how best to deploy these to reduce the consumption of analytical resources while still enabling accurate and complete process characterization. The approach is demonstrated by investigating how salt concentration and pH affect the binding of green fluorescent protein from Escherichia coli homogenate to an anion exchange resin presented in a 96-well filter plate format. Compared with the deployment of routinely used analytical methods alone, the application of SAD reduced both the total assay time and total assay material consumption by at least 40% and 5%, respectively. SAD has significant utility in accelerating bioprocess development activities. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  5. Satellite Orbit Under Influence of a Drag - Analytical Approach

    NASA Astrophysics Data System (ADS)

    Martinović, M. M.; Šegan, S. D.

    2017-12-01

    The report studies some changes in orbital elements of the artificial satellites of Earth under influence of atmospheric drag. In order to develop possibilities of applying the results in many future cases, an analytical interpretation of the orbital element perturbations is given via useful, but very long expressions. The development is based on the TD88 air density model, recently upgraded with some additional terms. Some expressions and formulae were developed by the computer algebra system Mathematica and tested in some hypothetical cases. The results have good agreement with iterative (numerical) approach.

  6. Basic emotion processing and the adolescent brain: Task demands, analytic approaches, and trajectories of changes.

    PubMed

    Del Piero, Larissa B; Saxbe, Darby E; Margolin, Gayla

    2016-06-01

    Early neuroimaging studies suggested that adolescents show initial development in brain regions linked with emotional reactivity, but slower development in brain structures linked with emotion regulation. However, the increased sophistication of adolescent brain research has made this picture more complex. This review examines functional neuroimaging studies that test for differences in basic emotion processing (reactivity and regulation) between adolescents and either children or adults. We delineated different emotional processing demands across the experimental paradigms in the reviewed studies to synthesize the diverse results. The methods for assessing change (i.e., analytical approach) and cohort characteristics (e.g., age range) were also explored as potential factors influencing study results. Few unifying dimensions were found to successfully distill the results of the reviewed studies. However, this review highlights the potential impact of subtle methodological and analytic differences between studies, need for standardized and theory-driven experimental paradigms, and necessity of analytic approaches that are can adequately test the trajectories of developmental change that have recently been proposed. Recommendations for future research highlight connectivity analyses and non-linear developmental trajectories, which appear to be promising approaches for measuring change across adolescence. Recommendations are made for evaluating gender and biological markers of development beyond chronological age. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Basic emotion processing and the adolescent brain: Task demands, analytic approaches, and trajectories of changes

    PubMed Central

    Del Piero, Larissa B.; Saxbe, Darby E.; Margolin, Gayla

    2016-01-01

    Early neuroimaging studies suggested that adolescents show initial development in brain regions linked with emotional reactivity, but slower development in brain structures linked with emotion regulation. However, the increased sophistication of adolescent brain research has made this picture more complex. This review examines functional neuroimaging studies that test for differences in basic emotion processing (reactivity and regulation) between adolescents and either children or adults. We delineated different emotional processing demands across the experimental paradigms in the reviewed studies to synthesize the diverse results. The methods for assessing change (i.e., analytical approach) and cohort characteristics (e.g., age range) were also explored as potential factors influencing study results. Few unifying dimensions were found to successfully distill the results of the reviewed studies. However, this review highlights the potential impact of subtle methodological and analytic differences between studies, need for standardized and theory-driven experimental paradigms, and necessity of analytic approaches that are can adequately test the trajectories of developmental change that have recently been proposed. Recommendations for future research highlight connectivity analyses and nonlinear developmental trajectories, which appear to be promising approaches for measuring change across adolescence. Recommendations are made for evaluating gender and biological markers of development beyond chronological age. PMID:27038840

  8. Semi-Analytic Reconstruction of Flux in Finite Volume Formulations

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2006-01-01

    Semi-analytic reconstruction uses the analytic solution to a second-order, steady, ordinary differential equation (ODE) to simultaneously evaluate the convective and diffusive flux at all interfaces of a finite volume formulation. The second-order ODE is itself a linearized approximation to the governing first- and second- order partial differential equation conservation laws. Thus, semi-analytic reconstruction defines a family of formulations for finite volume interface fluxes using analytic solutions to approximating equations. Limiters are not applied in a conventional sense; rather, diffusivity is adjusted in the vicinity of changes in sign of eigenvalues in order to achieve a sufficiently small cell Reynolds number in the analytic formulation across critical points. Several approaches for application of semi-analytic reconstruction for the solution of one-dimensional scalar equations are introduced. Results are compared with exact analytic solutions to Burger s Equation as well as a conventional, upwind discretization using Roe s method. One approach, the end-point wave speed (EPWS) approximation, is further developed for more complex applications. One-dimensional vector equations are tested on a quasi one-dimensional nozzle application. The EPWS algorithm has a more compact difference stencil than Roe s algorithm but reconstruction time is approximately a factor of four larger than for Roe. Though both are second-order accurate schemes, Roe s method approaches a grid converged solution with fewer grid points. Reconstruction of flux in the context of multi-dimensional, vector conservation laws including effects of thermochemical nonequilibrium in the Navier-Stokes equations is developed.

  9. Chemical Sensor Array Response Modeling Using Quantitative Structure-Activity Relationships Technique

    NASA Astrophysics Data System (ADS)

    Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.

    We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.

  10. Pavement Performance : Approaches Using Predictive Analytics

    DOT National Transportation Integrated Search

    2018-03-23

    Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...

  11. Technology advancement for integrative stem cell analyses.

    PubMed

    Jeong, Yoon; Choi, Jonghoon; Lee, Kwan Hyi

    2014-12-01

    Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose--by introducing a concept of vertical and horizontal approach--that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment.

  12. Analytical method development of nifedipine and its degradants binary mixture using high performance liquid chromatography through a quality by design approach

    NASA Astrophysics Data System (ADS)

    Choiri, S.; Ainurofiq, A.; Ratri, R.; Zulmi, M. U.

    2018-03-01

    Nifedipin (NIF) is a photo-labile drug that easily degrades when it exposures a sunlight. This research aimed to develop of an analytical method using a high-performance liquid chromatography and implemented a quality by design approach to obtain effective, efficient, and validated analytical methods of NIF and its degradants. A 22 full factorial design approach with a curvature as a center point was applied to optimize of the analytical condition of NIF and its degradants. Mobile phase composition (MPC) and flow rate (FR) as factors determined on the system suitability parameters. The selected condition was validated by cross-validation using a leave one out technique. Alteration of MPC affected on time retention significantly. Furthermore, an increase of FR reduced the tailing factor. In addition, the interaction of both factors affected on an increase of the theoretical plates and resolution of NIF and its degradants. The selected analytical condition of NIF and its degradants has been validated at range 1 – 16 µg/mL that had good linearity, precision, accuration and efficient due to an analysis time within 10 min.

  13. Method development and qualification of capillary zone electrophoresis for investigation of therapeutic monoclonal antibody quality.

    PubMed

    Suba, Dávid; Urbányi, Zoltán; Salgó, András

    2016-10-01

    Capillary electrophoresis techniques are widely used in the analytical biotechnology. Different electrophoretic techniques are very adequate tools to monitor size-and charge heterogenities of protein drugs. Method descriptions and development studies of capillary zone electrophoresis (CZE) have been described in literature. Most of them are performed based on the classical one-factor-at-time (OFAT) approach. In this study a very simple method development approach is described for capillary zone electrophoresis: a "two-phase-four-step" approach is introduced which allows a rapid, iterative method development process and can be a good platform for CZE method. In every step the current analytical target profile and an appropriate control strategy were established to monitor the current stage of development. A very good platform was established to investigate intact and digested protein samples. Commercially available monoclonal antibody was chosen as model protein for the method development study. The CZE method was qualificated after the development process and the results were presented. The analytical system stability was represented by the calculated RSD% value of area percentage and migration time of the selected peaks (<0.8% and <5%) during the intermediate precision investigation. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  15. Multi-analytical Approaches Informing the Risk of Sepsis

    NASA Astrophysics Data System (ADS)

    Gwadry-Sridhar, Femida; Lewden, Benoit; Mequanint, Selam; Bauer, Michael

    Sepsis is a significant cause of mortality and morbidity and is often associated with increased hospital resource utilization, prolonged intensive care unit (ICU) and hospital stay. The economic burden associated with sepsis is huge. With advances in medicine, there are now aggressive goal oriented treatments that can be used to help these patients. If we were able to predict which patients may be at risk for sepsis we could start treatment early and potentially reduce the risk of mortality and morbidity. Analytic methods currently used in clinical research to determine the risk of a patient developing sepsis may be further enhanced by using multi-modal analytic methods that together could be used to provide greater precision. Researchers commonly use univariate and multivariate regressions to develop predictive models. We hypothesized that such models could be enhanced by using multiple analytic methods that together could be used to provide greater insight. In this paper, we analyze data about patients with and without sepsis using a decision tree approach and a cluster analysis approach. A comparison with a regression approach shows strong similarity among variables identified, though not an exact match. We compare the variables identified by the different approaches and draw conclusions about the respective predictive capabilities,while considering their clinical significance.

  16. Simplified analytical model and balanced design approach for light-weight wood-based structural panel in bending

    Treesearch

    Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai

    2016-01-01

    This paper presents a simplified analytical model and balanced design approach for modeling lightweight wood-based structural panels in bending. Because many design parameters are required to input for the model of finite element analysis (FEA) during the preliminary design process and optimization, the equivalent method was developed to analyze the mechanical...

  17. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  18. Development of an Analytical Method for Dibutyl Phthalate Determination Using Surrogate Analyte Approach

    PubMed Central

    Farzanehfar, Vahid; Faizi, Mehrdad; Naderi, Nima; Kobarfard, Farzad

    2017-01-01

    Dibutyl phthalate (DBP) is a phthalic acid ester and is widely used in polymeric products to make them more flexible. DBP is found in almost every plastic material and is believed to be persistent in the environment. Various analytical methods have been used to measure DBP in different matrices. Considering the ubiquitous nature of DBP, the most important challenge in DBP analyses is the contamination of even analytical grade organic solvents with this compound and lack of availability of a true blank matrix to construct the calibration line. Standard addition method or using artificial matrices reduce the precision and accuracy of the results. In this study a surrogate analyte approach that is based on using deuterium labeled analyte (DBP-d4) to construct the calibration line was applied to determine DBP in hexane samples. PMID:28496469

  19. Advances in Assays and Analytical Approaches for Botulinum Toxin Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grate, Jay W.; Ozanich, Richard M.; Warner, Marvin G.

    2010-08-04

    Methods to detect botulinum toxin, the most poisonous substance known, are reviewed. Current assays are being developed with two main objectives in mind: 1) to obtain sufficiently low detection limits to replace the mouse bioassay with an in vitro assay, and 2) to develop rapid assays for screening purposes that are as sensitive as possible while requiring an hour or less to process the sample an obtain the result. This review emphasizes the diverse analytical approaches and devices that have been developed over the last decade, while also briefly reviewing representative older immunoassays to provide background and context.

  20. Optimal guidance law development for an advanced launch system

    NASA Technical Reports Server (NTRS)

    Calise, Anthony J.; Leung, Martin S. K.

    1995-01-01

    The objective of this research effort was to develop a real-time guidance approach for launch vehicles ascent to orbit injection. Various analytical approaches combined with a variety of model order and model complexity reduction have been investigated. Singular perturbation methods were first attempted and found to be unsatisfactory. The second approach based on regular perturbation analysis was subsequently investigated. It also fails because the aerodynamic effects (ignored in the zero order solution) are too large to be treated as perturbations. Therefore, the study demonstrates that perturbation methods alone (both regular and singular perturbations) are inadequate for use in developing a guidance algorithm for the atmospheric flight phase of a launch vehicle. During a second phase of the research effort, a hybrid analytic/numerical approach was developed and evaluated. The approach combines the numerical methods of collocation and the analytical method of regular perturbations. The concept of choosing intelligent interpolating functions is also introduced. Regular perturbation analysis allows the use of a crude representation for the collocation solution, and intelligent interpolating functions further reduce the number of elements without sacrificing the approximation accuracy. As a result, the combined method forms a powerful tool for solving real-time optimal control problems. Details of the approach are illustrated in a fourth order nonlinear example. The hybrid approach is then applied to the launch vehicle problem. The collocation solution is derived from a bilinear tangent steering law, and results in a guidance solution for the entire flight regime that includes both atmospheric and exoatmospheric flight phases.

  1. Integrating DNA strand displacement circuitry to the nonlinear hybridization chain reaction.

    PubMed

    Zhang, Zhuo; Fan, Tsz Wing; Hsing, I-Ming

    2017-02-23

    Programmable and modular attributes of DNA molecules allow one to develop versatile sensing platforms that can be operated isothermally and enzyme-free. In this work, we present an approach to integrate upstream DNA strand displacement circuits that can be turned on by a sequence-specific microRNA analyte with a downstream nonlinear hybridization chain reaction for a cascading hyperbranched nucleic acid assembly. This system provides a two-step amplification strategy for highly sensitive detection of the miRNA analyte, conducive for multiplexed detection. Multiple miRNA analytes were tested with our integrated circuitry using the same downstream signal amplification setting, showing the decoupling of nonlinear self-assembly with the analyte sequence. Compared with the reported methods, our signal amplification approach provides an additional control module for higher-order DNA self-assembly and could be developed into a promising platform for the detection of critical nucleic-acid based biomarkers.

  2. On-orbit evaluation of the control system/structural mode interactions on OSO-8

    NASA Technical Reports Server (NTRS)

    Slafer, L. I.

    1980-01-01

    The Orbiting Solar Observatory-8 experienced severe structural mode/control loop interaction problems during the spacecraft development. Extensive analytical studies, using the hybrid coordinate modeling approach, and comprehensive ground testing were carried out in order to achieve the system's precision pointing performance requirements. A recent series of flight tests were conducted with the spacecraft in which a wide bandwidth, high resolution telemetry system was utilized to evaluate the on-orbit flexible dynamics characteristics of the vehicle along with the control system performance. This paper describes the results of these tests, reviewing the basic design problem, analytical approach taken, ground test philosophy, and on-orbit testing. Data from the tests was used to determine the primary mode frequency, damping, and servo coupling dynamics for the on-orbit condition. Additionally, the test results have verified analytically predicted differences between the on-orbit and ground test environments. The test results have led to a validation of both the analytical modeling and servo design techniques used during the development of the control system, and also verified the approach taken to vehicle and servo ground testing.

  3. Mechanics of additively manufactured porous biomaterials based on the rhombicuboctahedron unit cell.

    PubMed

    Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A

    2016-01-01

    Thanks to recent developments in additive manufacturing techniques, it is now possible to fabricate porous biomaterials with arbitrarily complex micro-architectures. Micro-architectures of such biomaterials determine their physical and biological properties, meaning that one could potentially improve the performance of such biomaterials through rational design of micro-architecture. The relationship between the micro-architecture of porous biomaterials and their physical and biological properties has therefore received increasing attention recently. In this paper, we studied the mechanical properties of porous biomaterials made from a relatively unexplored unit cell, namely rhombicuboctahedron. We derived analytical relationships that relate the micro-architecture of such porous biomaterials, i.e. the dimensions of the rhombicuboctahedron unit cell, to their elastic modulus, Poisson's ratio, and yield stress. Finite element models were also developed to validate the analytical solutions. Analytical and numerical results were compared with experimental data from one of our recent studies. It was found that analytical solutions and numerical results show a very good agreement particularly for smaller values of apparent density. The elastic moduli predicted by analytical and numerical models were in very good agreement with experimental observations too. While in excellent agreement with each other, analytical and numerical models somewhat over-predicted the yield stress of the porous structures as compared to experimental data. As the ratio of the vertical struts to the inclined struts, α, approaches zero and infinity, the rhombicuboctahedron unit cell respectively approaches the octahedron (or truncated cube) and cube unit cells. For those limits, the analytical solutions presented here were found to approach the analytic solutions obtained for the octahedron, truncated cube, and cube unit cells, meaning that the presented solutions are generalizations of the analytical solutions obtained for several other types of porous biomaterials. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  5. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions

    NASA Astrophysics Data System (ADS)

    Donahue, William; Newhauser, Wayne D.; Ziegler, James F.

    2016-09-01

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u-1 to 450 MeV u-1 or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  6. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions.

    PubMed

    Donahue, William; Newhauser, Wayne D; Ziegler, James F

    2016-09-07

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u(-1) to 450 MeV u(-1) or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  7. A Skills Approach to Career Development.

    ERIC Educational Resources Information Center

    Grites, Thomas J.

    1983-01-01

    A counseling approach encourages students' development of job-applicable, career-transferable skills to meet the changing demands of specialization, automation, mobility, urban growth, and industrial trends in the job market. These include writing; speaking; research; and analytical, organizational, leadership, interpersonal, and quantitative…

  8. Grade 8 students' capability of analytical thinking and attitude toward science through teaching and learning about soil and its' pollution based on science technology and society (STS) approach

    NASA Astrophysics Data System (ADS)

    Boonprasert, Lapisarin; Tupsai, Jiraporn; Yuenyong, Chokchai

    2018-01-01

    This study reported Grade 8 students' analytical thinking and attitude toward science in teaching and learning about soil and its' pollution through science technology and society (STS) approach. The participants were 36 Grade 8 students in Naklang, Nongbualumphu, Thailand. The teaching and learning about soil and its' pollution through STS approach had carried out for 6 weeks. The soil and its' pollution unit through STS approach was developed based on framework of Yuenyong (2006) that consisted of five stages including (1) identification of social issues, (2) identification of potential solutions, (3) need for knowledge, (4) decision-making, and (5) socialization stage. Students' analytical thinking and attitude toward science was collected during their learning by participant observation, analytical thinking test, students' tasks, and journal writing. The findings revealed that students could gain their capability of analytical thinking. They could give ideas or behave the characteristics of analytical thinking such as thinking for classifying, compare and contrast, reasoning, interpreting, collecting data and decision making. Students' journal writing reflected that the STS class of soil and its' pollution motivated students. The paper will discuss implications of these for science teaching and learning through STS in Thailand.

  9. The Development of Verbal and Visual Working Memory Processes: A Latent Variable Approach

    ERIC Educational Resources Information Center

    Koppenol-Gonzalez, Gabriela V.; Bouwmeester, Samantha; Vermunt, Jeroen K.

    2012-01-01

    Working memory (WM) processing in children has been studied with different approaches, focusing on either the organizational structure of WM processing during development (factor analytic) or the influence of different task conditions on WM processing (experimental). The current study combined both approaches, aiming to distinguish verbal and…

  10. Analytical approach of laser beam propagation in the hollow polygonal light pipe.

    PubMed

    Zhu, Guangzhi; Zhu, Xiao; Zhu, Changhong

    2013-08-10

    An analytical method of researching the light distribution properties on the output end of a hollow n-sided polygonal light pipe and a light source with a Gaussian distribution is developed. The mirror transformation matrices and a special algorithm of removing void virtual images are created to acquire the location and direction vector of each effective virtual image on the entrance plane. The analytical method is demonstrated by Monte Carlo ray tracing. At the same time, four typical cases are discussed. The analytical results indicate that the uniformity of light distribution varies with the structural and optical parameters of the hollow n-sided polygonal light pipe and light source with a Gaussian distribution. The analytical approach will be useful to design and choose the hollow n-sided polygonal light pipe, especially for high-power laser beam homogenization techniques.

  11. VAP/VAT: video analytics platform and test bed for testing and deploying video analytics

    NASA Astrophysics Data System (ADS)

    Gorodnichy, Dmitry O.; Dubrofsky, Elan

    2010-04-01

    Deploying Video Analytics in operational environments is extremely challenging. This paper presents a methodological approach developed by the Video Surveillance and Biometrics Section (VSB) of the Science and Engineering Directorate (S&E) of the Canada Border Services Agency (CBSA) to resolve these problems. A three-phase approach to enable VA deployment within an operational agency is presented and the Video Analytics Platform and Testbed (VAP/VAT) developed by the VSB section is introduced. In addition to allowing the integration of third party and in-house built VA codes into an existing video surveillance infrastructure, VAP/VAT also allows the agency to conduct an unbiased performance evaluation of the cameras and VA software available on the market. VAP/VAT consists of two components: EventCapture, which serves to Automatically detect a "Visual Event", and EventBrowser, which serves to Display & Peruse of "Visual Details" captured at the "Visual Event". To deal with Open architecture as well as with Closed architecture cameras, two video-feed capture mechanisms have been developed within the EventCapture component: IPCamCapture and ScreenCapture.

  12. Earthdata Cloud Analytics Project

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Lynnes, Chris

    2018-01-01

    This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.

  13. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  14. Invention in Argument.

    ERIC Educational Resources Information Center

    Fahnestock, Jeanne; Secor, Marie

    A genre approach to teaching the argumentative essay in composition classes has been developed. The need for this approach emanated from problems associated with the other methods of teaching persuasive discourse, such as the logical/analytic, content/problem solving, and rhetorical/generative approaches. The genre approach depends on the…

  15. THE FUTURE OF SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (SMARTE): 2006-2010

    EPA Science Inventory

    SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...

  16. Cognitive-Developmental and Behavior-Analytic Theories: Evolving into Complementarity

    ERIC Educational Resources Information Center

    Overton, Willis F.; Ennis, Michelle D.

    2006-01-01

    Historically, cognitive-developmental and behavior-analytic approaches to the study of human behavior change and development have been presented as incompatible alternative theoretical and methodological perspectives. This presumed incompatibility has been understood as arising from divergent sets of metatheoretical assumptions that take the form…

  17. In-orbit evaluation of the control system/structural mode interactions of the OSO-8 spacecraft

    NASA Technical Reports Server (NTRS)

    Slafer, L. I.

    1979-01-01

    The Orbiting Solar Observatory-8 experienced severe structural mode/control loop interaction problems during the spacecraft development. Extensive analytical studies, using the hybrid coordinate modeling approach, and comprehensive ground testing were carried out in order to achieve the system's precision pointing performance requirements. A recent series of flight tests were conducted with the spacecraft in which a wide bandwidth, high resolution telemetry system was utilized to evaluate the on-orbit flexible dynamics characteristics of the vehicle along with the control system performance. The paper describes the results of these tests, reviewing the basic design problem, analytical approach taken, ground test philosophy, and on-orbit testing. Data from the tests was used to determine the primary mode frequency, damping, and servo coupling dynamics for the on-orbit condition. Additionally, the test results have verified analytically predicted differences between the on-orbit and ground test environments, and have led to a validation of both the analytical modeling and servo design techniques used during the development of the control system.

  18. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education.

    PubMed

    Hervatis, Vasilis; Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-10-06

    Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators' decision making. A deductive case study approach was applied to develop the conceptual model. The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach.

  19. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education

    PubMed Central

    Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-01-01

    Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840

  20. Using a Systematic Approach to Develop a Chemistry Course Introducing Students to Instrumental Analysis

    ERIC Educational Resources Information Center

    Shen, Hao-Yu; Shen, Bo; Hardacre, Christopher

    2013-01-01

    A systematic approach to develop the teaching of instrumental analytical chemistry is discussed, as well as a conceptual framework for organizing and executing lectures and a laboratory course. Three main components are used in this course: theoretical knowledge developed in the classroom, simulations via a virtual laboratory, and practical…

  1. A Corpus-Based Approach to Online Materials Development for Writing Research Articles

    ERIC Educational Resources Information Center

    Chang, Ching-Fen; Kuo, Chih-Hua

    2011-01-01

    There has been increasing interest in the possible applications of corpora to both linguistic research and pedagogy. This study takes a corpus-based, genre-analytic approach to discipline-specific materials development. Combining corpus analysis with genre analysis makes it possible to develop teaching materials that are not only authentic but…

  2. Approach to method development and validation in capillary electrophoresis for enantiomeric purity testing of active basic pharmaceutical ingredients.

    PubMed

    Sokoliess, Torsten; Köller, Gerhard

    2005-06-01

    A chiral capillary electrophoresis system allowing the determination of the enantiomeric purity of an investigational new drug was developed using a generic method development approach for basic analytes. The method was optimized in terms of type and concentration of both cyclodextrin (CD) and electrolyte, buffer pH, temperature, voltage, and rinsing procedure. Optimal chiral separation of the analyte was obtained using an electrolyte with 2.5% carboxymethyl-beta-CD in 25 mM NaH2PO4 (pH 4.0). Interchanging the inlet and outlet vials after each run improved the method's precision. To assure the method's suitability for the control of enantiomeric impurities in pharmaceutical quality control, its specificity, linearity, precision, accuracy, and robustness were validated according to the requirements of the International Conference on Harmonization. The usefulness of our generic method development approach for the validation of robustness was demonstrated.

  3. Harmonization of strategies for the validation of quantitative analytical procedures. A SFSTP proposal--Part I.

    PubMed

    Hubert, Ph; Nguyen-Huu, J-J; Boulanger, B; Chapuzet, E; Chiap, P; Cohen, N; Compagnon, P-A; Dewé, W; Feinberg, M; Lallier, M; Laurentie, M; Mercier, N; Muzard, G; Nivet, C; Valat, L

    2004-11-15

    This paper is the first part of a summary report of a new commission of the Société Française des Sciences et Techniques Pharmaceutiques (SFSTP). The main objective of this commission was the harmonization of approaches for the validation of quantitative analytical procedures. Indeed, the principle of the validation of theses procedures is today widely spread in all the domains of activities where measurements are made. Nevertheless, this simple question of acceptability or not of an analytical procedure for a given application, remains incompletely determined in several cases despite the various regulations relating to the good practices (GLP, GMP, ...) and other documents of normative character (ISO, ICH, FDA, ...). There are many official documents describing the criteria of validation to be tested, but they do not propose any experimental protocol and limit themselves most often to the general concepts. For those reasons, two previous SFSTP commissions elaborated validation guides to concretely help the industrial scientists in charge of drug development to apply those regulatory recommendations. If these two first guides widely contributed to the use and progress of analytical validations, they present, nevertheless, weaknesses regarding the conclusions of the performed statistical tests and the decisions to be made with respect to the acceptance limits defined by the use of an analytical procedure. The present paper proposes to review even the bases of the analytical validation for developing harmonized approach, by distinguishing notably the diagnosis rules and the decision rules. This latter rule is based on the use of the accuracy profile, uses the notion of total error and allows to simplify the approach of the validation of an analytical procedure while checking the associated risk to its usage. Thanks to this novel validation approach, it is possible to unambiguously demonstrate the fitness for purpose of a new method as stated in all regulatory documents.

  4. A modeling approach to compare ΣPCB concentrations between congener-specific analyses

    USGS Publications Warehouse

    Gibson, Polly P.; Mills, Marc A.; Kraus, Johanna M.; Walters, David M.

    2017-01-01

    Changes in analytical methods over time pose problems for assessing long-term trends in environmental contamination by polychlorinated biphenyls (PCBs). Congener-specific analyses vary widely in the number and identity of the 209 distinct PCB chemical configurations (congeners) that are quantified, leading to inconsistencies among summed PCB concentrations (ΣPCB) reported by different studies. Here we present a modeling approach using linear regression to compare ΣPCB concentrations derived from different congener-specific analyses measuring different co-eluting groups. The approach can be used to develop a specific conversion model between any two sets of congener-specific analytical data from similar samples (similar matrix and geographic origin). We demonstrate the method by developing a conversion model for an example data set that includes data from two different analytical methods, a low resolution method quantifying 119 congeners and a high resolution method quantifying all 209 congeners. We used the model to show that the 119-congener set captured most (93%) of the total PCB concentration (i.e., Σ209PCB) in sediment and biological samples. ΣPCB concentrations estimated using the model closely matched measured values (mean relative percent difference = 9.6). General applications of the modeling approach include (a) generating comparable ΣPCB concentrations for samples that were analyzed for different congener sets; and (b) estimating the proportional contribution of different congener sets to ΣPCB. This approach may be especially valuable for enabling comparison of long-term remediation monitoring results even as analytical methods change over time. 

  5. Capabilities for Intercultural Dialogue

    ERIC Educational Resources Information Center

    Crosbie, Veronica

    2014-01-01

    The capabilities approach offers a valuable analytical lens for exploring the challenge and complexity of intercultural dialogue in contemporary settings. The central tenets of the approach, developed by Amartya Sen and Martha Nussbaum, involve a set of humanistic goals including the recognition that development is a process whereby people's…

  6. A novel quality by design approach for developing an HPLC method to analyze herbal extracts: A case study of sugar content analysis.

    PubMed

    Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu

    2018-01-01

    The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.

  7. Accurate analytical modeling of junctionless DG-MOSFET by green's function approach

    NASA Astrophysics Data System (ADS)

    Nandi, Ashutosh; Pandey, Nilesh

    2017-11-01

    An accurate analytical model of Junctionless double gate MOSFET (JL-DG-MOSFET) in the subthreshold regime of operation is developed in this work using green's function approach. The approach considers 2-D mixed boundary conditions and multi-zone techniques to provide an exact analytical solution to 2-D Poisson's equation. The Fourier coefficients are calculated correctly to derive the potential equations that are further used to model the channel current and subthreshold slope of the device. The threshold voltage roll-off is computed from parallel shifts of Ids-Vgs curves between the long channel and short-channel devices. It is observed that the green's function approach of solving 2-D Poisson's equation in both oxide and silicon region can accurately predict channel potential, subthreshold current (Isub), threshold voltage (Vt) roll-off and subthreshold slope (SS) of both long & short channel devices designed with different doping concentrations and higher as well as lower tsi/tox ratio. All the analytical model results are verified through comparisons with TCAD Sentaurus simulation results. It is observed that the model matches quite well with TCAD device simulations.

  8. Flexible aircraft dynamic modeling for dynamic analysis and control synthesis

    NASA Technical Reports Server (NTRS)

    Schmidt, David K.

    1989-01-01

    The linearization and simplification of a nonlinear, literal model for flexible aircraft is highlighted. Areas of model fidelity that are critical if the model is to be used for control system synthesis are developed and several simplification techniques that can deliver the necessary model fidelity are discussed. These techniques include both numerical and analytical approaches. An analytical approach, based on first-order sensitivity theory is shown to lead not only to excellent numerical results, but also to closed-form analytical expressions for key system dynamic properties such as the pole/zero factors of the vehicle transfer-function matrix. The analytical results are expressed in terms of vehicle mass properties, vibrational characteristics, and rigid-body and aeroelastic stability derivatives, thus leading to the underlying causes for critical dynamic characteristics.

  9. xQuake: A Modern Approach to Seismic Network Analytics

    NASA Astrophysics Data System (ADS)

    Johnson, C. E.; Aikin, K. E.

    2017-12-01

    While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.

  10. Analytical and quasi-Bayesian methods as development of the iterative approach for mixed radiation biodosimetry.

    PubMed

    Słonecka, Iwona; Łukasik, Krzysztof; Fornalski, Krzysztof W

    2018-06-04

    The present paper proposes two methods of calculating components of the dose absorbed by the human body after exposure to a mixed neutron and gamma radiation field. The article presents a novel approach to replace the common iterative method in its analytical form, thus reducing the calculation time. It also shows a possibility of estimating the neutron and gamma doses when their ratio in a mixed beam is not precisely known.

  11. Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level

    PubMed Central

    Savalei, Victoria; Rhemtulla, Mijke

    2017-01-01

    In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data—that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study. PMID:29276371

  12. Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level.

    PubMed

    Savalei, Victoria; Rhemtulla, Mijke

    2017-08-01

    In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data-that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study.

  13. Hierarchical Analytical Approaches for Unraveling the Composition of Proprietary Mixtures

    EPA Pesticide Factsheets

    The composition of commercial mixtures including pesticide inert ingredients, aircraft deicers, and aqueous film-forming foam (AFFF) formulations, and by analogy, fracking fluids, are proprietary. Quantitative analytical methodologies can only be developed for mixture components once their identities are known. Because proprietary mixtures may contain volatile and non-volatile components, a hierarchy of analytical methods is often required for the full identification of all proprietary mixture components.

  14. Random Forests for Evaluating Pedagogy and Informing Personalized Learning

    ERIC Educational Resources Information Center

    Spoon, Kelly; Beemer, Joshua; Whitmer, John C.; Fan, Juanjuan; Frazee, James P.; Stronach, Jeanne; Bohonak, Andrew J.; Levine, Richard A.

    2016-01-01

    Random forests are presented as an analytics foundation for educational data mining tasks. The focus is on course- and program-level analytics including evaluating pedagogical approaches and interventions and identifying and characterizing at-risk students. As part of this development, the concept of individualized treatment effects (ITE) is…

  15. Data analytics approach to create waste generation profiles for waste management and collection.

    PubMed

    Niska, Harri; Serkkola, Ari

    2018-04-30

    Extensive monitoring data on waste generation is increasingly collected in order to implement cost-efficient and sustainable waste management operations. In addition, geospatial data from different registries of the society are opening for free usage. Novel data analytics approaches can be built on the top of the data to produce more detailed, and in-time waste generation information for the basis of waste management and collection. In this paper, a data-based approach based on the self-organizing map (SOM) and the k-means algorithm is developed for creating a set of waste generation type profiles. The approach is demonstrated using the extensive container-level waste weighting data collected in the metropolitan area of Helsinki, Finland. The results obtained highlight the potential of advanced data analytic approaches in producing more detailed waste generation information e.g. for the basis of tailored feedback services for waste producers and the planning and optimization of waste collection and recycling. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Technology Advancement for Integrative Stem Cell Analyses

    PubMed Central

    Jeong, Yoon

    2014-01-01

    Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose—by introducing a concept of vertical and horizontal approach—that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment. PMID:24874188

  17. Technosocial Predictive Analytics in Support of Naturalistic Decision Making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.; Cowell, Andrew J.; Malone, Elizabeth L.

    2009-06-23

    A main challenge we face in fostering sustainable growth is to anticipate outcomes through predictive and proactive across domains as diverse as energy, security, the environment, health and finance in order to maximize opportunities, influence outcomes and counter adversities. The goal of this paper is to present new methods for anticipatory analytical thinking which address this challenge through the development of a multi-perspective approach to predictive modeling as a core to a creative decision making process. This approach is uniquely multidisciplinary in that it strives to create decision advantage through the integration of human and physical models, and leverages knowledgemore » management and visual analytics to support creative thinking by facilitating the achievement of interoperable knowledge inputs and enhancing the user’s cognitive access. We describe a prototype system which implements this approach and exemplify its functionality with reference to a use case in which predictive modeling is paired with analytic gaming to support collaborative decision-making in the domain of agricultural land management.« less

  18. The Water-Energy-Food Nexus: A systematic review of methods for nexus assessment

    NASA Astrophysics Data System (ADS)

    Albrecht, Tamee R.; Crootof, Arica; Scott, Christopher A.

    2018-04-01

    The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex resource and development challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, while the WEF nexus offers a promising conceptual approach, the use of WEF nexus methods to systematically evaluate water, energy, and food interlinkages or support development of socially and politically-relevant resource policies has been limited. This paper reviews WEF nexus methods to provide a knowledge base of existing approaches and promote further development of analytical methods that align with nexus thinking. The systematic review of 245 journal articles and book chapters reveals that (a) use of specific and reproducible methods for nexus assessment is uncommon (less than one-third); (b) nexus methods frequently fall short of capturing interactions among water, energy, and food—the very linkages they conceptually purport to address; (c) assessments strongly favor quantitative approaches (nearly three-quarters); (d) use of social science methods is limited (approximately one-quarter); and (e) many nexus methods are confined to disciplinary silos—only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. To help overcome these limitations, we derive four key features of nexus analytical tools and methods—innovation, context, collaboration, and implementation—from the literature that reflect WEF nexus thinking. By evaluating existing nexus analytical approaches based on these features, we highlight 18 studies that demonstrate promising advances to guide future research. This paper finds that to address complex resource and development challenges, mixed-methods and transdisciplinary approaches are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and decision-makers.

  19. Optimal design of piezoelectric transformers: a rational approach based on an analytical model and a deterministic global optimization.

    PubMed

    Pigache, Francois; Messine, Frédéric; Nogarede, Bertrand

    2007-07-01

    This paper deals with a deterministic and rational way to design piezoelectric transformers in radial mode. The proposed approach is based on the study of the inverse problem of design and on its reformulation as a mixed constrained global optimization problem. The methodology relies on the association of the analytical models for describing the corresponding optimization problem and on an exact global optimization software, named IBBA and developed by the second author to solve it. Numerical experiments are presented and compared in order to validate the proposed approach.

  20. Major advances in testing of dairy products: milk component and dairy product attribute testing.

    PubMed

    Barbano, D M; Lynch, J M

    2006-04-01

    Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.

  1. A geovisual analytic approach to understanding geo-social relationships in the international trade network.

    PubMed

    Luo, Wei; Yin, Peifeng; Di, Qian; Hardisty, Frank; MacEachren, Alan M

    2014-01-01

    The world has become a complex set of geo-social systems interconnected by networks, including transportation networks, telecommunications, and the internet. Understanding the interactions between spatial and social relationships within such geo-social systems is a challenge. This research aims to address this challenge through the framework of geovisual analytics. We present the GeoSocialApp which implements traditional network analysis methods in the context of explicitly spatial and social representations. We then apply it to an exploration of international trade networks in terms of the complex interactions between spatial and social relationships. This exploration using the GeoSocialApp helps us develop a two-part hypothesis: international trade network clusters with structural equivalence are strongly 'balkanized' (fragmented) according to the geography of trading partners, and the geographical distance weighted by population within each network cluster has a positive relationship with the development level of countries. In addition to demonstrating the potential of visual analytics to provide insight concerning complex geo-social relationships at a global scale, the research also addresses the challenge of validating insights derived through interactive geovisual analytics. We develop two indicators to quantify the observed patterns, and then use a Monte-Carlo approach to support the hypothesis developed above.

  2. A Geovisual Analytic Approach to Understanding Geo-Social Relationships in the International Trade Network

    PubMed Central

    Luo, Wei; Yin, Peifeng; Di, Qian; Hardisty, Frank; MacEachren, Alan M.

    2014-01-01

    The world has become a complex set of geo-social systems interconnected by networks, including transportation networks, telecommunications, and the internet. Understanding the interactions between spatial and social relationships within such geo-social systems is a challenge. This research aims to address this challenge through the framework of geovisual analytics. We present the GeoSocialApp which implements traditional network analysis methods in the context of explicitly spatial and social representations. We then apply it to an exploration of international trade networks in terms of the complex interactions between spatial and social relationships. This exploration using the GeoSocialApp helps us develop a two-part hypothesis: international trade network clusters with structural equivalence are strongly ‘balkanized’ (fragmented) according to the geography of trading partners, and the geographical distance weighted by population within each network cluster has a positive relationship with the development level of countries. In addition to demonstrating the potential of visual analytics to provide insight concerning complex geo-social relationships at a global scale, the research also addresses the challenge of validating insights derived through interactive geovisual analytics. We develop two indicators to quantify the observed patterns, and then use a Monte-Carlo approach to support the hypothesis developed above. PMID:24558409

  3. Clinical reasoning and case-based decision making: the fundamental challenge to veterinary educators.

    PubMed

    May, Stephen A

    2013-01-01

    Confusion about the nature of human reasoning and its appropriate application to patients has hampered veterinary students' development of these skills. Expertise is associated with greater ability to deploy pattern recognition (type 1 reasoning), which is aided by progressive development of data-driven, forward reasoning (in contrast to scientific, backward reasoning), analytical approaches that lead to schema acquisition. The associative nature of type 1 reasoning makes it prone to bias, particularly in the face of "cognitive miserliness," when clues that indicate the need for triangulation with an analytical approach are ignored. However, combined reasoning approaches, from the earliest stages, are more successful than one approach alone, so it is important that those involved in curricular design and delivery promote student understanding of reasoning generally, and the situations in which reasoning goes awry, and develop students' ability to reason safely and accurately whether presented with a familiar case or with a case that they have never seen before.

  4. On finding the analytic dependencies of the external field potential on the control function when optimizing the beam dynamics

    NASA Astrophysics Data System (ADS)

    Ovsyannikov, A. D.; Kozynchenko, S. A.; Kozynchenko, V. A.

    2017-12-01

    When developing a particle accelerator for generating the high-precision beams, the injection system design is of importance, because it largely determines the output characteristics of the beam. At the present paper we consider the injection systems consisting of electrodes with given potentials. The design of such systems requires carrying out simulation of beam dynamics in the electrostatic fields. For external field simulation we use the new approach, proposed by A.D. Ovsyannikov, which is based on analytical approximations, or finite difference method, taking into account the real geometry of the injection system. The software designed for solving the problems of beam dynamics simulation and optimization in the injection system for non-relativistic beams has been developed. Both beam dynamics and electric field simulations in the injection system which use analytical approach and finite difference method have been made and the results presented in this paper.

  5. Immunoanalysis Methods for the Detection of Dioxins and Related Chemicals

    PubMed Central

    Tian, Wenjing; Xie, Heidi Qunhui; Fu, Hualing; Pei, Xinhui; Zhao, Bin

    2012-01-01

    With the development of biotechnology, approaches based on antibodies, such as enzyme-linked immunosorbent assay (ELISA), active aryl hydrocarbon immunoassay (Ah-I) and other multi-analyte immunoassays, have been utilized as alternatives to the conventional techniques based on gas chromatography and mass spectroscopy for the analysis of dioxin and dioxin-like compounds in environmental and biological samples. These screening methods have been verified as rapid, simple and cost-effective. This paper provides an overview on the development and application of antibody-based approaches, such as ELISA, Ah-I, and multi-analyte immunoassays, covering the sample extraction and cleanup, antigen design, antibody preparation and immunoanalysis. However, in order to meet the requirements for on-site fast detection and relative quantification of dioxins in the environment, further optimization is needed to make these immuno-analytical methods more sensitive and easy to use. PMID:23443395

  6. Environmental management strategy: four forces analysis.

    PubMed

    Doyle, Martin W; Von Windheim, Jesko

    2015-01-01

    We develop an analytical approach for more systematically analyzing environmental management problems in order to develop strategic plans. This approach can be deployed by agencies, non-profit organizations, corporations, or other organizations and institutions tasked with improving environmental quality. The analysis relies on assessing the underlying natural processes followed by articulation of the relevant societal forces causing environmental change: (1) science and technology, (2) governance, (3) markets and the economy, and (4) public behavior. The four forces analysis is then used to strategize which types of actions might be most effective at influencing environmental quality. Such strategy has been under-used and under-valued in environmental management outside of the corporate sector, and we suggest that this four forces analysis is a useful analytic to begin developing such strategy.

  7. Environmental Management Strategy: Four Forces Analysis

    NASA Astrophysics Data System (ADS)

    Doyle, Martin W.; Von Windheim, Jesko

    2015-01-01

    We develop an analytical approach for more systematically analyzing environmental management problems in order to develop strategic plans. This approach can be deployed by agencies, non-profit organizations, corporations, or other organizations and institutions tasked with improving environmental quality. The analysis relies on assessing the underlying natural processes followed by articulation of the relevant societal forces causing environmental change: (1) science and technology, (2) governance, (3) markets and the economy, and (4) public behavior. The four forces analysis is then used to strategize which types of actions might be most effective at influencing environmental quality. Such strategy has been under-used and under-valued in environmental management outside of the corporate sector, and we suggest that this four forces analysis is a useful analytic to begin developing such strategy.

  8. Assessment regarding the use of the computer aided analytical models in the calculus of the general strength of a ship hull

    NASA Astrophysics Data System (ADS)

    Hreniuc, V.; Hreniuc, A.; Pescaru, A.

    2017-08-01

    Solving a general strength problem of a ship hull may be done using analytical approaches which are useful to deduce the buoyancy forces distribution, the weighting forces distribution along the hull and the geometrical characteristics of the sections. These data are used to draw the free body diagrams and to compute the stresses. The general strength problems require a large amount of calculi, therefore it is interesting how a computer may be used to solve such problems. Using computer programming an engineer may conceive software instruments based on analytical approaches. However, before developing the computer code the research topic must be thoroughly analysed, in this way being reached a meta-level of understanding of the problem. The following stage is to conceive an appropriate development strategy of the original software instruments useful for the rapid development of computer aided analytical models. The geometrical characteristics of the sections may be computed using a bool algebra that operates with ‘simple’ geometrical shapes. By ‘simple’ we mean that for the according shapes we have direct calculus relations. In the set of ‘simple’ shapes we also have geometrical entities bounded by curves approximated as spline functions or as polygons. To conclude, computer programming offers the necessary support to solve general strength ship hull problems using analytical methods.

  9. Two Approaches to Estimating the Effect of Parenting on the Development of Executive Function in Early Childhood

    ERIC Educational Resources Information Center

    Blair, Clancy; Raver, C. Cybele; Berry, Daniel J.

    2014-01-01

    In the current article, we contrast 2 analytical approaches to estimate the relation of parenting to executive function development in a sample of 1,292 children assessed longitudinally between the ages of 36 and 60 months of age. Children were administered a newly developed and validated battery of 6 executive function tasks tapping inhibitory…

  10. Incorporating Students' Self-Designed, Research-Based Analytical Chemistry Projects into the Instrumentation Curriculum

    ERIC Educational Resources Information Center

    Gao, Ruomei

    2015-01-01

    In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…

  11. Geometrical enhancement of the electric field: Application of fractional calculus in nanoplasmonics

    NASA Astrophysics Data System (ADS)

    Baskin, E.; Iomin, A.

    2011-12-01

    We developed an analytical approach, for a wave propagation in metal-dielectric nanostructures in the quasi-static limit. This consideration establishes a link between fractional geometry of the nanostructure and fractional integro-differentiation. The method is based on fractional calculus and permits to obtain analytical expressions for the electric-field enhancement.

  12. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    PubMed

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Force 2025 and Beyond Strategic Force Design Analytic Model

    DTIC Science & Technology

    2017-01-12

    depiction of the core ideas of our force design model. Figure 1: Description of Force Design Model Figure 2 shows an overview of our methodology ...the F2025B Force Design Analytic Model research conducted by TRAC- MTRY and the Naval Postgraduate School. Our research develops a methodology for...designs. We describe a data development methodology that characterizes the data required to construct a force design model using our approach. We

  15. Climate Analytics as a Service. Chapter 11

    NASA Technical Reports Server (NTRS)

    Schnase, John L.

    2016-01-01

    Exascale computing, big data, and cloud computing are driving the evolution of large-scale information systems toward a model of data-proximal analysis. In response, we are developing a concept of climate analytics as a service (CAaaS) that represents a convergence of data analytics and archive management. With this approach, high-performance compute-storage implemented as an analytic system is part of a dynamic archive comprising both static and computationally realized objects. It is a system whose capabilities are framed as behaviors over a static data collection, but where queries cause results to be created, not found and retrieved. Those results can be the product of a complex analysis, but, importantly, they also can be tailored responses to the simplest of requests. NASA's MERRA Analytic Service and associated Climate Data Services API provide a real-world example of climate analytics delivered as a service in this way. Our experiences reveal several advantages to this approach, not the least of which is orders-of-magnitude time reduction in the data assembly task common to many scientific workflows.

  16. Analytical and numerical treatment of drift-tearing modes in plasma slab

    NASA Astrophysics Data System (ADS)

    Mirnov, V. V.; Hegna, C. C.; Sovinec, C. R.; Howell, E. C.

    2016-10-01

    Two-fluid corrections to linear tearing modes includes 1) diamagnetic drifts that reduce the growth rate and 2) electron and ion decoupling on short scales that can lead to fast reconnection. We have recently developed an analytical model that includes effects 1) and 2) and important contribution from finite electron parallel thermal conduction. Both the tendencies 1) and 2) are confirmed by an approximate analytic dispersion relation that is derived using a perturbative approach of small ion-sound gyroradius ρs. This approach is only valid at the beginning of the transition from the collisional to semi-collisional regimes. Further analytical and numerical work is performed to cover the full interval of ρs connecting these two limiting cases. Growth rates are computed from analytic theory with a shooting method. They match the resistive MHD regime with the dispersion relations known at asymptotically large ion-sound gyroradius. A comparison between this analytical treatment and linear numerical simulations using the NIMROD code with cold ions and hot electrons in plasma slab is reported. The material is based on work supported by the U.S. DOE and NSF.

  17. Two Analyte Calibration From The Transient Response Of Potentiometric Sensors Employed With The SIA Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cartas, Raul; Mimendia, Aitor; Valle, Manel del

    2009-05-23

    Calibration models for multi-analyte electronic tongues have been commonly built using a set of sensors, at least one per analyte under study. Complex signals recorded with these systems are formed by the sensors' responses to the analytes of interest plus interferents, from which a multivariate response model is then developed. This work describes a data treatment method for the simultaneous quantification of two species in solution employing the signal from a single sensor. The approach used here takes advantage of the complex information recorded with one electrode's transient after insertion of sample for building the calibration models for both analytes.more » The departure information from the electrode was firstly processed by discrete wavelet for transforming the signals to extract useful information and reduce its length, and then by artificial neural networks for fitting a model. Two different potentiometric sensors were used as study case for simultaneously corroborating the effectiveness of the approach.« less

  18. Marshall Space Flight Center ECLSS technology activities

    NASA Technical Reports Server (NTRS)

    Wieland, Paul

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) technology activities are presented. Topics covered include: analytical development; ECLSS modeling approach; example of water reclamation modeling needs; and hardware development and testing.

  19. Profiling Approaches to Teaching in Higher Education: A Cluster-Analytic Study

    ERIC Educational Resources Information Center

    Stes, Ann; Van Petegem, Peter

    2014-01-01

    Teaching approaches in higher education have already been the subject of a considerable body of research. An important contribution was Prosser and Trigwell's development of the Approaches to Teaching Inventory (ATI). The present study aims to map out the approaches to teaching profiles of teachers in higher education on the basis of their scores…

  20. The rise of environmental analytical chemistry as an interdisciplinary activity.

    PubMed

    Brown, Richard

    2009-07-01

    Modern scientific endeavour is increasingly delivered within an interdisciplinary framework. Analytical environmental chemistry is a long-standing example of an interdisciplinary approach to scientific research where value is added by the close cooperation of different disciplines. This editorial piece discusses the rise of environmental analytical chemistry as an interdisciplinary activity and outlines the scope of the Analytical Chemistry and the Environmental Chemistry domains of TheScientificWorldJOURNAL (TSWJ), and the appropriateness of TSWJ's domain format in covering interdisciplinary research. All contributions of new data, methods, case studies, and instrumentation, or new interpretations and developments of existing data, case studies, methods, and instrumentation, relating to analytical and/or environmental chemistry, to the Analytical and Environmental Chemistry domains, are welcome and will be considered equally.

  1. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    The three dimensional quasi-analytical sensitivity analysis and the ancillary driver programs are developed needed to carry out the studies and perform comparisons. The code is essentially contained in one unified package which includes the following: (1) a three dimensional transonic wing analysis program (ZEBRA); (2) a quasi-analytical portion which determines the matrix elements in the quasi-analytical equations; (3) a method for computing the sensitivity coefficients from the resulting quasi-analytical equations; (4) a package to determine for comparison purposes sensitivity coefficients via the finite difference approach; and (5) a graphics package.

  2. Managing Technical and Cost Uncertainties During Product Development in a Simulation-Based Design Environment

    NASA Technical Reports Server (NTRS)

    Karandikar, Harsh M.

    1997-01-01

    An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.

  3. NAFLA - Ein Simulationswerkzeug zur analytischen Abschätzung von Schadstofffahnenlängen

    NASA Astrophysics Data System (ADS)

    Kumar Yadav, Prabhas; Händel, Falk; Müller, Christian; Liedl, Rudolf; Dietrich, Peter

    2013-03-01

    Groundwater pollution with organic contaminants remains a world-wide problem. Before selection of any remediation technique, it is important to pre-assess contaminated sites with respect to their hazard. For this, several analytical and numerical approaches have been used and an initial assessment of contaminated sites the MS-Excel© tool "NAFLA" was developed. "NAFLA" allows a quick and straightforward calculation and comparison of some analytical approaches for the estimation of maximum plume length under steady-state conditions. These approaches differ from each other in source geometry, model domain orientation, and in the consideration of (bio)chemical reaction within the domain. In this communication, we provide details about the development of "NAFLA", its possible usage and information for users. The tool is especially designed for application in student education, by authorities and consultants.

  4. Effect of primary and secondary parameters on analytical estimation of effective thermal conductivity of two phase materials using unit cell approach

    NASA Astrophysics Data System (ADS)

    S, Chidambara Raja; P, Karthikeyan; Kumaraswamidhas, L. A.; M, Ramu

    2018-05-01

    Most of the thermal design systems involve two phase materials and analysis of such systems requires detailed understanding of the thermal characteristics of the two phase material. This article aimed to develop geometry dependent unit cell approach model by considering the effects of all primary parameters (conductivity ratio and concentration) and secondary parameters (geometry, contact resistance, natural convection, Knudsen and radiation) for the estimation of effective thermal conductivity of two-phase materials. The analytical equations have been formulated based on isotherm approach for 2-D and 3-D spatially periodic medium. The developed models are validated with standard models and suited for all kind of operating conditions. The results have shown substantial improvement compared to the existing models and are in good agreement with the experimental data.

  5. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    ERIC Educational Resources Information Center

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  6. Image correlation and sampling study

    NASA Technical Reports Server (NTRS)

    Popp, D. J.; Mccormack, D. S.; Sedwick, J. L.

    1972-01-01

    The development of analytical approaches for solving image correlation and image sampling of multispectral data is discussed. Relevant multispectral image statistics which are applicable to image correlation and sampling are identified. The general image statistics include intensity mean, variance, amplitude histogram, power spectral density function, and autocorrelation function. The translation problem associated with digital image registration and the analytical means for comparing commonly used correlation techniques are considered. General expressions for determining the reconstruction error for specific image sampling strategies are developed.

  7. Safety and Suitability for Service Assessment Testing for Surface and Underwater Launched Munitions

    DTIC Science & Technology

    2014-12-05

    test efficiency that tend to associate the Analytical S3 Test Approach with large, complex munition systems and the Empirical S3 Test Approach with...the smaller, less complex munition systems . 8.1 ANALYTICAL S3 TEST APPROACH. The Analytical S3 test approach, as shown in Figure 3, evaluates...assets than the Analytical S3 Test approach to establish the safety margin of the system . This approach is generally applicable to small munitions

  8. The role of analytical science in natural resource decision making

    NASA Astrophysics Data System (ADS)

    Miller, Alan

    1993-09-01

    There is a continuing debate about the proper role of analytical (positivist) science in natural resource decision making. Two diametrically opposed views are evident, arguing for and against a more extended role for scientific information. The debate takes on a different complexion if one recognizes that certain kinds of problem, referred to here as “wicked” or “trans-science” problems, may not be amenable to the analytical process. Indeed, the mistaken application of analytical methods to trans-science problems may not only be a waste of time and money but also serve to hinder policy development. Since many environmental issues are trans-science in nature, then it follows that alternatives to analytical science need to be developed. In this article, the issues involved in the debate are clarified by examining the impact of the use of analytical methods in a particular case, the spruce budworm controversy in New Brunswick. The article ends with some suggestions about a “holistic” approach to the problem.

  9. Analysis methods for Kevlar shield response to rotor fragments

    NASA Technical Reports Server (NTRS)

    Gerstle, J. H.

    1977-01-01

    Several empirical and analytical approaches to rotor burst shield sizing are compared and principal differences in metal and fabric dynamic behavior are discussed. The application of transient structural response computer programs to predict Kevlar containment limits is described. For preliminary shield sizing, present analytical methods are useful if insufficient test data for empirical modeling are available. To provide other information useful for engineering design, analytical methods require further developments in material characterization, failure criteria, loads definition, and post-impact fragment trajectory prediction.

  10. Metabolomics and Diabetes: Analytical and Computational Approaches

    PubMed Central

    Sas, Kelli M.; Karnovsky, Alla; Michailidis, George

    2015-01-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  11. Analytical Quality by Design Approach in RP-HPLC Method Development for the Assay of Etofenamate in Dosage Forms

    PubMed Central

    Peraman, R.; Bhadraya, K.; Reddy, Y. Padmanabha; Reddy, C. Surayaprakash; Lokesh, T.

    2015-01-01

    By considering the current regulatory requirement for an analytical method development, a reversed phase high performance liquid chromatographic method for routine analysis of etofenamate in dosage form has been optimized using analytical quality by design approach. Unlike routine approach, the present study was initiated with understanding of quality target product profile, analytical target profile and risk assessment for method variables that affect the method response. A liquid chromatography system equipped with a C18 column (250×4.6 mm, 5 μ), a binary pump and photodiode array detector were used in this work. The experiments were conducted based on plan by central composite design, which could save time, reagents and other resources. Sigma Tech software was used to plan and analyses the experimental observations and obtain quadratic process model. The process model was used for predictive solution for retention time. The predicted data from contour diagram for retention time were verified actually and it satisfied with actual experimental data. The optimized method was achieved at 1.2 ml/min flow rate of using mobile phase composition of methanol and 0.2% triethylamine in water at 85:15, % v/v, pH adjusted to 6.5. The method was validated and verified for targeted method performances, robustness and system suitability during method transfer. PMID:26997704

  12. Educational Approaches to Entrepreneurship in Higher Education: A View from the Swedish Horizon

    ERIC Educational Resources Information Center

    Hoppe, Magnus; Westerberg, Mats; Leffler, Eva

    2017-01-01

    Purpose: The purpose of this paper is to present and develop models of educational approaches to entrepreneurship that can provide complementary analytical structures to better study, enact and reflect upon the role of entrepreneurship in higher education. Design/methodology/approach A general framework for entrepreneurship education is developed…

  13. 75 FR 41173 - Call for Information: Information on Greenhouse Gas Emissions Associated With Bioenergy and Other...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-15

    ... oversimplify a complex issue. If this is the case, what alternative approaches or additional analytical tools... Information to solicit information and viewpoints from interested parties on approaches to accounting for... comment on developing an approach for such emissions under the Prevention of Significant Deterioration...

  14. New approaches in GMO detection.

    PubMed

    Querci, Maddalena; Van den Bulcke, Marc; Zel, Jana; Van den Eede, Guy; Broll, Hermann

    2010-03-01

    The steady rate of development and diffusion of genetically modified plants and their increasing diversification of characteristics, genes and genetic control elements poses a challenge in analysis of genetically modified organisms (GMOs). It is expected that in the near future the picture will be even more complex. Traditional approaches, mostly based on the sequential detection of one target at a time, or on a limited multiplexing, allowing only a few targets to be analysed at once, no longer meet the testing requirements. Along with new analytical technologies, new approaches for the detection of GMOs authorized for commercial purposes in various countries have been developed that rely on (1) a smart and accurate strategy for target selection, (2) the use of high-throughput systems or platforms for the detection of multiple targets and (3) algorithms that allow the conversion of analytical results into an indication of the presence of individual GMOs potentially present in an unknown sample. This paper reviews the latest progress made in GMO analysis, taking examples from the most recently developed strategies and tools, and addresses some of the critical aspects related to these approaches.

  15. A Semi-Analytical Solution to Time Dependent Groundwater Flow Equation Incorporating Stream-Wetland-Aquifer Interactions

    NASA Astrophysics Data System (ADS)

    Boyraz, Uǧur; Melek Kazezyılmaz-Alhan, Cevza

    2017-04-01

    Groundwater is a vital element of hydrologic cycle and the analytical & numerical solutions of different forms of groundwater flow equations play an important role in understanding the hydrological behavior of subsurface water. The interaction between groundwater and surface water bodies can be determined using these solutions. In this study, new hypothetical approaches are implemented to groundwater flow system in order to contribute to the studies on surface water/groundwater interactions. A time dependent problem is considered in a 2-dimensional stream-wetland-aquifer system. The sloped stream boundary is used to represent the interaction between stream and aquifer. The rest of the aquifer boundaries are assumed as no-flux boundary. In addition, a wetland is considered as a surface water body which lies over the whole aquifer. The effect of the interaction between the wetland and the aquifer is taken into account with a source/sink term in the groundwater flow equation and the interaction flow is calculated by using Darcy's approach. A semi-analytical solution is developed for the 2-dimensional groundwater flow equation in 5 steps. First, Laplace and Fourier cosine transforms are employed to obtain the general solution in Fourier and Laplace domain. Then, the initial and boundary conditions are applied to obtain the particular solution. Finally, inverse Fourier transform is carried out analytically and inverse Laplace transform is carried out numerically to obtain the final solution in space and time domain, respectively. In order to verify the semi-analytical solution, an explicit finite difference algorithm is developed and analytical and numerical solutions are compared for synthetic examples. The comparison of the analytical and numerical solutions shows that the analytical solution gives accurate results.

  16. Hydrocarbon-Fueled Rocket Engine Plume Diagnostics: Analytical Developments and Experimental Results

    NASA Technical Reports Server (NTRS)

    Tejwani, Gopal D.; McVay, Gregory P.; Langford, Lester A.; St. Cyr, William W.

    2006-01-01

    A viewgraph presentation describing experimental results and analytical developments about plume diagnostics for hydrocarbon-fueled rocket engines is shown. The topics include: 1) SSC Plume Diagnostics Background; 2) Engine Health Monitoring Approach; 3) Rocket Plume Spectroscopy Simulation Code; 4) Spectral Simulation for 10 Atomic Species and for 11 Diatomic Molecular Electronic Bands; 5) "Best" Lines for Plume Diagnostics for Hydrocarbon-Fueled Rocket Engines; 6) Experimental Set Up for the Methane Thruster Test Program and Experimental Results; and 7) Summary and Recommendations.

  17. BEAMS Lab: Novel approaches to finding a balance between throughput and sensitivity

    NASA Astrophysics Data System (ADS)

    Liberman, Rosa G.; Skipper, Paul L.; Prakash, Chandra; Shaffer, Christopher L.; Flarakos, Jimmy; Tannenbaum, Steven R.

    2007-06-01

    Development of 14C AMS has long pursued the twin goals of maximizing both sensitivity and precision in the interest, among others, of optimizing radiocarbon dating. Application of AMS to biomedical research is less constrained with respect to sensitivity requirements, but more demanding of high throughput. This work presents some technical and conceptual developments in sample processing and analytical instrumentation designed to streamline the process of extracting quantitative data from the various types of samples encountered in analytical biochemistry.

  18. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Evaluating revised biomass equations: are some forest types more equivalent than others?

    Treesearch

    Coeli M. Hoover; James E. Smith

    2016-01-01

    Background: In 2014, Chojnacky et al. published a revised set of biomass equations for trees of temperate US forests, expanding on an existing equation set (published in 2003 by Jenkins et al.), both of which were developed from published equations using a meta-analytical approach. Given the similarities in the approach to developing the equations, an examination of...

  20. Multiplexed and Microparticle-based Analyses: Quantitative Tools for the Large-Scale Analysis of Biological Systems

    PubMed Central

    Nolan, John P.; Mandy, Francis

    2008-01-01

    While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537

  1. Development of computer-based analytical tool for assessing physical protection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less

  2. Analytical surveillance of emerging drugs of abuse and drug formulations

    PubMed Central

    Thomas, Brian F.; Pollard, Gerald T.; Grabenauer, Megan

    2012-01-01

    Uncontrolled recreational drugs are proliferating in number and variety. Effects of long-term use are unknown, and regulation is problematic, as efforts to control one chemical often lead to several other structural analogs. Advanced analytical instrumentation and methods are continuing to be developed to identify drugs, chemical constituents of products, and drug substances and metabolites in biological fluids. Several mass spectrometry based approaches appear promising, particularly those that involve high resolution chromatographic and mass spectrometric methods that allow unbiased data acquisition and sophisticated data interrogation. Several of these techniques are shown to facilitate both targeted and broad spectrum analysis, which is often of particular benefit when dealing with misleadingly labeled products or assessing a biological matrix for illicit drugs and metabolites. The development and application of novel analytical approaches such as these will help to assess the nature and degree of exposure and risk and, where necessary, inform forensics and facilitate implementation of specific regulation and control measures. PMID:23154240

  3. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  4. Analytical Description of the H/D Exchange Kinetic of Macromolecule.

    PubMed

    Kostyukevich, Yury; Kononikhin, Alexey; Popov, Igor; Nikolaev, Eugene

    2018-04-17

    We present the accurate analytical solution obtained for the system of rate equations describing the isotope exchange process for molecules containing an arbitrary number of equivalent labile atoms. The exact solution was obtained using Mathematica 7.0 software, and this solution has the form of the time-dependent Gaussian distribution. For the case when forward exchange considerably overlaps the back exchange, it is possible to estimate the activation energy of the reaction by obtaining a temperature dependence of the reaction degree. Using a previously developed approach for performing H/D exchange directly in the ESI source, we have estimated the activation energies for ions with different functional groups and they were found to be in a range 0.04-0.3 eV. Since the value of the activation energy depends on the type of functional group, the developed approach can have potential analytical applications for determining types of functional groups in complex mixtures, such as petroleum, humic substances, bio-oil, and so on.

  5. Freeze-thaw approach: A practical sample preparation strategy for residue analysis of multi-class veterinary drugs in chicken muscle.

    PubMed

    Zhang, Meiyu; Li, Erfen; Su, Yijuan; Song, Xuqin; Xie, Jingmeng; Zhang, Yingxia; He, Limin

    2018-06-01

    Seven drugs from different classes, namely, fluoroquinolones (enrofloxacin, ciprofloxacin, sarafloxacin), sulfonamides (sulfadimidine, sulfamonomethoxine), and macrolides (tilmicosin, tylosin), were used as test compounds in chickens by oral administration, a simple extraction step after cryogenic freezing might allow the effective extraction of multi-class veterinary drug residues from minced chicken muscles by mix vortexing. On basis of the optimized freeze-thaw approach, a convenient, selective, and reproducible liquid chromatography with tandem mass spectrometry method was developed. At three spiking levels in blank chicken and medicated chicken muscles, average recoveries of the analytes were in the range of 71-106 and 63-119%, respectively. All the relative standard deviations were <20%. The limits of quantification of analytes were 0.2-5.0 ng/g. Regardless of the chicken levels, there were no significant differences (P > 0.05) in the average contents of almost any of the analytes in medicated chickens between this method and specific methods in the literature for the determination of specific analytes. Finally, the developed method was successfully extended to the monitoring of residues of 55 common veterinary drugs in food animal muscles. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Empirical and semi-analytical models for predicting peak outflows caused by embankment dam failures

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Chen, Yunliang; Wu, Chao; Peng, Yong; Song, Jiajun; Liu, Wenjun; Liu, Xin

    2018-07-01

    Prediction of peak discharge of floods has attracted great attention for researchers and engineers. In present study, nine typical nonlinear mathematical models are established based on database of 40 historical dam failures. The first eight models that were developed with a series of regression analyses are purely empirical, while the last one is a semi-analytical approach that was derived from an analytical solution of dam-break floods in a trapezoidal channel. Water depth above breach invert (Hw), volume of water stored above breach invert (Vw), embankment length (El), and average embankment width (Ew) are used as independent variables to develop empirical formulas of estimating the peak outflow from breached embankment dams. It is indicated from the multiple regression analysis that a function using the former two variables (i.e., Hw and Vw) produce considerably more accurate results than that using latter two variables (i.e., El and Ew). It is shown that the semi-analytical approach works best in terms of both prediction accuracy and uncertainty, and the established empirical models produce considerably reasonable results except the model only using El. Moreover, present models have been compared with other models available in literature for estimating peak discharge.

  7. ANALYTIC MODELING OF STARSHADES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cash, Webster

    2011-09-01

    External occulters, otherwise known as starshades, have been proposed as a solution to one of the highest priority yet technically vexing problems facing astrophysics-the direct imaging and characterization of terrestrial planets around other stars. New apodization functions, developed over the past few years, now enable starshades of just a few tens of meters diameter to occult central stars so efficiently that the orbiting exoplanets can be revealed and other high-contrast imaging challenges addressed. In this paper, an analytic approach to the analysis of these apodization functions is presented. It is used to develop a tolerance analysis suitable for use inmore » designing practical starshades. The results provide a mathematical basis for understanding starshades and a quantitative approach to setting tolerances.« less

  8. Effect of Vibration on Retention Characteristics of Screen Acquisition Systems

    NASA Technical Reports Server (NTRS)

    Tegart, J. R.; Park, A. C.

    1977-01-01

    An analytical and experimental investigation of the effect of vibration on the retention characteristics of screen acquisition systems was performed. The functioning of surface tension devices using fine-mesh screens requires that the pressure differential acting on the screen be less than its pressure retention capability. When exceeded, screen breakdown will occur and gas-free expulsion of propellant will no longer be possible. An analytical approach to predicting the effect of vibration was developed. This approach considers the transmission of the vibration to the screens of the device and the coupling of the liquid and the screen in establishing the screen response. A method of evaluating the transient response of the gas/liquid interface within the screen was also developed.

  9. Manufacturing data analytics using a virtual factory representation.

    PubMed

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  10. Study and characterization of a MEMS micromirror device

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2004-08-01

    In this paper, advances in our study and characterization of a MEMS micromirror device are presented. The micromirror device, of 510 mm characteristic length, operates in a dynamic mode with a maximum displacement on the order of 10 mm along its principal optical axis and oscillation frequencies of up to 1.3 kHz. Developments are carried on by analytical, computational, and experimental methods. Analytical and computational nonlinear geometrical models are developed in order to determine the optimal loading-displacement operational characteristics of the micromirror. Due to the operational mode of the micromirror, the experimental characterization of its loading-displacement transfer function requires utilization of advanced optical metrology methods. Optoelectronic holography (OEH) methodologies based on multiple wavelengths that we are developing to perform such characterization are described. It is shown that the analytical, computational, and experimental approach is effective in our developments.

  11. A weight-of-evidence approach to identify nanomaterials in consumer products: a case study of nanoparticles in commercial sunscreens.

    PubMed

    Cuddy, Michael F; Poda, Aimee R; Moser, Robert D; Weiss, Charles A; Cairns, Carolyn; Steevens, Jeffery A

    2016-01-01

    Nanoscale ingredients in commercial products represent a point of emerging environmental concern due to recent findings that correlate toxicity with small particle size. A weight-of-evidence (WOE) approach based upon multiple lines of evidence (LOE) is developed here to assess nanomaterials as they exist in consumer product formulations, providing a qualitative assessment regarding the presence of nanomaterials, along with a baseline estimate of nanoparticle concentration if nanomaterials do exist. Electron microscopy, analytical separations, and X-ray detection methods were used to identify and characterize nanomaterials in sunscreen formulations. The WOE/LOE approach as applied to four commercial sunscreen products indicated that all four contained at least 10% dispersed primary particles having at least one dimension <100 nm in size. Analytical analyses confirmed that these constituents were comprised of zinc oxide (ZnO) or titanium dioxide (TiO2). The screening approaches developed herein offer a streamlined, facile means to identify potentially hazardous nanomaterial constituents with minimal abrasive processing of the raw material.

  12. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less

  13. Single Cell Proteomics in Biomedicine: High-dimensional Data Acquisition, Visualization and Analysis

    PubMed Central

    Su, Yapeng; Shi, Qihui; Wei, Wei

    2017-01-01

    New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. PMID:28128880

  14. Theory for the three-dimensional Mercedes-Benz model of water.

    PubMed

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A

    2009-11-21

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.

  15. Theory for the three-dimensional Mercedes-Benz model of water

    PubMed Central

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A.

    2009-01-01

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the “right answer,” we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim’s Ornstein–Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation. PMID:19929057

  16. Theory for the three-dimensional Mercedes-Benz model of water

    NASA Astrophysics Data System (ADS)

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A.

    2009-11-01

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.

  17. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  18. Analytical challenges in sports drug testing.

    PubMed

    Thevis, Mario; Krug, Oliver; Geyer, Hans; Walpurgis, Katja; Baume, Norbert; Thomas, Andreas

    2018-03-01

    Analytical chemistry represents a central aspect of doping controls. Routine sports drug testing approaches are primarily designed to address the question whether a prohibited substance is present in a doping control sample and whether prohibited methods (for example, blood transfusion or sample manipulation) have been conducted by an athlete. As some athletes have availed themselves of the substantial breadth of research and development in the pharmaceutical arena, proactive and preventive measures are required such as the early implementation of new drug candidates and corresponding metabolites into routine doping control assays, even though these drug candidates are to date not approved for human use. Beyond this, analytical data are also cornerstones of investigations into atypical or adverse analytical findings, where the overall picture provides ample reason for follow-up studies. Such studies have been of most diverse nature, and tailored approaches have been required to probe hypotheses and scenarios reported by the involved parties concerning the plausibility and consistency of statements and (analytical) facts. In order to outline the variety of challenges that doping control laboratories are facing besides providing optimal detection capabilities and analytical comprehensiveness, selected case vignettes involving the follow-up of unconventional adverse analytical findings, urine sample manipulation, drug/food contamination issues, and unexpected biotransformation reactions are thematized.

  19. Effect of Binding Components in Complex Sample Matrices on Recovery in Direct Immersion Solid-Phase Microextraction: Friends or Foe?

    PubMed

    Alam, Md Nazmul; Pawliszyn, Janusz

    2018-02-20

    The development of matrix compatible coatings for solid-phase microextraction (SPME) has enabled direct extraction of analytes from complex sample matrices. The direct immersion (DI) mode of SPME when utilized in conjunction with such extraction phases facilitates extraction of a wide range of analytes from complex matrices without the incurrence of fouling or coating saturation. In this work, mathematical models and computational simulations were employed to investigate the effect of binding components present in complex samples on the recovery of small molecules varying in logP for extractions carried out using the direct immersion approach. The presented findings corroborate that the studied approach indeed enables the extraction of both polar and nonpolar analytes from complex matrices, provided a suitable sorbent is employed. Further results indicated that, in certain cases, the kinetics of extraction of a given analyte in its free form might be dependent on the desorption kinetics of their bound form from matrix components, which might lower total recoveries of analytes with high affinity for the matrix. However, the binding of analytes to matrix components also enables SPME to extract a balanced quantity of different logP analytes, facilitated by multiphase equilibria, with a single extraction device.

  20. ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...

    EPA Pesticide Factsheets

    Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli

  1. Heat Transfer Analysis of Thermal Protection Structures for Hypersonic Vehicles

    NASA Astrophysics Data System (ADS)

    Zhou, Chen; Wang, Zhijin; Hou, Tianjiao

    2017-11-01

    This research aims to develop an analytical approach to study the heat transfer problem of thermal protection systems (TPS) for hypersonic vehicles. Laplace transform and integral method are used to describe the temperature distribution through the TPS subject to aerodynamic heating during flight. Time-dependent incident heat flux is also taken into account. Two different cases with heat flux and radiation boundary conditions are studied and discussed. The results are compared with those obtained by finite element analyses and show a good agreement. Although temperature profiles of such problems can be readily accessed via numerical simulations, analytical solutions give a greater insight into the physical essence of the heat transfer problem. Furthermore, with the analytical approach, rapid thermal analyses and even thermal optimization can be achieved during the preliminary TPS design.

  2. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  3. Developing Students' Ideas about Lens Imaging: Teaching Experiments with an Image-Based Approach

    ERIC Educational Resources Information Center

    Grusche, Sascha

    2017-01-01

    Lens imaging is a classic topic in physics education. To guide students from their holistic viewpoint to the scientists' analytic viewpoint, an image-based approach to lens imaging has recently been proposed. To study the effect of the image-based approach on undergraduate students' ideas, teaching experiments are performed and evaluated using…

  4. Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y; Glascoe, L

    The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less

  5. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...

  6. Mixture modeling methods for the assessment of normal and abnormal personality, part II: longitudinal models.

    PubMed

    Wright, Aidan G C; Hallquist, Michael N

    2014-01-01

    Studying personality and its pathology as it changes, develops, or remains stable over time offers exciting insight into the nature of individual differences. Researchers interested in examining personal characteristics over time have a number of time-honored analytic approaches at their disposal. In recent years there have also been considerable advances in person-oriented analytic approaches, particularly longitudinal mixture models. In this methodological primer we focus on mixture modeling approaches to the study of normative and individual change in the form of growth mixture models and ipsative change in the form of latent transition analysis. We describe the conceptual underpinnings of each of these models, outline approaches for their implementation, and provide accessible examples for researchers studying personality and its assessment.

  7. Analytical Approach Validation for the Spin-Stabilized Satellite Attitude

    NASA Technical Reports Server (NTRS)

    Zanardi, Maria Cecilia F. P. S.; Garcia, Roberta Veloso; Kuga, Helio Koiti

    2007-01-01

    An analytical approach for spin-stabilized spacecraft attitude prediction is presented for the influence of the residual magnetic torques and the satellite in an elliptical orbit. Assuming a quadripole model for the Earth s magnetic field, an analytical averaging method is applied to obtain the mean residual torque in every orbital period. The orbit mean anomaly is used to compute the average components of residual torque in the spacecraft body frame reference system. The theory is developed for time variations in the orbital elements, giving rise to many curvature integrals. It is observed that the residual magnetic torque does not have component along the spin axis. The inclusion of this torque on the rotational motion differential equations of a spin stabilized spacecraft yields conditions to derive an analytical solution. The solution shows that the residual torque does not affect the spin velocity magnitude, contributing only for the precession and the drift of the spin axis of the spacecraft. The theory developed has been applied to the Brazilian s spin stabilized satellites, which are quite appropriated for verification and comparison of the theory with the data generated and processed by the Satellite Control Center of Brazil National Research Institute. The results show the period that the analytical solution can be used to the attitude propagation, within the dispersion range of the attitude determination system performance of Satellite Control Center of Brazil National Research Institute.

  8. Stochastic modeling of macrodispersion in unsaturated heterogeneous porous media. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yeh, T.C.J.

    1995-02-01

    Spatial heterogeneity of geologic media leads to uncertainty in predicting both flow and transport in the vadose zone. In this work an efficient and flexible, combined analytical-numerical Monte Carlo approach is developed for the analysis of steady-state flow and transient transport processes in highly heterogeneous, variably saturated porous media. The approach is also used for the investigation of the validity of linear, first order analytical stochastic models. With the Monte Carlo analysis accurate estimates of the ensemble conductivity, head, velocity, and concentration mean and covariance are obtained; the statistical moments describing displacement of solute plumes, solute breakthrough at a compliancemore » surface, and time of first exceedance of a given solute flux level are analyzed; and the cumulative probability density functions for solute flux across a compliance surface are investigated. The results of the Monte Carlo analysis show that for very heterogeneous flow fields, and particularly in anisotropic soils, the linearized, analytical predictions of soil water tension and soil moisture flux become erroneous. Analytical, linearized Lagrangian transport models also overestimate both the longitudinal and the transverse spreading of the mean solute plume in very heterogeneous soils and in dry soils. A combined analytical-numerical conditional simulation algorithm is also developed to estimate the impact of in-situ soil hydraulic measurements on reducing the uncertainty of concentration and solute flux predictions.« less

  9. Considerations regarding the validation of chromatographic mass spectrometric methods for the quantification of endogenous substances in forensics.

    PubMed

    Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra

    2018-02-01

    The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Introductory Guide to the Statistics of Molecular Genetics

    ERIC Educational Resources Information Center

    Eley, Thalia C.; Rijsdijk, Fruhling

    2005-01-01

    Background: This introductory guide presents the main two analytical approaches used by molecular geneticists: linkage and association. Methods: Traditional linkage and association methods are described, along with more recent advances in methodologies such as those using a variance components approach. Results: New methods are being developed all…

  11. Lead Slowing-Down Spectrometry Time Spectral Analysis for Spent Fuel Assay: FY11 Status Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulisek, Jonathan A.; Anderson, Kevin K.; Bowyer, Sonya M.

    2011-09-30

    Developing a method for the accurate, direct, and independent assay of the fissile isotopes in bulk materials (such as used fuel) from next-generation domestic nuclear fuel cycles is a goal of the Office of Nuclear Energy, Fuel Cycle R&D, Material Protection and Control Technology (MPACT) Campaign. To meet this goal, MPACT supports a multi-institutional collaboration, of which PNNL is a part, to study the feasibility of Lead Slowing Down Spectroscopy (LSDS). This technique is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic masses in used fuel with an uncertaintymore » considerably lower than the approximately 10% typical of today's confirmatory assay methods. This document is a progress report for FY2011 PNNL analysis and algorithm development. Progress made by PNNL in FY2011 continues to indicate the promise of LSDS analysis and algorithms applied to used fuel. PNNL developed an empirical model based on calibration of the LSDS to responses generated from well-characterized used fuel. The empirical model, which accounts for self-shielding effects using empirical basis vectors calculated from the singular value decomposition (SVD) of a matrix containing the true self-shielding functions of the used fuel assembly models. The potential for the direct and independent assay of the sum of the masses of 239Pu and 241Pu to within approximately 3% over a wide used fuel parameter space was demonstrated. Also, in FY2011, PNNL continued to develop an analytical model. Such efforts included the addition of six more non-fissile absorbers in the analytical shielding function and the non-uniformity of the neutron flux across the LSDS assay chamber. A hybrid analytical-empirical approach was developed to determine the mass of total Pu (sum of the masses of 239Pu, 240Pu, and 241Pu), which is an important quantity in safeguards. Results using this hybrid method were of approximately the same accuracy as the pure empirical approach. In addition, total Pu with much better accuracy with the hybrid approach than the pure analytical approach. In FY2012, PNNL will continue efforts to optimize its empirical model and minimize its reliance on calibration data. In addition, PNNL will continue to develop an analytical model, considering effects such as neutron-scattering in the fuel and cladding, as well as neutrons streaming through gaps between fuel pins in the fuel assembly.« less

  12. Modern reaction-based indicator systems†

    PubMed Central

    2010-01-01

    Traditional analyte-specific synthetic receptors or sensors have been developed on the basis of supramolecular interactions (e.g., hydrogen bonding, electrostatics, weak coordinative bonds). Unfortunately, this approach is often subject to limitations. As a result, increasing attention within the chemical sensor community is turning to the use of analyte-specific molecular indicators, wherein substrate-triggered reactions are used to signal the presence of a given analyte. This tutorial review highlights recent reaction-based indicator systems that have been used to detect selected anions, cations, reactive oxygen species, and neutral substrates. PMID:19587959

  13. Proteomics: from hypothesis to quantitative assay on a single platform. Guidelines for developing MRM assays using ion trap mass spectrometers.

    PubMed

    Han, Bomie; Higgs, Richard E

    2008-09-01

    High-throughput HPLC-mass spectrometry (HPLC-MS) is routinely used to profile biological samples for potential protein markers of disease, drug efficacy and toxicity. The discovery technology has advanced to the point where translating hypotheses from proteomic profiling studies into clinical use is the bottleneck to realizing the full potential of these approaches. The first step in this translation is the development and analytical validation of a higher throughput assay with improved sensitivity and selectivity relative to typical profiling assays. Multiple reaction monitoring (MRM) assays are an attractive approach for this stage of biomarker development given their improved sensitivity and specificity, the speed at which the assays can be developed and the quantitative nature of the assay. While the profiling assays are performed with ion trap mass spectrometers, MRM assays are traditionally developed in quadrupole-based mass spectrometers. Development of MRM assays from the same instrument used in the profiling analysis enables a seamless and rapid transition from hypothesis generation to validation. This report provides guidelines for rapidly developing an MRM assay using the same mass spectrometry platform used for profiling experiments (typically ion traps) and reviews methodological and analytical validation considerations. The analytical validation guidelines presented are drawn from existing practices on immunological assays and are applicable to any mass spectrometry platform technology.

  14. A workflow learning model to improve geovisual analytics utility

    PubMed Central

    Roth, Robert E; MacEachren, Alan M; McCabe, Craig A

    2011-01-01

    Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545

  15. A workflow learning model to improve geovisual analytics utility.

    PubMed

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.

  16. Social Exclusion/Inclusion: Foucault's Analytics of Exclusion, the Political Ecology of Social Inclusion and the Legitimation of Inclusive Education

    ERIC Educational Resources Information Center

    Peters, Michael A.; Besley, Tina A. C.

    2014-01-01

    This article offers a broad philosophical and historical background to the dyad of social exclusion/inclusion by examining the analytics and politics of exclusion first by reference to Michel Foucault who studies the modern history of exclusion and makes it central to his approach in understanding the development of modern institutions of emerging…

  17. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    ERIC Educational Resources Information Center

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  18. Analytical Approaches to Verify Food Integrity: Needs and Challenges.

    PubMed

    Stadler, Richard H; Tran, Lien-Anh; Cavin, Christophe; Zbinden, Pascal; Konings, Erik J M

    2016-09-01

    A brief overview of the main analytical approaches and practices to determine food authenticity is presented, addressing, as well, food supply chain and future requirements to more effectively mitigate food fraud. Food companies are introducing procedures and mechanisms that allow them to identify vulnerabilities in their food supply chain under the umbrella of a food fraud prevention management system. A key step and first line of defense is thorough supply chain mapping and full transparency, assessing the likelihood of fraudsters to penetrate the chain at any point. More vulnerable chains, such as those where ingredients and/or raw materials are purchased through traders or auctions, may require a higher degree of sampling, testing, and surveillance. Access to analytical tools is therefore pivotal, requiring continuous development and possibly sophistication in identifying chemical markers, data acquisition, and modeling. Significant progress in portable technologies is evident already today, for instance, as in the rapid testing now available at the agricultural level. In the near future, consumers may also have the ability to scan products in stores or at home to authenticate labels and food content. For food manufacturers, targeted analytical methods complemented by untargeted approaches are end control measures at the factory gate when the material is delivered. In essence, testing for food adulterants is an integral part of routine QC, ideally tailored to the risks in the individual markets and/or geographies or supply chains. The development of analytical methods is a first step in verifying the compliance and authenticity of food materials. A next, more challenging step is the successful establishment of global consensus reference methods as exemplified by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals initiative, which can serve as an approach that could also be applied to methods for contaminants and adulterants in food. The food industry has taken these many challenges aboard, working closely with all stakeholders and continuously communicating on progress in a fully transparent manner.

  19. Qualitative research in healthcare: an introduction to grounded theory using thematic analysis.

    PubMed

    Chapman, A L; Hadfield, M; Chapman, C J

    2015-01-01

    In today's NHS, qualitative research is increasingly important as a method of assessing and improving quality of care. Grounded theory has developed as an analytical approach to qualitative data over the last 40 years. It is primarily an inductive process whereby theoretical insights are generated from data, in contrast to deductive research where theoretical hypotheses are tested via data collection. Grounded theory has been one of the main contributors to the acceptance of qualitative methods in a wide range of applied social sciences. The influence of grounded theory as an approach is, in part, based on its provision of an explicit framework for analysis and theory generation. Furthermore the stress upon grounding research in the reality of participants has also given it credence in healthcare research. As with all analytical approaches, grounded theory has drawbacks and limitations. It is important to have an understanding of these in order to assess the applicability of this approach to healthcare research. In this review we outline the principles of grounded theory, and focus on thematic analysis as the analytical approach used most frequently in grounded theory studies, with the aim of providing clinicians with the skills to critically review studies using this methodology.

  20. Chemical Detection and Identification Techniques for Exobiology Flight Experiments

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.

    2002-01-01

    Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).

  1. 3D Tensorial Elastodynamics for Isotropic Media on Vertically Deformed Meshes

    NASA Astrophysics Data System (ADS)

    Shragge, J. C.

    2017-12-01

    Solutions of the 3D elastodynamic wave equation are sometimes required in industrial and academic applications of elastic reverse-time migration (E-RTM) and full waveform inversion (E-FWI) that involve vertically deformed meshes. Examples include incorporating irregular free-surface topography and handling internal boundaries (e.g., water bottom) directly into the computational meshes. In 3D E-RTM and E-FWI applications, the number of forward modeling simulations can number in the tens of thousands (per iteration), which necessitates the development of stable, accurate and efficient 3D elastodynamics solvers. For topographic scenarios, most finite-difference solution approaches use a change-of-variable strategy that has a number of associated computational challenges, including difficulties in handling of the free-surface boundary condition. In this study, I follow a tensorial approach and use a generalized family of analytic transforms to develop a set of analytic equations for 3D elastodynamics that directly incorporates vertical grid deformations. Importantly, this analytic approach allows for the specification of an analytic free-surface boundary condition appropriate for vertically deformed meshes. These equations are both straightforward and efficient to solve using a velocity-stress formulation with finite-difference (MFD) operators implemented on a fully staggered grid. Moreover, I demonstrate that the use of mimetic finite difference (MFD) methods allows stable, accurate, and efficient numerical solutions to be simulated for typical topographic scenarios. Examples demonstrate that high-quality elastic wavefields can be generated for topographic surfaces exhibiting significant topographic relief.

  2. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  3. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2014-01-01

    Monitoring the misuse of drugs and the abuse of substances and methods potentially or evidently improving athletic performance by analytical chemistry strategies is one of the main pillars of modern anti-doping efforts. Owing to the continuously growing knowledge in medicine, pharmacology, and (bio)chemistry, new chemical entities are frequently established and developed, various of which present a temptation for sportsmen and women due to assumed/attributed beneficial effects of such substances and preparations on, for example, endurance, strength, and regeneration. By means of new technologies, expanded existing test protocols, new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA), analytical assays have been further improved in agreement with the content of the 2013 Prohibited List. In this annual banned-substance review, literature concerning human sports drug testing that was published between October 2012 and September 2013 is summarized and reviewed with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Efficient Online Optimized Quantum Control for Adiabatic Quantum Computation

    NASA Astrophysics Data System (ADS)

    Quiroz, Gregory

    Adiabatic quantum computation (AQC) relies on controlled adiabatic evolution to implement a quantum algorithm. While control evolution can take many forms, properly designed time-optimal control has been shown to be particularly advantageous for AQC. Grover's search algorithm is one such example where analytically-derived time-optimal control leads to improved scaling of the minimum energy gap between the ground state and first excited state and thus, the well-known quadratic quantum speedup. Analytical extensions beyond Grover's search algorithm present a daunting task that requires potentially intractable calculations of energy gaps and a significant degree of model certainty. Here, an in situ quantum control protocol is developed for AQC. The approach is shown to yield controls that approach the analytically-derived time-optimal controls for Grover's search algorithm. In addition, the protocol's convergence rate as a function of iteration number is shown to be essentially independent of system size. Thus, the approach is potentially scalable to many-qubit systems.

  5. Characteristics, Properties and Analytical Methods of Amoxicillin: A Review with Green Approach.

    PubMed

    de Marco, Bianca Aparecida; Natori, Jéssica Sayuri Hisano; Fanelli, Stefany; Tótoli, Eliane Gandolpho; Salgado, Hérida Regina Nunes

    2017-05-04

    Bacterial infections are the second leading cause of global mortality. Considering this fact, it is extremely important studying the antimicrobial agents. Amoxicillin is an antimicrobial agent that belongs to the class of penicillins; it has bactericidal activity and is widely used in the Brazilian health system. In literature, some analytical methods are found for the identification and quantification of this penicillin, which are essential for its quality control, which ensures maintaining the product characteristics, therapeutic efficacy and patient's safety. Thus, this study presents a brief literature review on amoxicillin and the analytical methods developed for the analysis of this drug in official and scientific papers. The major analytical methods found were high-performance liquid chromatography (HPLC), ultra-performance liquid chromatography (U-HPLC), capillary electrophoresis and iodometry and diffuse reflectance infrared Fourier transform. It is essential to note that most of the developed methods used toxic and hazardous solvents, which makes necessary industries and researchers choose to develop environmental-friendly techniques to provide enhanced benefits to environment and staff.

  6. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  7. Adverse outcome pathway networks: Development, analytics and applications

    EPA Science Inventory

    The US EPA is developing more cost effective and efficient ways to evaluate chemical safety using high throughput and computationally based testing strategies. An important component of this approach is the ability to translate chemical effects on fundamental biological processes...

  8. Adverse outcome pathway networks: Development, analytics, and applications

    EPA Science Inventory

    Product Description:The US EPA is developing more cost effective and efficient ways to evaluate chemical safety using high throughput and computationally based testing strategies. An important component of this approach is the ability to translate chemical effects on fundamental ...

  9. A dynamical-systems approach for computing ice-affected streamflow

    USGS Publications Warehouse

    Holtschlag, David J.

    1996-01-01

    A dynamical-systems approach was developed and evaluated for computing ice-affected streamflow. The approach provides for dynamic simulation and parameter estimation of site-specific equations relating ice effects to routinely measured environmental variables. Comparison indicates that results from the dynamical-systems approach ranked higher than results from 11 analytical methods previously investigated on the basis of accuracy and feasibility criteria. Additional research will likely lead to further improvements in the approach.

  10. Laser-induced breakdown spectroscopy (LIBS), part II: review of instrumental and methodological approaches to material analysis and applications to different fields.

    PubMed

    Hahn, David W; Omenetto, Nicoló

    2012-04-01

    The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy

  11. HPLC method development for evolving applications in the pharmaceutical industry and nanoscale chemistry

    NASA Astrophysics Data System (ADS)

    Castiglione, Steven Louis

    As scientific research trends towards trace levels and smaller architectures, the analytical chemist is often faced with the challenge of quantitating said species in a variety of matricies. The challenge is heightened when the analytes prove to be potentially toxic or possess physical or chemical properties that make traditional analytical methods problematic. In such cases, the successful development of an acceptable quantitative method plays a critical role in the ability to further develop the species under study. This is particularly true for pharmaceutical impurities and nanoparticles (NP). The first portion of the research focuses on the development of a part-per-billion level HPLC method for a substituted phenazine-class pharmaceutical impurity. The development of this method was required due to the need for a rapid methodology to quantitatively determine levels of a potentially toxic phenazine moiety in order to ensure patient safety. As the synthetic pathway for the active ingredient was continuously refined to produce progressively lower amounts of the phenazine impurity, the approach for increasingly sensitive quantitative methods was required. The approaches evolved across four discrete methods, each employing a unique scheme for analyte detection. All developed methods were evaluated with regards to accuracy, precision and linear adherence as well as ancillary benefits and detriments -- e.g., one method in this evolution demonstrated the ability to resolve and detect other species from the phenazine class. The second portion of the research focuses on the development of an HPLC method for the quantitative determination of NP size distributions. The current methodology for the determination of NP sizes employs tunneling electron microscopy (TEM), which requires sample drying without particle size alteration and which, in many cases, may prove infeasible due to cost or availability. The feasibility of an HPLC method for NP size characterizations evolved across three methods, each employing a different approach for size resolution. These methods were evaluated primarily for sensitivity, which proved to be a substantial hurdle to further development, but does not appear to deter future research efforts.

  12. Researching Vocabulary Development: A Conversation Analytic Approach

    ERIC Educational Resources Information Center

    Reichert, Tetyana

    2016-01-01

    This paper contributes to the much debated yet still largely unanswered question of how second language (L2) learning is anchored and configured in and through social interaction. Using a sociointeractional approach to second language (L2) learning (e.g., Hellermann, 2008; Mondada & Pekarek Doehler, 2004; Pekarek Doehler, 2010), I examine…

  13. Teaching Trends in Virtual Education: An Interpretative Approach

    ERIC Educational Resources Information Center

    Pando, Victor F.

    2018-01-01

    Based on the theoretical context of discussions about teaching trends in virtual education, particularly for higher education, this research develops an interpretation of some of these trends, resizing what was registered about it by other authors. To that end, a documentary study with an interpretative-analytical approach was carried out. The…

  14. A Graphical Approach to Teaching Amplifier Design at the Undergraduate Level

    ERIC Educational Resources Information Center

    Assaad, R. S.; Silva-Martinez, J.

    2009-01-01

    Current methods of teaching basic amplifier design at the undergraduate level need further development to match today's technological advances. The general class approach to amplifier design is analytical and heavily based on mathematical manipulations. However, the students mathematical abilities are generally modest, creating a void in which…

  15. The evolution of analytical chemistry methods in foodomics.

    PubMed

    Gallo, Monica; Ferranti, Pasquale

    2016-01-08

    The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. At-line nanofractionation with parallel mass spectrometry and bioactivity assessment for the rapid screening of thrombin and factor Xa inhibitors in snake venoms.

    PubMed

    Mladic, Marija; Zietek, Barbara M; Iyer, Janaki Krishnamoorthy; Hermarij, Philip; Niessen, Wilfried M A; Somsen, Govert W; Kini, R Manjunatha; Kool, Jeroen

    2016-02-01

    Snake venoms comprise complex mixtures of peptides and proteins causing modulation of diverse physiological functions upon envenomation of the prey organism. The components of snake venoms are studied as research tools and as potential drug candidates. However, the bioactivity determination with subsequent identification and purification of the bioactive compounds is a demanding and often laborious effort involving different analytical and pharmacological techniques. This study describes the development and optimization of an integrated analytical approach for activity profiling and identification of venom constituents targeting the cardiovascular system, thrombin and factor Xa enzymes in particular. The approach developed encompasses reversed-phase liquid chromatography (RPLC) analysis of a crude snake venom with parallel mass spectrometry (MS) and bioactivity analysis. The analytical and pharmacological part in this approach are linked using at-line nanofractionation. This implies that the bioactivity is assessed after high-resolution nanofractionation (6 s/well) onto high-density 384-well microtiter plates and subsequent freeze drying of the plates. The nanofractionation and bioassay conditions were optimized for maintaining LC resolution and achieving good bioassay sensitivity. The developed integrated analytical approach was successfully applied for the fast screening of snake venoms for compounds affecting thrombin and factor Xa activity. Parallel accurate MS measurements provided correlation of observed bioactivity to peptide/protein masses. This resulted in identification of a few interesting peptides with activity towards the drug target factor Xa from a screening campaign involving venoms of 39 snake species. Besides this, many positive protease activity peaks were observed in most venoms analysed. These protease fingerprint chromatograms were found to be similar for evolutionary closely related species and as such might serve as generic snake protease bioactivity fingerprints in biological studies on venoms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Development of a category 2 approach system model

    NASA Technical Reports Server (NTRS)

    Johnson, W. A.; Mcruer, D. T.

    1972-01-01

    An analytical model is presented which provides, as its primary output, the probability of a successful Category II approach. Typical applications are included using several example systems (manual and automatic) which are subjected to random gusts and deterministic wind shear. The primary purpose of the approach system model is to establish a structure containing the system elements, command inputs, disturbances, and their interactions in an analytical framework so that the relative effects of changes in the various system elements on precision of control and available margins of safety can be estimated. The model is intended to provide insight for the design and integration of suitable autopilot, display, and navigation elements; and to assess the interaction of such elements with the pilot/copilot.

  18. Matrix Effect Compensation in Small-Molecule Profiling for an LC-TOF Platform Using Multicomponent Postcolumn Infusion.

    PubMed

    González, Oskar; van Vliet, Michael; Damen, Carola W N; van der Kloet, Frans M; Vreeken, Rob J; Hankemeier, Thomas

    2015-06-16

    The possible presence of matrix effect is one of the main concerns in liquid chromatography-mass spectrometry (LC-MS)-driven bioanalysis due to its impact on the reliability of the obtained quantitative results. Here we propose an approach to correct for the matrix effect in LC-MS with electrospray ionization using postcolumn infusion of eight internal standards (PCI-IS). We applied this approach to a generic ultraperformance liquid chromatography-time-of-flight (UHPLC-TOF) platform developed for small-molecule profiling with a main focus on drugs. Different urine samples were spiked with 19 drugs with different physicochemical properties and analyzed in order to study matrix effect (in absolute and relative terms). Furthermore, calibration curves for each analyte were constructed and quality control samples at different concentration levels were analyzed to check the applicability of this approach in quantitative analysis. The matrix effect profiles of the PCI-ISs were different: this confirms that the matrix effect is compound-dependent, and therefore the most suitable PCI-IS has to be chosen for each analyte. Chromatograms were reconstructed using analyte and PCI-IS responses, which were used to develop an optimized method which compensates for variation in ionization efficiency. The approach presented here improved the results in terms of matrix effect dramatically. Furthermore, calibration curves of higher quality are obtained, dynamic range is enhanced, and accuracy and precision of QC samples is increased. The use of PCI-ISs is a very promising step toward an analytical platform free of matrix effect, which can make LC-MS analysis even more successful, adding a higher reliability in quantification to its intrinsic high sensitivity and selectivity.

  19. Current Status of Mycotoxin Analysis: A Critical Review.

    PubMed

    Shephard, Gordon S

    2016-07-01

    It is over 50 years since the discovery of aflatoxins focused the attention of food safety specialists on fungal toxins in the feed and food supply. Since then, analysis of this important group of natural contaminants has advanced in parallel with general developments in analytical science, and current MS methods are capable of simultaneously analyzing hundreds of compounds, including mycotoxins, pesticides, and drugs. This profusion of data may advance our understanding of human exposure, yet constitutes an interpretive challenge to toxicologists and food safety regulators. Despite these advances in analytical science, the basic problem of the extreme heterogeneity of mycotoxin contamination, although now well understood, cannot be circumvented. The real health challenges posed by mycotoxin exposure occur in the developing world, especially among small-scale and subsistence farmers. Addressing these problems requires innovative approaches in which analytical science must also play a role in providing suitable out-of-laboratory analytical techniques.

  20. Annual banned-substance review: Analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans

    2018-01-01

    Several high-profile revelations concerning anti-doping rule violations over the past 12 months have outlined the importance of tackling prevailing challenges and reducing the limitations of the current anti-doping system. At this time, the necessity to enhance, expand, and improve analytical test methods in response to the substances outlined in the World Anti-Doping Agency's (WADA) Prohibited List represents an increasingly crucial task for modern sports drug-testing programs. The ability to improve analytical testing methods often relies on the expedient application of novel information regarding superior target analytes for sports drug-testing assays, drug elimination profiles, alternative test matrices, together with recent advances in instrumental developments. This annual banned-substance review evaluates literature published between October 2016 and September 2017 offering an in-depth evaluation of developments in these arenas and their potential application to substances reported in WADA's 2017 Prohibited List. Copyright © 2017 John Wiley & Sons, Ltd.

  1. New analytic results for speciation times in neutral models.

    PubMed

    Gernhard, Tanja

    2008-05-01

    In this paper, we investigate the standard Yule model, and a recently studied model of speciation and extinction, the "critical branching process." We develop an analytic way-as opposed to the common simulation approach-for calculating the speciation times in a reconstructed phylogenetic tree. Simple expressions for the density and the moments of the speciation times are obtained. Methods for dating a speciation event become valuable, if for the reconstructed phylogenetic trees, no time scale is available. A missing time scale could be due to supertree methods, morphological data, or molecular data which violates the molecular clock. Our analytic approach is, in particular, useful for the model with extinction, since simulations of birth-death processes which are conditioned on obtaining n extant species today are quite delicate. Further, simulations are very time consuming for big n under both models.

  2. Higher Education for Sustainable Development in Japan: Policy and Progress

    ERIC Educational Resources Information Center

    Nomura, Ko; Abe, Osamu

    2010-01-01

    Purpose: The purpose of this paper is to review key developments and the role of governmental support in the field of education for sustainable development (ESD) in higher education in Japan. Design/methodology/approach: This is an analytical review paper on policy and practice, using an evaluative perspective to consider developments, challenges…

  3. DEVELOPMENT OF SAMPLING AND ANALYTICAL METHODS FOR THE MEASUREMENT OF NITROUS OXIDE FROM FOSSIL FUEL COMBUSTION SOURCES

    EPA Science Inventory

    The report documents the technical approach and results achieved while developing a grab sampling method and an automated, on-line gas chromatography method suitable to characterize nitrous oxide (N2O) emissions from fossil fuel combustion sources. The two methods developed have...

  4. Transcutaneous analyte measuring method (TAMM): a reflective, noninvasive, near-infrared blood chemistry analyzer

    NASA Astrophysics Data System (ADS)

    Schlager, Kenneth J.; Ruchti, Timothy L.

    1995-04-01

    TAMM for Transcutaneous Analyte Measuring Method is a near infrared spectroscopic technique for the noninvasive measurement of human blood chemistry. A near infrared indium gallium arsenide (InGaAs) photodiode array spectrometer has been developed and tested on over 1,000 patients as a part of an SBIR program sponsored by the Naval Medical Research and Development Command. Nine (9) blood analytes have been measured and evaluated during pre-clinical testing: sodium, chloride, calcium, potassium, bicarbonate, BUN, glucose, hematocrit and hemoglobin. A reflective rather than a transmissive invasive approach to measurement has been taken to avoid variations resulting from skin color and sensor positioning. The current status of the instrumentation, neural network pattern recognition algorithms and test results will be discussed.

  5. Validity of Particle-Counting Method Using Laser-Light Scattering for Detecting Platelet Aggregation in Diabetic Patients

    NASA Astrophysics Data System (ADS)

    Nakadate, Hiromichi; Sekizuka, Eiichi; Minamitani, Haruyuki

    We aimed to study the validity of a new analytical approach that reflected the phase from platelet activation to the formation of small platelet aggregates. We hoped that this new approach would enable us to use the particle-counting method with laser-light scattering to measure platelet aggregation in healthy controls and in diabetic patients without complications. We measured agonist-induced platelet aggregation for 10 min. Agonist was added to the platelet-rich plasma 1 min after measurement started. We compared the total scattered light intensity from small aggregates over a 10-min period (established analytical approach) and that over a 2-min period from 1 to 3 min after measurement started (new analytical approach). Consequently platelet aggregation in diabetics with HbA1c ≥ 6.5% was significantly greater than in healthy controls by both analytical approaches. However, platelet aggregation in diabetics with HbA1c < 6.5%, i.e. patients in the early stages of diabetes, was significantly greater than in healthy controls only by the new analytical approach, not by the established analytical approach. These results suggest that platelet aggregation as detected by the particle-counting method using laser-light scattering could be applied in clinical examinations by our new analytical approach.

  6. Sample-independent approach to normalize two-dimensional data for orthogonality evaluation using whole separation space scaling.

    PubMed

    Jáčová, Jaroslava; Gardlo, Alžběta; Friedecký, David; Adam, Tomáš; Dimandja, Jean-Marie D

    2017-08-18

    Orthogonality is a key parameter that is used to evaluate the separation power of chromatography-based two-dimensional systems. It is necessary to scale the separation data before the assessment of the orthogonality. Current scaling approaches are sample-dependent, and the extent of the retention space that is converted into a normalized retention space is set according to the retention times of the first and last analytes contained in a unique sample to elute. The presence or absence of a highly retained analyte in a sample can thus significantly influence the amount of information (in terms of the total amount of separation space) contained in the normalized retention space considered for the calculation of the orthogonality. We propose a Whole Separation Space Scaling (WOSEL) approach that accounts for the whole separation space delineated by the analytical method, and not the sample. This approach enables an orthogonality-based evaluation of the efficiency of the analytical system that is independent of the sample selected. The WOSEL method was compared to two currently used orthogonality approaches through the evaluation of in silico-generated chromatograms and real separations of human biofluids and petroleum samples. WOSEL exhibits sample-to-sample stability values of 3.8% on real samples, compared to 7.0% and 10.1% for the two other methods, respectively. Using real analyses, we also demonstrate that some previously developed approaches can provide misleading conclusions on the overall orthogonality of a two-dimensional chromatographic system. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Macroeconomic Activity Module - NEMS Documentation

    EIA Publications

    2016-01-01

    Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Macroeconomic Activity Module (MAM) used to develop the Annual Energy Outlook for 2016 (AEO2016). The report catalogues and describes the module assumptions, computations, methodology, parameter estimation techniques, and mainframe source code

  8. AN APPROACH TO METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Human exposure assessment studies require methods that are rapid, cost-effective and have a high sample through-put. The development of analytical methods for exposure studies should be based on specific information for individual studies. Human exposure studies suggest that di...

  9. Scientific Analysis and Documentation Provided by EPA Regional Labs

    EPA Pesticide Factsheets

    The 10 EPA Regional laboratories provide maximum flexibility to support Agency response to natural disasters and emergencies by developing effective approaches for a wide range of analytical challenges.

  10. War-gaming application for future space systems acquisition part 1: program and technical baseline war-gaming modeling and simulation approaches

    NASA Astrophysics Data System (ADS)

    Nguyen, Tien M.; Guillen, Andy T.

    2017-05-01

    This paper describes static Bayesian game models with "Pure" and "Mixed" games for the development of an optimum Program and Technical Baseline (PTB) solution for affordable acquisition of future space systems. The paper discusses System Engineering (SE) frameworks and analytical and simulation modeling approaches for developing the optimum PTB solutions from both the government and contractor perspectives.

  11. Multidisciplinary design and analytic approaches to advance prospective research on the multilevel determinants of child health.

    PubMed

    Johnson, Sara B; Little, Todd D; Masyn, Katherine; Mehta, Paras D; Ghazarian, Sharon R

    2017-06-01

    Characterizing the determinants of child health and development over time, and identifying the mechanisms by which these determinants operate, is a research priority. The growth of precision medicine has increased awareness and refinement of conceptual frameworks, data management systems, and analytic methods for multilevel data. This article reviews key methodological challenges in cohort studies designed to investigate multilevel influences on child health and strategies to address them. We review and summarize methodological challenges that could undermine prospective studies of the multilevel determinants of child health and ways to address them, borrowing approaches from the social and behavioral sciences. Nested data, variation in intervals of data collection and assessment, missing data, construct measurement across development and reporters, and unobserved population heterogeneity pose challenges in prospective multilevel cohort studies with children. We discuss innovations in missing data, innovations in person-oriented analyses, and innovations in multilevel modeling to address these challenges. Study design and analytic approaches that facilitate the integration across multiple levels, and that account for changes in people and the multiple, dynamic, nested systems in which they participate over time, are crucial to fully realize the promise of precision medicine for children and adolescents. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Industrial Demand Module - NEMS Documentation

    EIA Publications

    2014-01-01

    Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Industrial Demand Module. The report catalogues and describes model assumptions, computational methodology, parameter estimation techniques, and model source code.

  13. Analysis of THG modes for femtosecond laser pulse

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Sidorov, Pavel S.

    2017-05-01

    THG is used nowadays in many practical applications such as a substance diagnostics, and biological objects imaging, and etc. With developing of new materials and technology (for example, photonic crystal) an attention to THG process analysis grow. Therefore, THG features understanding are a modern problem. Early we have developed new analytical approach based on using the problem invariant for analytical solution construction of the THG process. It should be stressed that we did not use a basic wave non-depletion approximation. Nevertheless, a long pulse duration approximation and plane wave approximation has applied. The analytical solution demonstrates, in particular, an optical bistability property (and may other regimes of frequency tripling) for the third harmonic generation process. But, obviously, this approach does not reflect an influence of a medium dispersion on the frequency tripling. Therefore, in this paper we analyze THG efficiency of a femtosecond laser pulse taking into account a second order dispersion affect as well as self- and crossmodulation of the interacting waves affect on the frequency conversion process. Analysis is made using a computer simulation on the base of Schrödinger equations describing the process under consideration.

  14. A singularity free analytical solution of artificial satellite motion with drag

    NASA Technical Reports Server (NTRS)

    Scheifele, G.; Mueller, A. C.; Starke, S. E.

    1977-01-01

    The connection between the existing Delaunay-Similar and Poincare-Similar satellite theories in the true anomaly version is outlined for the J(2) perturbation and the new drag approach. An overall description of the concept of the approach is given while the necessary expansions and the procedure to arrive at the computer program for the canonical forces is delineated. The procedure for the analytical integration of these developed equations is described. In addition, some numerical results are given. The computer program for the algebraic multiplication of the Fourier series which creates the FORTRAN coding in an automatic manner is described and documented.

  15. Fluorescence Spectroscopy for the Monitoring of Food Processes.

    PubMed

    Ahmad, Muhammad Haseeb; Sahar, Amna; Hitzmann, Bernd

    Different analytical techniques have been used to examine the complexity of food samples. Among them, fluorescence spectroscopy cannot be ignored in developing rapid and non-invasive analytical methodologies. It is one of the most sensitive spectroscopic approaches employed in identification, classification, authentication, quantification, and optimization of different parameters during food handling, processing, and storage and uses different chemometric tools. Chemometrics helps to retrieve useful information from spectral data utilized in the characterization of food samples. This contribution discusses in detail the potential of fluorescence spectroscopy of different foods, such as dairy, meat, fish, eggs, edible oil, cereals, fruit, vegetables, etc., for qualitative and quantitative analysis with different chemometric approaches.

  16. The effect of inclined soil layers on surface vibration from underground railways using a semi-analytical approach

    NASA Astrophysics Data System (ADS)

    Jones, S.; Hunt, H.

    2009-08-01

    Ground vibration due to underground railways is a significant source of disturbance for people living or working near the subways. The numerical models used to predict vibration levels have inherent uncertainty which must be understood to give confidence in the predictions. A semi-analytical approach is developed herein to investigate the effect of soil layering on the surface vibration of a halfspace where both soil properties and layer inclination angles are varied. The study suggests that both material properties and inclination angle of the layers have significant effect (± 10dB) on the surface vibration response.

  17. NCI-FDA Interagency Oncology Task Force Workshop Provides Guidance for Analytical Validation of Protein-based Multiplex Assays | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    An NCI-FDA Interagency Oncology Task Force (IOTF) Molecular Diagnostics Workshop was held on October 30, 2008 in Cambridge, MA, to discuss requirements for analytical validation of protein-based multiplex technologies in the context of its intended use. This workshop developed through NCI's Clinical Proteomic Technologies for Cancer initiative and the FDA focused on technology-specific analytical validation processes to be addressed prior to use in clinical settings. In making this workshop unique, a case study approach was used to discuss issues related to

  18. Mixing of two co-directional Rayleigh surface waves in a nonlinear elastic material.

    PubMed

    Morlock, Merlin B; Kim, Jin-Yeon; Jacobs, Laurence J; Qu, Jianmin

    2015-01-01

    The mixing of two co-directional, initially monochromatic Rayleigh surface waves in an isotropic, homogeneous, and nonlinear elastic solid is investigated using analytical, finite element method, and experimental approaches. The analytical investigations show that while the horizontal velocity component can form a shock wave, the vertical velocity component can form a pulse independent of the specific ratios of the fundamental frequencies and amplitudes that are mixed. This analytical model is then used to simulate the development of the fundamentals, second harmonics, and the sum and difference frequency components over the propagation distance. The analytical model is further extended to include diffraction effects in the parabolic approximation. Finally, the frequency and amplitude ratios of the fundamentals are identified which provide maximum amplitudes of the second harmonics as well as of the sum and difference frequency components, to help guide effective material characterization; this approach should make it possible to measure the acoustic nonlinearity of a solid not only with the second harmonics, but also with the sum and difference frequency components. Results of the analytical investigations are then confirmed using the finite element method and the experimental feasibility of the proposed technique is validated for an aluminum specimen.

  19. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  20. Unlocking Proteomic Heterogeneity in Complex Diseases through Visual Analytics

    PubMed Central

    Bhavnani, Suresh K.; Dang, Bryant; Bellala, Gowtham; Divekar, Rohit; Visweswaran, Shyam; Brasier, Allan; Kurosky, Alex

    2015-01-01

    Despite years of preclinical development, biological interventions designed to treat complex diseases like asthma often fail in phase III clinical trials. These failures suggest that current methods to analyze biomedical data might be missing critical aspects of biological complexity such as the assumption that cases and controls come from homogeneous distributions. Here we discuss why and how methods from the rapidly evolving field of visual analytics can help translational teams (consisting of biologists, clinicians, and bioinformaticians) to address the challenge of modeling and inferring heterogeneity in the proteomic and phenotypic profiles of patients with complex diseases. Because a primary goal of visual analytics is to amplify the cognitive capacities of humans for detecting patterns in complex data, we begin with an overview of the cognitive foundations for the field of visual analytics. Next, we organize the primary ways in which a specific form of visual analytics called networks have been used to model and infer biological mechanisms, which help to identify the properties of networks that are particularly useful for the discovery and analysis of proteomic heterogeneity in complex diseases. We describe one such approach called subject-protein networks, and demonstrate its application on two proteomic datasets. This demonstration provides insights to help translational teams overcome theoretical, practical, and pedagogical hurdles for the widespread use of subject-protein networks for analyzing molecular heterogeneities, with the translational goal of designing biomarker-based clinical trials, and accelerating the development of personalized approaches to medicine. PMID:25684269

  1. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  2. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    ERIC Educational Resources Information Center

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  3. College Students' Motivation and Learning Strategies Profiles and Academic Achievement: A Self-Determination Theory Approach

    ERIC Educational Resources Information Center

    Liu, Woon Chia; Wang, Chee Keng John; Kee, Ying Hwa; Koh, Caroline; Lim, Boon San Coral; Chua, Lilian

    2014-01-01

    The development of effective self-regulated learning strategies is of interest to educationalists. In this paper, we examine inherent individual difference in self-regulated learning based on Motivated Learning for Learning Questionnaire (MLSQ) using the cluster analytic approach and examine cluster difference in terms of self-determination theory…

  4. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  5. DidFail: Coverage and Precision Enhancement

    DTIC Science & Technology

    2017-07-07

    Android developer website [4] for a detailed description of these methods. To test this approach, we developed two example apps with different complexities...broadcast receivers by performing an extra analytical step to find dynamic receivers and append descriptions of them to the manifest file before the data

  6. Laboratory services series: a programmed maintenance system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuxbury, D.C.; Srite, B.E.

    1980-01-01

    The diverse facilities, operations and equipment at a major national research and development laboratory require a systematic, analytical approach to operating equipment maintenance. A computer-scheduled preventive maintenance program is described including program development, equipment identification, maintenance and inspection instructions, scheduling, personnel, and equipment history.

  7. Commercial Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  8. Determination of linuron and related compounds in soil by microwave-assisted solvent extraction and reversed-phase liquid chromatography with UV detection.

    PubMed

    Molins, C; Hogendoorn, E A; Dijkman, E; Heusinkveld, H A; Baumann, R A

    2000-02-11

    The combination of microwave-assisted solvent extraction (MASE) and reversed-phase liquid chromatography (RPLC) with UV detection has been investigated for the efficient determination of phenylurea herbicides in soils involving the single-residue method (SRM) approach (linuron) and the multi-residue method (MRM) approach (monuron, monolinuron, isoproturon, metobromuron, diuron and linuron). Critical parameters of MASE, viz, extraction temperature, water content and extraction solvent were varied in order to optimise recoveries of the analytes while simultaneously minimising co-extraction of soil interferences. The optimised extraction procedure was applied to different types of soil with an organic carbon content of 0.4-16.7%. Besides freshly spiked soil samples, method validation included the analysis of samples with aged residues. A comparative study between the applicability of RPLC-UV without and with the use of column switching for the processing of uncleaned extracts, was carried out. For some of the tested analyte/matrix combinations the one-column approach (LC mode) is feasible. In comparison to LC, coupled-column LC (LC-LC mode) provides high selectivity in single-residue analysis (linuron) and, although less pronounced in multi-residue analysis (all six phenylurea herbicides), the clean-up performance of LC-LC improves both time of analysis and sample throughput. In the MRM approach the developed procedure involving MASE and LC-LC-UV provided acceptable recoveries (range, 80-120%) and RSDs (<12%) at levels of 10 microg/kg (n=9) and 50 microg/kg (n=7), respectively, for most analyte/matrix combinations. Recoveries from aged residue samples spiked at a level of 100 microg/kg (n=7) ranged, depending of the analyte/soil type combination, from 41-113% with RSDs ranging from 1-35%. In the SRM approach the developed LC-LC procedure was applied for the determination of linuron in 28 sandy soil samples collected in a field study. Linuron could be determined in soil with a limit of quantitation of 10 microg/kg.

  9. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  10. Zero Energy Districts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polly, Benjamin J

    This presentation shows how NREL is approaching Zero Energy Districts, including key opportunities, design strategies, and master planning concepts. The presentation also covers URBANopt, an advanced analytical platform for district that is being developed by NREL.

  11. Lagrangian based methods for coherent structure detection

    NASA Astrophysics Data System (ADS)

    Allshouse, Michael R.; Peacock, Thomas

    2015-09-01

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other two approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.

  12. Salivary biomarker development using genomic, proteomic and metabolomic approaches

    PubMed Central

    2012-01-01

    The use of saliva as a diagnostic sample provides a non-invasive, cost-efficient method of sample collection for disease screening without the need for highly trained professionals. Saliva collection is far more practical and safe compared with invasive methods of sample collection, because of the infection risk from contaminated needles during, for example, blood sampling. Furthermore, the use of saliva could increase the availability of accurate diagnostics for remote and impoverished regions. However, the development of salivary diagnostics has required technical innovation to allow stabilization and detection of analytes in the complex molecular mixture that is saliva. The recent development of cost-effective room temperature analyte stabilization methods, nucleic acid pre-amplification techniques and direct saliva transcriptomic analysis have allowed accurate detection and quantification of transcripts found in saliva. Novel protein stabilization methods have also facilitated improved proteomic analyses. Although candidate biomarkers have been discovered using epigenetic, transcriptomic, proteomic and metabolomic approaches, transcriptomic analyses have so far achieved the most progress in terms of sensitivity and specificity, and progress towards clinical implementation. Here, we review recent developments in salivary diagnostics that have been accomplished using genomic, transcriptomic, proteomic and metabolomic approaches. PMID:23114182

  13. Transforming Undergraduate Education Through the use of Analytical Reasoning (TUETAR)

    NASA Astrophysics Data System (ADS)

    Bishop, M. P.; Houser, C.; Lemmons, K.

    2015-12-01

    Traditional learning limits the potential for self-discovery, and the use of data and knowledge to understand Earth system relationships, processes, feedback mechanisms and system coupling. It is extremely difficult for undergraduate students to analyze, synthesize, and integrate quantitative information related to complex systems, as many concepts may not be mathematically tractable or yet to be formalized. Conceptual models have long served as a means for Earth scientists to organize their understanding of Earth's dynamics, and have served as a basis for human analytical reasoning and landscape interpretation. Consequently, we evaluated the use of conceptual modeling, knowledge representation and analytical reasoning to provide undergraduate students with an opportunity to develop and test geocomputational conceptual models based upon their understanding of Earth science concepts. This study describes the use of geospatial technologies and fuzzy cognitive maps to predict desertification across the South-Texas Sandsheet in an upper-level geomorphology course. Students developed conceptual models based on their understanding of aeolian processes from lectures, and then compared and evaluated their modeling results against an expert conceptual model and spatial predictions, and the observed distribution of dune activity in 2010. Students perceived that the analytical reasoning approach was significantly better for understanding desertification compared to traditional lecture, and promoted reflective learning, working with data, teamwork, student interaction, innovation, and creative thinking. Student evaluations support the notion that the adoption of knowledge representation and analytical reasoning in the classroom has the potential to transform undergraduate education by enabling students to formalize and test their conceptual understanding of Earth science. A model for developing and utilizing this geospatial technology approach in Earth science is presented.

  14. Single cell proteomics in biomedicine: High-dimensional data acquisition, visualization, and analysis.

    PubMed

    Su, Yapeng; Shi, Qihui; Wei, Wei

    2017-02-01

    New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features, and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. NASA Occupant Protection Standards Development

    NASA Technical Reports Server (NTRS)

    Somers, Jeffrey; Gernhardt, Michael; Lawrence, Charles

    2012-01-01

    Historically, spacecraft landing systems have been tested with human volunteers, because analytical methods for estimating injury risk were insufficient. These tests were conducted with flight-like suits and seats to verify the safety of the landing systems. Currently, NASA uses the Brinkley Dynamic Response Index to estimate injury risk, although applying it to the NASA environment has drawbacks: (1) Does not indicate severity or anatomical location of injury (2) Unclear if model applies to NASA applications. Because of these limitations, a new validated, analytical approach was desired. Leveraging off of the current state of the art in automotive safety and racing, a new approach was developed. The approach has several aspects: (1) Define the acceptable level of injury risk by injury severity (2) Determine the appropriate human surrogate for testing and modeling (3) Mine existing human injury data to determine appropriate Injury Assessment Reference Values (IARV). (4) Rigorously Validate the IARVs with sub-injurious human testing (5) Use validated IARVs to update standards and vehicle requirement

  16. Orthogonal analytical methods for botanical standardization: Determination of green tea catechins by qNMR and LC-MS/MS

    PubMed Central

    Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.

    2013-01-01

    The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106

  17. THz spectroscopy: An emerging technology for pharmaceutical development and pharmaceutical Process Analytical Technology (PAT) applications

    NASA Astrophysics Data System (ADS)

    Wu, Huiquan; Khan, Mansoor

    2012-08-01

    As an emerging technology, THz spectroscopy has gained increasing attention in the pharmaceutical area during the last decade. This attention is due to the fact that (1) it provides a promising alternative approach for in-depth understanding of both intermolecular interaction among pharmaceutical molecules and pharmaceutical product quality attributes; (2) it provides a promising alternative approach for enhanced process understanding of certain pharmaceutical manufacturing processes; and (3) the FDA pharmaceutical quality initiatives, most noticeably, the Process Analytical Technology (PAT) initiative. In this work, the current status and progress made so far on using THz spectroscopy for pharmaceutical development and pharmaceutical PAT applications are reviewed. In the spirit of demonstrating the utility of first principles modeling approach for addressing model validation challenge and reducing unnecessary model validation "burden" for facilitating THz pharmaceutical PAT applications, two scientific case studies based on published THz spectroscopy measurement results are created and discussed. Furthermore, other technical challenges and opportunities associated with adapting THz spectroscopy as a pharmaceutical PAT tool are highlighted.

  18. An Analytic Framework to Support E.Learning Strategy Development

    ERIC Educational Resources Information Center

    Marshall, Stephen J.

    2012-01-01

    Purpose: The purpose of this paper is to discuss and demonstrate the relevance of a new conceptual framework for leading and managing the development of learning and teaching to e.learning strategy development. Design/methodology/approach: After reviewing and discussing the research literature on e.learning in higher education institutions from…

  19. Development of Systems Engineering Competency Career Development Model: An Analytical Approach using Blooms Taxonomy

    DTIC Science & Technology

    2014-06-01

    4 E .   PURPOSE/BENEFIT ..................................................................................... 4   F.   SCOPE...INCORPORATING DAU SPRDE CL/POS & ELOS .............................. 29   E .   MAPPING TO FIT BLOOM’S TAXONOMY .......................................... 32...Description PSE Program Systems Engineering RDT& E Research, Development, Test and Engineering SE systems engineering SME Subject Matter Expert SPAWAR

  20. Modelling vortex-induced fluid-structure interaction.

    PubMed

    Benaroya, Haym; Gabbai, Rene D

    2008-04-13

    The principal goal of this research is developing physics-based, reduced-order, analytical models of nonlinear fluid-structure interactions associated with offshore structures. Our primary focus is to generalize the Hamilton's variational framework so that systems of flow-oscillator equations can be derived from first principles. This is an extension of earlier work that led to a single energy equation describing the fluid-structure interaction. It is demonstrated here that flow-oscillator models are a subclass of the general, physical-based framework. A flow-oscillator model is a reduced-order mechanical model, generally comprising two mechanical oscillators, one modelling the structural oscillation and the other a nonlinear oscillator representing the fluid behaviour coupled to the structural motion.Reduced-order analytical model development continues to be carried out using a Hamilton's principle-based variational approach. This provides flexibility in the long run for generalizing the modelling paradigm to complex, three-dimensional problems with multiple degrees of freedom, although such extension is very difficult. As both experimental and analytical capabilities advance, the critical research path to developing and implementing fluid-structure interaction models entails-formulating generalized equations of motion, as a superset of the flow-oscillator models; and-developing experimentally derived, semi-analytical functions to describe key terms in the governing equations of motion. The developed variational approach yields a system of governing equations. This will allow modelling of multiple d.f. systems. The extensions derived generalize the Hamilton's variational formulation for such problems. The Navier-Stokes equations are derived and coupled to the structural oscillator. This general model has been shown to be a superset of the flow-oscillator model. Based on different assumptions, one can derive a variety of flow-oscillator models.

  1. A Generalized Approach to Forensic Dye Identification: Development and Utility of Reference Libraries.

    PubMed

    Groves, Ethan; Palenik, Skip; Palenik, Christopher S

    2018-04-18

    While color is arguably the most important optical property of evidential fibers, the actual dyestuffs responsible for its expression in them are, in forensic trace evidence examinations, rarely analyzed and still less often identified. This is due, primarily, to the exceedingly small quantities of dye present in a single fiber as well as to the fact that dye identification is a challenging analytical problem, even when large quantities are available for analysis. Among the practical reasons for this are the wide range of dyestuffs available (and the even larger number of trade names), the low total concentration of dyes in the finished product, the limited amount of sample typically available for analysis in forensic cases, and the complexity of the dye mixtures that may exist within a single fiber. Literature on the topic of dye analysis is often limited to a specific method, subset of dyestuffs, or an approach that is not applicable given the constraints of a forensic analysis. Here, we present a generalized approach to dye identification that ( 1 ) combines several robust analytical methods, ( 2 ) is broadly applicable to a wide range of dye chemistries, application classes, and fiber types, and ( 3 ) can be scaled down to forensic casework-sized samples. The approach is based on the development of a reference collection of 300 commercially relevant textile dyes that have been characterized by a variety of microanalytical methods (HPTLC, Raman microspectroscopy, infrared microspectroscopy, UV-Vis spectroscopy, and visible microspectrophotometry). Although there is no single approach that is applicable to all dyes on every type of fiber, a combination of these analytical methods has been applied using a reproducible approach that permits the use of reference libraries to constrain the identity of and, in many cases, identify the dye (or dyes) present in a textile fiber sample.

  2. A semi-analytical refrigeration cycle modelling approach for a heat pump hot water heater

    NASA Astrophysics Data System (ADS)

    Panaras, G.; Mathioulakis, E.; Belessiotis, V.

    2018-04-01

    The use of heat pump systems in applications like the production of hot water or space heating makes important the modelling of the processes for the evaluation of the performance of existing systems, as well as for design purposes. The proposed semi-analytical model offers the opportunity to estimate the performance of a heat pump system producing hot water, without using detailed geometrical or any performance data. This is important, as for many commercial systems the type and characteristics of the involved subcomponents can hardly be detected, thus not allowing the implementation of more analytical approaches or the exploitation of the manufacturers' catalogue performance data. The analysis copes with the issues related with the development of the models of the subcomponents involved in the studied system. Issues not discussed thoroughly in the existing literature, as the refrigerant mass inventory in the case an accumulator is present, are examined effectively.

  3. A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales

    PubMed Central

    Ayton, Gary S.; Voth, Gregory A.

    2009-01-01

    A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167

  4. Analytical caustic surfaces

    NASA Technical Reports Server (NTRS)

    Schmidt, R. F.

    1987-01-01

    This document discusses the determination of caustic surfaces in terms of rays, reflectors, and wavefronts. Analytical caustics are obtained as a family of lines, a set of points, and several types of equations for geometries encountered in optics and microwave applications. Standard methods of differential geometry are applied under different approaches: directly to reflector surfaces, and alternatively, to wavefronts, to obtain analytical caustics of two sheets or branches. Gauss/Seidel aberrations are introduced into the wavefront approach, forcing the retention of all three coefficients of both the first- and the second-fundamental forms of differential geometry. An existing method for obtaining caustic surfaces through exploitation of the singularities in flux density is examined, and several constant-intensity contour maps are developed using only the intrinsic Gaussian, mean, and normal curvatures of the reflector. Numerous references are provided for extending the material of the present document to the morphologies of caustics and their associated diffraction patterns.

  5. Improved Quantification of Free and Ester-Bound Gallic Acid in Foods and Beverages by UHPLC-MS/MS.

    PubMed

    Newsome, Andrew G; Li, Yongchao; van Breemen, Richard B

    2016-02-17

    Hydrolyzable tannins are measured routinely during the characterization of food and beverage samples. Most methods for the determination of hydrolyzable tannins use hydrolysis or methanolysis to convert complex tannins to small molecules (gallic acid, methyl gallate, and ellagic acid) for quantification by HPLC-UV. Often unrecognized, analytical limitations and variability inherent in these approaches for the measurement of hydrolyzable tannins include the variable mass fraction (0-0.90) that is released as analyte, contributions of sources other than tannins to hydrolyzable gallate (can exceed >10 wt %/wt), the measurement of both free and total analyte, and lack of controls to account for degradation. An accurate, specific, sensitive, and higher-throughput approach for the determination of hydrolyzable gallate based on ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) that overcomes these limitations was developed.

  6. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    Continuing studies associated with the development of the quasi-analytical (QA) sensitivity method for three dimensional transonic flow about wings are presented. Furthermore, initial results using the quasi-analytical approach were obtained and compared to those computed using the finite difference (FD) approach. The basic goals achieved were: (1) carrying out various debugging operations pertaining to the quasi-analytical method; (2) addition of section design variables to the sensitivity equation in the form of multiple right hand sides; (3) reconfiguring the analysis/sensitivity package in order to facilitate the execution of analysis/FD/QA test cases; and (4) enhancing the display of output data to allow careful examination of the results and to permit various comparisons of sensitivity derivatives obtained using the FC/QA methods to be conducted easily and quickly. In addition to discussing the above goals, the results of executing subcritical and supercritical test cases are presented.

  7. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  8. International Natural Gas Model 2011, Model Documentation Report

    EIA Publications

    2013-01-01

    This report documents the objectives, analytical approach and development of the International Natural Gas Model (INGM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  9. 78 FR 46325 - Pacific Fishery Management Council (Pacific Council); Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-31

    ... Groundfish Subcommittee teleconference is to discuss analytical approaches for a meta-analysis of... development of analyses used to inform proxy F MSY harvest rates for consideration by the Pacific Council's...

  10. An illustrative analysis of technological alternatives for satellite communications

    NASA Technical Reports Server (NTRS)

    Metcalfe, M. R.; Cazalet, E. G.; North, D. W.

    1979-01-01

    The demand for satellite communications services in the domestic market is discussed. Two approaches to increasing system capacity are the expansion of service into frequencies presently allocated but not used for satellite communications, and the development of technologies that provide a greater level of service within the currently used frequency bands. The development of economic models and analytic techniques for evaluating capacity expansion alternatives such as these are presented. The satellite orbit spectrum problem, and also outlines of some suitable analytic approaches are examined. Illustrative analysis of domestic communications satellite technology options for providing increased levels of service are also examined. The analysis illustrates the use of probabilities and decision trees in analyzing alternatives, and provides insight into the important aspects of the orbit spectrum problem that would warrant inclusion in a larger scale analysis.

  11. Viscous damping and spring force in periodic perforated planar microstructures when the Reynolds’ equation cannot be applied

    PubMed Central

    Homentcovschi, Dorel; Miles, Ronald N.

    2010-01-01

    A model of squeeze-film behavior is developed based on Stokes’ equations for viscous, compressible isothermal flows. The flow domain is an axisymmetrical, unit cell approximation of a planar, periodic, perforated microstructure. The model is developed for cases when the lubrication approximation cannot be applied. The complex force generated by vibrations of the diaphragm driving the flow has two components: the damping force and the spring force. While for large frequencies the spring force dominates, at low (acoustical) frequencies the damping force is the most important part. The analytical approach developed here yields an explicit formula for both forces. In addition, using a finite element software package, the damping force is also obtained numerically. A comparison is made between the analytic result, numerical solution, and some experimental data found in the literature, which validates the analytic formula and provides compelling arguments about its value in designing microelectomechanical devices. PMID:20329828

  12. Teaching Analytical Method Development in an Undergraduate Instrumental Analysis Course

    ERIC Educational Resources Information Center

    Lanigan, Katherine C.

    2008-01-01

    Method development and assessment, central components of carrying out chemical research, require problem-solving skills. This article describes a pedagogical approach for teaching these skills through the adaptation of published experiments and application of group-meeting style discussions to the curriculum of an undergraduate instrumental…

  13. Systematically reviewing and synthesizing evidence from conversation analytic and related discursive research to inform healthcare communication practice and policy: an illustrated guide

    PubMed Central

    2013-01-01

    Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate them into a form that is credible and useful to healthcare practitioners, educators and policy-makers. PMID:23721181

  14. Flow adjustment inside homogeneous canopies after a leading edge – An analytical approach backed by LES

    DOE PAGES

    Kroniger, Konstantin; Banerjee, Tirtha; De Roo, Frederik; ...

    2017-10-06

    A two-dimensional analytical model for describing the mean flow behavior inside a vegetation canopy after a leading edge in neutral conditions was developed and tested by means of large eddy simulations (LES) employing the LES code PALM. The analytical model is developed for the region directly after the canopy edge, the adjustment region, where one-dimensional canopy models fail due to the sharp change in roughness. The derivation of this adjustment region model is based on an analytic solution of the two-dimensional Reynolds averaged Navier–Stokes equation in neutral conditions for a canopy with constant plant area density (PAD). The main assumptionsmore » for solving the governing equations are separability of the velocity components concerning the spatial variables and the neglection of the Reynolds stress gradients. These two assumptions are verified by means of LES. To determine the emerging model parameters, a simultaneous fitting scheme was applied to the velocity and pressure data of a reference LES simulation. Furthermore a sensitivity analysis of the adjustment region model, equipped with the previously calculated parameters, was performed varying the three relevant length, the canopy height ( h), the canopy length and the adjustment length ( Lc), in additional LES. Even if the model parameters are, in general, functions of h/ Lc, it was found out that the model is capable of predicting the flow quantities in various cases, when using constant parameters. Subsequently the adjustment region model is combined with the one-dimensional model of Massman, which is applicable for the interior of the canopy, to attain an analytical model capable of describing the mean flow for the full canopy domain. As a result, the model is tested against an analytical model based on a linearization approach.« less

  15. Flow adjustment inside homogeneous canopies after a leading edge – An analytical approach backed by LES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroniger, Konstantin; Banerjee, Tirtha; De Roo, Frederik

    A two-dimensional analytical model for describing the mean flow behavior inside a vegetation canopy after a leading edge in neutral conditions was developed and tested by means of large eddy simulations (LES) employing the LES code PALM. The analytical model is developed for the region directly after the canopy edge, the adjustment region, where one-dimensional canopy models fail due to the sharp change in roughness. The derivation of this adjustment region model is based on an analytic solution of the two-dimensional Reynolds averaged Navier–Stokes equation in neutral conditions for a canopy with constant plant area density (PAD). The main assumptionsmore » for solving the governing equations are separability of the velocity components concerning the spatial variables and the neglection of the Reynolds stress gradients. These two assumptions are verified by means of LES. To determine the emerging model parameters, a simultaneous fitting scheme was applied to the velocity and pressure data of a reference LES simulation. Furthermore a sensitivity analysis of the adjustment region model, equipped with the previously calculated parameters, was performed varying the three relevant length, the canopy height ( h), the canopy length and the adjustment length ( Lc), in additional LES. Even if the model parameters are, in general, functions of h/ Lc, it was found out that the model is capable of predicting the flow quantities in various cases, when using constant parameters. Subsequently the adjustment region model is combined with the one-dimensional model of Massman, which is applicable for the interior of the canopy, to attain an analytical model capable of describing the mean flow for the full canopy domain. As a result, the model is tested against an analytical model based on a linearization approach.« less

  16. Stochastic modelling of the hydrologic operation of rainwater harvesting systems

    NASA Astrophysics Data System (ADS)

    Guo, Rui; Guo, Yiping

    2018-07-01

    Rainwater harvesting (RWH) systems are an effective low impact development practice that provides both water supply and runoff reduction benefits. A stochastic modelling approach is proposed in this paper to quantify the water supply reliability and stormwater capture efficiency of RWH systems. The input rainfall series is represented as a marked Poisson process and two typical water use patterns are analytically described. The stochastic mass balance equation is solved analytically, and based on this, explicit expressions relating system performance to system characteristics are derived. The performances of a wide variety of RWH systems located in five representative climatic regions of the United States are examined using the newly derived analytical equations. Close agreements between analytical and continuous simulation results are shown for all the compared cases. In addition, an analytical equation is obtained expressing the required storage size as a function of the desired water supply reliability, average water use rate, as well as rainfall and catchment characteristics. The equations developed herein constitute a convenient and effective tool for sizing RWH systems and evaluating their performances.

  17. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  18. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1994-01-01

    The primary accomplishments of the project are as follows: (1) Using the transonic small perturbation equation as a flowfield model, the project demonstrated that the quasi-analytical method could be used to obtain aerodynamic sensitivity coefficients for airfoils at subsonic, transonic, and supersonic conditions for design variables such as Mach number, airfoil thickness, maximum camber, angle of attack, and location of maximum camber. It was established that the quasi-analytical approach was an accurate method for obtaining aerodynamic sensitivity derivatives for airfoils at transonic conditions and usually more efficient than the finite difference approach. (2) The usage of symbolic manipulation software to determine the appropriate expressions and computer coding associated with the quasi-analytical method for sensitivity derivatives was investigated. Using the three dimensional fully conservative full potential flowfield model, it was determined that symbolic manipulation along with a chain rule approach was extremely useful in developing a combined flowfield and quasi-analytical sensitivity derivative code capable of considering a large number of realistic design variables. (3) Using the three dimensional fully conservative full potential flowfield model, the quasi-analytical method was applied to swept wings (i.e. three dimensional) at transonic flow conditions. (4) The incremental iterative technique has been applied to the three dimensional transonic nonlinear small perturbation flowfield formulation, an equivalent plate deflection model, and the associated aerodynamic and structural discipline sensitivity equations; and coupled aeroelastic results for an aspect ratio three wing in transonic flow have been obtained.

  19. Comparison of NMR simulations of porous media derived from analytical and voxelized representations.

    PubMed

    Jin, Guodong; Torres-Verdín, Carlos; Toumelin, Emmanuel

    2009-10-01

    We develop and compare two formulations of the random-walk method, grain-based and voxel-based, to simulate the nuclear-magnetic-resonance (NMR) response of fluids contained in various models of porous media. The grain-based approach uses a spherical grain pack as input, where the solid surface is analytically defined without an approximation. In the voxel-based approach, the input is a computer-tomography or computer-generated image of reconstructed porous media. Implementation of the two approaches is largely the same, except for the representation of porous media. For comparison, both approaches are applied to various analytical and digitized models of porous media: isolated spherical pore, simple cubic packing of spheres, and random packings of monodisperse and polydisperse spheres. We find that spin magnetization decays much faster in the digitized models than in their analytical counterparts. The difference in decay rate relates to the overestimation of surface area due to the discretization of the sample; it cannot be eliminated even if the voxel size decreases. However, once considering the effect of surface-area increase in the simulation of surface relaxation, good quantitative agreement is found between the two approaches. Different grain or pore shapes entail different rates of increase of surface area, whereupon we emphasize that the value of the "surface-area-corrected" coefficient may not be universal. Using an example of X-ray-CT image of Fontainebleau rock sample, we show that voxel size has a significant effect on the calculated surface area and, therefore, on the numerically simulated magnetization response.

  20. A robust and versatile signal-on fluorescence sensing strategy based on SYBR Green I dye and graphene oxide

    PubMed Central

    Qiu, Huazhang; Wu, Namei; Zheng, Yanjie; Chen, Min; Weng, Shaohuang; Chen, Yuanzhong; Lin, Xinhua

    2015-01-01

    A robust and versatile signal-on fluorescence sensing strategy was developed to provide label-free detection of various target analytes. The strategy used SYBR Green I dye and graphene oxide as signal reporter and signal-to-background ratio enhancer, respectively. Multidrug resistance protein 1 (MDR1) gene and mercury ion (Hg2+) were selected as target analytes to investigate the generality of the method. The linear relationship and specificity of the detections showed that the sensitive and selective analyses of target analytes could be achieved by the proposed strategy with low detection limits of 0.5 and 2.2 nM for MDR1 gene and Hg2+, respectively. Moreover, the strategy was used to detect real samples. Analytical results of MDR1 gene in the serum indicated that the developed method is a promising alternative approach for real applications in complex systems. Furthermore, the recovery of the proposed method for Hg2+ detection was acceptable. Thus, the developed label-free signal-on fluorescence sensing strategy exhibited excellent universality, sensitivity, and handling convenience. PMID:25565810

  1. An analytic approach for the study of pulsar spindown

    NASA Astrophysics Data System (ADS)

    Chishtie, F. A.; Zhang, Xiyang; Valluri, S. R.

    2018-07-01

    In this work we develop an analytic approach to study pulsar spindown. We use the monopolar spindown model by Alvarez and Carramiñana (2004 Astron. Astrophys. 414 651–8), which assumes an inverse linear law of magnetic field decay of the pulsar, to extract an all-order formula for the spindown parameters using the Taylor series representation of Jaranowski et al (1998 Phys. Rev. D 58 6300). We further extend the analytic model to incorporate the quadrupole term that accounts for the emission of gravitational radiation, and obtain expressions for the period P and frequency f in terms of transcendental equations. We derive the analytic solution for pulsar frequency spindown in the absence of glitches. We examine the different cases that arise in the analysis of the roots in the solution of the non-linear differential equation for pulsar period evolution. We provide expressions for the spin-down parameters and find that the spindown values are in reasonable agreement with observations. A detection of gravitational waves from pulsars will be the next landmark in the field of multi-messenger gravitational wave astronomy.

  2. Raman spectroscopy as a process analytical technology for pharmaceutical manufacturing and bioprocessing.

    PubMed

    Esmonde-White, Karen A; Cuellar, Maryann; Uerpmann, Carsten; Lenain, Bruno; Lewis, Ian R

    2017-01-01

    Adoption of Quality by Design (QbD) principles, regulatory support of QbD, process analytical technology (PAT), and continuous manufacturing are major factors effecting new approaches to pharmaceutical manufacturing and bioprocessing. In this review, we highlight new technology developments, data analysis models, and applications of Raman spectroscopy, which have expanded the scope of Raman spectroscopy as a process analytical technology. Emerging technologies such as transmission and enhanced reflection Raman, and new approaches to using available technologies, expand the scope of Raman spectroscopy in pharmaceutical manufacturing, and now Raman spectroscopy is successfully integrated into real-time release testing, continuous manufacturing, and statistical process control. Since the last major review of Raman as a pharmaceutical PAT in 2010, many new Raman applications in bioprocessing have emerged. Exciting reports of in situ Raman spectroscopy in bioprocesses complement a growing scientific field of biological and biomedical Raman spectroscopy. Raman spectroscopy has made a positive impact as a process analytical and control tool for pharmaceutical manufacturing and bioprocessing, with demonstrated scientific and financial benefits throughout a product's lifecycle.

  3. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm

    2016-01-01

    The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Improved full analytical polygon-based method using Fourier analysis of the three-dimensional affine transformation.

    PubMed

    Pan, Yijie; Wang, Yongtian; Liu, Juan; Li, Xin; Jia, Jia

    2014-03-01

    Previous research [Appl. Opt.52, A290 (2013)] has revealed that Fourier analysis of three-dimensional affine transformation theory can be used to improve the computation speed of the traditional polygon-based method. In this paper, we continue our research and propose an improved full analytical polygon-based method developed upon this theory. Vertex vectors of primitive and arbitrary triangles and the pseudo-inverse matrix were used to obtain an affine transformation matrix representing the spatial relationship between the two triangles. With this relationship and the primitive spectrum, we analytically obtained the spectrum of the arbitrary triangle. This algorithm discards low-level angular dependent computations. In order to add diffusive reflection to each arbitrary surface, we also propose a whole matrix computation approach that takes advantage of the affine transformation matrix and uses matrix multiplication to calculate shifting parameters of similar sub-polygons. The proposed method improves hologram computation speed for the conventional full analytical approach. Optical experimental results are demonstrated which prove that the proposed method can effectively reconstruct three-dimensional scenes.

  5. Characterizing the uncertainty of classification methods and its impact on the performance of crowdsourcing

    NASA Astrophysics Data System (ADS)

    Ribera, Javier; Tahboub, Khalid; Delp, Edward J.

    2015-03-01

    Video surveillance systems are widely deployed for public safety. Real-time monitoring and alerting are some of the key requirements for building an intelligent video surveillance system. Real-life settings introduce many challenges that can impact the performance of real-time video analytics. Video analytics are desired to be resilient to adverse and changing scenarios. In this paper we present various approaches to characterize the uncertainty of a classifier and incorporate crowdsourcing at the times when the method is uncertain about making a particular decision. Incorporating crowdsourcing when a real-time video analytic method is uncertain about making a particular decision is known as online active learning from crowds. We evaluate our proposed approach by testing a method we developed previously for crowd flow estimation. We present three different approaches to characterize the uncertainty of the classifier in the automatic crowd flow estimation method and test them by introducing video quality degradations. Criteria to aggregate crowdsourcing results are also proposed and evaluated. An experimental evaluation is conducted using a publicly available dataset.

  6. A dynamic mechanical analysis technique for porous media

    PubMed Central

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in hydraulic conductivity as well. PMID:25248170

  7. A New Look at Multiple Goal Pursuit: The Promise of a Person-Centered Approach

    ERIC Educational Resources Information Center

    Wormington, Stephanie Virgine; Linnenbrink-Garcia, Lisa

    2017-01-01

    The current study reviewed and synthesized studies employing a person-centered approach to studying achievement goals. Towards this end, a common labeling scheme was developed for goal profiles. Ten profile types were identified across studies and compared via meta-analytic techniques in terms of academic motivation, social/emotional well-being,…

  8. An Investigation of Visual, Aural, Motion and Control Movement Cues.

    ERIC Educational Resources Information Center

    Matheny, W. G.; And Others

    A study was conducted to determine the ways in which multi-sensory cues can be simulated and effectively used in the training of pilots. Two analytical bases, one called the stimulus environment approach and the other an information array approach, are developed along with a cue taxonomy. Cues are postulated on the basis of information gained from…

  9. Understanding Movement: A Sociocultural Approach to Exploring Moving Humans

    ERIC Educational Resources Information Center

    Larsson, Hakan; Quennerstedt, Mikael

    2012-01-01

    The purpose of the article is to outline a sociocultural way of exploring human movement. Our ambition is to develop an analytical framework where moving humans are explored in terms of what it means to move as movements are performed by somebody, for a certain purpose, and in a certain situation. We find this approach in poststructural…

  10. Residential Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Model Documentation - Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code.

  11. Turbofan forced mixer lobe flow modeling. 1: Experimental and analytical assessment

    NASA Technical Reports Server (NTRS)

    Barber, T.; Paterson, R. W.; Skebe, S. A.

    1988-01-01

    A joint analytical and experimental investigation of three-dimensional flowfield development within the lobe region of turbofan forced mixer nozzles is described. The objective was to develop a method for predicting the lobe exit flowfield. In the analytical approach, a linearized inviscid aerodynamical theory was used for representing the axial and secondary flows within the three-dimensional convoluted mixer lobes and three-dimensional boundary layer analysis was applied thereafter to account for viscous effects. The experimental phase of the program employed three planar mixer lobe models having different waveform shapes and lobe heights for which detailed measurements were made of the three-dimensional velocity field and total pressure field at the lobe exit plane. Velocity data was obtained using Laser Doppler Velocimetry (LDV) and total pressure probing and hot wire anemometry were employed to define exit plane total pressure and boundary layer development. Comparison of data and analysis was performed to assess analytical model prediction accuracy. As a result of this study a planar mixed geometry analysis was developed. A principal conclusion is that the global mixer lobe flowfield is inviscid and can be predicted from an inviscid analysis and Kutta condition.

  12. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    PubMed

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  13. RESEARCH TOWARDS DEVELOPING METHODS FOR SELECTED PHARMACEUTICAL AND PERSONAL CARE PRODUCTS (PPCPS) ADAPTED FOR BIOSOLIDS

    EPA Science Inventory

    Development, standardization, and validation of analytical methods provides state-of-the-science

    techniques to evaluate the presence, or absence, of select PPCPs in biosolids. This research

    provides the approaches, methods, and tools to assess the exposures and redu...

  14. Assessing Leadership Knowledge in a Principalship Preparation Programme

    ERIC Educational Resources Information Center

    Seong, David Ng Foo

    2013-01-01

    Purpose: The purpose of this paper is to assess leadership learning in a principalship development programme. Design/methodology/approach: This case study adopted Popper's three worlds as an analytical framework to assess leadership learning in a principalship development programme. The unit of assessment of learning is knowledge--more…

  15. Developing analytical approaches to explore the connectionbetween endocrine-active pharmaceuticals in waterto effects in fish

    EPA Science Inventory

    The emphasis of this research project was to develop, and optimize, a solid-phase extraction (SPE) method and high performance liquid chromatography-electrospray ionization- mass spectrometry (LC-MS/MS) method, such that a linkage between the detection of endocrine active pharma...

  16. A new method for constructing analytic elements for groundwater flow.

    NASA Astrophysics Data System (ADS)

    Strack, O. D.

    2007-12-01

    The analytic element method is based upon the superposition of analytic functions that are defined throughout the infinite domain, and can be used to meet a variety of boundary conditions. Analytic elements have been use successfully for a number of problems, mainly dealing with the Poisson equation (see, e.g., Theory and Applications of the Analytic Element Method, Reviews of Geophysics, 41,2/1005 2003 by O.D.L. Strack). The majority of these analytic elements consists of functions that exhibit jumps along lines or curves. Such linear analytic elements have been developed also for other partial differential equations, e.g., the modified Helmholz equation and the heat equation, and were constructed by integrating elementary solutions, the point sink and the point doublet, along a line. This approach is limiting for two reasons. First, the existence is required of the elementary solutions, and, second, the integration tends to limit the range of solutions that can be obtained. We present a procedure for generating analytic elements that requires merely the existence of a harmonic function with the desired properties; such functions exist in abundance. The procedure to be presented is used to generalize this harmonic function in such a way that the resulting expression satisfies the applicable differential equation. The approach will be applied, along with numerical examples, for the modified Helmholz equation and for the heat equation, while it is noted that the method is in no way restricted to these equations. The procedure is carried out entirely in terms of complex variables, using Wirtinger calculus.

  17. A Peer Coaching-Based Professional Development Approach to Improving the Learning Participation and Learning Design Skills of In-Service Teachers

    ERIC Educational Resources Information Center

    Ma, Ning; Xin, Shuang; Du, Jia-Yuan

    2018-01-01

    Personalized learning based on learning analytics has become increasingly important for teachers' development via providing adaptive contents and strategies for teachers by identifying their questions and needs. Currently, most studies on teachers' professional development focus on pre-service teachers, and studies on teachers' personalized…

  18. On-orbit cryogenic fluid transfer

    NASA Technical Reports Server (NTRS)

    Aydelott, J. C.; Gille, J. P.; Eberhardt, R. N.

    1984-01-01

    A number of future NASA and DOD missions have been identified that will require, or could benefit from resupply of cryogenic liquids in orbit. The most promising approach for accomplishing cryogenic fluid transfer in the weightlessness environment of space is to use the thermodynamic filling technique. This approach involves initially reducing the receiver tank temperature by using several charge hold vent cycles followed by filling the tank without venting. Martin Marietta Denver Aerospace, under contract to the NASA Lewis Research Center, is currently developing analytical models to describe the on orbit cryogenic fluid transfer process. A detailed design of a shuttle attached experimental facility, which will provide the data necessary to verify the analytical models, is also being performed.

  19. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  20. Collector modulation in high-voltage bipolar transistor in the saturation mode: Analytical approach

    NASA Astrophysics Data System (ADS)

    Dmitriev, A. P.; Gert, A. V.; Levinshtein, M. E.; Yuferev, V. S.

    2018-04-01

    A simple analytical model is developed, capable of replacing the numerical solution of a system of nonlinear partial differential equations by solving a simple algebraic equation when analyzing the collector resistance modulation of a bipolar transistor in the saturation mode. In this approach, the leakage of the base current into the emitter and the recombination of non-equilibrium carriers in the base are taken into account. The data obtained are in good agreement with the results of numerical calculations and make it possible to describe both the motion of the front of the minority carriers and the steady state distribution of minority carriers across the collector in the saturation mode.

  1. Measurement-based reliability prediction methodology. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Linn, Linda Shen

    1991-01-01

    In the past, analytical and measurement based models were developed to characterize computer system behavior. An open issue is how these models can be used, if at all, for system design improvement. The issue is addressed here. A combined statistical/analytical approach to use measurements from one environment to model the system failure behavior in a new environment is proposed. A comparison of the predicted results with the actual data from the new environment shows a close correspondence.

  2. A lanthanide-based chemosensor for bioavailable Fe3+ using a fluorescent siderophore: an assay displacement approach.

    PubMed

    Orcutt, Karen M; Jones, W Scott; McDonald, Andrea; Schrock, David; Wallace, Karl J

    2010-01-01

    The measurement of trace analytes in aqueous systems has become increasingly important for understanding ocean primary productivity. In oceanography, iron (Fe) is a key element in regulating ocean productivity, microplankton assemblages and has been identified as a causative element in the development of some harmful algal blooms. The chemosenor developed in this study is based on an indicator displacement approach that utilizes time-resolved fluorescence and fluorescence resonance energy transfer as the sensing mechanism to achieve detection of Fe3+ ions as low as 5 nM. This novel approach holds promise for the development of photoactive chemosensors for ocean deployment.

  3. A program wide framework for evaluating data driven teaching and learning - earth analytics approaches, results and lessons learned

    NASA Astrophysics Data System (ADS)

    Wasser, L. A.; Gold, A. U.

    2017-12-01

    There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.

  4. Analyte-driven switching of DNA charge transport: de novo creation of electronic sensors for an early lung cancer biomarker.

    PubMed

    Thomas, Jason M; Chakraborty, Banani; Sen, Dipankar; Yu, Hua-Zhong

    2012-08-22

    A general approach is described for the de novo design and construction of aptamer-based electrochemical biosensors, for potentially any analyte of interest (ranging from small ligands to biological macromolecules). As a demonstration of the approach, we report the rapid development of a made-to-order electronic sensor for a newly reported early biomarker for lung cancer (CTAP III/NAP2). The steps include the in vitro selection and characterization of DNA aptamer sequences, design and biochemical testing of wholly DNA sensor constructs, and translation to a functional electrode-bound sensor format. The working principle of this distinct class of electronic biosensors is the enhancement of DNA-mediated charge transport in response to analyte binding. We first verify such analyte-responsive charge transport switching in solution, using biochemical methods; successful sensor variants were then immobilized on gold electrodes. We show that using these sensor-modified electrodes, CTAP III/NAP2 can be detected with both high specificity and sensitivity (K(d) ~1 nM) through a direct electrochemical reading. To investigate the underlying basis of analyte binding-induced conductivity switching, we carried out Förster Resonance Energy Transfer (FRET) experiments. The FRET data establish that analyte binding-induced conductivity switching in these sensors results from very subtle structural/conformational changes, rather than large scale, global folding events. The implications of this finding are discussed with respect to possible charge transport switching mechanisms in electrode-bound sensors. Overall, the approach we describe here represents a unique design principle for aptamer-based electrochemical sensors; its application should enable rapid, on-demand access to a class of portable biosensors that offer robust, inexpensive, and operationally simplified alternatives to conventional antibody-based immunoassays.

  5. A Case Study: Analyzing City Vitality with Four Pillars of Activity-Live, Work, Shop, and Play.

    PubMed

    Griffin, Matt; Nordstrom, Blake W; Scholes, Jon; Joncas, Kate; Gordon, Patrick; Krivenko, Elliott; Haynes, Winston; Higdon, Roger; Stewart, Elizabeth; Kolker, Natali; Montague, Elizabeth; Kolker, Eugene

    2016-03-01

    This case study evaluates and tracks vitality of a city (Seattle), based on a data-driven approach, using strategic, robust, and sustainable metrics. This case study was collaboratively conducted by the Downtown Seattle Association (DSA) and CDO Analytics teams. The DSA is a nonprofit organization focused on making the city of Seattle and its Downtown a healthy and vibrant place to Live, Work, Shop, and Play. DSA primarily operates through public policy advocacy, community and business development, and marketing. In 2010, the organization turned to CDO Analytics ( cdoanalytics.org ) to develop a process that can guide and strategically focus DSA efforts and resources for maximal benefit to the city of Seattle and its Downtown. CDO Analytics was asked to develop clear, easily understood, and robust metrics for a baseline evaluation of the health of the city, as well as for ongoing monitoring and comparisons of the vitality, sustainability, and growth. The DSA and CDO Analytics teams strategized on how to effectively assess and track the vitality of Seattle and its Downtown. The two teams filtered a variety of data sources, and evaluated the veracity of multiple diverse metrics. This iterative process resulted in the development of a small number of strategic, simple, reliable, and sustainable metrics across four pillars of activity: Live, Work, Shop, and Play. Data during the 5 years before 2010 were used for the development of the metrics and model and its training, and data during the 5 years from 2010 and on were used for testing and validation. This work enabled DSA to routinely track these strategic metrics, use them to monitor the vitality of Downtown Seattle, prioritize improvements, and identify new value-added programs. As a result, the four-pillar approach became an integral part of the data-driven decision-making and execution of the Seattle community's improvement activities. The approach described in this case study is actionable, robust, inexpensive, and easy to adopt and sustain. It can be applied to cities, districts, counties, regions, states, or countries, enabling cross-comparisons and improvements of vitality, sustainability, and growth.

  6. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  7. Phase-0/microdosing studies using PET, AMS, and LC-MS/MS: a range of study methodologies and conduct considerations. Accelerating development of novel pharmaceuticals through safe testing in humans - a practical guide.

    PubMed

    Burt, Tal; John, Christy S; Ruckle, Jon L; Vuong, Le T

    2017-05-01

    Phase-0 studies, including microdosing, also called Exploratory Investigational New Drug (eIND) or exploratory clinical trials, are a regulatory framework for first-in-human (FIH) trials. Common to these approaches is the use and implied safety of limited exposures to test articles. Use of sub-pharmacological doses in phase-0/microdose studies requires sensitive analytic tools such as accelerator mass spectrometer (AMS), Positron Emission Tomography (PET), and Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) to determine drug disposition. Areas covered: Here we present a practical guide to the range of methodologies, design options, and conduct strategies that can be used to increase the efficiency of drug development. We provide detailed examples of relevant developmental scenarios. Expert opinion: Validation studies over the past decade demonstrated the reliability of extrapolation of sub-pharmacological to therapeutic-level exposures in more than 80% of cases, an improvement over traditional allometric approaches. Applications of phase-0/microdosing approaches include study of pharmacokinetic and pharmacodynamic properties, target tissue localization, drug-drug interactions, effects in vulnerable populations (e.g. pediatric), and intra-target microdosing (ITM). Study design should take into account the advantages and disadvantages of each analytic tool. Utilization of combinations of these analytic techniques increases the versatility of study designs and the power of data obtained.

  8. Surface modified capillary electrophoresis combined with in solution isoelectric focusing and MALDI-TOF/TOF MS: a gel-free multidimensional electrophoresis approach for proteomic profiling--exemplified on human follicular fluid.

    PubMed

    Hanrieder, Jörg; Zuberovic, Aida; Bergquist, Jonas

    2009-04-24

    Development of miniaturized analytical tools continues to be of great interest to face the challenges in proteomic analysis of complex biological samples such as human body fluids. In the light of these challenges, special emphasis is put on the speed and simplicity of newly designed technological approaches as well as the need for cost efficiency and low sample consumption. In this study, we present an alternative multidimensional bottom-up approach for proteomic profiling for fast, efficient and sensitive protein analysis in complex biological matrices. The presented setup was based on sample pre-fractionation using microscale in solution isoelectric focusing (IEF) followed by tryptic digestion and subsequent capillary electrophoresis (CE) coupled off-line to matrix assisted laser desorption/ionization time of flight tandem mass spectrometry (MALDI TOF MS/MS). For high performance CE-separation, PolyE-323 modified capillaries were applied to minimize analyte-wall interactions. The potential of the analytical setup was demonstrated on human follicular fluid (hFF) representing a typical complex human body fluid with clinical implication. The obtained results show significant identification of 73 unique proteins (identified at 95% significance level), including mostly acute phase proteins but also protein identities that are well known to be extensively involved in follicular development.

  9. Source-term development for a contaminant plume for use by multimedia risk assessment models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.

    1999-12-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less

  10. Answer first: Applying the heuristic-analytic theory of reasoning to examine student intuitive thinking in the context of physics

    NASA Astrophysics Data System (ADS)

    Kryjevskaia, Mila; Stetzer, MacKenzie R.; Grosz, Nathaniel

    2014-12-01

    We have applied the heuristic-analytic theory of reasoning to interpret inconsistencies in student reasoning approaches to physics problems. This study was motivated by an emerging body of evidence that suggests that student conceptual and reasoning competence demonstrated on one task often fails to be exhibited on another. Indeed, even after instruction specifically designed to address student conceptual and reasoning difficulties identified by rigorous research, many undergraduate physics students fail to build reasoning chains from fundamental principles even though they possess the required knowledge and skills to do so. Instead, they often rely on a variety of intuitive reasoning strategies. In this study, we developed and employed a methodology that allowed for the disentanglement of student conceptual understanding and reasoning approaches through the use of sequences of related questions. We have shown that the heuristic-analytic theory of reasoning can be used to account for, in a mechanistic fashion, the observed inconsistencies in student responses. In particular, we found that students tended to apply their correct ideas in a selective manner that supported a specific and likely anticipated conclusion while neglecting to employ the same ideas to refute an erroneous intuitive conclusion. The observed reasoning patterns were consistent with the heuristic-analytic theory, according to which reasoners develop a "first-impression" mental model and then construct an argument in support of the answer suggested by this model. We discuss implications for instruction and argue that efforts to improve student metacognition, which serves to regulate the interaction between intuitive and analytical reasoning, is likely to lead to improved student reasoning.

  11. Peculiarities of the momentum distribution functions of strongly correlated charged fermions

    NASA Astrophysics Data System (ADS)

    Larkin, A. S.; Filinov, V. S.; Fortov, V. E.

    2018-01-01

    New numerical version of the Wigner approach to quantum thermodynamics of strongly coupled systems of particles has been developed for extreme conditions, when analytical approximations based on different kinds of perturbation theories cannot be applied. An explicit analytical expression of the Wigner function has been obtained in linear and harmonic approximations. Fermi statistical effects are accounted for by effective pair pseudopotential depending on coordinates, momenta and degeneracy parameter of particles and taking into account Pauli blocking of fermions. A new quantum Monte-Carlo method for calculations of average values of arbitrary quantum operators has been developed. Calculations of the momentum distribution functions and the pair correlation functions of degenerate ideal Fermi gas have been carried out for testing the developed approach. Comparison of the obtained momentum distribution functions of strongly correlated Coulomb systems with the Maxwell-Boltzmann and the Fermi distributions shows the significant influence of interparticle interaction both at small momenta and in high energy quantum ‘tails’.

  12. Individual behavioral phenotypes: an integrative meta-theoretical framework. Why "behavioral syndromes" are not analogs of "personality".

    PubMed

    Uher, Jana

    2011-09-01

    Animal researchers are increasingly interested in individual differences in behavior. Their interpretation as meaningful differences in behavioral strategies stable over time and across contexts, adaptive, heritable, and acted upon by natural selection has triggered new theoretical developments. However, the analytical approaches used to explore behavioral data still address population-level phenomena, and statistical methods suitable to analyze individual behavior are rarely applied. I discuss fundamental investigative principles and analytical approaches to explore whether, in what ways, and under which conditions individual behavioral differences are actually meaningful. I elaborate the meta-theoretical ideas underlying common theoretical concepts and integrate them into an overarching meta-theoretical and methodological framework. This unravels commonalities and differences, and shows that assumptions of analogy to concepts of human personality are not always warranted and that some theoretical developments may be based on methodological artifacts. Yet, my results also highlight possible directions for new theoretical developments in animal behavior research. Copyright © 2011 Wiley Periodicals, Inc.

  13. Developing students’ ideas about lens imaging: teaching experiments with an image-based approach

    NASA Astrophysics Data System (ADS)

    Grusche, Sascha

    2017-07-01

    Lens imaging is a classic topic in physics education. To guide students from their holistic viewpoint to the scientists’ analytic viewpoint, an image-based approach to lens imaging has recently been proposed. To study the effect of the image-based approach on undergraduate students’ ideas, teaching experiments are performed and evaluated using qualitative content analysis. Some of the students’ ideas have not been reported before, namely those related to blurry lens images, and those developed by the proposed teaching approach. To describe learning pathways systematically, a conception-versus-time coordinate system is introduced, specifying how teaching actions help students advance toward a scientific understanding.

  14. Using a dyadic logistic multilevel model to analyze couple data.

    PubMed

    Preciado, Mariana A; Krull, Jennifer L; Hicks, Andrew; Gipson, Jessica D

    2016-02-01

    There is growing recognition within the sexual and reproductive health field of the importance of incorporating both partners' perspectives when examining sexual and reproductive health behaviors. Yet, the analytical approaches to address couple data have not been readily integrated and utilized within the demographic and public health literature. This paper seeks to provide readers unfamiliar with analytical approaches to couple data an applied example of the use of dyadic logistic multilevel modeling, a useful approach to analyzing couple data to assess the individual, partner and couple characteristics that are related to individuals' reproductively relevant beliefs, attitudes and behaviors. The use of multilevel models in reproductive health research can help researchers develop a more comprehensive picture of the way in which individuals' reproductive health outcomes are situated in a larger relationship and cultural context. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    PubMed Central

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-01-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of TOF scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (Direct Image Reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias vs. variance performance to iterative TOF reconstruction with a matched resolution model. PMID:27032968

  16. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    NASA Astrophysics Data System (ADS)

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-05-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of time-of-flight (TOF) scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (DIRECT: direct image reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias versus variance performance to iterative TOF reconstruction with a matched resolution model.

  17. World Energy Projection System Plus Model Documentation: Coal Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Coal Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  18. World Energy Projection System Plus Model Documentation: Transportation Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) International Transportation model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  19. World Energy Projection System Plus Model Documentation: Residential Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Residential Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  20. World Energy Projection System Plus Model Documentation: Refinery Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Refinery Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  1. World Energy Projection System Plus Model Documentation: Main Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Main Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  2. Transportation Sector Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model.

  3. World Energy Projection System Plus Model Documentation: Electricity Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Electricity Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  4. Advanced information processing system: The Army fault tolerant architecture conceptual study. Volume 2: Army fault tolerant architecture design and analysis

    NASA Technical Reports Server (NTRS)

    Harper, R. E.; Alger, L. S.; Babikyan, C. A.; Butler, B. P.; Friend, S. A.; Ganska, R. J.; Lala, J. H.; Masotto, T. K.; Meyer, A. J.; Morton, D. P.

    1992-01-01

    Described here is the Army Fault Tolerant Architecture (AFTA) hardware architecture and components and the operating system. The architectural and operational theory of the AFTA Fault Tolerant Data Bus is discussed. The test and maintenance strategy developed for use in fielded AFTA installations is presented. An approach to be used in reducing the probability of AFTA failure due to common mode faults is described. Analytical models for AFTA performance, reliability, availability, life cycle cost, weight, power, and volume are developed. An approach is presented for using VHSIC Hardware Description Language (VHDL) to describe and design AFTA's developmental hardware. A plan is described for verifying and validating key AFTA concepts during the Dem/Val phase. Analytical models and partial mission requirements are used to generate AFTA configurations for the TF/TA/NOE and Ground Vehicle missions.

  5. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  6. Performance enhancement of Pt/TiO2/Si UV-photodetector by optimizing light trapping capability and interdigitated electrodes geometry

    NASA Astrophysics Data System (ADS)

    Bencherif, H.; Djeffal, F.; Ferhati, H.

    2016-09-01

    This paper presents a hybrid approach based on an analytical and metaheuristic investigation to study the impact of the interdigitated electrodes engineering on both speed and optical performance of an Interdigitated Metal-Semiconductor-Metal Ultraviolet Photodetector (IMSM-UV-PD). In this context, analytical models regarding the speed and optical performance have been developed and validated by experimental results, where a good agreement has been recorded. Moreover, the developed analytical models have been used as objective functions to determine the optimized design parameters, including the interdigit configuration effect, via a Multi-Objective Genetic Algorithm (MOGA). The ultimate goal of the proposed hybrid approach is to identify the optimal design parameters associated with the maximum of electrical and optical device performance. The optimized IMSM-PD not only reveals superior performance in terms of photocurrent and response time, but also illustrates higher optical reliability against the optical losses due to the active area shadowing effects. The advantages offered by the proposed design methodology suggest the possibility to overcome the most challenging problem with the communication speed and power requirements of the UV optical interconnect: high derived current and commutation speed in the UV receiver.

  7. Metrological approach to quantitative analysis of clinical samples by LA-ICP-MS: A critical review of recent studies.

    PubMed

    Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta

    2018-05-15

    Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Development and validation of a fast and simple multi-analyte procedure for quantification of 40 drugs relevant to emergency toxicology using GC-MS and one-point calibration.

    PubMed

    Meyer, Golo M J; Weber, Armin A; Maurer, Hans H

    2014-05-01

    Diagnosis and prognosis of poisonings should be confirmed by comprehensive screening and reliable quantification of xenobiotics, for example by gas chromatography-mass spectrometry (GC-MS) or liquid chromatography-mass spectrometry (LC-MS). The turnaround time should be short enough to have an impact on clinical decisions. In emergency toxicology, quantification using full-scan acquisition is preferable because this allows screening and quantification of expected and unexpected drugs in one run. Therefore, a multi-analyte full-scan GC-MS approach was developed and validated with liquid-liquid extraction and one-point calibration for quantification of 40 drugs relevant to emergency toxicology. Validation showed that 36 drugs could be determined quickly, accurately, and reliably in the range of upper therapeutic to toxic concentrations. Daily one-point calibration with calibrators stored for up to four weeks reduced workload and turn-around time to less than 1 h. In summary, the multi-analyte approach with simple liquid-liquid extraction, GC-MS identification, and quantification over fast one-point calibration could successfully be applied to proficiency tests and real case samples. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Improved detection of chemical substances from colorimetric sensor data using probabilistic machine learning

    NASA Astrophysics Data System (ADS)

    Mølgaard, Lasse L.; Buus, Ole T.; Larsen, Jan; Babamoradi, Hamid; Thygesen, Ida L.; Laustsen, Milan; Munk, Jens Kristian; Dossi, Eleftheria; O'Keeffe, Caroline; Lässig, Lina; Tatlow, Sol; Sandström, Lars; Jakobsen, Mogens H.

    2017-05-01

    We present a data-driven machine learning approach to detect drug- and explosives-precursors using colorimetric sensor technology for air-sampling. The sensing technology has been developed in the context of the CRIM-TRACK project. At present a fully- integrated portable prototype for air sampling with disposable sensing chips and automated data acquisition has been developed. The prototype allows for fast, user-friendly sampling, which has made it possible to produce large datasets of colorimetric data for different target analytes in laboratory and simulated real-world application scenarios. To make use of the highly multi-variate data produced from the colorimetric chip a number of machine learning techniques are employed to provide reliable classification of target analytes from confounders found in the air streams. We demonstrate that a data-driven machine learning method using dimensionality reduction in combination with a probabilistic classifier makes it possible to produce informative features and a high detection rate of analytes. Furthermore, the probabilistic machine learning approach provides a means of automatically identifying unreliable measurements that could produce false predictions. The robustness of the colorimetric sensor has been evaluated in a series of experiments focusing on the amphetamine pre-cursor phenylacetone as well as the improvised explosives pre-cursor hydrogen peroxide. The analysis demonstrates that the system is able to detect analytes in clean air and mixed with substances that occur naturally in real-world sampling scenarios. The technology under development in CRIM-TRACK has the potential as an effective tool to control trafficking of illegal drugs, explosive detection, or in other law enforcement applications.

  10. Modeling and analysis of cascade solar cells

    NASA Technical Reports Server (NTRS)

    Ho, F. D.

    1986-01-01

    A brief review is given of the present status of the development of cascade solar cells. It is known that photovoltaic efficiencies can be improved through this development. The designs and calculations of the multijunction cells, however, are quite complicated. The main goal is to find a method which is a compromise between accuracy and simplicity for modeling a cascade solar cell. Three approaches are presently under way, among them (1) equivalent circuit approach, (2) numerical approach, and (3) analytical approach. Here, the first and the second approaches are discussed. The equivalent circuit approach using SPICE (Simulation Program, Integrated Circuit Emphasis) to the cascade cells and the cascade-cell array is highlighted. The methods of extracting parameters for modeling are discussed.

  11. How Qualitative Methods Can be Used to Inform Model Development.

    PubMed

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  12. Thermodiffusion in concentrated ferrofluids: A review and current experimental and numerical results on non-magnetic thermodiffusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprenger, Lisa, E-mail: Lisa.Sprenger@tu-dresden.de; Lange, Adrian; Odenbach, Stefan

    2013-12-15

    Ferrofluids are colloidal suspensions consisting of magnetic nanoparticles dispersed in a carrier liquid. Their thermodiffusive behaviour is rather strong compared to molecular binary mixtures, leading to a Soret coefficient (S{sub T}) of 0.16 K{sup −1}. Former experiments with dilute magnetic fluids have been done with thermogravitational columns or horizontal thermodiffusion cells by different research groups. Considering the horizontal thermodiffusion cell, a former analytical approach has been used to solve the phenomenological diffusion equation in one dimension assuming a constant concentration gradient over the cell's height. The current experimental work is based on the horizontal separation cell and emphasises the comparison ofmore » the concentration development in different concentrated magnetic fluids and at different temperature gradients. The ferrofluid investigated is the kerosene-based EMG905 (Ferrotec) to be compared with the APG513A (Ferrotec), both containing magnetite nanoparticles. The experiments prove that the separation process linearly depends on the temperature gradient and that a constant concentration gradient develops in the setup due to the separation. Analytical one dimensional and numerical three dimensional approaches to solve the diffusion equation are derived to be compared with the solution used so far for dilute fluids to see if formerly made assumptions also hold for higher concentrated fluids. Both, the analytical and numerical solutions, either in a phenomenological or a thermodynamic description, are able to reproduce the separation signal gained from the experiments. The Soret coefficient can then be determined to 0.184 K{sup −1} in the analytical case and 0.29 K{sup −1} in the numerical case. Former theoretical approaches for dilute magnetic fluids underestimate the strength of the separation in the case of a concentrated ferrofluid.« less

  13. A Collaborative Approach to Designing Graduate Admission Studies: A Model for Influencing Program Planning and Policy. AIR 1999 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Delaney, Anne Marie

    This paper presents the rationale, research design, analytical approaches, and results of a graduate admission study which examined the motivation and enrollment decision processes of students accepted to a newly redesigned Master of Business Administration (MBA) Program. The study was developed collaboratively by the institution's Office of…

  14. Using a Data Mining Approach to Develop a Student Engagement-Based Institutional Typology. IR Applications, Volume 18, February 8, 2009

    ERIC Educational Resources Information Center

    Luan, Jing; Zhao, Chun-Mei; Hayek, John C.

    2009-01-01

    Data mining provides both systematic and systemic ways to detect patterns of student engagement among students at hundreds of institutions. Using traditional statistical techniques alone, the task would be significantly difficult--if not impossible--considering the size and complexity in both data and analytical approaches necessary for this…

  15. An innovative approach to the development of a portable unit for analytical flame characterization in a microgravity environment

    NASA Technical Reports Server (NTRS)

    Dubinskiy, Mark A.; Kamal, Mohammed M.; Misra, Prabhaker

    1995-01-01

    The availability of manned laboratory facilities in space offers wonderful opportunities and challenges in microgravity combustion science and technology. In turn, the fundamentals of microgravity combustion science can be studied via spectroscopic characterization of free radicals generated in flames. The laser-induced fluorescence (LIF) technique is a noninvasive method of considerable utility in combustion physics and chemistry suitable for monitoring not only specific species and their kinetics, but it is also important for imaging of flames. This makes LIF one of the most important tools for microgravity combustion science. Flame characterization under microgravity conditions using LIF is expected to be more informative than other methods aimed at searching for effects like pumping phenomenon that can be modeled via ground level experiments. A primary goal of our work consisted in working out an innovative approach to devising an LIF-based analytical unit suitable for in-space flame characterization. It was decided to follow two approaches in tandem: (1) use the existing laboratory (non-portable) equipment and determine the optimal set of parameters for flames that can be used as analytical criteria for flame characterization under microgravity conditions; and (2) use state-of-the-art developments in laser technology and concentrate some effort in devising a layout for the portable analytical equipment. This paper presents an up-to-date summary of the results of our experiments aimed at the creation of the portable device for combustion studies in a microgravity environment, which is based on a portable UV tunable solid-state laser for excitation of free radicals normally present in flames in detectable amounts. A systematic approach has allowed us to make a convenient choice of species under investigation, as well as the proper tunable laser system, and also enabled us to carry out LIF experiments on free radicals using a solid-state laser tunable in the UV.

  16. A simulation technique for predicting thickness of thermal sprayed coatings

    NASA Technical Reports Server (NTRS)

    Goedjen, John G.; Miller, Robert A.; Brindley, William J.; Leissler, George W.

    1995-01-01

    The complexity of many of the components being coated today using the thermal spray process makes the trial and error approach traditionally followed in depositing a uniform coating inadequate, thereby necessitating a more analytical approach to developing robotic trajectories. A two dimensional finite difference simulation model has been developed to predict the thickness of coatings deposited using the thermal spray process. The model couples robotic and component trajectories and thermal spraying parameters to predict coating thickness. Simulations and experimental verification were performed on a rotating disk to evaluate the predictive capabilities of the approach.

  17. On Statistical Approaches for Demonstrating Analytical Similarity in the Presence of Correlation.

    PubMed

    Yang, Harry; Novick, Steven; Burdick, Richard K

    Analytical similarity is the foundation for demonstration of biosimilarity between a proposed product and a reference product. For this assessment, currently the U.S. Food and Drug Administration (FDA) recommends a tiered system in which quality attributes are categorized into three tiers commensurate with their risk and approaches of varying statistical rigor are subsequently used for the three-tier quality attributes. Key to the analyses of Tiers 1 and 2 quality attributes is the establishment of equivalence acceptance criterion and quality range. For particular licensure applications, the FDA has provided advice on statistical methods for demonstration of analytical similarity. For example, for Tier 1 assessment, an equivalence test can be used based on an equivalence margin of 1.5 σ R , where σ R is the reference product variability estimated by the sample standard deviation S R from a sample of reference lots. The quality range for demonstrating Tier 2 analytical similarity is of the form X̄ R ± K × σ R where the constant K is appropriately justified. To demonstrate Tier 2 analytical similarity, a large percentage (e.g., 90%) of test product must fall in the quality range. In this paper, through both theoretical derivations and simulations, we show that when the reference drug product lots are correlated, the sample standard deviation S R underestimates the true reference product variability σ R As a result, substituting S R for σ R in the Tier 1 equivalence acceptance criterion and the Tier 2 quality range inappropriately reduces the statistical power and the ability to declare analytical similarity. Also explored is the impact of correlation among drug product lots on Type I error rate and power. Three methods based on generalized pivotal quantities are introduced, and their performance is compared against a two-one-sided tests (TOST) approach. Finally, strategies to mitigate risk of correlation among the reference products lots are discussed. A biosimilar is a generic version of the original biological drug product. A key component of a biosimilar development is the demonstration of analytical similarity between the biosimilar and the reference product. Such demonstration relies on application of statistical methods to establish a similarity margin and appropriate test for equivalence between the two products. This paper discusses statistical issues with demonstration of analytical similarity and provides alternate approaches to potentially mitigate these problems. © PDA, Inc. 2016.

  18. New vistas in refractive laser beam shaping with an analytic design approach

    NASA Astrophysics Data System (ADS)

    Duerr, Fabian; Thienpont, Hugo

    2014-05-01

    Many commercial, medical and scientific applications of the laser have been developed since its invention. Some of these applications require a specific beam irradiance distribution to ensure optimal performance. Often, it is possible to apply geometrical methods to design laser beam shapers. This common design approach is based on the ray mapping between the input plane and the output beam. Geometric ray mapping designs with two plano-aspheric lenses have been thoroughly studied in the past. Even though analytic expressions for various ray mapping functions do exist, the surface profiles of the lenses are still calculated numerically. In this work, we present an alternative novel design approach that allows direct calculation of the rotational symmetric lens profiles described by analytic functions. Starting from the example of a basic beam expander, a set of functional differential equations is derived from Fermat's principle. This formalism allows calculating the exact lens profiles described by Taylor series coefficients up to very high orders. To demonstrate the versatility of this new approach, two further cases are solved: a Gaussian to at-top irradiance beam shaping system, and a beam shaping system that generates a more complex dark-hollow Gaussian (donut-like) irradiance profile with zero intensity in the on-axis region. The presented ray tracing results confirm the high accuracy of all calculated solutions and indicate the potential of this design approach for refractive beam shaping applications.

  19. Developing End-User Innovation from Circuits of Learning

    ERIC Educational Resources Information Center

    Fosstenløkken, Siw M.

    2015-01-01

    Purpose: This paper aims to raise the question of how end-user product innovation is developed by exploring the underlying learning mechanisms that drive such idea realization in practice. A trialogical learning perspective from educational science is applied as an analytical approach to enlighten the black box of learning dynamics in user…

  20. Temperament Profiles from Infancy to Middle Childhood: Development and Associations with Behavior Problems

    ERIC Educational Resources Information Center

    Janson, Harald; Mathiesen, Kristin S.

    2008-01-01

    The authors applied I-States as Objects Analysis (ISOA), a recently proposed person-oriented analytic approach, to the study of temperament development in 921 Norwegian children from a population-based sample. A 5-profile classification based on cluster analysis of standardized mother reports of activity, sociability, emotionality, and shyness at…

  1. Basic Human Needs: A Development Planning Approach. AID Discussion Paper No. 38.

    ERIC Educational Resources Information Center

    Crosswell, Michael

    The monograph explores basic needs of all human beings and considers various patterns of growth and development toward meeting these needs on a sustainable basis. The purpose of the study is to improve knowledge of analytical studies, research results, and financial assistance policies among personnel of the Agency for International Development…

  2. Establishing and performing targeted multi-residue analysis for lipid mediators and fatty acids in small clinical plasma samples.

    USDA-ARS?s Scientific Manuscript database

    LC-MS/MS and GC-MS based targeted metabolomics is typically conducted by analyzing and quantifying a cascade of metabolites with methods specifically developed for the metabolite class. Here we describe an approach for the development of multi-residue analytical profiles, calibration standards, and ...

  3. Social Networks and Smoking: Exploring the Effects of Peer Influence and Smoker Popularity through Simulations

    ERIC Educational Resources Information Center

    Schaefer, David R.; adams, jimi; Haas, Steven A.

    2013-01-01

    Adolescent smoking and friendship networks are related in many ways that can amplify smoking prevalence. Understanding and developing interventions within such a complex system requires new analytic approaches. We draw on recent advances in dynamic network modeling to develop a technique that explores the implications of various intervention…

  4. Exact and approximate solutions for transient squeezing flow

    NASA Astrophysics Data System (ADS)

    Lang, Ji; Santhanam, Sridhar; Wu, Qianhong

    2017-10-01

    In this paper, we report two novel theoretical approaches to examine a fast-developing flow in a thin fluid gap, which is widely observed in industrial applications and biological systems. The problem is featured by a very small Reynolds number and Strouhal number, making the fluid convective acceleration negligible, while its local acceleration is not. We have developed an exact solution for this problem which shows that the flow starts with an inviscid limit when the viscous effect has no time to appear and is followed by a subsequent developing flow, in which the viscous effect continues to penetrate into the entire fluid gap. An approximate solution is also developed using a boundary layer integral method. This solution precisely captures the general behavior of the transient fluid flow process and agrees very well with the exact solution. We also performed numerical simulation using Ansys-CFX. Excellent agreement between the analytical and the numerical solutions is obtained, indicating the validity of the analytical approaches. The study presented herein fills the gap in the literature and will have a broad impact on industrial and biomedical applications.

  5. The truncated conjugate gradient (TCG), a non-iterative/fixed-cost strategy for computing polarization in molecular dynamics: Fast evaluation of analytical forces

    NASA Astrophysics Data System (ADS)

    Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip

    2017-10-01

    In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.

  6. The truncated conjugate gradient (TCG), a non-iterative/fixed-cost strategy for computing polarization in molecular dynamics: Fast evaluation of analytical forces.

    PubMed

    Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip

    2017-10-28

    In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.

  7. Shifting from Stewardship to Analytics of Massive Science Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Doyle, R.; Law, E.; Hughes, S.; Huang, T.; Mahabal, A.

    2015-12-01

    Currently, the analysis of large data collections is executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Data collection, archiving and analysis from future remote sensing missions, be it from earth science satellites, planetary robotic missions, or massive radio observatories may not scale as more capable instruments stress existing architectural approaches and systems due to more continuous data streams, data from multiple observational platforms, and measurements and models from different agencies. A new paradigm is needed in order to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural choices, data processing, management, analysis, etc are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections. Future observational systems, including satellite and airborne experiments, and research in climate modeling will significantly increase the size of the data requiring new methodological approaches towards data analytics where users can more effectively interact with the data and apply automated mechanisms for data reduction, reduction and fusion across these massive data repositories. This presentation will discuss architecture, use cases, and approaches for developing a big data analytics strategy across multiple science disciplines.

  8. Do different data analytic approaches generate discrepant findings when measuring mother-infant HPA axis attunement?

    PubMed

    Bernard, Nicola K; Kashy, Deborah A; Levendosky, Alytia A; Bogat, G Anne; Lonstein, Joseph S

    2017-03-01

    Attunement between mothers and infants in their hypothalamic-pituitary-adrenal (HPA) axis responsiveness to acute stressors is thought to benefit the child's emerging physiological and behavioral self-regulation, as well as their socioemotional development. However, there is no universally accepted definition of attunement in the literature, which appears to have resulted in inconsistent statistical analyses for determining its presence or absence, and contributed to discrepant results. We used a series of data analytic approaches, some previously used in the attunement literature and others not, to evaluate the attunement between 182 women and their 1-year-old infants in their HPA axis responsivity to acute stress. Cortisol was measured in saliva samples taken from mothers and infants before and twice after a naturalistic laboratory stressor (infant arm restraint). The results of the data analytic approaches were mixed, with some analyses suggesting attunement while others did not. The strengths and weaknesses of each statistical approach are discussed, and an analysis using a cross-lagged model that considered both time and interactions between mother and infant appeared the most appropriate. Greater consensus in the field about the conceptualization and analysis of physiological attunement would be valuable in order to advance our understanding of this phenomenon. © 2016 Wiley Periodicals, Inc.

  9. Silicone rod extraction followed by liquid desorption-large volume injection-programmable temperature vaporiser-gas chromatography-mass spectrometry for trace analysis of priority organic pollutants in environmental water samples.

    PubMed

    Delgado, Alejandra; Posada-Ureta, Oscar; Olivares, Maitane; Vallejo, Asier; Etxebarria, Nestor

    2013-12-15

    In this study a priority organic pollutants usually found in environmental water samples were considered to accomplish two extraction and analysis approaches. Among those compounds organochlorine compounds, pesticides, phthalates, phenols and residues of pharmaceutical and personal care products were included. The extraction and analysis steps were based on silicone rod extraction (SR) followed by liquid desorption in combination with large volume injection-programmable temperature vaporiser (LVI-PTV) and gas chromatography-mass spectrometry (GC-MS). Variables affecting the analytical response as a function of the programmable temperature vaporiser (PTV) parameters were firstly optimised following an experimental design approach. The SR extraction and desorption conditions were assessed afterwards, including matrix modification, time extraction, and stripping solvent composition. Subsequently, the possibility of performing membrane enclosed sorptive coating extraction (MESCO) as a modified extraction approach was also evaluated. The optimised method showed low method detection limits (3-35 ng L(-1)), acceptable accuracy (78-114%) and precision values (<13%) for most of the studied analytes regardless of the aqueous matrix. Finally, the developed approach was successfully applied to the determination of target analytes in aqueous environmental matrices including estuarine and wastewater samples. © 2013 Elsevier B.V. All rights reserved.

  10. Development of methods to monitor ionization modification from dosing vehicles and phospholipids in study samples.

    PubMed

    Chang, Min; Li, Yongchao; Angeles, Reginald; Khan, Samina; Chen, Lian; Kaplan, Julia; Yang, Liyu

    2011-08-01

    Two approaches to monitor the matrix effect on ionization in study samples were described. One approach is the addition of multiple reaction monitoring transitions to the bioanalytical methods to monitor the presence of known ionization modification-causing components of the matrix, for example, m/z 184→125 (or m/z 184→184) and m/z 133→89 may be used for phospholipids and polyethylene oxide containing surfactants, respectively. This approach requires no additional equipment and can be readily adapted for most method. The approach detects only the intended interfering compounds and provides little quantitative indication if the matrix effect is within the tolerable range (±15%). The other approach requires the addition of an infusion pump and identifies an appropriate surrogate of the analyte to be infused for the determination of modification on the ionization of the analyte. The second approach detects interferences in the sample regardless of the sources (i.e., dosing vehicle components, co-administrated drugs, their metabolites, phospholipids, plasticizers and endogenous components introduced due to disease stage).

  11. Lagrangian based methods for coherent structure detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allshouse, Michael R., E-mail: mallshouse@chaos.utexas.edu; Peacock, Thomas, E-mail: tomp@mit.edu

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other twomore » approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.« less

  12. An approach to the design of wide-angle optical systems with special illumination and IFOV requirements

    NASA Astrophysics Data System (ADS)

    Pravdivtsev, Andrey V.

    2012-06-01

    The article presents the approach to the design wide-angle optical systems with special illumination and instantaneous field of view (IFOV) requirements. The unevenness of illumination reduces the dynamic range of the system, which negatively influence on the system ability to perform their task. The result illumination on the detector depends among other factors from the IFOV changes. It is also necessary to consider IFOV in the synthesis of data processing algorithms, as it directly affects to the potential "signal/background" ratio for the case of statistically homogeneous backgrounds. A numerical-analytical approach that simplifies the design of wideangle optical systems with special illumination and IFOV requirements is presented. The solution can be used for optical systems which field of view greater than 180 degrees. Illumination calculation in optical CAD is based on computationally expensive tracing of large number of rays. The author proposes to use analytical expression for some characteristics which illumination depends on. The rest characteristic are determined numerically in calculation with less computationally expensive operands, the calculation performs not every optimization step. The results of analytical calculation inserts in the merit function of optical CAD optimizer. As a result we reduce the optimizer load, since using less computationally expensive operands. It allows reducing time and resources required to develop a system with the desired characteristics. The proposed approach simplifies the creation and understanding of the requirements for the quality of the optical system, reduces the time and resources required to develop an optical system, and allows creating more efficient EOS.

  13. Analytical optical scattering in clouds

    NASA Technical Reports Server (NTRS)

    Phanord, Dieudonne D.

    1989-01-01

    An analytical optical model for scattering of light due to lightning by clouds of different geometry is being developed. The self-consistent approach and the equivalent medium concept of Twersky was used to treat the case corresponding to outside illumination. Thus, the resulting multiple scattering problem is transformed with the knowledge of the bulk parameters, into scattering by a single obstacle in isolation. Based on the size parameter of a typical water droplet as compared to the incident wave length, the problem for the single scatterer equivalent to the distribution of cloud particles can be solved either by Mie or Rayleigh scattering theory. The super computing code of Wiscombe can be used immediately to produce results that can be compared to the Monte Carlo computer simulation for outside incidence. A fairly reasonable inverse approach using the solution of the outside illumination case was proposed to model analytically the situation for point sources located inside the thick optical cloud. Its mathematical details are still being investigated. When finished, it will provide scientists an enhanced capability to study more realistic clouds. For testing purposes, the direct approach to the inside illumination of clouds by lightning is under consideration. Presently, an analytical solution for the cubic cloud will soon be obtained. For cylindrical or spherical clouds, preliminary results are needed for scattering by bounded obstacles above or below a penetrable surface interface.

  14. Spiral trajectory design: a flexible numerical algorithm and base analytical equations.

    PubMed

    Pipe, James G; Zwart, Nicholas R

    2014-01-01

    Spiral-based trajectories for magnetic resonance imaging can be advantageous, but are often cumbersome to design or create. This work presents a flexible numerical algorithm for designing trajectories based on explicit definition of radial undersampling, and also gives several analytical expressions for charactering the base (critically sampled) class of these trajectories. Expressions for the gradient waveform, based on slew and amplitude limits, are developed such that a desired pitch in the spiral k-space trajectory is followed. The source code for this algorithm, written in C, is publicly available. Analytical expressions approximating the spiral trajectory (ignoring the radial component) are given to characterize measurement time, gradient heating, maximum gradient amplitude, and off-resonance phase for slew-limited and gradient amplitude-limited cases. Several numerically calculated trajectories are illustrated, and base Archimedean spirals are compared with analytically obtained results. Several different waveforms illustrate that the desired slew and amplitude limits are reached, as are the desired undersampling patterns, using the numerical method. For base Archimedean spirals, the results of the numerical and analytical approaches are in good agreement. A versatile numerical algorithm was developed, and was written in publicly available code. Approximate analytical formulas are given that help characterize spiral trajectories. Copyright © 2013 Wiley Periodicals, Inc.

  15. Adverse outcome pathway networks II: Network analytics

    EPA Science Inventory

    The US EPA is developing more cost effective and efficient ways to evaluate chemical safety using high throughput and computationally based testing strategies. An important component of this approach is the ability to translate chemical effects on fundamental biological processes...

  16. Semi-analytical approach to estimate railroad tank car shell puncture

    DOT National Transportation Integrated Search

    2011-03-16

    This paper describes the development of engineering-based equations to estimate the puncture resistance of railroad tank cars under a generalized shell or side impact scenario. Resistance to puncture is considered in terms of puncture velocity, which...

  17. World Energy Projection System Plus Model Documentation: Greenhouse Gases Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Greenhouse Gases Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  18. World Energy Projection System Plus Model Documentation: Natural Gas Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Natural Gas Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  19. World Energy Projection System Plus Model Documentation: District Heat Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) District Heat Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  20. World Energy Projection System Plus Model Documentation: Industrial Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Industrial Model (WIM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  1. Productivity Measurement: An Analytic Approach

    DTIC Science & Technology

    1983-09-01

    LMDC-TR-83-4 6a. NAME OF PERFORMING ORGANIZATION Leadership and Management Development Center (AU) 6b. OFFICE SYMBOL (If applicable) 7a. NAME...Charles R. White, USAFRES September 1983 Approved for public release; distribution unlimited. \\\\ LEADERSHIP AND MANAGEMENT DEVELOPMENT CENTER, AIR...UNIVERSITY Maxwell Air Force Base, Alabama 36112 0 007 LMDC-TR-83-4 Technical Reports prepared by the Leadership and Management Development

  2. Interactive Visual Analytics Approch for Exploration of Geochemical Model Simulations with Different Parameter Sets

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2015-04-01

    Many geoscience applications can benefit from testing many combinations of input parameters for geochemical simulation models. It is, however, a challenge to screen the input and output data from the model to identify the significant relationships between input parameters and output variables. For addressing this problem we propose a Visual Analytics approach that has been developed in an ongoing collaboration between computer science and geoscience researchers. Our Visual Analytics approach uses visualization methods of hierarchical horizontal axis, multi-factor stacked bar charts and interactive semi-automated filtering for input and output data together with automatic sensitivity analysis. This guides the users towards significant relationships. We implement our approach as an interactive data exploration tool. It is designed with flexibility in mind, so that a diverse set of tasks such as inverse modeling, sensitivity analysis and model parameter refinement can be supported. Here we demonstrate the capabilities of our approach by two examples for gas storage applications. For the first example our Visual Analytics approach enabled the analyst to observe how the element concentrations change around previously established baselines in response to thousands of different combinations of mineral phases. This supported combinatorial inverse modeling for interpreting observations about the chemical composition of the formation fluids at the Ketzin pilot site for CO2 storage. The results indicate that, within the experimental error range, the formation fluid cannot be considered at local thermodynamical equilibrium with the mineral assemblage of the reservoir rock. This is a valuable insight from the predictive geochemical modeling for the Ketzin site. For the second example our approach supports sensitivity analysis for a reaction involving the reductive dissolution of pyrite with formation of pyrrothite in presence of gaseous hydrogen. We determine that this reaction is thermodynamically favorable under a broad range of conditions. This includes low temperatures and absence of microbial catalysators. Our approach has potential for use in other applications that involve exploration of relationships in geochemical simulation model data.

  3. Analyte-Responsive Hydrogels: Intelligent Materials for Biosensing and Drug Delivery.

    PubMed

    Culver, Heidi R; Clegg, John R; Peppas, Nicholas A

    2017-02-21

    Nature has mastered the art of molecular recognition. For example, using synergistic non-covalent interactions, proteins can distinguish between molecules and bind a partner with incredible affinity and specificity. Scientists have developed, and continue to develop, techniques to investigate and better understand molecular recognition. As a consequence, analyte-responsive hydrogels that mimic these recognitive processes have emerged as a class of intelligent materials. These materials are unique not only in the type of analyte to which they respond but also in how molecular recognition is achieved and how the hydrogel responds to the analyte. Traditional intelligent hydrogels can respond to environmental cues such as pH, temperature, and ionic strength. The functional monomers used to make these hydrogels can be varied to achieve responsive behavior. For analyte-responsive hydrogels, molecular recognition can also be achieved by incorporating biomolecules with inherent molecular recognition properties (e.g., nucleic acids, peptides, enzymes, etc.) into the polymer network. Furthermore, in addition to typical swelling/syneresis responses, these materials exhibit unique responsive behaviors, such as gel assembly or disassembly, upon interaction with the target analyte. With the diverse tools available for molecular recognition and the ability to generate unique responsive behaviors, analyte-responsive hydrogels have found great utility in a wide range of applications. In this Account, we discuss strategies for making four different classes of analyte-responsive hydrogels, specifically, non-imprinted, molecularly imprinted, biomolecule-containing, and enzymatically responsive hydrogels. Then we explore how these materials have been incorporated into sensors and drug delivery systems, highlighting examples that demonstrate the versatility of these materials. For example, in addition to the molecular recognition properties of analyte-responsive hydrogels, the physicochemical changes that are induced upon analyte binding can be exploited to generate a detectable signal for sensing applications. As research in this area has grown, a number of creative approaches for improving the selectivity and sensitivity (i.e., detection limit) of these sensors have emerged. For applications in drug delivery systems, therapeutic release can be triggered by competitive molecular interactions or physicochemical changes in the network. Additionally, including degradable units within the network can enable sustained and responsive therapeutic release. Several exciting examples exploiting the analyte-responsive behavior of hydrogels for the treatment of cancer, diabetes, and irritable bowel syndrome are discussed in detail. We expect that creative and combinatorial approaches used in the design of analyte-responsive hydrogels will continue to yield materials with great potential in the fields of sensing and drug delivery.

  4. Meeting future information needs for Great Lakes fisheries management

    USGS Publications Warehouse

    Christie, W.J.; Collins, John J.; Eck, Gary W.; Goddard, Chris I.; Hoenig, John M.; Holey, Mark; Jacobson, Lawrence D.; MacCallum, Wayne; Nepszy, Stephen J.; O'Gorman, Robert; Selgeby, James

    1987-01-01

    Description of information needs for management of Great Lakes fisheries is complicated by recent changes in biology and management of the Great Lakes, development of new analytical methodologies, and a transition in management from a traditional unispecies approach to a multispecies/community approach. A number of general problems with the collection and management of data and information for fisheries management need to be addressed (i.e. spatial resolution, reliability, computerization and accessibility of data, design of sampling programs, standardization and coordination among agencies, and the need for periodic review of procedures). Problems with existing data collection programs include size selectivity and temporal trends in the efficiency of fishing gear, inadequate creel survey programs, bias in age estimation, lack of detailed sea lamprey (Petromyzon marinus) wounding data, and data requirements for analytical techniques that are underutilized by managers of Great Lakes fisheries. The transition to multispecies and community approaches to fisheries management will require policy decisions by the management agencies, adequate funding, and a commitment to develop programs for collection of appropriate data on a long-term basis.

  5. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  6. Bioparticles assembled using low frequency vibration immune to evacuation drifts

    NASA Astrophysics Data System (ADS)

    Shao, Fenfen; Whitehill, James David; Ng, Tuck Wah

    2012-08-01

    The use of low frequency vibration on suspensions of glass beads in a droplet has been shown to develop a strong degree of patterning (to a ring) due to the manner with which the surface waves are modified. Functionalized glass beads that serve as bioparticles permit for sensitive readings when concentrated at specific locations. However, a time controlled exposure with analytes is desirable. The replacement of the liquid medium with analyte through extraction is needed to conserve time. Nevertheless, we show here that extraction with a porous media, which is simple and useable in the field, will strongly displace the patterned beads. The liquid removal was found to be dependent on two mechanisms that affect the shape of the droplet, one of contact hysteresis due to the outer edge pinning, and the other of liquid being drawn into the porous media. From this, we developed and demonstrated a modified well structure that prevented micro-bead displacement during evacuation. An added strong advantage with this approach lies with its ability to require only analytes to be dispensed at the location of aggregated particles, which minimizes analyte usage. This was analytically established here.

  7. Headspace single drop microextraction versus dispersive liquid-liquid microextraction using magnetic ionic liquid extraction solvents.

    PubMed

    An, Jiwoo; Rahn, Kira L; Anderson, Jared L

    2017-05-15

    A headspace single drop microextraction (HS-SDME) method and a dispersive liquid-liquid microextraction (DLLME) method were developed using two tetrachloromanganate ([MnCl 4 2- ])-based magnetic ionic liquids (MIL) as extraction solvents for the determination of twelve aromatic compounds, including four polyaromatic hydrocarbons, by reversed phase high-performance liquid chromatography (HPLC). The analytical performance of the developed HS-SDME method was compared to the DLLME approach employing the same MILs. In the HS-SDME approach, the magnetic field generated by the magnet was exploited to suspend the MIL solvent from the tip of a rod magnet. The utilization of MILs in HS-SDME resulted in a highly stable microdroplet under elevated temperatures and long extraction times, overcoming a common challenge encountered in traditional SDME approaches of droplet instability. The low UV absorbance of the [MnCl 4 2- ]-based MILs permitted direct analysis of the analyte enriched extraction solvent by HPLC. In HS-SDME, the effects of ionic strength of the sample solution, temperature of the extraction system, extraction time, stir rate, and headspace volume on extraction efficiencies were examined. Coefficients of determination (R 2 ) ranged from 0.994 to 0.999 and limits of detection (LODs) varied from 0.04 to 1.0μgL -1 with relative recoveries from lake water ranging from 70.2% to 109.6%. For the DLLME method, parameters including disperser solvent type and volume, ionic strength of the sample solution, mass of extraction solvent, and extraction time were studied and optimized. Coefficients of determination for the DLLME method varied from 0.997 to 0.999 with LODs ranging from 0.05 to 1.0μgL -1 . Relative recoveries from lake water samples ranged from 68.7% to 104.5%. Overall, the DLLME approach permitted faster extraction times and higher enrichment factors for analytes with low vapor pressure whereas the HS-SDME approach exhibited better extraction efficiencies for analytes with relatively higher vapor pressure. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Some contingencies of spelling

    PubMed Central

    Lee, Vicki L.; Sanderson, Gwenda M.

    1987-01-01

    This paper presents some speculation about the contingencies that might select standard spellings. The speculation is based on a new development in the teaching of spelling—the process writing approach, which lets standard spellings emerge collateral to a high frequency of reading and writing. The paper discusses this approach, contrasts it with behavior-analytic research on spelling, and suggests some new directions for this latter research based on a behavioral interpretation of the process writing approach to spelling. PMID:22477529

  9. Conductivity of graphene in the framework of Dirac model: Interplay between nonzero mass gap and chemical potential

    NASA Astrophysics Data System (ADS)

    Klimchitskaya, G. L.; Mostepanenko, V. M.; Petrov, V. M.

    2017-12-01

    The complete theory of electrical conductivity of graphene at arbitrary temperature is developed with taking into account mass-gap parameter and chemical potential. Both the in-plane and out-of-plane conductivities of graphene are expressed via the components of the polarization tensor in (2+1)-dimensional space-time analytically continued to the real frequency axis. Simple analytic expressions for both the real and imaginary parts of the conductivity of graphene are obtained at zero and nonzero temperature. They demonstrate an interesting interplay depending on the values of mass gap and chemical potential. In the local limit, several results obtained earlier using various approximate and phenomenological approaches are reproduced, refined, and generalized. The numerical computations of both the real and imaginary parts of the conductivity of graphene are performed to illustrate the obtained results. The analytic expressions for the conductivity of graphene obtained in this paper can serve as a guide in the comparison between different theoretical approaches and between experiment and theory.

  10. Strategy to improve the quantitative LC-MS analysis of molecular ions resistant to gas-phase collision induced dissociation: application to disulfide-rich cyclic peptides.

    PubMed

    Ciccimaro, Eugene; Ranasinghe, Asoka; D'Arienzo, Celia; Xu, Carrie; Onorato, Joelle; Drexler, Dieter M; Josephs, Jonathan L; Poss, Michael; Olah, Timothy

    2014-12-02

    Due to observed collision induced dissociation (CID) fragmentation inefficiency, developing sensitive liquid chromatography tandem mass spectrometry (LC-MS/MS) assays for CID resistant compounds is especially challenging. As an alternative to traditional LC-MS/MS, we present here a methodology that preserves the intact analyte ion for quantification by selectively filtering ions while reducing chemical noise. Utilizing a quadrupole-Orbitrap MS, the target ion is selectively isolated while interfering matrix components undergo MS/MS fragmentation by CID, allowing noise-free detection of the analyte's surviving molecular ion. In this manner, CID affords additional selectivity during high resolution accurate mass analysis by elimination of isobaric interferences, a fundamentally different concept than the traditional approach of monitoring a target analyte's unique fragment following CID. This survivor-selected ion monitoring (survivor-SIM) approach has allowed sensitive and specific detection of disulfide-rich cyclic peptides extracted from plasma.

  11. An evolution based biosensor receptor DNA sequence generation algorithm.

    PubMed

    Kim, Eungyeong; Lee, Malrey; Gatton, Thomas M; Lee, Jaewan; Zang, Yupeng

    2010-01-01

    A biosensor is composed of a bioreceptor, an associated recognition molecule, and a signal transducer that can selectively detect target substances for analysis. DNA based biosensors utilize receptor molecules that allow hybridization with the target analyte. However, most DNA biosensor research uses oligonucleotides as the target analytes and does not address the potential problems of real samples. The identification of recognition molecules suitable for real target analyte samples is an important step towards further development of DNA biosensors. This study examines the characteristics of DNA used as bioreceptors and proposes a hybrid evolution-based DNA sequence generating algorithm, based on DNA computing, to identify suitable DNA bioreceptor recognition molecules for stable hybridization with real target substances. The Traveling Salesman Problem (TSP) approach is applied in the proposed algorithm to evaluate the safety and fitness of the generated DNA sequences. This approach improves efficiency and stability for enhanced and variable-length DNA sequence generation and allows extension to generation of variable-length DNA sequences with diverse receptor recognition requirements.

  12. Displacement potential solution of a guided deep beam of composite materials under symmetric three-point bending

    NASA Astrophysics Data System (ADS)

    Rahman, M. Muzibur; Ahmad, S. Reaz

    2017-12-01

    An analytical investigation of elastic fields for a guided deep beam of orthotropic composite material having three point symmetric bending is carried out using displacement potential boundary modeling approach. Here, the formulation is developed as a single function of space variables defined in terms of displacement components, which has to satisfy the mixed type of boundary conditions. The relevant displacement and stress components are derived into infinite series using Fourier integral along with suitable polynomials coincided with boundary conditions. The results are presented mainly in the form of graphs and verified with finite element solutions using ANSYS. This study shows that the analytical and numerical solutions are in good agreement and thus enhances reliability of the displacement potential approach.

  13. In situ intracellular spectroscopy with surface enhanced Raman spectroscopy (SERS)-enabled nanopipettes.

    PubMed

    Vitol, Elina A; Orynbayeva, Zulfiya; Bouchard, Michael J; Azizkhan-Clifford, Jane; Friedman, Gary; Gogotsi, Yury

    2009-11-24

    We report on a new analytical approach to intracellular chemical sensing that utilizes a surface-enhanced Raman spectroscopy (SERS)-enabled nanopipette. The probe is comprised of a glass capillary with a 100-500 nm tip coated with gold nanoparticles. The fixed geometry of the gold nanoparticles allows us to overcome the limitations of the traditional approach for intracellular SERS using metal colloids. We demonstrate that the SERS-enabled nanopipettes can be used for in situ analysis of living cell function in real time. In addition, SERS functionality of these probes allows tracking of their localization in a cell. The developed probes can also be applied for highly sensitive chemical analysis of nanoliter volumes of chemicals in a variety of environmental and analytical applications.

  14. Helios: Understanding Solar Evolution Through Text Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Randazzese, Lucien

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance,more » or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.« less

  15. Development and application of dynamic simulations of a subsonic wind tunnel

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Cole, G. L.; Seidel, R. C.; Arpasi, D. J.

    1986-01-01

    Efforts are currently underway at NASA Lewis to improve and expand ground test facilities and to develop supporting technologies to meet anticipated aeropropulsion research needs. Many of these efforts have been focused on a proposed rehabilitation of the Altitude Wind Tunnel (AWT). In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide input to the AWT final design process. This paper describes the approach taken to develop analytical, dynamic computer simulations of the AWT, and the use of these simulations as test-beds for: (1) predicting the dynamic response characteristics of the AWT, and (2) evaluating proposed AWT control concepts. Plans for developing a portable, real-time simulator for the AWT facility are also described.

  16. Analytic model for ultrasound energy receivers and their optimal electric loads II: Experimental validation

    NASA Astrophysics Data System (ADS)

    Gorostiaga, M.; Wapler, M. C.; Wallrabe, U.

    2017-10-01

    In this paper, we verify the two optimal electric load concepts based on the zero reflection condition and on the power maximization approach for ultrasound energy receivers. We test a high loss 1-3 composite transducer, and find that the measurements agree very well with the predictions of the analytic model for plate transducers that we have developed previously. Additionally, we also confirm that the power maximization and zero reflection loads are very different when the losses in the receiver are high. Finally, we compare the optimal load predictions by the KLM and the analytic models with frequency dependent attenuation to evaluate the influence of the viscosity.

  17. Chemometric applications to assess quality and critical parameters of virgin and extra-virgin olive oil. A review.

    PubMed

    Gómez-Caravaca, Ana M; Maggio, Rubén M; Cerretani, Lorenzo

    2016-03-24

    Today virgin and extra-virgin olive oil (VOO and EVOO) are food with a large number of analytical tests planned to ensure its quality and genuineness. Almost all official methods demand high use of reagents and manpower. Because of that, analytical development in this area is continuously evolving. Therefore, this review focuses on analytical methods for EVOO/VOO which use fast and smart approaches based on chemometric techniques in order to reduce time of analysis, reagent consumption, high cost equipment and manpower. Experimental approaches of chemometrics coupled with fast analytical techniques such as UV-Vis spectroscopy, fluorescence, vibrational spectroscopies (NIR, MIR and Raman fluorescence), NMR spectroscopy, and other more complex techniques like chromatography, calorimetry and electrochemical techniques applied to EVOO/VOO production and analysis have been discussed throughout this work. The advantages and drawbacks of this association have also been highlighted. Chemometrics has been evidenced as a powerful tool for the oil industry. In fact, it has been shown how chemometrics can be implemented all along the different steps of EVOO/VOO production: raw material input control, monitoring during process and quality control of final product. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Learning Analytics Considered Harmful

    ERIC Educational Resources Information Center

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  19. Teaching Analytical Chemistry to Pharmacy Students: A Combined, Iterative Approach

    ERIC Educational Resources Information Center

    Masania, Jinit; Grootveld, Martin; Wilson, Philippe B.

    2018-01-01

    Analytical chemistry has often been a difficult subject to teach in a classroom or lecture-based context. Numerous strategies for overcoming the inherently practical-based difficulties have been suggested, each with differing pedagogical theories. Here, we present a combined approach to tackling the problem of teaching analytical chemistry, with…

  20. Analytical study of the heat loss attenuation by clothing on thermal manikins under radiative heat loads.

    PubMed

    Den Hartog, Emiel A; Havenith, George

    2010-01-01

    For wearers of protective clothing in radiation environments there are no quantitative guidelines available for the effect of a radiative heat load on heat exchange. Under the European Union funded project ThermProtect an analytical effort was defined to address the issue of radiative heat load while wearing protective clothing. As within the ThermProtect project much information has become available from thermal manikin experiments in thermal radiation environments, these sets of experimental data are used to verify the analytical approach. The analytical approach provided a good prediction of the heat loss in the manikin experiments, 95% of the variance was explained by the model. The model has not yet been validated at high radiative heat loads and neglects some physical properties of the radiation emissivity. Still, the analytical approach provides a pragmatic approach and may be useful for practical implementation in protective clothing standards for moderate thermal radiation environments.

  1. Evolution of accelerometer methods for physical activity research.

    PubMed

    Troiano, Richard P; McClain, James J; Brychta, Robert J; Chen, Kong Y

    2014-07-01

    The technology and application of current accelerometer-based devices in physical activity (PA) research allow the capture and storage or transmission of large volumes of raw acceleration signal data. These rich data not only provide opportunities to improve PA characterisation, but also bring logistical and analytic challenges. We discuss how researchers and developers from multiple disciplines are responding to the analytic challenges and how advances in data storage, transmission and big data computing will minimise logistical challenges. These new approaches also bring the need for several paradigm shifts for PA researchers, including a shift from count-based approaches and regression calibrations for PA energy expenditure (PAEE) estimation to activity characterisation and EE estimation based on features extracted from raw acceleration signals. Furthermore, a collaborative approach towards analytic methods is proposed to facilitate PA research, which requires a shift away from multiple independent calibration studies. Finally, we make the case for a distinction between PA represented by accelerometer-based devices and PA assessed by self-report. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  3. Effects of joints in truss structures

    NASA Technical Reports Server (NTRS)

    Ikegami, R.

    1988-01-01

    The response of truss-type structures for future space applications, such as Large Deployable Reflector (LDR), will be directly affected by joint performance. Some of the objectives of research at BAC were to characterize structural joints, establish analytical approaches that incorporate joint characteristics, and experimentally establish the validity of the analytical approaches. The test approach to characterize joints for both erectable and deployable-type structures was based upon a Force State Mapping Technique. The approach pictorially shows how the nonlinear joint results can be used for equivalent linear analysis. Testing of the Space Station joints developed at LaRC (a hinged joint at 2 Hz and a clevis joint at 2 Hz) successfully revealed the nonlinear characteristics of the joints. The Space Station joints were effectively linear when loaded to plus or minus 500 pounds with a corresponding displacement of about plus or minus 0.0015 inch. It was indicated that good linear joints exist which are compatible with errected structures, but that difficulty may be encountered if nonlinear-type joints are incorporated in the structure.

  4. Analytic hierarchy process-based approach for selecting a Pareto-optimal solution of a multi-objective, multi-site supply-chain planning problem

    NASA Astrophysics Data System (ADS)

    Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi

    2017-07-01

    The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.

  5. Causal inference as an emerging statistical approach in neurology: an example for epilepsy in the elderly.

    PubMed

    Moura, Lidia Mvr; Westover, M Brandon; Kwasnik, David; Cole, Andrew J; Hsu, John

    2017-01-01

    The elderly population faces an increasing number of cases of chronic neurological conditions, such as epilepsy and Alzheimer's disease. Because the elderly with epilepsy are commonly excluded from randomized controlled clinical trials, there are few rigorous studies to guide clinical practice. When the elderly are eligible for trials, they either rarely participate or frequently have poor adherence to therapy, thus limiting both generalizability and validity. In contrast, large observational data sets are increasingly available, but are susceptible to bias when using common analytic approaches. Recent developments in causal inference-analytic approaches also introduce the possibility of emulating randomized controlled trials to yield valid estimates. We provide a practical example of the application of the principles of causal inference to a large observational data set of patients with epilepsy. This review also provides a framework for comparative-effectiveness research in chronic neurological conditions.

  6. Quantitative PCR for genetic markers of human fecal pollution

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantification approach. We report the development of quantitative PCR assays for enumeration of two recently described hum...

  7. Mitigation methods for temporary concrete traffic barrier effects on flood water flows.

    DOT National Transportation Integrated Search

    2011-07-01

    A combined experimental and analytical approach was put together to evaluate the hydraulic performance and : stability of TxDOT standard and modified temporary concrete traffic barriers (TCTBs) in extreme flood. : Rating curves are developed for diff...

  8. Trainee Characteristics and Perceptions of HIV/AIDS Training Quality.

    ERIC Educational Resources Information Center

    Panter, A. T.; Huba, G. J.; Melchior, Lisa A.; Anderson, Donna; Driscoll, Mary; German, Victor F.; Henderson, Harold; Henderson, Ron; Lalonde, Bernadette; Uldall, Karnina K.; Zalumas, Jacqueline

    2000-01-01

    Reports findings from 7 HIV/AIDS education and training projects involving more than 600 training sessions. Trainee characteristics were related to their assessments of training quality, using a regression decision-tree analytic approach. Discusses implications for curriculum development. (SLD)

  9. Autonomous Soil Assessment System: A Data-Driven Approach to Planetary Mobility Hazard Detection

    NASA Astrophysics Data System (ADS)

    Raimalwala, K.; Faragalli, M.; Reid, E.

    2018-04-01

    The Autonomous Soil Assessment System predicts mobility hazards for rovers. Its development and performance are presented, with focus on its data-driven models, machine learning algorithms, and real-time sensor data fusion for predictive analytics.

  10. Role of chromatography in the development of Standard Reference Materials for organic analysis.

    PubMed

    Wise, Stephen A; Phinney, Karen W; Sander, Lane C; Schantz, Michele M

    2012-10-26

    The certification of chemical constituents in natural-matrix Standard Reference Materials (SRMs) at the National Institute of Standards and Technology (NIST) can require the use of two or more independent analytical methods. The independence among the methods is generally achieved by taking advantage of differences in extraction, separation, and detection selectivity. This review describes the development of the independent analytical methods approach at NIST, and its implementation in the measurement of organic constituents such as contaminants in environmental materials, nutrients and marker compounds in food and dietary supplement matrices, and health diagnostic and nutritional assessment markers in human serum. The focus of this review is the important and critical role that separation science techniques play in achieving the necessary independence of the analytical steps in the measurement of trace-level organic constituents in natural matrix SRMs. Published by Elsevier B.V.

  11. Analytical Chemistry in the Regulatory Science of Medical Devices.

    PubMed

    Wang, Yi; Guan, Allan; Wickramasekara, Samanthi; Phillips, K Scott

    2018-06-12

    In the United States, regulatory science is the science of developing new tools, standards, and approaches to assess the safety, efficacy, quality, and performance of all Food and Drug Administration-regulated products. Good regulatory science facilitates consumer access to innovative medical devices that are safe and effective throughout the Total Product Life Cycle (TPLC). Because the need to measure things is fundamental to the regulatory science of medical devices, analytical chemistry plays an important role, contributing to medical device technology in two ways: It can be an integral part of an innovative medical device (e.g., diagnostic devices), and it can be used to support medical device development throughout the TPLC. In this review, we focus on analytical chemistry as a tool for the regulatory science of medical devices. We highlight recent progress in companion diagnostics, medical devices on chips for preclinical testing, mass spectrometry for postmarket monitoring, and detection/characterization of bacterial biofilm to prevent infections.

  12. Visual analytics as a translational cognitive science.

    PubMed

    Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard

    2011-07-01

    Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.

  13. Delamination Assessment Tool for Spacecraft Composite Structures

    NASA Astrophysics Data System (ADS)

    Portela, Pedro; Preller, Fabian; Wittke, Henrik; Sinnema, Gerben; Camanho, Pedro; Turon, Albert

    2012-07-01

    Fortunately only few cases are known where failure of spacecraft structures due to undetected damage has resulted in a loss of spacecraft and launcher mission. However, several problems related to damage tolerance and in particular delamination of composite materials have been encountered during structure development of various ESA projects and qualification testing. To avoid such costly failures during development, launch or service of spacecraft, launcher and reusable launch vehicles structures a comprehensive damage tolerance verification approach is needed. In 2009, the European Space Agency (ESA) initiated an activity called “Delamination Assessment Tool” which is led by the Portuguese company HPS Lda and includes academic and industrial partners. The goal of this study is the development of a comprehensive damage tolerance verification approach for launcher and reusable launch vehicles (RLV) structures, addressing analytical and numerical methodologies, material-, subcomponent- and component testing, as well as non-destructive inspection. The study includes a comprehensive review of current industrial damage tolerance practice resulting from ECSS and NASA standards, the development of new Best Practice Guidelines for analysis, test and inspection methods and the validation of these with a real industrial case study. The paper describes the main findings of this activity so far and presents a first iteration of a Damage Tolerance Verification Approach, which includes the introduction of novel analytical and numerical tools at an industrial level. This new approach is being put to the test using real industrial case studies provided by the industrial partners, MT Aerospace, RUAG Space and INVENT GmbH

  14. "Dip-and-read" paper-based analytical devices using distance-based detection with color screening.

    PubMed

    Yamada, Kentaro; Citterio, Daniel; Henry, Charles S

    2018-05-15

    An improved paper-based analytical device (PAD) using color screening to enhance device performance is described. Current detection methods for PADs relying on the distance-based signalling motif can be slow due to the assay time being limited by capillary flow rates that wick fluid through the detection zone. For traditional distance-based detection motifs, analysis can take up to 45 min for a channel length of 5 cm. By using a color screening method, quantification with a distance-based PAD can be achieved in minutes through a "dip-and-read" approach. A colorimetric indicator line deposited onto a paper substrate using inkjet-printing undergoes a concentration-dependent colorimetric response for a given analyte. This color intensity-based response has been converted to a distance-based signal by overlaying a color filter with a continuous color intensity gradient matching the color of the developed indicator line. As a proof-of-concept, Ni quantification in welding fume was performed as a model assay. The results of multiple independent user testing gave mean absolute percentage error and average relative standard deviations of 10.5% and 11.2% respectively, which were an improvement over analysis based on simple visual color comparison with a read guide (12.2%, 14.9%). In addition to the analytical performance comparison, an interference study and a shelf life investigation were performed to further demonstrate practical utility. The developed system demonstrates an alternative detection approach for distance-based PADs enabling fast (∼10 min), quantitative, and straightforward assays.

  15. High-frequency phase shift measurement greatly enhances the sensitivity of QCM immunosensors.

    PubMed

    March, Carmen; García, José V; Sánchez, Ángel; Arnau, Antonio; Jiménez, Yolanda; García, Pablo; Manclús, Juan J; Montoya, Ángel

    2015-03-15

    In spite of being widely used for in liquid biosensing applications, sensitivity improvement of conventional (5-20MHz) quartz crystal microbalance (QCM) sensors remains an unsolved challenging task. With the help of a new electronic characterization approach based on phase change measurements at a constant fixed frequency, a highly sensitive and versatile high fundamental frequency (HFF) QCM immunosensor has successfully been developed and tested for its use in pesticide (carbaryl and thiabendazole) analysis. The analytical performance of several immunosensors was compared in competitive immunoassays taking carbaryl insecticide as the model analyte. The highest sensitivity was exhibited by the 100MHz HFF-QCM carbaryl immunosensor. When results were compared with those reported for 9MHz QCM, analytical parameters clearly showed an improvement of one order of magnitude for sensitivity (estimated as the I50 value) and two orders of magnitude for the limit of detection (LOD): 30μgl(-1) vs 0.66μgL(-1)I50 value and 11μgL(-1) vs 0.14μgL(-1) LOD, for 9 and 100MHz, respectively. For the fungicide thiabendazole, I50 value was roughly the same as that previously reported for SPR under the same biochemical conditions, whereas LOD improved by a factor of 2. The analytical performance achieved by high frequency QCM immunosensors surpassed those of conventional QCM and SPR, closely approaching the most sensitive ELISAs. The developed 100MHz QCM immunosensor strongly improves sensitivity in biosensing, and therefore can be considered as a very promising new analytical tool for in liquid applications where highly sensitive detection is required. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Using Multiple Lenses to Examine the Development of Beginning Biology Teachers' Pedagogical Content Knowledge for Teaching Natural Selection Simulations

    NASA Astrophysics Data System (ADS)

    Sickel, Aaron J.; Friedrichsen, Patricia

    2018-02-01

    Pedagogical content knowledge (PCK) has become a useful construct to examine science teacher learning. Yet, researchers conceptualize PCK development in different ways. The purpose of this longitudinal study was to use three analytic lenses to understand the development of three beginning biology teachers' PCK for teaching natural selection simulations. We observed three early-career biology teachers as they taught natural selection in their respective school contexts over two consecutive years. Data consisted of six interviews with each participant. Using the PCK model developed by Magnusson et al. (1999), we examined topic-specific PCK development utilizing three different lenses: (1) expansion of knowledge within an individual knowledge base, (2) integration of knowledge across knowledge bases, and (3) knowledge that explicitly addressed core concepts of natural selection. We found commonalities across the participants, yet each lens was also useful to understand the influence of different factors (e.g., orientation, subject matter preparation, and the idiosyncratic nature of teacher knowledge) on PCK development. This multi-angle approach provides implications for considering the quality of beginning science teachers' knowledge and future research on PCK development. We conclude with an argument that explicitly communicating lenses used to understand PCK development will help the research community compare analytic approaches and better understand the nature of science teacher learning.

  17. On the development and benchmarking of an approach to model gas transport in fractured media with immobile water storage

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Ortiz, J. P.; Pandey, S.; Karra, S.; Viswanathan, H. S.; Stauffer, P. H.; Anderson, D. N.; Bradley, C. R.

    2017-12-01

    In unsaturated fractured media, the rate of gas transport is much greater than liquid transport in many applications (e.g., soil vapor extraction operations, methane leaks from hydraulic fracking, shallow CO2 transport from geologic sequestration operations, and later-time radionuclide gas transport from underground nuclear explosions). However, the relatively immobile pore water can inhibit or promote gas transport for soluble constituents by providing storage. In scenarios with constant pressure gradients, the gas transport will be retarded. In scenarios with reversing pressure gradients (i.e. barometric pressure variations) pore water storage can enhance gas transport by providing a ratcheting mechanism. Recognizing the computational efficiency that can be gained using a single-phase model and the necessity of considering pore water storage, we develop a Richard's solution approach that includes kinetic dissolution/volatilization of constituents. Henry's Law governs the equilibrium gaseous/aqueous phase partitioning in the approach. The approach is implemented in a development branch of the PFLOTRAN simulator. We verify the approach with analytical solutions of: (1) 1D gas diffusion, (2) 1D gas advection, (3) sinusoidal barometric pumping of a fracture, and (4) gas transport along a fracture with uniform flow and diffusive walls. We demonstrate the retardation of gas transport in cases with constant pressure gradients and the enhancement of gas transport with reversing pressure gradients. The figure presents the verification of our approach to the analytical solution of barometric pumping of a fracture from Nilson et al (1991) where the x-axis "Horizontal axis" is the distance into the matrix block from the fracture.

  18. On the development and benchmarking of an approach to model gas transport in fractured media with immobile water storage

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Ortiz, J. P.; Pandey, S.; Karra, S.; Viswanathan, H. S.; Stauffer, P. H.; Anderson, D. N.; Bradley, C. R.

    2016-12-01

    In unsaturated fractured media, the rate of gas transport is much greater than liquid transport in many applications (e.g., soil vapor extraction operations, methane leaks from hydraulic fracking, shallow CO2 transport from geologic sequestration operations, and later-time radionuclide gas transport from underground nuclear explosions). However, the relatively immobile pore water can inhibit or promote gas transport for soluble constituents by providing storage. In scenarios with constant pressure gradients, the gas transport will be retarded. In scenarios with reversing pressure gradients (i.e. barometric pressure variations) pore water storage can enhance gas transport by providing a ratcheting mechanism. Recognizing the computational efficiency that can be gained using a single-phase model and the necessity of considering pore water storage, we develop a Richard's solution approach that includes kinetic dissolution/volatilization of constituents. Henry's Law governs the equilibrium gaseous/aqueous phase partitioning in the approach. The approach is implemented in a development branch of the PFLOTRAN simulator. We verify the approach with analytical solutions of: (1) 1D gas diffusion, (2) 1D gas advection, (3) sinusoidal barometric pumping of a fracture, and (4) gas transport along a fracture with uniform flow and diffusive walls. We demonstrate the retardation of gas transport in cases with constant pressure gradients and the enhancement of gas transport with reversing pressure gradients. The figure presents the verification of our approach to the analytical solution of barometric pumping of a fracture from Nilson et al (1991) where the x-axis "Horizontal axis" is the distance into the matrix block from the fracture.

  19. The Life Course Perspective on Drug Use: A Conceptual Framework for Understanding Drug Use Trajectories

    ERIC Educational Resources Information Center

    Hser, Yih-Ing; Longshore, Douglas; Anglin, M. Douglas

    2007-01-01

    This article discusses the life course perspective on drug use, including conceptual and analytic issues involved in developing the life course framework to explain how drug use trajectories develop during an individual's lifetime and how this knowledge can guide new research and approaches to management of drug dependence. Central concepts…

  20. Dynamics of Complexity and Accuracy: A Longitudinal Case Study of Advanced Untutored Development

    ERIC Educational Resources Information Center

    Polat, Brittany; Kim, Youjin

    2014-01-01

    This longitudinal case study follows a dynamic systems approach to investigate an under-studied research area in second language acquisition, the development of complexity and accuracy for an advanced untutored learner of English. Using the analytical tools of dynamic systems theory (Verspoor et al. 2011) within the framework of complexity,…

  1. Checking Equity: Why Differential Item Functioning Analysis Should Be a Routine Part of Developing Conceptual Assessments

    ERIC Educational Resources Information Center

    Martinková, Patricia; Drabinová, Adéla; Liaw, Yuan-Ling; Sanders, Elizabeth A.; McFarland, Jenny L.; Price, Rebecca M.

    2017-01-01

    We provide a tutorial on differential item functioning (DIF) analysis, an analytic method useful for identifying potentially biased items in assessments. After explaining a number of methodological approaches, we test for gender bias in two scenarios that demonstrate why DIF analysis is crucial for developing assessments, particularly because…

  2. Model documentation: Electricity Market Module, Electricity Fuel Dispatch Submodule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report documents the objectives, analytical approach and development of the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  3. Promoting Appropriate Behavior in Daily Life Contexts Using Functional Analytic Psychotherapy in Early-Adolescent Children

    ERIC Educational Resources Information Center

    Cattivelli, Roberto; Tirelli, Valentina; Berardo, Federica; Perini, Silvia

    2012-01-01

    The topics of social skills development in adolescents and ways to promote this process have been amply investigated in both the clinical and educational literature. Yet, although this line of research has led to the development of many different approaches for this population, most have shown little effectiveness in promoting further social…

  4. Cotinine analytical workshop report: consideration of analytical methods for determining cotinine in human body fluids as a measure of passive exposure to tobacco smoke.

    PubMed Central

    Watts, R R; Langone, J J; Knight, G J; Lewtas, J

    1990-01-01

    A two-day technical workshop was convened November 10-11, 1986, to discuss analytical approaches for determining trace amounts of cotinine in human body fluids resulting from passive exposure to environmental tobacco smoke (ETS). The workshop, jointly sponsored by the U.S. Environmental Protection Agency and Centers for Disease Control, was attended by scientists with expertise in cotinine analytical methodology and/or conduct of human monitoring studies related to ETS. The workshop format included technical presentations, separate panel discussions on chromatography and immunoassay analytical approaches, and group discussions related to the quality assurance/quality control aspects of future monitoring programs. This report presents a consensus of opinion on general issues before the workshop panel participants and also a detailed comparison of several analytical approaches being used by the various represented laboratories. The salient features of the chromatography and immunoassay analytical methods are discussed separately. PMID:2190812

  5. Cross-Disciplinary Consultancy to Bridge Public Health Technical Needs and Analytic Developers: Asyndromic Surveillance Use Case

    PubMed Central

    Faigen, Zachary; Deyneka, Lana; Ising, Amy; Neill, Daniel; Conway, Mike; Fairchild, Geoffrey; Gunn, Julia; Swenson, David; Painter, Ian; Johnson, Lauren; Kiley, Chris; Streichert, Laura

    2015-01-01

    Introduction: We document a funded effort to bridge the gap between constrained scientific challenges of public health surveillance and methodologies from academia and industry. Component tasks are the collection of epidemiologists’ use case problems, multidisciplinary consultancies to refine them, and dissemination of problem requirements and shareable datasets. We describe an initial use case and consultancy as a concrete example and challenge to developers. Materials and Methods: Supported by the Defense Threat Reduction Agency Biosurveillance Ecosystem project, the International Society for Disease Surveillance formed an advisory group to select tractable use case problems and convene inter-disciplinary consultancies to translate analytic needs into well-defined problems and to promote development of applicable solution methods. The initial consultancy’s focus was a problem originated by the North Carolina Department of Health and its NC DETECT surveillance system: Derive a method for detection of patient record clusters worthy of follow-up based on free-text chief complaints and without syndromic classification. Results: Direct communication between public health problem owners and analytic developers was informative to both groups and constructive for the solution development process. The consultancy achieved refinement of the asyndromic detection challenge and of solution requirements. Participants summarized and evaluated solution approaches and discussed dissemination and collaboration strategies. Practice Implications: A solution meeting the specification of the use case described above could improve human monitoring efficiency with expedited warning of events requiring follow-up, including otherwise overlooked events with no syndromic indicators. This approach can remove obstacles to collaboration with efficient, minimal data-sharing and without costly overhead. PMID:26834939

  6. Cross-Disciplinary Consultancy to Bridge Public Health Technical Needs and Analytic Developers: Asyndromic Surveillance Use Case.

    PubMed

    Faigen, Zachary; Deyneka, Lana; Ising, Amy; Neill, Daniel; Conway, Mike; Fairchild, Geoffrey; Gunn, Julia; Swenson, David; Painter, Ian; Johnson, Lauren; Kiley, Chris; Streichert, Laura; Burkom, Howard

    2015-01-01

    We document a funded effort to bridge the gap between constrained scientific challenges of public health surveillance and methodologies from academia and industry. Component tasks are the collection of epidemiologists' use case problems, multidisciplinary consultancies to refine them, and dissemination of problem requirements and shareable datasets. We describe an initial use case and consultancy as a concrete example and challenge to developers. Supported by the Defense Threat Reduction Agency Biosurveillance Ecosystem project, the International Society for Disease Surveillance formed an advisory group to select tractable use case problems and convene inter-disciplinary consultancies to translate analytic needs into well-defined problems and to promote development of applicable solution methods. The initial consultancy's focus was a problem originated by the North Carolina Department of Health and its NC DETECT surveillance system: Derive a method for detection of patient record clusters worthy of follow-up based on free-text chief complaints and without syndromic classification. Direct communication between public health problem owners and analytic developers was informative to both groups and constructive for the solution development process. The consultancy achieved refinement of the asyndromic detection challenge and of solution requirements. Participants summarized and evaluated solution approaches and discussed dissemination and collaboration strategies. A solution meeting the specification of the use case described above could improve human monitoring efficiency with expedited warning of events requiring follow-up, including otherwise overlooked events with no syndromic indicators. This approach can remove obstacles to collaboration with efficient, minimal data-sharing and without costly overhead.

  7. A two-dimensional analytical model of vapor intrusion involving vertical heterogeneity.

    PubMed

    Yao, Yijun; Verginelli, Iason; Suuberg, Eric M

    2017-05-01

    In this work, we present an analytical chlorinated vapor intrusion (CVI) model that can estimate source-to-indoor air concentration attenuation by simulating two-dimensional (2-D) vapor concentration profile in vertically heterogeneous soils overlying a homogenous vapor source. The analytical solution describing the 2-D soil gas transport was obtained by applying a modified Schwarz-Christoffel mapping method. A partial field validation showed that the developed model provides results (especially in terms of indoor emission rates) in line with the measured data from a case involving a building overlying a layered soil. In further testing, it was found that the new analytical model can very closely replicate the results of three-dimensional (3-D) numerical models at steady state in scenarios involving layered soils overlying homogenous groundwater sources. By contrast, by adopting a two-layer approach (capillary fringe and vadose zone) as employed in the EPA implementation of the Johnson and Ettinger model, the spatially and temporally averaged indoor concentrations in the case of groundwater sources can be higher than the ones estimated by the numerical model up to two orders of magnitude. In short, the model proposed in this work can represent an easy-to-use tool that can simulate the subsurface soil gas concentration in layered soils overlying a homogenous vapor source while keeping the simplicity of an analytical approach that requires much less computational effort.

  8. Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.

    PubMed

    Buske, Christine; Gerlai, Robert

    2014-08-30

    Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.

  9. Political and clinical developments in analytical psychology, 1972-2014: subjectivity, equality and diversity-inside and outside the consulting room.

    PubMed

    Samuels, Andrew

    2014-11-01

    Utilizing Jung's idea of theory as a 'personal confession', the author charts his own development as a theorist, establishing links between his personal history and his ideas. Such links include his relationship with both parents, his sexuality, his cultural heritage, and his fascination with Tricksters and with Hermes. There follows a substantial critical interrogation of what the author discerns as the two main lines of clinical theorizing in contemporary analytical psychotherapy: interpretation of transference-countertransference, and the relational approach. His conclusion is that neither is superior to the other and neither is in fact adequate as a basis for clinical work. The focus then shifts to explore a range of political and social aspects of the clinical project of analytical psychology: economic inequality, diversity within the professional field, and Jung's controversial ideas about Jews and Africans. The author calls for an apology from the 'Jungian community' for remarks about Africans analogous to the apology already issued for remarks about Jews. The paper is dedicated to the author's friend Fred Plaut (1913-2009). © 2014, The Society of Analytical Psychology.

  10. Large space structures controls research and development at Marshall Space Flight Center: Status and future plans

    NASA Technical Reports Server (NTRS)

    Buchanan, H. J.

    1983-01-01

    Work performed in Large Space Structures Controls research and development program at Marshall Space Flight Center is described. Studies to develop a multilevel control approach which supports a modular or building block approach to the buildup of space platforms are discussed. A concept has been developed and tested in three-axis computer simulation utilizing a five-body model of a basic space platform module. Analytical efforts have continued to focus on extension of the basic theory and subsequent application. Consideration is also given to specifications to evaluate several algorithms for controlling the shape of Large Space Structures.

  11. Quantitative determination of carcinogenic mycotoxins in human and animal biological matrices and animal-derived foods using multi-mycotoxin and analyte-specific high performance liquid chromatography-tandem mass spectrometric methods.

    PubMed

    Cao, Xiaoqin; Li, Xiaofei; Li, Jian; Niu, Yunhui; Shi, Lu; Fang, Zhenfeng; Zhang, Tao; Ding, Hong

    2018-01-15

    A sensitive and reliable multi-mycotoxin-based method was developed to identify and quantify several carcinogenic mycotoxins in human blood and urine, as well as edible animal tissues, including muscle and liver tissue from swine and chickens, using liquid chromatography-tandem mass spectrometry (LC-MS/MS). For the toxicokinetic studies with individual mycotoxins, highly sensitive analyte-specific LC-MS/MS methods were developed for rat plasma and urine. Sample purification consisted of a rapid 'dilute and shoot' approach in urine samples, a simple 'dilute, evaporate and shoot' approach in plasma samples and a 'QuEChERS' procedure in edible animal tissues. The multi-mycotoxin and analyte-specific methods were validated in-house: The limits of detection (LOD) for the multi-mycotoxin and analyte-specific methods ranged from 0.02 to 0.41 μg/kg (μg/L) and 0.01 to 0.19 μg/L, respectively, and limits of quantification (LOQ) between 0.10 to 1.02 μg/kg (μg/L) and 0.09 to 0.47 μg/L, respectively. Apparent recoveries of the samples spiked with 0.25 to 4 μg/kg (μg/L) ranged from 60.1% to 109.8% with relative standard deviations below 15%. The methods were successfully applied to real samples. To the best of our knowledge, this is the first study carried out using a small group of patients from the Chinese population with hepatocellular carcinoma to assess their exposure to carcinogenic mycotoxins using biomarkers. Finally, the multi-mycotoxin method is a useful analytical method for assessing exposure to mycotoxins edible in animal tissues. The analyte-specific methods could be useful during toxicokinetic and toxicological studies. Copyright © 2017. Published by Elsevier B.V.

  12. Quantifying Acoustic Impacts on Marine Mammals and Sea Turtles: Methods and Analytical Approach for Phase III Training and Testing

    DTIC Science & Technology

    2017-06-16

    Acoustic Impacts on Marine Mammals and Sea Turtles: Methods and Analytical Approach for Phase III Training and Testing Sarah A. Blackstock Joseph O...December 2017 4. TITLE AND SUBTITLE Quantifying Acoustic Impacts on Marine Mammals and Sea Turtles: Methods and Analytical Approach for Phase III...Navy’s Phase III Study Areas as described in each Environmental Impact Statement/ Overseas Environmental Impact Statement and describes the methods

  13. Improving early cycle economic evaluation of diagnostic technologies.

    PubMed

    Steuten, Lotte M G; Ramsey, Scott D

    2014-08-01

    The rapidly increasing range and expense of new diagnostics, compels consideration of a different, more proactive approach to health economic evaluation of diagnostic technologies. Early cycle economic evaluation is a decision analytic approach to evaluate technologies in development so as to increase the return on investment as well as patient and societal impact. This paper describes examples of 'early cycle economic evaluations' as applied to diagnostic technologies and highlights challenges in its real-time application. It shows that especially in the field of diagnostics, with rapid technological developments and a changing regulatory climate, early cycle economic evaluation can have a guiding role to improve the efficiency of the diagnostics innovation process. In the next five years the attention will move beyond the methodological and analytic challenges of early cycle economic evaluation towards the challenge of effectively applying it to improve diagnostic research and development and patient value. Future work in this area should therefore be 'strong on principles and soft on metrics', that is, the metrics that resonate most clearly with the various decision makers in this field.

  14. Quality by design (QbD), Process Analytical Technology (PAT), and design of experiment applied to the development of multifunctional sunscreens.

    PubMed

    Peres, Daniela D'Almeida; Ariede, Maira Bueno; Candido, Thalita Marcilio; de Almeida, Tania Santos; Lourenço, Felipe Rebello; Consiglieri, Vladi Olga; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Baby, André Rolim

    2017-02-01

    Multifunctional formulations are of great importance to ensure better skin protection from harm caused by ultraviolet radiation (UV). Despite the advantages of Quality by Design and Process Analytical Technology approaches to the development and optimization of new products, we found in the literature only a few studies concerning their applications in cosmetic product industry. Thus, in this research work, we applied the QbD and PAT approaches to the development of multifunctional sunscreens containing bemotrizinol, ethylhexyl triazone, and ferulic acid. In addition, UV transmittance method was applied to assess qualitative and quantitative critical quality attributes of sunscreens using chemometrics analyses. Linear discriminant analysis allowed classifying unknown formulations, which is useful for investigation of counterfeit and adulteration. Simultaneous quantification of ethylhexyl triazone, bemotrizinol, and ferulic acid presented at the formulations was performed using PLS regression. This design allowed us to verify the compounds in isolation and in combination and to prove that the antioxidant action of ferulic acid as well as the sunscreen actions, since the presence of this component increased 90% of antioxidant activity in vitro.

  15. Combining analytical hierarchy process and agglomerative hierarchical clustering in search of expert consensus in green corridors development management.

    PubMed

    Shapira, Aviad; Shoshany, Maxim; Nir-Goldenberg, Sigal

    2013-07-01

    Environmental management and planning are instrumental in resolving conflicts arising between societal needs for economic development on the one hand and for open green landscapes on the other hand. Allocating green corridors between fragmented core green areas may provide a partial solution to these conflicts. Decisions regarding green corridor development require the assessment of alternative allocations based on multiple criteria evaluations. Analytical Hierarchy Process provides a methodology for both a structured and consistent extraction of such evaluations and for the search for consensus among experts regarding weights assigned to the different criteria. Implementing this methodology using 15 Israeli experts-landscape architects, regional planners, and geographers-revealed inherent differences in expert opinions in this field beyond professional divisions. The use of Agglomerative Hierarchical Clustering allowed to identify clusters representing common decisions regarding criterion weights. Aggregating the evaluations of these clusters revealed an important dichotomy between a pragmatist approach that emphasizes the weight of statutory criteria and an ecological approach that emphasizes the role of the natural conditions in allocating green landscape corridors.

  16. Combining Analytical Hierarchy Process and Agglomerative Hierarchical Clustering in Search of Expert Consensus in Green Corridors Development Management

    NASA Astrophysics Data System (ADS)

    Shapira, Aviad; Shoshany, Maxim; Nir-Goldenberg, Sigal

    2013-07-01

    Environmental management and planning are instrumental in resolving conflicts arising between societal needs for economic development on the one hand and for open green landscapes on the other hand. Allocating green corridors between fragmented core green areas may provide a partial solution to these conflicts. Decisions regarding green corridor development require the assessment of alternative allocations based on multiple criteria evaluations. Analytical Hierarchy Process provides a methodology for both a structured and consistent extraction of such evaluations and for the search for consensus among experts regarding weights assigned to the different criteria. Implementing this methodology using 15 Israeli experts—landscape architects, regional planners, and geographers—revealed inherent differences in expert opinions in this field beyond professional divisions. The use of Agglomerative Hierarchical Clustering allowed to identify clusters representing common decisions regarding criterion weights. Aggregating the evaluations of these clusters revealed an important dichotomy between a pragmatist approach that emphasizes the weight of statutory criteria and an ecological approach that emphasizes the role of the natural conditions in allocating green landscape corridors.

  17. Quantification of free and total desmosine and isodesmosine in human urine by liquid chromatography tandem mass spectrometry: a comparison of the surrogate-analyte and the surrogate-matrix approach for quantitation.

    PubMed

    Ongay, Sara; Hendriks, Gert; Hermans, Jos; van den Berge, Maarten; ten Hacken, Nick H T; van de Merbel, Nico C; Bischoff, Rainer

    2014-01-24

    In spite of the data suggesting the potential of urinary desmosine (DES) and isodesmosine (IDS) as biomarkers for elevated lung elastic fiber turnover, further validation in large-scale studies of COPD populations, as well as the analysis of longitudinal samples is required. Validated analytical methods that allow the accurate and precise quantification of DES and IDS in human urine are mandatory in order to properly evaluate the outcome of such clinical studies. In this work, we present the development and full validation of two methods that allow DES and IDS measurement in human urine, one for the free and one for the total (free+peptide-bound) forms. To this end we compared the two principle approaches that are used for the absolute quantification of endogenous compounds in biological samples, analysis against calibrators containing authentic analyte in surrogate matrix or containing surrogate analyte in authentic matrix. The validated methods were employed for the analysis of a small set of samples including healthy never-smokers, healthy current-smokers and COPD patients. This is the first time that the analysis of urinary free DES, free IDS, total DES, and total IDS has been fully validated and that the surrogate analyte approach has been evaluated for their quantification in biological samples. Results indicate that the presented methods have the necessary quality and level of validation to assess the potential of urinary DES and IDS levels as biomarkers for the progression of COPD and the effect of therapeutic interventions. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Multi-site study of additive genetic effects on fractional anisotropy of cerebral white matter: Comparing meta and megaanalytical approaches for data pooling.

    PubMed

    Kochunov, Peter; Jahanshad, Neda; Sprooten, Emma; Nichols, Thomas E; Mandl, René C; Almasy, Laura; Booth, Tom; Brouwer, Rachel M; Curran, Joanne E; de Zubicaray, Greig I; Dimitrova, Rali; Duggirala, Ravi; Fox, Peter T; Hong, L Elliot; Landman, Bennett A; Lemaitre, Hervé; Lopez, Lorna M; Martin, Nicholas G; McMahon, Katie L; Mitchell, Braxton D; Olvera, Rene L; Peterson, Charles P; Starr, John M; Sussmann, Jessika E; Toga, Arthur W; Wardlaw, Joanna M; Wright, Margaret J; Wright, Susan N; Bastin, Mark E; McIntosh, Andrew M; Boomsma, Dorret I; Kahn, René S; den Braber, Anouk; de Geus, Eco J C; Deary, Ian J; Hulshoff Pol, Hilleke E; Williamson, Douglas E; Blangero, John; van 't Ent, Dennis; Thompson, Paul M; Glahn, David C

    2014-07-15

    Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  20. Prediction of In-hospital Mortality in Emergency Department Patients With Sepsis: A Local Big Data-Driven, Machine Learning Approach.

    PubMed

    Taylor, R Andrew; Pare, Joseph R; Venkatesh, Arjun K; Mowafi, Hani; Melnick, Edward R; Fleischman, William; Hall, M Kennedy

    2016-03-01

    Predictive analytics in emergency care has mostly been limited to the use of clinical decision rules (CDRs) in the form of simple heuristics and scoring systems. In the development of CDRs, limitations in analytic methods and concerns with usability have generally constrained models to a preselected small set of variables judged to be clinically relevant and to rules that are easily calculated. Furthermore, CDRs frequently suffer from questions of generalizability, take years to develop, and lack the ability to be updated as new information becomes available. Newer analytic and machine learning techniques capable of harnessing the large number of variables that are already available through electronic health records (EHRs) may better predict patient outcomes and facilitate automation and deployment within clinical decision support systems. In this proof-of-concept study, a local, big data-driven, machine learning approach is compared to existing CDRs and traditional analytic methods using the prediction of sepsis in-hospital mortality as the use case. This was a retrospective study of adult ED visits admitted to the hospital meeting criteria for sepsis from October 2013 to October 2014. Sepsis was defined as meeting criteria for systemic inflammatory response syndrome with an infectious admitting diagnosis in the ED. ED visits were randomly partitioned into an 80%/20% split for training and validation. A random forest model (machine learning approach) was constructed using over 500 clinical variables from data available within the EHRs of four hospitals to predict in-hospital mortality. The machine learning prediction model was then compared to a classification and regression tree (CART) model, logistic regression model, and previously developed prediction tools on the validation data set using area under the receiver operating characteristic curve (AUC) and chi-square statistics. There were 5,278 visits among 4,676 unique patients who met criteria for sepsis. Of the 4,222 patients in the training group, 210 (5.0%) died during hospitalization, and of the 1,056 patients in the validation group, 50 (4.7%) died during hospitalization. The AUCs with 95% confidence intervals (CIs) for the different models were as follows: random forest model, 0.86 (95% CI = 0.82 to 0.90); CART model, 0.69 (95% CI = 0.62 to 0.77); logistic regression model, 0.76 (95% CI = 0.69 to 0.82); CURB-65, 0.73 (95% CI = 0.67 to 0.80); MEDS, 0.71 (95% CI = 0.63 to 0.77); and mREMS, 0.72 (95% CI = 0.65 to 0.79). The random forest model AUC was statistically different from all other models (p ≤ 0.003 for all comparisons). In this proof-of-concept study, a local big data-driven, machine learning approach outperformed existing CDRs as well as traditional analytic techniques for predicting in-hospital mortality of ED patients with sepsis. Future research should prospectively evaluate the effectiveness of this approach and whether it translates into improved clinical outcomes for high-risk sepsis patients. The methods developed serve as an example of a new model for predictive analytics in emergency care that can be automated, applied to other clinical outcomes of interest, and deployed in EHRs to enable locally relevant clinical predictions. © 2015 by the Society for Academic Emergency Medicine.

  1. Prediction of In-hospital Mortality in Emergency Department Patients With Sepsis: A Local Big Data–Driven, Machine Learning Approach

    PubMed Central

    Taylor, R. Andrew; Pare, Joseph R.; Venkatesh, Arjun K.; Mowafi, Hani; Melnick, Edward R.; Fleischman, William; Hall, M. Kennedy

    2018-01-01

    Objectives Predictive analytics in emergency care has mostly been limited to the use of clinical decision rules (CDRs) in the form of simple heuristics and scoring systems. In the development of CDRs, limitations in analytic methods and concerns with usability have generally constrained models to a preselected small set of variables judged to be clinically relevant and to rules that are easily calculated. Furthermore, CDRs frequently suffer from questions of generalizability, take years to develop, and lack the ability to be updated as new information becomes available. Newer analytic and machine learning techniques capable of harnessing the large number of variables that are already available through electronic health records (EHRs) may better predict patient outcomes and facilitate automation and deployment within clinical decision support systems. In this proof-of-concept study, a local, big data–driven, machine learning approach is compared to existing CDRs and traditional analytic methods using the prediction of sepsis in-hospital mortality as the use case. Methods This was a retrospective study of adult ED visits admitted to the hospital meeting criteria for sepsis from October 2013 to October 2014. Sepsis was defined as meeting criteria for systemic inflammatory response syndrome with an infectious admitting diagnosis in the ED. ED visits were randomly partitioned into an 80%/20% split for training and validation. A random forest model (machine learning approach) was constructed using over 500 clinical variables from data available within the EHRs of four hospitals to predict in-hospital mortality. The machine learning prediction model was then compared to a classification and regression tree (CART) model, logistic regression model, and previously developed prediction tools on the validation data set using area under the receiver operating characteristic curve (AUC) and chi-square statistics. Results There were 5,278 visits among 4,676 unique patients who met criteria for sepsis. Of the 4,222 patients in the training group, 210 (5.0%) died during hospitalization, and of the 1,056 patients in the validation group, 50 (4.7%) died during hospitalization. The AUCs with 95% confidence intervals (CIs) for the different models were as follows: random forest model, 0.86 (95% CI = 0.82 to 0.90); CART model, 0.69 (95% CI = 0.62 to 0.77); logistic regression model, 0.76 (95% CI = 0.69 to 0.82); CURB-65, 0.73 (95% CI = 0.67 to 0.80); MEDS, 0.71 (95% CI = 0.63 to 0.77); and mREMS, 0.72 (95% CI = 0.65 to 0.79). The random forest model AUC was statistically different from all other models (p ≤ 0.003 for all comparisons). Conclusions In this proof-of-concept study, a local big data–driven, machine learning approach outperformed existing CDRs as well as traditional analytic techniques for predicting in-hospital mortality of ED patients with sepsis. Future research should prospectively evaluate the effectiveness of this approach and whether it translates into improved clinical outcomes for high-risk sepsis patients. The methods developed serve as an example of a new model for predictive analytics in emergency care that can be automated, applied to other clinical outcomes of interest, and deployed in EHRs to enable locally relevant clinical predictions. PMID:26679719

  2. Two approaches to estimating the effect of parenting on the development of executive function in early childhood.

    PubMed

    Blair, Clancy; Raver, C Cybele; Berry, Daniel J

    2014-02-01

    In the current article, we contrast 2 analytical approaches to estimate the relation of parenting to executive function development in a sample of 1,292 children assessed longitudinally between the ages of 36 and 60 months of age. Children were administered a newly developed and validated battery of 6 executive function tasks tapping inhibitory control, working memory, and attention shifting. Residualized change analysis indicated that higher quality parenting as indicated by higher scores on widely used measures of parenting at both earlier and later time points predicted more positive gain in executive function at 60 months. Latent change score models in which parenting and executive function over time were held to standards of longitudinal measurement invariance provided additional evidence of the association between change in parenting quality and change in executive function. In these models, cross-lagged paths indicated that in addition to parenting predicting change in executive function, executive function bidirectionally predicted change in parenting quality. Results were robust with the addition of covariates, including child sex, race, maternal education, and household income-to-need. Strengths and drawbacks of the 2 analytic approaches are discussed, and the findings are considered in light of emerging methodological innovations for testing the extent to which executive function is malleable and open to the influence of experience.

  3. Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches

    NASA Technical Reports Server (NTRS)

    Farassat, Fereidoun; Casper, Jay H.

    2006-01-01

    In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.

  4. Protein instability and immunogenicity: roadblocks to clinical application of injectable protein delivery systems for sustained release.

    PubMed

    Jiskoot, Wim; Randolph, Theodore W; Volkin, David B; Middaugh, C Russell; Schöneich, Christian; Winter, Gerhard; Friess, Wolfgang; Crommelin, Daan J A; Carpenter, John F

    2012-03-01

    Protein instability and immunogenicity are two main roadblocks to the clinical success of novel protein drug delivery systems. In this commentary, we discuss the need for more extensive analytical characterization in relation to concerns about protein instability in injectable drug delivery systems for sustained release. We then will briefly address immunogenicity concerns and outline current best practices for using state-of-the-art analytical assays to monitor protein stability for both conventional and novel therapeutic protein dosage forms. Next, we provide a summary of the stresses on proteins arising during preparation of drug delivery systems and subsequent in vivo release. We note the challenges and difficulties in achieving the absolute requirement of quantitatively assessing the degradation of protein molecules in a drug delivery system. We describe the potential roles for academic research in further improving protein stability and developing new analytical technologies to detect protein degradation byproducts in novel drug delivery systems. Finally, we provide recommendations for the appropriate approaches to formulation design and assay development to ensure that stable, minimally immunogenic formulations of therapeutic proteins are created. These approaches should help to increase the probability that novel drug delivery systems for sustained protein release will become more readily available as effective therapeutic agents to treat and benefit patients. Copyright © 2011 Wiley Periodicals, Inc.

  5. Current antiviral drugs and their analysis in biological materials - Part II: Antivirals against hepatitis and HIV viruses.

    PubMed

    Nováková, Lucie; Pavlík, Jakub; Chrenková, Lucia; Martinec, Ondřej; Červený, Lukáš

    2018-01-05

    This review is a Part II of the series aiming to provide comprehensive overview of currently used antiviral drugs and to show modern approaches to their analysis. While in the Part I antivirals against herpes viruses and antivirals against respiratory viruses were addressed, this part concerns antivirals against hepatitis viruses (B and C) and human immunodeficiency virus (HIV). Many novel antivirals against hepatitis C virus (HCV) and HIV have been introduced into the clinical practice over the last decade. The recent broadening portfolio of these groups of antivirals is reflected in increasing number of developed analytical methods required to meet the needs of clinical terrain. Part II summarizes the mechanisms of action of antivirals against hepatitis B virus (HBV), HCV, and HIV, their use in clinical practice, and analytical methods for individual classes. It also provides expert opinion on state of art in the field of bioanalysis of these drugs. Analytical methods reflect novelty of these chemical structures and use by far the most current approaches, such as simple and high-throughput sample preparation and fast separation, often by means of UHPLC-MS/MS. Proper method validation based on requirements of bioanalytical guidelines is an inherent part of the developed methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Dissecting the human immunologic memory for pathogens.

    PubMed

    Zielinski, Christina E; Corti, Davide; Mele, Federico; Pinto, Dora; Lanzavecchia, Antonio; Sallusto, Federica

    2011-03-01

    Studies on immunologic memory in animal models and especially in the human system are instrumental to identify mechanisms and correlates of protection necessary for vaccine development. In this article, we provide an overview of the cellular basis of immunologic memory. We also describe experimental approaches based on high throughput cell cultures, which we have developed to interrogate human memory T cells, B cells, and plasma cells. We discuss how these approaches can provide new tools and information for vaccine design, in a process that we define as 'analytic vaccinology'. © 2011 John Wiley & Sons A/S.

  7. Analytic theory of the selection mechanism in the Saffman-Taylor problem. [concerning shape of fingers in Hele-Shaw cell

    NASA Technical Reports Server (NTRS)

    Hong, D. C.; Langer, J. S.

    1986-01-01

    An analytic approach to the problem of predicting the widths of fingers in a Hele-Shaw cell is presented. The analysis is based on the WKB technique developed recently for dealing with the effects of surface tension in the problem of dendritic solidification. It is found that the relation between the dimensionless width lambda and the dimensionless group of parameters containing the surface tension, nu, has the form lambda - 1/2 = nu exp 2/3 in the limit of small nu.

  8. Flight simulator fidelity assessment in a rotorcraft lateral translation maneuver

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Malsbury, T.; Atencio, A., Jr.

    1992-01-01

    A model-based methodology for assessing flight simulator fidelity in closed-loop fashion is exercised in analyzing a rotorcraft low-altitude maneuver for which flight test and simulation results were available. The addition of a handling qualities sensitivity function to a previously developed model-based assessment criteria allows an analytical comparison of both performance and handling qualities between simulation and flight test. Model predictions regarding the existence of simulator fidelity problems are corroborated by experiment. The modeling approach is used to assess analytically the effects of modifying simulator characteristics on simulator fidelity.

  9. Developments in Impeller/Seal Secondary Flow Path Modeling for Dynamic Force Coefficients and Leakage

    NASA Technical Reports Server (NTRS)

    Palazzolo, Alan; Bhattacharya, Avijit; Athavale, Mahesh; Venkataraman, Balaji; Ryan, Steve; Funston, Kerry

    1997-01-01

    This paper highlights bulk flow and CFD-based models prepared to calculate force and leakage properties for seals and shrouded impeller leakage paths. The bulk flow approach uses a Hir's based friction model and the CFD approach solves the Navier Stoke's (NS) equation with a finite whirl orbit or via analytical perturbation. The results show good agreement in most instances with available benchmarks.

  10. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1992-01-01

    Research conducted during the period from July 1991 through December 1992 is covered. A method based upon the quasi-analytical approach was developed for computing the aerodynamic sensitivity coefficients of three dimensional wings in transonic and subsonic flow. In addition, the method computes for comparison purposes the aerodynamic sensitivity coefficients using the finite difference approach. The accuracy and validity of the methods are currently under investigation.

  11. Shuttle payload bay dynamic environments: Summary and conclusion report for STS flights 1-5 and 9

    NASA Technical Reports Server (NTRS)

    Oconnell, M.; Garba, J.; Kern, D.

    1984-01-01

    The vibration, acoustic and low frequency loads data from the first 5 shuttle flights are presented. The engineering analysis of that data is also presented. Vibroacoustic data from STS-9 are also presented because they represent the only data taken on a large payload. Payload dynamic environment predictions developed by the participation of various NASA and industrial centers are presented along with a comparison of analytical loads methodology predictions with flight data, including a brief description of the methodologies employed in developing those predictions for payloads. The review of prediction methodologies illustrates how different centers have approached the problems of developing shuttle dynamic environmental predictions and criteria. Ongoing research activities related to the shuttle dynamic environments are also described. Analytical software recently developed for the prediction of payload acoustic and vibration environments are also described.

  12. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  13. Combining Model-Based and Feature-Driven Diagnosis Approaches - A Case Study on Electromechanical Actuators

    NASA Technical Reports Server (NTRS)

    Narasimhan, Sriram; Roychoudhury, Indranil; Balaban, Edward; Saxena, Abhinav

    2010-01-01

    Model-based diagnosis typically uses analytical redundancy to compare predictions from a model against observations from the system being diagnosed. However this approach does not work very well when it is not feasible to create analytic relations describing all the observed data, e.g., for vibration data which is usually sampled at very high rates and requires very detailed finite element models to describe its behavior. In such cases, features (in time and frequency domains) that contain diagnostic information are extracted from the data. Since this is a computationally intensive process, it is not efficient to extract all the features all the time. In this paper we present an approach that combines the analytic model-based and feature-driven diagnosis approaches. The analytic approach is used to reduce the set of possible faults and then features are chosen to best distinguish among the remaining faults. We describe an implementation of this approach on the Flyable Electro-mechanical Actuator (FLEA) test bed.

  14. From analytic inversion to contemporary IMRT optimization: Radiation therapy planning revisited from a mathematical perspective

    PubMed Central

    Censor, Yair; Unkelbach, Jan

    2011-01-01

    In this paper we look at the development of radiation therapy treatment planning from a mathematical point of view. Historically, planning for Intensity-Modulated Radiation Therapy (IMRT) has been considered as an inverse problem. We discuss first the two fundamental approaches that have been investigated to solve this inverse problem: Continuous analytic inversion techniques on one hand, and fully-discretized algebraic methods on the other hand. In the second part of the paper, we review another fundamental question which has been subject to debate from the beginning of IMRT until the present day: The rotation therapy approach versus fixed angle IMRT. This builds a bridge from historic work on IMRT planning to contemporary research in the context of Intensity-Modulated Arc Therapy (IMAT). PMID:21616694

  15. An analytical approach to the rise velocity of periodic bubble trains in non-Newtonian fluids.

    PubMed

    Frank, X; Li, H Z; Funfschilling, D

    2005-01-01

    The present study aims at providing insight into the acceleration mechanism of a bubble chain rising in shear-thinning viscoelastic fluids. The experimental investigation by the Particle Image Velocimetry (PIV), birefringence visualisation and rheological simulation shows that two aspects are central to bubble interactions in such media: the stress creation by the passage of bubbles, and their relaxation due to the fluid's memory forming an evanescent corridor of reduced viscosity. Interactions between bubbles were taken into account mainly through a linear superposition of the stress evolution behind each bubble. An analytical approach together with the rheological consideration was developed to compute the rise velocity of a bubble chain in function of the injection period and bubble volume. The model predictions compare satisfactorily with the experimental investigation.

  16. Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.

    1992-01-01

    The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.

  17. ENVIRONMENTAL MASS SPECTROMETRY: EMERGING CONTAMINANTS AND CURRENT ISSUES, 2004 REVIEW

    EPA Science Inventory

    This review covers developments in environmental mass spectrometry over the period of 2002-2003. A few significant references that appeared between January and March 2004 are also included. This review is in keeping with a current approach of Analytical Chemistry to include onl...

  18. Microethnographic Discourse Analysis in an Inquiry Classroom

    ERIC Educational Resources Information Center

    Moses, Lindsey

    2012-01-01

    This article addresses the relationship among theories related to classroom language and literacy events by first examining the researcher's theoretical perspective on discourse and sociocultural theories of learning development. The analytical heuristic for a microethnographic approach using a variety of theoretical tools is discussed and…

  19. 77 FR 13607 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-07

    ... Transformation Grants: Use of System Dynamic Modeling and Economic Analysis in Select Communities--New--National... community interventions. Using a system dynamics approach, CDC also plans to conduct simulation modeling... the development of analytic tools for system dynamics modeling under more limited conditions. The...

  20. AN INTERDISCIPLINARY APPROACH TO VALUING WATER FROM BRUSH CONTROL

    EPA Science Inventory

    An analytical methodology utilizing models from three disciplines is developed to assess the viability of brush control for wate yield in the Frio River Basin, TX. Ecological, hydrologic, and economic models are used to portray changes in forage production and water supply result...

  1. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1988-01-01

    The initial effort was concentrated on developing the quasi-analytical approach for two-dimensional transonic flow. To keep the problem computationally efficient and straightforward, only the two-dimensional flow was considered and the problem was modeled using the transonic small perturbation equation.

  2. Modelling Social Learning in Monkeys

    ERIC Educational Resources Information Center

    Kendal, Jeremy R.

    2008-01-01

    The application of modelling to social learning in monkey populations has been a neglected topic. Recently, however, a number of statistical, simulation and analytical approaches have been developed to help examine social learning processes, putative traditions, the use of social learning strategies and the diffusion dynamics of socially…

  3. Local fields and effective conductivity tensor of ellipsoidal particle composite with anisotropic constituents

    NASA Astrophysics Data System (ADS)

    Kushch, Volodymyr I.; Sevostianov, Igor; Giraud, Albert

    2017-11-01

    An accurate semi-analytical solution of the conductivity problem for a composite with anisotropic matrix and arbitrarily oriented anisotropic ellipsoidal inhomogeneities has been obtained. The developed approach combines the superposition principle with the multipole expansion of perturbation fields of inhomogeneities in terms of ellipsoidal harmonics and reduces the boundary value problem to an infinite system of linear algebraic equations for the induced multipole moments of inhomogeneities. A complete full-field solution is obtained for the multi-particle models comprising inhomogeneities of diverse shape, size, orientation and properties which enables an adequate account for the microstructure parameters. The solution is valid for the general-type anisotropy of constituents and arbitrary orientation of the orthotropy axes. The effective conductivity tensor of the particulate composite with anisotropic constituents is evaluated in the framework of the generalized Maxwell homogenization scheme. Application of the developed method to composites with imperfect ellipsoidal interfaces is straightforward. Their incorporation yields probably the most general model of a composite that may be considered in the framework of analytical approach.

  4. Automation of data processing and calculation of retention parameters and thermodynamic data for gas chromatography

    NASA Astrophysics Data System (ADS)

    Makarycheva, A. I.; Faerman, V. A.

    2017-02-01

    The analyses of automation patterns is performed and the programming solution for the automation of data processing of the chromatographic data and their further information storage with a help of a software package, Mathcad and MS Excel spreadsheets, is developed. The offered approach concedes the ability of data processing algorithm modification and does not require any programming experts participation. The approach provides making a measurement of the given time and retention volumes, specific retention volumes, a measurement of differential molar free adsorption energy, and a measurement of partial molar solution enthalpies and isosteric heats of adsorption. The developed solution is focused on the appliance in a small research group and is tested on the series of some new gas chromatography sorbents. More than 20 analytes were submitted to calculation of retention parameters and thermodynamic sorption quantities. The received data are provided in the form accessible to comparative analysis, and they are able to find sorbing agents with the most profitable properties to solve some concrete analytic issues.

  5. Approaches for the analysis of low molecular weight compounds with laser desorption/ionization techniques and mass spectrometry.

    PubMed

    Bergman, Nina; Shevchenko, Denys; Bergquist, Jonas

    2014-01-01

    This review summarizes various approaches for the analysis of low molecular weight (LMW) compounds by different laser desorption/ionization mass spectrometry techniques (LDI-MS). It is common to use an agent to assist the ionization, and small molecules are normally difficult to analyze by, e.g., matrix assisted laser desorption/ionization mass spectrometry (MALDI-MS) using the common matrices available today, because the latter are generally small organic compounds themselves. This often results in severe suppression of analyte peaks, or interference of the matrix and analyte signals in the low mass region. However, intrinsic properties of several LDI techniques such as high sensitivity, low sample consumption, high tolerance towards salts and solid particles, and rapid analysis have stimulated scientists to develop methods to circumvent matrix-related issues in the analysis of LMW molecules. Recent developments within this field as well as historical considerations and future prospects are presented in this review.

  6. Stochastic and deterministic multiscale models for systems biology: an auxin-transport case study.

    PubMed

    Twycross, Jamie; Band, Leah R; Bennett, Malcolm J; King, John R; Krasnogor, Natalio

    2010-03-26

    Stochastic and asymptotic methods are powerful tools in developing multiscale systems biology models; however, little has been done in this context to compare the efficacy of these methods. The majority of current systems biology modelling research, including that of auxin transport, uses numerical simulations to study the behaviour of large systems of deterministic ordinary differential equations, with little consideration of alternative modelling frameworks. In this case study, we solve an auxin-transport model using analytical methods, deterministic numerical simulations and stochastic numerical simulations. Although the three approaches in general predict the same behaviour, the approaches provide different information that we use to gain distinct insights into the modelled biological system. We show in particular that the analytical approach readily provides straightforward mathematical expressions for the concentrations and transport speeds, while the stochastic simulations naturally provide information on the variability of the system. Our study provides a constructive comparison which highlights the advantages and disadvantages of each of the considered modelling approaches. This will prove helpful to researchers when weighing up which modelling approach to select. In addition, the paper goes some way to bridging the gap between these approaches, which in the future we hope will lead to integrative hybrid models.

  7. Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.

    PubMed

    Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis

    2016-07-01

    Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.

  8. Analysis and characterization of heparin impurities.

    PubMed

    Beni, Szabolcs; Limtiaco, John F K; Larive, Cynthia K

    2011-01-01

    This review discusses recent developments in analytical methods available for the sensitive separation, detection and structural characterization of heparin contaminants. The adulteration of raw heparin with oversulfated chondroitin sulfate (OSCS) in 2007-2008 spawned a global crisis resulting in extensive revisions to the pharmacopeia monographs on heparin and prompting the FDA to recommend the development of additional physicochemical methods for the analysis of heparin purity. The analytical chemistry community quickly responded to this challenge, developing a wide variety of innovative approaches, several of which are reported in this special issue. This review provides an overview of methods of heparin isolation and digestion, discusses known heparin contaminants, including OSCS, and summarizes recent publications on heparin impurity analysis using sensors, near-IR, Raman, and NMR spectroscopy, as well as electrophoretic and chromatographic separations.

  9. An Optoelectronic Nose for Detection of Toxic Gases

    PubMed Central

    Lim, Sung H.; Feng, Liang; Kemling, Jonathan W.; Musto, Christopher J.; Suslick, Kenneth S.

    2009-01-01

    We have developed a simple colorimetric sensor array (CSA) for the detection of a wide range of volatile analytes and applied it to the detection of toxic gases. The sensor consists of a disposable array of cross-responsive nanoporous pigments whose colors are changed by diverse chemical interactions with analytes. Although no single chemically responsive pigment is specific for any one analyte, the pattern of color change for the array is a unique molecular fingerprint. Clear differentiation among 19 different toxic industrial chemicals (TICs) within two minutes of exposure at IDLH (immediately dangerous to life or health) concentration has been demonstrated. Quantification of each analyte is easily accomplished based on the color change of the array, and excellent detection limits have been demonstrated, generally below the PELs (permissible exposure limits). Identification of the TICs was readily achieved using a standard chemometric approach, i.e., hierarchical clustering analysis (HCA), with no misclassifications over 140 trials. PMID:20160982

  10. Analyzing the Heterogeneous Hierarchy of Cultural Heritage Materials: Analytical Imaging.

    PubMed

    Trentelman, Karen

    2017-06-12

    Objects of cultural heritage significance are created using a wide variety of materials, or mixtures of materials, and often exhibit heterogeneity on multiple length scales. The effective study of these complex constructions thus requires the use of a suite of complementary analytical technologies. Moreover, because of the importance and irreplaceability of most cultural heritage objects, researchers favor analytical techniques that can be employed noninvasively, i.e., without having to remove any material for analysis. As such, analytical imaging has emerged as an important approach for the study of cultural heritage. Imaging technologies commonly employed, from the macroscale through the micro- to nanoscale, are discussed with respect to how the information obtained helps us understand artists' materials and methods, the cultures in which the objects were created, how the objects may have changed over time, and importantly, how we may develop strategies for their preservation.

  11. An optoelectronic nose for the detection of toxic gases.

    PubMed

    Lim, Sung H; Feng, Liang; Kemling, Jonathan W; Musto, Christopher J; Suslick, Kenneth S

    2009-10-01

    We have developed a simple colorimetric sensor array that detects a wide range of volatile analytes and then applied it to the detection of toxic gases. The sensor consists of a disposable array of cross-responsive nanoporous pigments with colours that are changed by diverse chemical interactions with analytes. Although no single chemically responsive pigment is specific for any one analyte, the pattern of colour change for the array is a unique molecular fingerprint. Clear differentiation among 19 different toxic industrial chemicals (TICs) within two minutes of exposure at concentrations immediately dangerous to life or health were demonstrated. Based on the colour change of the array, quantification of each analyte was accomplished easily, and excellent detection limits were achieved, generally below the permissible exposure limits. Different TICs were identified readily using a standard chemometric approach (hierarchical clustering analysis), with no misclassifications over 140 trials.

  12. Feature Geo Analytics and Big Data Processing: Hybrid Approaches for Earth Science and Real-Time Decision Support

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.

    2016-12-01

    Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.

  13. A Blended Approach to Learning: Added Value and Lessons Learnt from Students' Use of Computer-Based Materials for Neurological Analysis

    ERIC Educational Resources Information Center

    Davies, Alison; Ramsay, Jill; Lindfield, Helen; Couperthwaite, John

    2005-01-01

    This paper examines BSc Physiotherapy students' experiences of developing their neurological observational and analytical skills using a blend of traditional classroom activities and computer-based materials at the University of Birmingham. New teaching and learning resources were developed and supported in the School of Health Sciences using Web…

  14. Research on Employment in the Rural Nonfarm Sector in Africa. African Rural Employment Paper No. 5.

    ERIC Educational Resources Information Center

    Liedholm, Carl

    Within the context of the role of rural employment in overall economic development, the objectives were to summarize existing knowledge of the rural African nonfarm sector and to develop an analytical framework for examing utilization of labor in this sector, using a descriptive profile, a theoretical model, and a research approach to rural…

  15. Finding accurate frontiers: A knowledge-intensive approach to relational learning

    NASA Technical Reports Server (NTRS)

    Pazzani, Michael; Brunk, Clifford

    1994-01-01

    An approach to analytic learning is described that searches for accurate entailments of a Horn Clause domain theory. A hill-climbing search, guided by an information based evaluation function, is performed by applying a set of operators that derive frontiers from domain theories. The analytic learning system is one component of a multi-strategy relational learning system. We compare the accuracy of concepts learned with this analytic strategy to concepts learned with an analytic strategy that operationalizes the domain theory.

  16. HPAEC-PAD for oligosaccharide analysis-novel insights into analyte sensitivity and response stability.

    PubMed

    Mechelke, Matthias; Herlet, Jonathan; Benz, J Philipp; Schwarz, Wolfgang H; Zverlov, Vladimir V; Liebl, Wolfgang; Kornberger, Petra

    2017-12-01

    The rising importance of accurately detecting oligosaccharides in biomass hydrolyzates or as ingredients in food, such as in beverages and infant milk products, demands for the availability of tools to sensitively analyze the broad range of available oligosaccharides. Over the last decades, HPAEC-PAD has been developed into one of the major technologies for this task and represents a popular alternative to state-of-the-art LC-MS oligosaccharide analysis. This work presents the first comprehensive study which gives an overview of the separation of 38 analytes as well as enzymatic hydrolyzates of six different polysaccharides focusing on oligosaccharides. The high sensitivity of the PAD comes at cost of its stability due to recession of the gold electrode. By an in-depth analysis of the sensitivity drop over time for 35 analytes, including xylo- (XOS), arabinoxylo- (AXOS), laminari- (LOS), manno- (MOS), glucomanno- (GMOS), and cellooligosaccharides (COS), we developed an analyte-specific one-phase decay model for this effect over time. Using this model resulted in significantly improved data normalization when using an internal standard. Our results thereby allow a quantification approach which takes the inevitable and analyte-specific PAD response drop into account. Graphical abstract HPAEC-PAD analysis of oligosaccharides and determination of PAD response drop leading to an improved data normalization.

  17. Comparison of three methods for wind turbine capacity factor estimation.

    PubMed

    Ditkovich, Y; Kuperman, A

    2014-01-01

    Three approaches to calculating capacity factor of fixed speed wind turbines are reviewed and compared using a case study. The first "quasiexact" approach utilizes discrete wind raw data (in the histogram form) and manufacturer-provided turbine power curve (also in discrete form) to numerically calculate the capacity factor. On the other hand, the second "analytic" approach employs a continuous probability distribution function, fitted to the wind data as well as continuous turbine power curve, resulting from double polynomial fitting of manufacturer-provided power curve data. The latter approach, while being an approximation, can be solved analytically thus providing a valuable insight into aspects, affecting the capacity factor. Moreover, several other merits of wind turbine performance may be derived based on the analytical approach. The third "approximate" approach, valid in case of Rayleigh winds only, employs a nonlinear approximation of the capacity factor versus average wind speed curve, only requiring rated power and rotor diameter of the turbine. It is shown that the results obtained by employing the three approaches are very close, enforcing the validity of the analytically derived approximations, which may be used for wind turbine performance evaluation.

  18. A new frequency approach for light flicker evaluation in electric power systems

    NASA Astrophysics Data System (ADS)

    Feola, Luigi; Langella, Roberto; Testa, Alfredo

    2015-12-01

    In this paper, a new analytical estimator for light flicker in frequency domain, which is able to take into account also the frequency components neglected by the classical methods proposed in literature, is proposed. The analytical solutions proposed apply for any generic stationary signal affected by interharmonic distortion. The light flicker analytical estimator proposed is applied to numerous numerical case studies with the goal of showing i) the correctness and the improvements of the analytical approach proposed with respect to the other methods proposed in literature and ii) the accuracy of the results compared to those obtained by means of the classical International Electrotechnical Commission (IEC) flickermeter. The usefulness of the proposed analytical approach is that it can be included in signal processing tools for interharmonic penetration studies for the integration of renewable energy sources in future smart grids.

  19. Experimental Validation of the Transverse Shear Behavior of a Nomex Core for Sandwich Panels

    NASA Astrophysics Data System (ADS)

    Farooqi, M. I.; Nasir, M. A.; Ali, H. M.; Ali, Y.

    2017-05-01

    This work deals with determination of the transverse shear moduli of a Nomex® honeycomb core of sandwich panels. Their out-of-plane shear characteristics depend on the transverse shear moduli of the honeycomb core. These moduli were determined experimentally, numerically, and analytically. Numerical simulations were performed by using a unit cell model and three analytical approaches. Analytical calculations showed that two of the approaches provided reasonable predictions for the transverse shear modulus as compared with experimental results. However, the approach based upon the classical lamination theory showed large deviations from experimental data. Numerical simulations also showed a trend similar to that resulting from the analytical models.

  20. A Theoretical and Experimental Study for a Developing Flow in a Thin Fluid Gap

    NASA Astrophysics Data System (ADS)

    Wu, Qianhong; Lang, Ji; Jen, Kei-Peng; Nathan, Rungun; Vucbmss Team

    2016-11-01

    In this paper, we report a novel theoretical and experimental approach to examine a fast developing flow in a thin fluid gap. Although the phenomena are widely observed in industrial applications and biological systems, there is a lack of analytical approach that captures the instantaneous fluid response to a sudden impact. An experimental setup was developed that contains a piston instrumented with a laser displacement sensor and a pressure transducer. A sudden impact was imposed on the piston, creating a fast compaction on the thin fluid gap underneath. The motion of the piston was captured by the laser displacement sensor, and the fluid pressure build-up and relaxation was recorded by the pressure transducer. For this dynamic process, a novel analytical approach was developed. It starts with the inviscid limit when the viscous fluid effect has no time to appear. This short process is followed by a developing flow, in which the inviscid core flow region decreases and the viscous wall region increases until the entire fluid gap is filled with viscous fluid flow. A boundary layer integral method is used during the process. Lastly, the flow is completely viscous dominant featured by a typical squeeze flow in a thin gap. Excellent agreement between the theory and the experiment was achieved. The study presented herein, filling the gap in the literature, will have broad impact in industrial and biomedical applications. This research was supported by the National Science Foundation under Award #1511096.

  1. On the relationship between the causal-inference and meta-analytic paradigms for the validation of surrogate endpoints.

    PubMed

    Alonso, Ariel; Van der Elst, Wim; Molenberghs, Geert; Buyse, Marc; Burzykowski, Tomasz

    2015-03-01

    The increasing cost of drug development has raised the demand for surrogate endpoints when evaluating new drugs in clinical trials. However, over the years, it has become clear that surrogate endpoints need to be statistically evaluated and deemed valid, before they can be used as substitutes of "true" endpoints in clinical studies. Nowadays, two paradigms, based on causal-inference and meta-analysis, dominate the scene. Nonetheless, although the literature emanating from these paradigms is wide, till now the relationship between them has largely been left unexplored. In the present work, we discuss the conceptual framework underlying both approaches and study the relationship between them using theoretical elements and the analysis of a real case study. Furthermore, we show that the meta-analytic approach can be embedded within a causal-inference framework on the one hand and that it can be heuristically justified why surrogate endpoints successfully evaluated using this approach will often be appealing from a causal-inference perspective as well, on the other. A newly developed and user friendly R package Surrogate is provided to carry out the evaluation exercise. © 2014, The International Biometric Society.

  2. A practical model of thin disk regenerative amplifier based on analytical expression of ASE lifetime

    NASA Astrophysics Data System (ADS)

    Zhou, Huang; Chyla, Michal; Nagisetty, Siva Sankar; Chen, Liyuan; Endo, Akira; Smrz, Martin; Mocek, Tomas

    2017-12-01

    In this paper, a practical model of a thin disk regenerative amplifier has been developed based on an analytical approach, in which Drew A. Copeland [1] had evaluated the loss rate of the upper state laser level due to ASE and derived the analytical expression of the effective life-time of the upper-state laser level by taking the Lorentzian stimulated emission line-shape and total internal reflection into account. By adopting the analytical expression of effective life-time in the rate equations, we have developed a less numerically intensive model for predicting and analyzing the performance of a thin disk regenerative amplifier. Thanks to the model, optimized combination of various parameters can be obtained to avoid saturation, period-doubling bifurcation or first pulse suppression prior to experiments. The effective life-time due to ASE is also analyzed against various parameters. The simulated results fit well with experimental data. By fitting more experimental results with numerical model, we can improve the parameters of the model, such as reflective factor which is used to determine the weight of boundary reflection within the influence of ASE. This practical model will be used to explore the scaling limits imposed by ASE of the thin disk regenerative amplifier being developed in HiLASE Centre.

  3. Microchip integrating magnetic nanoparticles for allergy diagnosis.

    PubMed

    Teste, Bruno; Malloggi, Florent; Siaugue, Jean-Michel; Varenne, Anne; Kanoufi, Frederic; Descroix, Stéphanie

    2011-12-21

    We report on the development of a simple and easy to use microchip dedicated to allergy diagnosis. This microchip combines both the advantages of homogeneous immunoassays i.e. species diffusion and heterogeneous immunoassays i.e. easy separation and preconcentration steps. In vitro allergy diagnosis is based on specific Immunoglobulin E (IgE) quantitation, in that way we have developed and integrated magnetic core-shell nanoparticles (MCSNPs) as an IgE capture nanoplatform in a microdevice taking benefit from both their magnetic and colloidal properties. Integrating such immunosupport allows to perform the target analyte (IgE) capture in the colloidal phase thus increasing the analyte capture kinetics since both immunological partners are diffusing during the immune reaction. This colloidal approach improves 1000 times the analyte capture kinetics compared to conventional methods. Moreover, based on the MCSNPs' magnetic properties and on the magnetic chamber we have previously developed the MCSNPs and therefore the target can be confined and preconcentrated within the microdevice prior to the detection step. The MCSNPs preconcentration factor achieved was about 35,000 and allows to reach high sensitivity thus avoiding catalytic amplification during the detection step. The developed microchip offers many advantages: the analytical procedure was fully integrated on-chip, analyses were performed in short assay time (20 min), the sample and reagents consumption was reduced to few microlitres (5 μL) while a low limit of detection can be achieved (about 1 ng mL(-1)).

  4. The Identification and Significance of Intuitive and Analytic Problem Solving Approaches Among College Physics Students

    ERIC Educational Resources Information Center

    Thorsland, Martin N.; Novak, Joseph D.

    1974-01-01

    Described is an approach to assessment of intuitive and analytic modes of thinking in physics. These modes of thinking are associated with Ausubel's theory of learning. High ability in either intuitive or analytic thinking was associated with success in college physics, with high learning efficiency following a pattern expected on the basis of…

  5. Scattering of In-Plane Waves by Elastic Wedges

    NASA Astrophysics Data System (ADS)

    Mohammadi, K.; Asimaki, D.; Fradkin, L.

    2014-12-01

    The scattering of seismic waves by elastic wedges has been a topic of interest in seismology and geophysics for many decades. Analytical, semi-analytical, experimental and numerical studies on idealized wedges have provided insight into the seismic behavior of continental margins, mountain roots and crustal discontinuities. Published results, however, have almost exclusively focused on incident Rayleigh waves and out-of-plane body (SH) waves. Complementing the existing body of work, we here present results from our study on the res­ponse of elastic wedges to incident P or SV waves, an idealized pro­blem that can provide valuable insight to the understanding and parameterization of topographic ampli­fication of seismic ground mo­tion. We first show our earlier work on explicit finite difference simulations of SV-wave scattering by elastic wedges over a wide range of internal angles. We next present a semi-analytical solution that we developed using the approach proposed by Gautesen, to describe the scattered wavefield in the immediate vicinity of the wedge's tip (near-field). We use the semi-analytical solution to validate the numerical analyses, and improve resolution of the amplification factor at the wedge vertex that spikes when the internal wedge angle approaches the critical angle of incidence.

  6. Analytics that Inform the University: Using Data You Already Have

    ERIC Educational Resources Information Center

    Dziuban, Charles; Moskal, Patsy; Cavanagh, Thomas; Watts, Andre

    2012-01-01

    The authors describe the University of Central Florida's top-down/bottom-up action analytics approach to using data to inform decision-making at the University of Central Florida. The top-down approach utilizes information about programs, modalities, and college implementation of Web initiatives. The bottom-up approach continuously monitors…

  7. Quantifying residues from postharvest fumigation of almonds and walnuts with propylene oxide

    USDA-ARS?s Scientific Manuscript database

    A novel analytical approach, involving solvent extraction with methyl tert-butyl ether (MTBE) followed by gas chromatography (GC), was developed to quantify residues that result from the postharvest fumigation of almonds and walnuts with propylene oxide (PPO). Verification and quantification of PPO,...

  8. Characteristics of Effective Leadership Networks

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Azah, Vera Ndifor

    2016-01-01

    Purpose: The purpose of this paper is to inquire about the characteristics of effective school leadership networks and the contribution of such networks to the development of individual leaders' professional capacities. Design/methodology/approach: The study used path-analytic techniques with survey data provided by 450 school and district leaders…

  9. Strategic Analysis of Irregular Warfare

    DTIC Science & Technology

    2010-03-01

    the same mathematical equations used by Lanchester .10 Irregular Warfare Theory and Doctrine It is time to develop new analytical methods and models...basis on which to build, similar to what Lanchester provided almost 100 years ago. Figure 9 portrays both Lanchester’s approach and an irregular 17

  10. USING HISTORICAL BIOLOGICAL DATA TO EVALUATE STATUS AND TRENDS IN THE BIG DARBY CREEK WATERSHED (OHIO, USA)

    EPA Science Inventory

    Assessment of watershed ecological status and trends is challenging for managers who lack randomly or consistently sampled data, or monitoring programs developed from a watershed perspective. This study investigated analytical approaches for assessment of status and trends using ...

  11. Looking for new biomarkers of skin wound vitality with a cytokine-based multiplex assay: preliminary study.

    PubMed

    Peyron, Pierre-Antoine; Baccino, Éric; Nagot, Nicolas; Lehmann, Sylvain; Delaby, Constance

    2017-02-01

    Determination of skin wound vitality is an important issue in forensic practice. No reliable biomarker currently exists. Quantification of inflammatory cytokines in injured skin with MSD ® technology is an innovative and promising approach. This preliminary study aims to develop a protocol for the preparation and the analysis of skin samples. Samples from ante mortem wounds, post mortem wounds, and intact skin ("control samples") were taken from corpses at the autopsy. After an optimization of the pre-analytical protocol had been performed in terms of skin homogeneisation and proteic extraction, the concentration of TNF-α was measured in each sample with the MSD ® approach. Then five other cytokines of interest (IL-1β, IL-6, IL-10, IL-12p70 and IFN-γ) were simultaneously quantified with a MSD ® multiplex assay. The optimal pre-analytical conditions consist in a proteic extraction from a 6 mm diameter skin sample, in a PBS buffer with triton 0,05%. Our results show the linearity and the reproductibility of the TNF-α quantification with MSD ® , and an inter- and intra-individual variability of the concentrations of proteins. The MSD ® multiplex assay is likely to detect differential skin concentrations for each cytokine of interest. This preliminary study was used to develop and optimize the pre-analytical and analytical conditions of the MSD ® method using injured and healthy skin samples, for the purpose of looking for and identifying the cytokine, or the set of cytokines, that may be biomarkers of skin wound vitality.

  12. Liquid Metering Centrifuge Sticks (LMCS): A Centrifugal Approach to Metering Known Sample Volumes for Colorimetric Solid Phase Extraction (C-SPE)

    NASA Technical Reports Server (NTRS)

    Gazda, Daniel B.; Schultz, John R.; Clarke, Mark S.

    2007-01-01

    Phase separation is one of the most significant obstacles encountered during the development of analytical methods for water quality monitoring in spacecraft environments. Removing air bubbles from water samples prior to analysis is a routine task on earth; however, in the absence of gravity, this routine task becomes extremely difficult. This paper details the development and initial ground testing of liquid metering centrifuge sticks (LMCS), devices designed to collect and meter a known volume of bubble-free water in microgravity. The LMCS uses centrifugal force to eliminate entrapped air and reproducibly meter liquid sample volumes for analysis with Colorimetric Solid Phase Extraction (C-SPE). C-SPE is a sorption-spectrophotometric platform that is being developed as a potential spacecraft water quality monitoring system. C-SPE utilizes solid phase extraction membranes impregnated with analyte-specific colorimetric reagents to concentrate and complex target analytes in spacecraft water samples. The mass of analyte extracted from the water sample is determined using diffuse reflectance (DR) data collected from the membrane surface and an analyte-specific calibration curve. The analyte concentration can then be calculated from the mass of extracted analyte and the volume of the sample analyzed. Previous flight experiments conducted in microgravity conditions aboard the NASA KC-135 aircraft demonstrated that the inability to collect and meter a known volume of water using a syringe was a limiting factor in the accuracy of C-SPE measurements. Herein, results obtained from ground based C-SPE experiments using ionic silver as a test analyte and either the LMCS or syringes for sample metering are compared to evaluate the performance of the LMCS. These results indicate very good agreement between the two sample metering methods and clearly illustrate the potential of utilizing centrifugal forces to achieve phase separation and metering of water samples in microgravity.

  13. Quasi-Steady Evolution of Hillslopes in Layered Landscapes: An Analytic Approach

    NASA Astrophysics Data System (ADS)

    Glade, R. C.; Anderson, R. S.

    2018-01-01

    Landscapes developed in layered sedimentary or igneous rocks are common on Earth, as well as on other planets. Features such as hogbacks, exposed dikes, escarpments, and mesas exhibit resistant rock layers adjoining more erodible rock in tilted, vertical, or horizontal orientations. Hillslopes developed in the erodible rock are typically characterized by steep, linear-to-concave slopes or "ramps" mantled with material derived from the resistant layers, often in the form of large blocks. Previous work on hogbacks has shown that feedbacks between weathering and transport of the blocks and underlying soft rock can create relief over time and lead to the development of concave-up slope profiles in the absence of rilling processes. Here we employ an analytic approach, informed by numerical modeling and field data, to describe the quasi-steady state behavior of such rocky hillslopes for the full spectrum of resistant layer dip angles. We begin with a simple geometric analysis that relates structural dip to erosion rates. We then explore the mechanisms by which our numerical model of hogback evolution self-organizes to meet these geometric expectations, including adjustment of soil depth, erosion rates, and block velocities along the ramp. Analytical solutions relate easily measurable field quantities such as ramp length, slope, block size, and resistant layer dip angle to local incision rate, block velocity, and block weathering rate. These equations provide a framework for exploring the evolution of layered landscapes and pinpoint the processes for which we require a more thorough understanding to predict their evolution over time.

  14. Use of the self-organising map network (SOMNet) as a decision support system for regional mental health planning.

    PubMed

    Chung, Younjin; Salvador-Carulla, Luis; Salinas-Pérez, José A; Uriarte-Uriarte, Jose J; Iruin-Sanz, Alvaro; García-Alonso, Carlos R

    2018-04-25

    Decision-making in mental health systems should be supported by the evidence-informed knowledge transfer of data. Since mental health systems are inherently complex, involving interactions between its structures, processes and outcomes, decision support systems (DSS) need to be developed using advanced computational methods and visual tools to allow full system analysis, whilst incorporating domain experts in the analysis process. In this study, we use a DSS model developed for interactive data mining and domain expert collaboration in the analysis of complex mental health systems to improve system knowledge and evidence-informed policy planning. We combine an interactive visual data mining approach, the self-organising map network (SOMNet), with an operational expert knowledge approach, expert-based collaborative analysis (EbCA), to develop a DSS model. The SOMNet was applied to the analysis of healthcare patterns and indicators of three different regional mental health systems in Spain, comprising 106 small catchment areas and providing healthcare for over 9 million inhabitants. Based on the EbCA, the domain experts in the development team guided and evaluated the analytical processes and results. Another group of 13 domain experts in mental health systems planning and research evaluated the model based on the analytical information of the SOMNet approach for processing information and discovering knowledge in a real-world context. Through the evaluation, the domain experts assessed the feasibility and technology readiness level (TRL) of the DSS model. The SOMNet, combined with the EbCA, effectively processed evidence-based information when analysing system outliers, explaining global and local patterns, and refining key performance indicators with their analytical interpretations. The evaluation results showed that the DSS model was feasible by the domain experts and reached level 7 of the TRL (system prototype demonstration in operational environment). This study supports the benefits of combining health systems engineering (SOMNet) and expert knowledge (EbCA) to analyse the complexity of health systems research. The use of the SOMNet approach contributes to the demonstration of DSS for mental health planning in practice.

  15. Understanding the Scalability of Bayesian Network Inference using Clique Tree Growth Curves

    NASA Technical Reports Server (NTRS)

    Mengshoel, Ole Jakob

    2009-01-01

    Bayesian networks (BNs) are used to represent and efficiently compute with multi-variate probability distributions in a wide range of disciplines. One of the main approaches to perform computation in BNs is clique tree clustering and propagation. In this approach, BN computation consists of propagation in a clique tree compiled from a Bayesian network. There is a lack of understanding of how clique tree computation time, and BN computation time in more general, depends on variations in BN size and structure. On the one hand, complexity results tell us that many interesting BN queries are NP-hard or worse to answer, and it is not hard to find application BNs where the clique tree approach in practice cannot be used. On the other hand, it is well-known that tree-structured BNs can be used to answer probabilistic queries in polynomial time. In this article, we develop an approach to characterizing clique tree growth as a function of parameters that can be computed in polynomial time from BNs, specifically: (i) the ratio of the number of a BN's non-root nodes to the number of root nodes, or (ii) the expected number of moral edges in their moral graphs. Our approach is based on combining analytical and experimental results. Analytically, we partition the set of cliques in a clique tree into different sets, and introduce a growth curve for each set. For the special case of bipartite BNs, we consequently have two growth curves, a mixed clique growth curve and a root clique growth curve. In experiments, we systematically increase the degree of the root nodes in bipartite Bayesian networks, and find that root clique growth is well-approximated by Gompertz growth curves. It is believed that this research improves the understanding of the scaling behavior of clique tree clustering, provides a foundation for benchmarking and developing improved BN inference and machine learning algorithms, and presents an aid for analytical trade-off studies of clique tree clustering using growth curves.

  16. Application of Precipitate Free Zone Growth Kinetics to the β-Phase Depletion Behavior in a CoNiCrAlY Coating Alloy: An Analytical Approach

    NASA Astrophysics Data System (ADS)

    Chen, H.

    2018-06-01

    This paper concerns the β-phase depletion kinetics of a thermally sprayed free-standing CoNiCrAlY (Co-31.7 pct Ni-20.8 pct Cr-8.1 pct Al-0.5 pct Y, all in wt pct) coating alloy. An analytical β-phase depletion model based on the precipitate free zone growth kinetics was developed to calculate the β-phase depletion kinetics during isothermal oxidation. This approach, which accounts for the molar volume of the alloy, the interfacial energy of the γ/ β interface, and the Al concentration at γ/ γ + β boundary, requires the Al concentrations in the β-phase depletion zone as the input rather than the oxidation kinetics at the oxide/coating interface. The calculated β-phase depletion zones derived from the current model were compared with experimental results. It is shown that the calculated β-phase depletion zones using the current model are in reasonable agreement with those obtained experimentally. The constant compositional terms used in the model are likely to cause the discrepancies between the model predictions and experimental results. This analytical approach, which shows a reasonable correlation with experimental results, demonstrates a good reliability in the fast evaluation on lifetime prediction of MCrAlY coatings.

  17. Application of Precipitate Free Zone Growth Kinetics to the β-Phase Depletion Behavior in a CoNiCrAlY Coating Alloy: An Analytical Approach

    NASA Astrophysics Data System (ADS)

    Chen, H.

    2018-03-01

    This paper concerns the β-phase depletion kinetics of a thermally sprayed free-standing CoNiCrAlY (Co-31.7 pct Ni-20.8 pct Cr-8.1 pct Al-0.5 pct Y, all in wt pct) coating alloy. An analytical β-phase depletion model based on the precipitate free zone growth kinetics was developed to calculate the β-phase depletion kinetics during isothermal oxidation. This approach, which accounts for the molar volume of the alloy, the interfacial energy of the γ/β interface, and the Al concentration at γ/γ + β boundary, requires the Al concentrations in the β-phase depletion zone as the input rather than the oxidation kinetics at the oxide/coating interface. The calculated β-phase depletion zones derived from the current model were compared with experimental results. It is shown that the calculated β-phase depletion zones using the current model are in reasonable agreement with those obtained experimentally. The constant compositional terms used in the model are likely to cause the discrepancies between the model predictions and experimental results. This analytical approach, which shows a reasonable correlation with experimental results, demonstrates a good reliability in the fast evaluation on lifetime prediction of MCrAlY coatings.

  18. Relative motion using analytical differential gravity

    NASA Technical Reports Server (NTRS)

    Gottlieb, Robert G.

    1988-01-01

    This paper presents a new approach to the computation of the motion of one satellite relative to another. The trajectory of the reference satellite is computed accurately subject to geopotential perturbations. This precise trajectory is used as a reference in computing the position of a nearby body, or bodies. The problem that arises in this approach is differencing nearly equal terms in the geopotential model, especially as the separation of the reference and nearby bodies approaches zero. By developing closed form expressions for differences in higher order and degree geopotential terms, the numerical problem inherent in the differencing approach is eliminated.

  19. A strategy to determine operating parameters in tissue engineering hollow fiber bioreactors

    PubMed Central

    Shipley, RJ; Davidson, AJ; Chan, K; Chaudhuri, JB; Waters, SL; Ellis, MJ

    2011-01-01

    The development of tissue engineering hollow fiber bioreactors (HFB) requires the optimal design of the geometry and operation parameters of the system. This article provides a strategy for specifying operating conditions for the system based on mathematical models of oxygen delivery to the cell population. Analytical and numerical solutions of these models are developed based on Michaelis–Menten kinetics. Depending on the minimum oxygen concentration required to culture a functional cell population, together with the oxygen uptake kinetics, the strategy dictates the model needed to describe mass transport so that the operating conditions can be defined. If cmin ≫ Km we capture oxygen uptake using zero-order kinetics and proceed analytically. This enables operating equations to be developed that allow the user to choose the medium flow rate, lumen length, and ECS depth to provide a prescribed value of cmin. When , we use numerical techniques to solve full Michaelis–Menten kinetics and present operating data for the bioreactor. The strategy presented utilizes both analytical and numerical approaches and can be applied to any cell type with known oxygen transport properties and uptake kinetics. PMID:21370228

  20. A unified procedure for meta-analytic evaluation of surrogate end points in randomized clinical trials

    PubMed Central

    Dai, James Y.; Hughes, James P.

    2012-01-01

    The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448

  1. Automatic computer procedure for generating exact and analytical kinetic energy operators based on the polyspherical approach: General formulation and removal of singularities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ndong, Mamadou; Lauvergnat, David; Nauts, André

    2013-11-28

    We present new techniques for an automatic computation of the kinetic energy operator in analytical form. These techniques are based on the use of the polyspherical approach and are extended to take into account Cartesian coordinates as well. An automatic procedure is developed where analytical expressions are obtained by symbolic calculations. This procedure is a full generalization of the one presented in Ndong et al., [J. Chem. Phys. 136, 034107 (2012)]. The correctness of the new implementation is analyzed by comparison with results obtained from the TNUM program. We give several illustrations that could be useful for users of themore » code. In particular, we discuss some cyclic compounds which are important in photochemistry. Among others, we show that choosing a well-adapted parameterization and decomposition into subsystems can allow one to avoid singularities in the kinetic energy operator. We also discuss a relation between polyspherical and Z-matrix coordinates: this comparison could be helpful for building an interface between the new code and a quantum chemistry package.« less

  2. Modern Adaptive Analytics Approach to Lowering Seismic Network Detection Thresholds

    NASA Astrophysics Data System (ADS)

    Johnson, C. E.

    2017-12-01

    Modern seismic networks present a number of challenges, but perhaps most notably are those related to 1) extreme variation in station density, 2) temporal variation in station availability, and 3) the need to achieve detectability for much smaller events of strategic importance. The first of these has been reasonably addressed in the development of modern seismic associators, such as GLASS 3.0 by the USGS/NEIC, though some work still remains to be done in this area. However, the latter two challenges demand special attention. Station availability is impacted by weather, equipment failure or the adding or removing of stations, and while thresholds have been pushed to increasingly smaller magnitudes, new algorithms are needed to achieve even lower thresholds. Station availability can be addressed by a modern, adaptive architecture that maintains specified performance envelopes using adaptive analytics coupled with complexity theory. Finally, detection thresholds can be lowered using a novel approach that tightly couples waveform analytics with the event detection and association processes based on a principled repicking algorithm that uses particle realignment for enhanced phase discrimination.

  3. Analytical reverse time migration: An innovation in imaging of infrastructures using ultrasonic shear waves.

    PubMed

    Asadollahi, Aziz; Khazanovich, Lev

    2018-04-11

    The emergence of ultrasonic dry point contact (DPC) transducers that emit horizontal shear waves has enabled efficient collection of high-quality data in the context of a nondestructive evaluation of concrete structures. This offers an opportunity to improve the quality of evaluation by adapting advanced imaging techniques. Reverse time migration (RTM) is a simulation-based reconstruction technique that offers advantages over conventional methods, such as the synthetic aperture focusing technique. RTM is capable of imaging boundaries and interfaces with steep slopes and the bottom boundaries of inclusions and defects. However, this imaging technique requires a massive amount of memory and its computation cost is high. In this study, both bottlenecks of the RTM are resolved when shear transducers are used for data acquisition. An analytical approach was developed to obtain the source and receiver wavefields needed for imaging using reverse time migration. It is shown that the proposed analytical approach not only eliminates the high memory demand, but also drastically reduces the computation time from days to minutes. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Molecularly imprinted polymer coupled with dispersive liquid-liquid microextraction and injector port silylation: a novel approach for the determination of 3-phenoxybenzoic acid in complex biological samples using gas chromatography-tandem mass spectrometry.

    PubMed

    Mudiam, Mohana Krishna Reddy; Chauhan, Abhishek; Jain, Rajeev; Dhuriya, Yogesh Kumar; Saxena, Prem Narain; Khanna, Vinay Kumar

    2014-01-15

    A novel analytical approach based on molecularly imprinted solid phase extraction (MISPE) coupled with dispersive liquid-liquid microextraction (DLLME), and injector port silylation (IPS) has been developed for the selective preconcentration, derivatization and analysis of 3-phenoxybenzoic acid (3-PBA) using gas chromatography-tandem mass spectrometry (GC-MS/MS) in complex biological samples such as rat blood and liver. Factors affecting the synthesis of MIP were evaluated and the best monomer and cross-linker were selected based on binding affinity studies. Various parameters of MISPE, DLLME and IPS were optimized for the selective preconcentration and derivatization of 3-PBA. The developed method offers a good linearity over the calibration range of 0.02-2.5ngmg(-1) and 7.5-2000ngmL(-1) for liver and blood respectively. Under optimized conditions, the recovery of 3-PBA in liver and blood samples were found to be in the range of 83-91%. The detection limit was found to be 0.0045ngmg(-1) and 1.82ngmL(-1) in liver and blood respectively. SRM transition of 271→227 and 271→197 has been selected as quantifier and qualifier transition for 3-PBA derivative. Intra and inter-day precision for five replicates in a day and for five, successive days was found to be less than 8%. The method developed was successfully applied to real samples, i.e. rat blood and tissue for quantitative evaluation of 3-PBA. The analytical approach developed is rapid, economic, simple, eco-friendly and possess immense utility for the analysis of analytes with polar functional groups in complex biological samples by GC-MS/MS. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Quantitative Profiling of Endogenous Fat-Soluble Vitamins and Carotenoids in Human Plasma Using an Improved UHPSFC-ESI-MS Interface.

    PubMed

    Petruzziello, Filomena; Grand-Guillaume Perrenoud, Alexandre; Thorimbert, Anita; Fogwill, Michael; Rezzi, Serge

    2017-07-18

    Analytical solutions enabling the quantification of circulating levels of liposoluble micronutrients such as vitamins and carotenoids are currently limited to either single or a reduced panel of analytes. The requirement to use multiple approaches hampers the investigation of the biological variability on a large number of samples in a time and cost efficient manner. With the goal to develop high-throughput and robust quantitative methods for the profiling of micronutrients in human plasma, we introduce a novel, validated workflow for the determination of 14 fat-soluble vitamins and carotenoids in a single run. Automated supported liquid extraction was optimized and implemented to simultaneously parallelize 48 samples in 1 h, and the analytes were measured using ultrahigh-performance supercritical fluid chromatography coupled to tandem mass spectrometry in less than 8 min. An improved mass spectrometry interface hardware was built up to minimize the post-decompression volume and to allow better control of the chromatographic effluent density on its route toward and into the ion source. In addition, a specific make-up solvent condition was developed to ensure both analytes and matrix constituents solubility after mobile phase decompression. The optimized interface resulted in improved spray plume stability and conserved matrix compounds solubility leading to enhanced hyphenation robustness while ensuring both suitable analytical repeatability and improved the detection sensitivity. The overall developed methodology gives recoveries within 85-115%, as well as within and between-day coefficient of variation of 2 and 14%, respectively.

  6. Development and in-line validation of a Process Analytical Technology to facilitate the scale up of coating processes.

    PubMed

    Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P

    2013-05-05

    Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. The Evaluation of a Self-Enumerated Scale of Quality of Life (CASP-19) in the Context of Research on Ageing: A Combination of Exploratory and Confirmatory Approaches

    ERIC Educational Resources Information Center

    Wiggins, R. D.; Netuveli, G.; Hyde, M.; Higgs, P.; Blane, D.

    2008-01-01

    This paper describes the conceptual development of a self-enumerated scale of quality of life (CASP-19) and presents an empirical evaluation of its structure using a combination of exploratory and confirmatory factor analytic approaches across three different survey settings for older people living in England and Wales in the new millennium. All…

  8. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    USGS Publications Warehouse

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  9. Mechanisms of chemical vapor generation by aqueous tetrahydridoborate. Recent developments toward the definition of a more general reaction model

    NASA Astrophysics Data System (ADS)

    D'Ulivo, Alessandro

    2016-05-01

    A reaction model describing the reactivity of metal and semimetal species with aqueous tetrahydridoborate (THB) has been drawn taking into account the mechanism of chemical vapor generation (CVG) of hydrides, recent evidences on the mechanism of interference and formation of byproducts in arsane generation, and other evidences in the field of the synthesis of nanoparticles and catalytic hydrolysis of THB by metal nanoparticles. The new "non-analytical" reaction model is of more general validity than the previously described "analytical" reaction model for CVG. The non-analytical model is valid for reaction of a single analyte with THB and for conditions approaching those typically encountered in the synthesis of nanoparticles and macroprecipitates. It reduces to the previously proposed analytical model under conditions typically employed in CVG for trace analysis (analyte below the μM level, borane/analyte ≫ 103 mol/mol, no interference). The non-analytical reaction model is not able to explain all the interference effects observed in CVG, which can be achieved only by assuming the interaction among the species of reaction pathways of different analytical substrates. The reunification of CVG, the synthesis of nanoparticles by aqueous THB and the catalytic hydrolysis of THB inside a common frame contribute to rationalization of the complex reactivity of aqueous THB with metal and semimetal species.

  10. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.

    PubMed

    Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.

  11. Extension of biomass estimates to pre-assessment periods using density dependent surplus production approach.

    PubMed

    Horbowy, Jan; Tomczak, Maciej T

    2017-01-01

    Biomass reconstructions to pre-assessment periods for commercially important and exploitable fish species are important tools for understanding long-term processes and fluctuation on stock and ecosystem level. For some stocks only fisheries statistics and fishery dependent data are available, for periods before surveys were conducted. The methods for the backward extension of the analytical assessment of biomass for years for which only total catch volumes are available were developed and tested in this paper. Two of the approaches developed apply the concept of the surplus production rate (SPR), which is shown to be stock density dependent if stock dynamics is governed by classical stock-production models. The other approach used a modified form of the Schaefer production model that allows for backward biomass estimation. The performance of the methods was tested on the Arctic cod and North Sea herring stocks, for which analytical biomass estimates extend back to the late 1940s. Next, the methods were applied to extend biomass estimates of the North-east Atlantic mackerel from the 1970s (analytical biomass estimates available) to the 1950s, for which only total catch volumes were available. For comparison with other methods which employs a constant SPR estimated as an average of the observed values, was also applied. The analyses showed that the performance of the methods is stock and data specific; the methods that work well for one stock may fail for the others. The constant SPR method is not recommended in those cases when the SPR is relatively high and the catch volumes in the reconstructed period are low.

  12. Extension of biomass estimates to pre-assessment periods using density dependent surplus production approach

    PubMed Central

    Horbowy, Jan

    2017-01-01

    Biomass reconstructions to pre-assessment periods for commercially important and exploitable fish species are important tools for understanding long-term processes and fluctuation on stock and ecosystem level. For some stocks only fisheries statistics and fishery dependent data are available, for periods before surveys were conducted. The methods for the backward extension of the analytical assessment of biomass for years for which only total catch volumes are available were developed and tested in this paper. Two of the approaches developed apply the concept of the surplus production rate (SPR), which is shown to be stock density dependent if stock dynamics is governed by classical stock-production models. The other approach used a modified form of the Schaefer production model that allows for backward biomass estimation. The performance of the methods was tested on the Arctic cod and North Sea herring stocks, for which analytical biomass estimates extend back to the late 1940s. Next, the methods were applied to extend biomass estimates of the North-east Atlantic mackerel from the 1970s (analytical biomass estimates available) to the 1950s, for which only total catch volumes were available. For comparison with other methods which employs a constant SPR estimated as an average of the observed values, was also applied. The analyses showed that the performance of the methods is stock and data specific; the methods that work well for one stock may fail for the others. The constant SPR method is not recommended in those cases when the SPR is relatively high and the catch volumes in the reconstructed period are low. PMID:29131850

  13. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.

  14. Noninvasive Biomonitoring Approaches to Determine Dosimetry and Risk Following Acute Chemical Exposure: Analysis of Lead or Organophosphate Insecticide in Saliva

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timchalk, Chuck; Poet, Torka S.; Kousba, Ahmed A.

    2004-04-01

    There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. Themore » system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk.« less

  15. Use of the Threshold of Toxicological Concern (TTC) approach for deriving target values for drinking water contaminants.

    PubMed

    Mons, M N; Heringa, M B; van Genderen, J; Puijker, L M; Brand, W; van Leeuwen, C J; Stoks, P; van der Hoek, J P; van der Kooij, D

    2013-03-15

    Ongoing pollution and improving analytical techniques reveal more and more anthropogenic substances in drinking water sources, and incidentally in treated water as well. In fact, complete absence of any trace pollutant in treated drinking water is an illusion as current analytical techniques are capable of detecting very low concentrations. Most of the substances detected lack toxicity data to derive safe levels and have not yet been regulated. Although the concentrations in treated water usually do not have adverse health effects, their presence is still undesired because of customer perception. This leads to the question how sensitive analytical methods need to become for water quality screening, at what levels water suppliers need to take action and how effective treatment methods need to be designed to remove contaminants sufficiently. Therefore, in the Netherlands a clear and consistent approach called 'Drinking Water Quality for the 21st century (Q21)' has been developed within the joint research program of the drinking water companies. Target values for anthropogenic drinking water contaminants were derived by using the recently introduced Threshold of Toxicological Concern (TTC) approach. The target values for individual genotoxic and steroid endocrine chemicals were set at 0.01 μg/L. For all other organic chemicals the target values were set at 0.1 μg/L. The target value for the total sum of genotoxic chemicals, the total sum of steroid hormones and the total sum of all other organic compounds were set at 0.01, 0.01 and 1.0 μg/L, respectively. The Dutch Q21 approach is further supplemented by the standstill-principle and effect-directed testing. The approach is helpful in defining the goals and limits of future treatment process designs and of analytical methods to further improve and ensure the quality of drinking water, without going to unnecessary extents. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Finite Element Modeling of the Buckling Response of Sandwich Panels

    NASA Technical Reports Server (NTRS)

    Rose, Cheryl A.; Moore, David F.; Knight, Norman F., Jr.; Rankin, Charles C.

    2002-01-01

    A comparative study of different modeling approaches for predicting sandwich panel buckling response is described. The study considers sandwich panels with anisotropic face sheets and a very thick core. Results from conventional analytical solutions for sandwich panel overall buckling and face-sheet-wrinkling type modes are compared with solutions obtained using different finite element modeling approaches. Finite element solutions are obtained using layered shell element models, with and without transverse shear flexibility, layered shell/solid element models, with shell elements for the face sheets and solid elements for the core, and sandwich models using a recently developed specialty sandwich element. Convergence characteristics of the shell/solid and sandwich element modeling approaches with respect to in-plane and through-the-thickness discretization, are demonstrated. Results of the study indicate that the specialty sandwich element provides an accurate and effective modeling approach for predicting both overall and localized sandwich panel buckling response. Furthermore, results indicate that anisotropy of the face sheets, along with the ratio of principle elastic moduli, affect the buckling response and these effects may not be represented accurately by analytical solutions. Modeling recommendations are also provided.

  17. Targeted proteomics coming of age - SRM, PRM and DIA performance evaluated from a core facility perspective.

    PubMed

    Kockmann, Tobias; Trachsel, Christian; Panse, Christian; Wahlander, Asa; Selevsek, Nathalie; Grossmann, Jonas; Wolski, Witold E; Schlapbach, Ralph

    2016-08-01

    Quantitative mass spectrometry is a rapidly evolving methodology applied in a large number of omics-type research projects. During the past years, new designs of mass spectrometers have been developed and launched as commercial systems while in parallel new data acquisition schemes and data analysis paradigms have been introduced. Core facilities provide access to such technologies, but also actively support the researchers in finding and applying the best-suited analytical approach. In order to implement a solid fundament for this decision making process, core facilities need to constantly compare and benchmark the various approaches. In this article we compare the quantitative accuracy and precision of current state of the art targeted proteomics approaches single reaction monitoring (SRM), parallel reaction monitoring (PRM) and data independent acquisition (DIA) across multiple liquid chromatography mass spectrometry (LC-MS) platforms, using a readily available commercial standard sample. All workflows are able to reproducibly generate accurate quantitative data. However, SRM and PRM workflows show higher accuracy and precision compared to DIA approaches, especially when analyzing low concentrated analytes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Molecular properties of excited electronic state: Formalism, implementation, and applications of analytical second energy derivatives within the framework of the time-dependent density functional theory/molecular mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeng, Qiao; Liang, WanZhen, E-mail: liangwz@xmu.edu.cn; Liu, Jie

    2014-05-14

    This work extends our previous works [J. Liu and W. Z. Liang, J. Chem. Phys. 135, 014113 (2011); J. Liu and W. Z. Liang, J. Chem. Phys. 135, 184111 (2011)] on analytical excited-state energy Hessian within the framework of time-dependent density functional theory (TDDFT) to couple with molecular mechanics (MM). The formalism, implementation, and applications of analytical first and second energy derivatives of TDDFT/MM excited state with respect to the nuclear and electric perturbations are presented. Their performances are demonstrated by the calculations of adiabatic excitation energies, and excited-state geometries, harmonic vibrational frequencies, and infrared intensities for a number ofmore » benchmark systems. The consistent results with the full quantum mechanical method and other hybrid theoretical methods indicate the reliability of the current numerical implementation of developed algorithms. The computational accuracy and efficiency of the current analytical approach are also checked and the computational efficient strategies are suggested to speed up the calculations of complex systems with many MM degrees of freedom. Finally, we apply the current analytical approach in TDDFT/MM to a realistic system, a red fluorescent protein chromophore together with part of its nearby protein matrix. The calculated results indicate that the rearrangement of the hydrogen bond interactions between the chromophore and the protein matrix is responsible for the large Stokes shift.« less

  19. Storyline Visualizations of Eye Tracking of Movie Viewing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balint, John T.; Arendt, Dustin L.; Blaha, Leslie M.

    Storyline visualizations offer an approach that promises to capture the spatio-temporal characteristics of individual observers and simultaneously illustrate emerging group behaviors. We develop a visual analytics approach to parsing, aligning, and clustering fixation sequences from eye tracking data. Visualization of the results captures the similarities and differences across a group of observers performing a common task. We apply our storyline approach to visualize gaze patterns of people watching dynamic movie clips. Storylines mitigate some of the shortcomings of existent spatio-temporal visualization techniques and, importantly, continue to highlight individual observer behavioral dynamics.

  20. Analytical modeling of the structureborne noise path on a small twin-engine aircraft

    NASA Technical Reports Server (NTRS)

    Cole, J. E., III; Stokes, A. Westagard; Garrelick, J. M.; Martini, K. F.

    1988-01-01

    The structureborne noise path of a six passenger twin-engine aircraft is analyzed. Models of the wing and fuselage structures as well as the interior acoustic space of the cabin are developed and used to evaluate sensitivity to structural and acoustic parameters. Different modeling approaches are used to examine aspects of the structureborne path. These approaches are guided by a number of considerations including the geometry of the structures, the frequency range of interest, and the tractability of the computations. Results of these approaches are compared with experimental data.

  1. Continuum modeling of large lattice structures: Status and projections

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Mikulas, Martin M., Jr.

    1988-01-01

    The status and some recent developments of continuum modeling for large repetitive lattice structures are summarized. Discussion focuses on a number of aspects including definition of an effective substitute continuum; characterization of the continuum model; and the different approaches for generating the properties of the continuum, namely, the constitutive matrix, the matrix of mass densities, and the matrix of thermal coefficients. Also, a simple approach is presented for generating the continuum properties. The approach can be used to generate analytic and/or numerical values of the continuum properties.

  2. Analysis of high-aspect-ratio jet-flap wings of arbitrary geometry

    NASA Technical Reports Server (NTRS)

    Lissaman, P. B. S.

    1973-01-01

    An analytical technique to compute the performance of an arbitrary jet-flapped wing is developed. The solution technique is based on the method of Maskell and Spence in which the well-known lifting-line approach is coupled with an auxiliary equation providing the extra function needed in jet-flap theory. The present method is generalized to handle straight, uncambered wings of arbitrary planform, twist, and blowing (including unsymmetrical cases). An analytical procedure is developed for continuous variations in the above geometric data with special functions to exactly treat discontinuities in any of the geometric and blowing data. A rational theory for the effect of finite wing thickness is introduced as well as simplified concepts of effective aspect ratio for rapid estimation of performance.

  3. Development of lightweight aluminum compression panels reinforced by boron-epoxy infiltrated extrusions

    NASA Technical Reports Server (NTRS)

    Roy, P. A.; Mcelman, J. A.; Henshaw, J.

    1973-01-01

    Analytical and experimental studies were performed to evaluate the structural efficiencies afforded by the selective reinforcement of conventional aluminum compression panels with unidirectional boron epoxy composite materials. A unique approach for selective reinforcement was utilized called boron/epoxy infiltration. This technique uses extruded metal sections with preformed hollow voids into which unidirectional boron filaments are drawn and subsequently infiltrated with resin to form an integral part. Simplified analytical models were developed to investigate the behavior of stiffener webs with reinforced flanges. Theoretical results are presented demonstrating the effects of transverse shear, of the reinforcement, flange eccentricity and torsional stiffness in such construction. A series of 55 tests were conducted on boron-infiltrated rods and extruded structural sections.

  4. Laser-Induced Breakdown Spectroscopy (LIBS) in a Novel Molten Salt Aerosol System.

    PubMed

    Williams, Ammon N; Phongikaroon, Supathorn

    2017-04-01

    In the pyrochemical separation of used nuclear fuel (UNF), fission product, rare earth, and actinide chlorides accumulate in the molten salt electrolyte over time. Measuring this salt composition in near real-time is advantageous for operational efficiency, material accountability, and nuclear safeguards. Laser-induced breakdown spectroscopy (LIBS) has been proposed and demonstrated as a potential analytical approach for molten LiCl-KCl salts. However, all the studies conducted to date have used a static surface approach which can lead to issues with splashing, low repeatability, and poor sample homogeneity. In this initial study, a novel molten salt aerosol approach has been developed and explored to measure the composition of the salt via LIBS. The functionality of the system has been demonstrated as well as a basic optimization of the laser energy and nebulizer gas pressure used. Initial results have shown that this molten salt aerosol-LIBS system has a great potential as an analytical technique for measuring the molten salt electrolyte used in this UNF reprocessing technology.

  5. Chemoselective synthesis and analysis of naturally occurring phosphorylated cysteine peptides

    PubMed Central

    Bertran-Vicente, Jordi; Penkert, Martin; Nieto-Garcia, Olaia; Jeckelmann, Jean-Marc; Schmieder, Peter; Krause, Eberhard; Hackenberger, Christian P. R.

    2016-01-01

    In contrast to protein O-phosphorylation, studying the function of the less frequent N- and S-phosphorylation events have lagged behind because they have chemical features that prevent their manipulation through standard synthetic and analytical methods. Here we report on the development of a chemoselective synthetic method to phosphorylate Cys side-chains in unprotected peptides. This approach makes use of a reaction between nucleophilic phosphites and electrophilic disulfides accessible by standard methods. We achieve the stereochemically defined phosphorylation of a Cys residue and verify the modification using electron-transfer higher-energy dissociation (EThcD) mass spectrometry. To demonstrate the use of the approach in resolving biological questions, we identify an endogenous Cys phosphorylation site in IICBGlc, which is known to be involved in the carbohydrate uptake from the bacterial phosphotransferase system (PTS). This new chemical and analytical approach finally allows further investigating the functions and significance of Cys phosphorylation in a wide range of crucial cellular processes. PMID:27586301

  6. A Variational Approach to the Analysis of Dissipative Electromechanical Systems

    PubMed Central

    Allison, Andrew; Pearce, Charles E. M.; Abbott, Derek

    2014-01-01

    We develop a method for systematically constructing Lagrangian functions for dissipative mechanical, electrical, and electromechanical systems. We derive the equations of motion for some typical electromechanical systems using deterministic principles that are strictly variational. We do not use any ad hoc features that are added on after the analysis has been completed, such as the Rayleigh dissipation function. We generalise the concept of potential, and define generalised potentials for dissipative lumped system elements. Our innovation offers a unified approach to the analysis of electromechanical systems where there are energy and power terms in both the mechanical and electrical parts of the system. Using our novel technique, we can take advantage of the analytic approach from mechanics, and we can apply these powerful analytical methods to electrical and to electromechanical systems. We can analyse systems that include non-conservative forces. Our methodology is deterministic, and does does require any special intuition, and is thus suitable for automation via a computer-based algebra package. PMID:24586221

  7. Letter: Observation of the 16O/18O exchange during electrospray ionization.

    PubMed

    Kostyukevich, Yury; Kononikhin, Alexey; Popov, Igor; Nikolaev, Eugene

    2015-01-01

    Isotopic exchange approach coupled to high-resolution mass spectrometry has become the power analytical approach for a wide range of analytical and bioanalyticall applications. Considerable efforts have been dedicated to developing fast exchange techniques directly in the ionization source. But all such methods are limited to the hydrogen/deuterium exchange approaches. In this paper we demonstrate that certain types of oxygen atoms can also be exchanged for (18)O on the time scale of the ionization process. Using HIO(3) and NaIO(4) and by infusing the heavy water H(2)(18)O in the ESI source we have demonstrated that it is possible to obtain a high level of oxygen exchange. It was observed that the rate of this exchange depends to a large extent on the temperature of the desolvating capillary of the mass spectrometer. Several other species, such as peptides, oligonucleotides and low weight organic molecules, were subjected to in-ESI (16)O/(18)O exchange but the exchange was not observed.

  8. Recent trends in the determination of vitamin D.

    PubMed

    Gomes, Fabio P; Shaw, P Nicholas; Whitfield, Karen; Koorts, Pieter; Hewavitharana, Amitha K

    2013-12-01

    The occurrence of vitamin D deficiency has become an issue of serious concern in the worldwide population. As a result numerous analytical methods have been developed, for a variety of matrices, during the last few years to measure vitamin D analogs and metabolites. This review employs a comprehensive search of all vitamin D methods developed during the last 5 years for all applications, using ISI Web of Science(®), Scifinder(®), Science Direct, Scopus and PubMed. Particular emphasis is given to sample-preparation methods and the different forms of vitamin D measured across different fields of applications such as biological fluids, food and pharmaceutical preparations. This review compares and critically evaluates a wide range of approaches and methods, and hence it will enable readers to access developments across a number of applications and to select or develop the optimal analytical method for vitamin D for their particular application.

  9. Source-term development for a contaminant plume for use by multimedia risk assessment models

    NASA Astrophysics Data System (ADS)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    2000-02-01

    Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.

  10. The Concordance between EFL Learners' Linguistic Sequential Development and the Curricula of Formal and Informal Learning Settings: An Analytical Study

    ERIC Educational Resources Information Center

    Albaqshi, Jalal H.

    2016-01-01

    This research explores the sequence of content in ESP curricula to our learners' linguistic development and to authentic situations. This study has been conducted in Alahsa College of Technology, Saudi Arabia. Methodology used was an analysis of an ESP textbook in corpus-based approach and matching the units of the textbook to students' needs…

  11. A Hybrid Approach to Develop an Analytical Model for Enhancing the Service Quality of E-Learning

    ERIC Educational Resources Information Center

    Wu, Hung-Yi; Lin, Hsin-Yu

    2012-01-01

    The digital content industry is flourishing as a result of the rapid development of technology and the widespread use of computer networks. As has been reported, the market size of the global e-learning (i.e., distance education and telelearning) will reach USD 49.6 billion in 2014. However, to retain and/or increase the market share associated…

  12. Case-Based Learning in Endocrine Physiology: An Approach toward Self-Directed Learning and the Development of Soft Skills in Medical Students

    ERIC Educational Resources Information Center

    Gade, Shubhada; Chari, Suresh

    2013-01-01

    The Medical Council of India, in the recent "Vision 2015" document, recommended curricular reforms for undergraduates. Case-based learning (CBL) is one method where students are motivated toward self-directed learning and to develop analytic and problem-solving skills. An overview of thyroid physiology was given in a didactic lecture. A…

  13. Analytical Planning for University Libraries.

    ERIC Educational Resources Information Center

    Leimkuhler, Ferdinand F.; Cooper, Michael D.

    A survey is made of the more important technological and managerial problems in the planning of university library services and recommendations are made for a positive program of innovation and development. Two approaches are explored in considerable detail. The first is the use of operations research models of the acquisition and storage…

  14. Decision making in prioritization of required operational capabilities

    NASA Astrophysics Data System (ADS)

    Andreeva, P.; Karev, M.; Kovacheva, Ts.

    2015-10-01

    The paper describes an expert heuristic approach to prioritization of required operational capabilities in the field of defense. Based on expert assessment and by application of the method of Analytical Hierarchical Process, a methodology for their prioritization has been developed. It has been applied to practical simulation decision making games.

  15. Partners in research exceed the sum of the parts: partners > parts

    USDA-ARS?s Scientific Manuscript database

    The overriding goal of analytical chemistry research has always been and will always be the same: develop and validate approaches to achieve the needed quality of results that fit the purpose of the analysis in the fastest, easiest, safest, most economical, robust, and environmentally-friendly way ...

  16. Optimizing the Long-Term Retention of Skills: Structural and Analytic Approaches to Skill Maintenance

    DTIC Science & Technology

    1990-08-01

    evidence for a surprising degree of long-term skill retention. We formulated a theoretical framework , focusing on the importance of procedural reinstatement...considerable forgetting over even relatively short retention intervals. We have been able to place these studies in the same general theoretical framework developed

  17. VALIDATION STUDIES OF THERMAL EXTRACTION-GC/MS APPLIED TO SOURCE EMISSIONS AEROSOLS: 1. SEMIVOLATILE ANALYTE--NONVOLATILE MATRIX INTERACTIONS

    EPA Science Inventory

    This work develops a novel validation approach for studying how non-volatile aerosol matrices of considerably different chemical composition potentially affect the thermal extraction (TE)/GC/MS quantification of a wide range of trace semivolatile organic markers. The non-volatil...

  18. A Characteristics Approach to the Evaluation of Economics Software Packages.

    ERIC Educational Resources Information Center

    Lumsden, Keith; Scott, Alex

    1988-01-01

    Utilizes Bloom's Taxonomy to identify elements of teacher and student interest. Depicts the way in which these interests are developed into characteristics for use in analytically evaluating software. Illustrates the use of this evaluating technique by appraising the much used software package "Running the British Economy." (KO)

  19. Relational Frame Theory: An Overview of the Controversy

    ERIC Educational Resources Information Center

    Gross, Amy C.; Fox, Eric J.

    2009-01-01

    Although Skinner's "Verbal Behavior" (1957) was published over 50 years ago, behavior-analytic research on human language and cognition has been slow to develop. In recent years, a new behavioral approach to language known as relational frame theory (RFT) has generated considerable attention, research, and debate. The controversy surrounding RFT…

  20. Developing a New Interdisciplinary Lab Course for Undergraduate and Graduate Students: Plant Cells and Proteins

    ERIC Educational Resources Information Center

    Jez, Joseph M.; Schachtman, Daniel P.; Berg, R. Howard; Taylor, Christopher G.; Chen, Sixue; Hicks, Leslie M.; Jaworski, Jan G.; Smith, Thomas J.; Nielsen, Erik; Pikaard, Craig S.

    2007-01-01

    Studies of protein function increasingly use multifaceted approaches that span disciplines including recombinant DNA technology, cell biology, and analytical biochemistry. These studies rely on sophisticated equipment and methodologies including confocal fluorescence microscopy, mass spectrometry, and X-ray crystallography that are beyond the…

  1. Impact of Line 1 on the South African Hereford Population

    USDA-ARS?s Scientific Manuscript database

    The goal of this research was to document the influence of Line 1 Hereford cattle, developed by the United States Department of Agriculture (USDA) at its research facility in Miles City, Montana, on Hereford cattle in South Africa. Analytical approaches made use of both recorded pedigree and microsa...

  2. USEPA'S APPROACH TO THE DEVELOPMENT OF NEW ANALYTICAL METHODS FOR EMERGING CONTAMINANTS IN DRINKING WATER

    EPA Science Inventory

    The 1996 Amendments to the Safe Drinking Water Act require USEPA to perform Unregulated Contaminant Monitoring (UCM) for chemicals of interest to the Agency for possible future regulation. Many of these chemicals fall into the category of "emerging contaminants". An important e...

  3. Corporate Ph.D.: Making the Grade in Business.

    ERIC Educational Resources Information Center

    Groneman, Carol; Lear, Robert N.

    Charting the ever-increasing integration of "academics" into the business world, this book uses specific examples to describe how the research and analytic skills developed in an academic setting can offer new approaches to problem solving in the business arena. Of use to employers, corporate headhunters and recruiters, and academics…

  4. One-Shot Deal? Students' Perceptions of Assessment and Course Placement in California's Community Colleges

    ERIC Educational Resources Information Center

    Venezia, Andrea; Bracco, Kathy Reeves; Nodine, Thad

    2010-01-01

    There is substantial work being done--in California and nationwide--to develop college readiness standards; expand concurrent enrollment programs; communicate clearly about the key cognitive strategies necessary for postsecondary success (e.g., analytical thinking); improve student supports; and implement other approaches to improve students'…

  5. Importance of Preserving Cross-correlation in developing Statistically Downscaled Climate Forcings and in estimating Land-surface Fluxes and States

    NASA Astrophysics Data System (ADS)

    Das Bhowmik, R.; Arumugam, S.

    2015-12-01

    Multivariate downscaling techniques exhibited superiority over univariate regression schemes in terms of preserving cross-correlations between multiple variables- precipitation and temperature - from GCMs. This study focuses on two aspects: (a) develop an analytical solutions on estimating biases in cross-correlations from univariate downscaling approaches and (b) quantify the uncertainty in land-surface states and fluxes due to biases in cross-correlations in downscaled climate forcings. Both these aspects are evaluated using climate forcings available from both historical climate simulations and CMIP5 hindcasts over the entire US. The analytical solution basically relates the univariate regression parameters, co-efficient of determination of regression and the co-variance ratio between GCM and downscaled values. The analytical solutions are compared with the downscaled univariate forcings by choosing the desired p-value (Type-1 error) in preserving the observed cross-correlation. . For quantifying the impacts of biases on cross-correlation on estimating streamflow and groundwater, we corrupt the downscaled climate forcings with different cross-correlation structure.

  6. Unifying Approach to Analytical Chemistry and Chemical Analysis: Problem-Oriented Role of Chemical Analysis.

    ERIC Educational Resources Information Center

    Pardue, Harry L.; Woo, Jannie

    1984-01-01

    Proposes an approach to teaching analytical chemistry and chemical analysis in which a problem to be resolved is the focus of a course. Indicates that this problem-oriented approach is intended to complement detailed discussions of fundamental and applied aspects of chemical determinations and not replace such discussions. (JN)

  7. Cocontraction of pairs of antagonistic muscles: analytical solution for planar static nonlinear optimization approaches.

    PubMed

    Herzog, W; Binding, P

    1993-11-01

    It has been stated in the literature that static, nonlinear optimization approaches cannot predict coactivation of pairs of antagonistic muscles; however, numerical solutions of such approaches have predicted coactivation of pairs of one-joint and multijoint antagonists. Analytical support for either finding is not available in the literature for systems containing more than one degree of freedom. The purpose of this study was to investigate analytically the possibility of cocontraction of pairs of antagonistic muscles using a static nonlinear optimization approach for a multidegree-of-freedom, two-dimensional system. Analytical solutions were found using the Karush-Kuhn-Tucker conditions, which were necessary and sufficient for optimality in this problem. The results show that cocontraction of pairs of one-joint antagonistic muscles is not possible, whereas cocontraction of pairs of multijoint antagonists is. These findings suggest that cocontraction of pairs of antagonistic muscles may be an "efficient" way to accomplish many movement tasks.

  8. Integration within the Felsenstein equation for improved Markov chain Monte Carlo methods in population genetics

    PubMed Central

    Hey, Jody; Nielsen, Rasmus

    2007-01-01

    In 1988, Felsenstein described a framework for assessing the likelihood of a genetic data set in which all of the possible genealogical histories of the data are considered, each in proportion to their probability. Although not analytically solvable, several approaches, including Markov chain Monte Carlo methods, have been developed to find approximate solutions. Here, we describe an approach in which Markov chain Monte Carlo simulations are used to integrate over the space of genealogies, whereas other parameters are integrated out analytically. The result is an approximation to the full joint posterior density of the model parameters. For many purposes, this function can be treated as a likelihood, thereby permitting likelihood-based analyses, including likelihood ratio tests of nested models. Several examples, including an application to the divergence of chimpanzee subspecies, are provided. PMID:17301231

  9. Tiered Approach to Resilience Assessment.

    PubMed

    Linkov, Igor; Fox-Lent, Cate; Read, Laura; Allen, Craig R; Arnott, James C; Bellini, Emanuele; Coaffee, Jon; Florin, Marie-Valentine; Hatfield, Kirk; Hyde, Iain; Hynes, William; Jovanovic, Aleksandar; Kasperson, Roger; Katzenberger, John; Keys, Patrick W; Lambert, James H; Moss, Richard; Murdoch, Peter S; Palma-Oliveira, Jose; Pulwarty, Roger S; Sands, Dale; Thomas, Edward A; Tye, Mari R; Woods, David

    2018-04-25

    Regulatory agencies have long adopted a three-tier framework for risk assessment. We build on this structure to propose a tiered approach for resilience assessment that can be integrated into the existing regulatory processes. Comprehensive approaches to assessing resilience at appropriate and operational scales, reconciling analytical complexity as needed with stakeholder needs and resources available, and ultimately creating actionable recommendations to enhance resilience are still lacking. Our proposed framework consists of tiers by which analysts can select resilience assessment and decision support tools to inform associated management actions relative to the scope and urgency of the risk and the capacity of resource managers to improve system resilience. The resilience management framework proposed is not intended to supplant either risk management or the many existing efforts of resilience quantification method development, but instead provide a guide to selecting tools that are appropriate for the given analytic need. The goal of this tiered approach is to intentionally parallel the tiered approach used in regulatory contexts so that resilience assessment might be more easily and quickly integrated into existing structures and with existing policies. Published 2018. This article is a U.S. government work and is in the public domain in the USA.

  10. A Social Identity Approach to Sport Psychology: Principles, Practice, and Prospects.

    PubMed

    Rees, Tim; Alexander Haslam, S; Coffee, Pete; Lavallee, David

    2015-08-01

    Drawing on social identity theory and self-categorization theory, we outline an approach to sport psychology that understands groups not simply as features of sporting contexts but rather as elements that can be, and often are, incorporated into a person's sense of self and, through this, become powerful determinants of their sport-related behavior. The underpinnings of this social identity approach are outlined, and four key lessons for sport that are indicative of the analytical and practical power of the approach are presented. These suggest that social identity is the basis for sports group (1) behavior, (2) formation and development, (3) support and stress appraisal, and (4) leadership. Building on recent developments within sport science, we outline an agenda for future research by identifying a range of topics to which the social identity approach could fruitfully contribute.

  11. Review of Thawing Time Prediction Models Depending
on Process Conditions and Product Characteristics

    PubMed Central

    Kluza, Franciszek; Spiess, Walter E. L.; Kozłowicz, Katarzyna

    2016-01-01

    Summary Determining thawing times of frozen foods is a challenging problem as the thermophysical properties of the product change during thawing. A number of calculation models and solutions have been developed. The proposed solutions range from relatively simple analytical equations based on a number of assumptions to a group of empirical approaches that sometimes require complex calculations. In this paper analytical, empirical and graphical models are presented and critically reviewed. The conditions of solution, limitations and possible applications of the models are discussed. The graphical and semi--graphical models are derived from numerical methods. Using the numerical methods is not always possible as running calculations takes time, whereas the specialized software and equipment are not always cheap. For these reasons, the application of analytical-empirical models is more useful for engineering. It is demonstrated that there is no simple, accurate and feasible analytical method for thawing time prediction. Consequently, simplified methods are needed for thawing time estimation of agricultural and food products. The review reveals the need for further improvement of the existing solutions or development of new ones that will enable accurate determination of thawing time within a wide range of practical conditions of heat transfer during processing. PMID:27904387

  12. Evaluation of a hydrophilic interaction liquid chromatography design space for sugars and sugar alcohols.

    PubMed

    Hetrick, Evan M; Kramer, Timothy T; Risley, Donald S

    2017-03-17

    Based on a column-screening exercise, a column ranking system was developed for sample mixtures containing any combination of 26 sugar and sugar alcohol analytes using 16 polar stationary phases in the HILIC mode with acetonitrile/water or acetone/water mobile phases. Each analyte was evaluated on the HILIC columns with gradient elution and the subsequent chromatography data was compiled into a statistical software package where any subset of the analytes can be selected and the columns are then ranked by the greatest separation. Since these analytes lack chromophores, aerosol-based detectors, including an evaporative light scattering detector (ELSD) and a charged aerosol detector (CAD) were employed for qualitative and quantitative detection. Example qualitative applications are provided to illustrate the practicality and efficiency of this HILIC column ranking. Furthermore, the design-space approach was used as a starting point for a quantitative method for the trace analysis of glucose in trehalose samples in a complex matrix. Knowledge gained from evaluating the design-space led to rapid development of a capable method as demonstrated through validation of the following parameters: specificity, accuracy, precision, linearity, limit of quantitation, limit of detection, and range. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Designing Flavoprotein-GFP Fusion Probes for Analyte-Specific Ratiometric Fluorescence Imaging.

    PubMed

    Hudson, Devin A; Caplan, Jeffrey L; Thorpe, Colin

    2018-02-20

    The development of genetically encoded fluorescent probes for analyte-specific imaging has revolutionized our understanding of intracellular processes. Current classes of intracellular probes depend on the selection of binding domains that either undergo conformational changes on analyte binding or can be linked to thiol redox chemistry. Here we have designed novel probes by fusing a flavoenzyme, whose fluorescence is quenched on reduction by the analyte of interest, with a GFP domain to allow for rapid and specific ratiometric sensing. Two flavoproteins, Escherichia coli thioredoxin reductase and Saccharomyces cerevisiae lipoamide dehydrogenase, were successfully developed into thioredoxin and NAD + /NADH specific probes, respectively, and their performance was evaluated in vitro and in vivo. A flow cell format, which allowed dynamic measurements, was utilized in both bacterial and mammalian systems. In E. coli the first reported intracellular steady-state of the cytoplasmic thioredoxin pool was measured. In HEK293T mammalian cells, the steady-state cytosolic ratio of NAD + /NADH induced by glucose was determined. These genetically encoded fluorescent constructs represent a modular approach to intracellular probe design that should extend the range of metabolites that can be quantitated in live cells.

  14. Application of surface plasmon resonance for the detection of carbohydrates, glycoconjugates, and measurement of the carbohydrate-specific interactions: a comparison with conventional analytical techniques. A critical review.

    PubMed

    Safina, Gulnara

    2012-01-27

    Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Analytical and numerical solutions for heat transfer and effective thermal conductivity of cracked media

    NASA Astrophysics Data System (ADS)

    Tran, A. B.; Vu, M. N.; Nguyen, S. T.; Dong, T. Q.; Le-Nguyen, K.

    2018-02-01

    This paper presents analytical solutions to heat transfer problems around a crack and derive an adaptive model for effective thermal conductivity of cracked materials based on singular integral equation approach. Potential solution of heat diffusion through two-dimensional cracked media, where crack filled by air behaves as insulator to heat flow, is obtained in a singular integral equation form. It is demonstrated that the temperature field can be described as a function of temperature and rate of heat flow on the boundary and the temperature jump across the cracks. Numerical resolution of this boundary integral equation allows determining heat conduction and effective thermal conductivity of cracked media. Moreover, writing this boundary integral equation for an infinite medium embedding a single crack under a far-field condition allows deriving the closed-form solution of temperature discontinuity on the crack and particularly the closed-form solution of temperature field around the crack. These formulas are then used to establish analytical effective medium estimates. Finally, the comparison between the developed numerical and analytical solutions allows developing an adaptive model for effective thermal conductivity of cracked media. This model takes into account both the interaction between cracks and the percolation threshold.

  16. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Analytic Steering: Inserting Context into the Information Dialog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.

    2011-10-23

    An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less

  18. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    PubMed

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  19. Curating and Integrating Data from Multiple Sources to Support Healthcare Analytics.

    PubMed

    Ng, Kenney; Kakkanatt, Chris; Benigno, Michael; Thompson, Clay; Jackson, Margaret; Cahan, Amos; Zhu, Xinxin; Zhang, Ping; Huang, Paul

    2015-01-01

    As the volume and variety of healthcare related data continues to grow, the analysis and use of this data will increasingly depend on the ability to appropriately collect, curate and integrate disparate data from many different sources. We describe our approach to and highlight our experiences with the development of a robust data collection, curation and integration infrastructure that supports healthcare analytics. This system has been successfully applied to the processing of a variety of data types including clinical data from electronic health records and observational studies, genomic data, microbiomic data, self-reported data from surveys and self-tracked data from wearable devices from over 600 subjects. The curated data is currently being used to support healthcare analytic applications such as data visualization, patient stratification and predictive modeling.

  20. Standardization approaches in absolute quantitative proteomics with mass spectrometry.

    PubMed

    Calderón-Celis, Francisco; Encinar, Jorge Ruiz; Sanz-Medel, Alfredo

    2017-07-31

    Mass spectrometry-based approaches have enabled important breakthroughs in quantitative proteomics in the last decades. This development is reflected in the better quantitative assessment of protein levels as well as to understand post-translational modifications and protein complexes and networks. Nowadays, the focus of quantitative proteomics shifted from the relative determination of proteins (ie, differential expression between two or more cellular states) to absolute quantity determination, required for a more-thorough characterization of biological models and comprehension of the proteome dynamism, as well as for the search and validation of novel protein biomarkers. However, the physico-chemical environment of the analyte species affects strongly the ionization efficiency in most mass spectrometry (MS) types, which thereby require the use of specially designed standardization approaches to provide absolute quantifications. Most common of such approaches nowadays include (i) the use of stable isotope-labeled peptide standards, isotopologues to the target proteotypic peptides expected after tryptic digestion of the target protein; (ii) use of stable isotope-labeled protein standards to compensate for sample preparation, sample loss, and proteolysis steps; (iii) isobaric reagents, which after fragmentation in the MS/MS analysis provide a final detectable mass shift, can be used to tag both analyte and standard samples; (iv) label-free approaches in which the absolute quantitative data are not obtained through the use of any kind of labeling, but from computational normalization of the raw data and adequate standards; (v) elemental mass spectrometry-based workflows able to provide directly absolute quantification of peptides/proteins that contain an ICP-detectable element. A critical insight from the Analytical Chemistry perspective of the different standardization approaches and their combinations used so far for absolute quantitative MS-based (molecular and elemental) proteomics is provided in this review. © 2017 Wiley Periodicals, Inc.

Top