Sample records for analytical design method

  1. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  2. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  3. Horizontal lifelines - review of regulations and simple design method considering anchorage rigidity.

    PubMed

    Galy, Bertrand; Lan, André

    2018-03-01

    Among the many occupational risks construction workers encounter every day falling from a height is the most dangerous. The objective of this article is to propose a simple analytical design method for horizontal lifelines (HLLs) that considers anchorage flexibility. The article presents a short review of the standards and regulations/acts/codes concerning HLLs in Canada the USA and Europe. A static analytical approach is proposed considering anchorage flexibility. The analytical results are compared with a series of 42 dynamic fall tests and a SAP2000 numerical model. The experimental results show that the analytical method is a little conservative and overestimates the line tension in most cases with a maximum of 17%. The static SAP2000 results show a maximum 2.1% difference with the analytical method. The analytical method is accurate enough to safely design HLLs and quick design abaci are provided to allow the engineer to make quick on-site verification if needed.

  4. Design and analysis of tubular permanent magnet linear generator for small-scale wave energy converter

    NASA Astrophysics Data System (ADS)

    Kim, Jeong-Man; Koo, Min-Mo; Jeong, Jae-Hoon; Hong, Keyyong; Cho, Il-Hyoung; Choi, Jang-Young

    2017-05-01

    This paper reports the design and analysis of a tubular permanent magnet linear generator (TPMLG) for a small-scale wave-energy converter. The analytical field computation is performed by applying a magnetic vector potential and a 2-D analytical model to determine design parameters. Based on analytical solutions, parametric analysis is performed to meet the design specifications of a wave-energy converter (WEC). Then, 2-D FEA is employed to validate the analytical method. Finally, the experimental result confirms the predictions of the analytical and finite element analysis (FEA) methods under regular and irregular wave conditions.

  5. An analytical method for designing low noise helicopter transmissions

    NASA Technical Reports Server (NTRS)

    Bossler, R. B., Jr.; Bowes, M. A.; Royal, A. C.

    1978-01-01

    The development and experimental validation of a method for analytically modeling the noise mechanism in the helicopter geared power transmission systems is described. This method can be used within the design process to predict interior noise levels and to investigate the noise reducing potential of alternative transmission design details. Examples are discussed.

  6. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  7. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    PubMed

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  8. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  9. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    ERIC Educational Resources Information Center

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  10. The importance of quality control in validating concentrations ...

    EPA Pesticide Factsheets

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer

  11. Method of identity analyte-binding peptides

    DOEpatents

    Kauvar, Lawrence M.

    1990-01-01

    A method for affinity chromatography or adsorption of a designated analyte utilizes a paralog as the affinity partner. The immobilized paralog can be used in purification or analysis of the analyte; the paralog can also be used as a substitute for antibody in an immunoassay. The paralog is identified by screening candidate peptide sequences of 4-20 amino acids for specific affinity to the analyte.

  12. Analytical Methods of Decoupling the Automotive Engine Torque Roll Axis

    NASA Astrophysics Data System (ADS)

    JEONG, TAESEOK; SINGH, RAJENDRA

    2000-06-01

    This paper analytically examines the multi-dimensional mounting schemes of an automotive engine-gearbox system when excited by oscillating torques. In particular, the issue of torque roll axis decoupling is analyzed in significant detail since it is poorly understood. New dynamic decoupling axioms are presented an d compared with the conventional elastic axis mounting and focalization methods. A linear time-invariant system assumption is made in addition to a proportionally damped system. Only rigid-body modes of the powertrain are considered and the chassis elements are assumed to be rigid. Several simplified physical systems are considered and new closed-form solutions for symmetric and asymmetric engine-mounting systems are developed. These clearly explain the design concepts for the 4-point mounting scheme. Our analytical solutions match with the existing design formulations that are only applicable to symmetric geometries. Spectra for all six rigid-body motions are predicted using the alternate decoupling methods and the closed-form solutions are verified. Also, our method is validated by comparing modal solutions with prior experimental and analytical studies. Parametric design studies are carried out to illustrate the methodology. Chief contributions of this research include the development of new or refined analytical models and closed-form solutions along with improved design strategies for the torque roll axis decoupling.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, A.G.

    The Pacific Northwest Laboratory (PNL)/Analytical Chemistry Laboratory (ACL) and the Westinghouse Hanford Company (WHC)/Process Analytical Laboratory (PAL) provide analytical support services to various environmental restoration and waste management projects/programs at Hanford. In response to a US Department of Energy -- Richland Field Office (DOE-RL) audit, which questioned the comparability of analytical methods employed at each laboratory, the Sample Exchange/Exchange (SEE) program was initiated. The SEE Program is a selfassessment program designed to compare analytical methods of the PAL and ACL laboratories using sitespecific waste material. The SEE program is managed by a collaborative, the Quality Assurance Triad (Triad). Triad membershipmore » is made up of representatives from the WHC/PAL, PNL/ACL, and WHC Hanford Analytical Services Management (HASM) organizations. The Triad works together to design/evaluate/implement each phase of the SEE Program.« less

  14. Permanent Ground Anchors : Stump Design Criteria

    DOT National Transportation Integrated Search

    1982-09-01

    This document summarizes the main design methods used by the principal investigators in the design of permanent ground anchors, including basic concepts, design criteria, and analytical techniques. The application of these design methods are illustra...

  15. Life cycle management of analytical methods.

    PubMed

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Method of identity analyte-binding peptides

    DOEpatents

    Kauvar, L.M.

    1990-10-16

    A method for affinity chromatography or adsorption of a designated analyte utilizes a paralog as the affinity partner. The immobilized paralog can be used in purification or analysis of the analyte; the paralog can also be used as a substitute for antibody in an immunoassay. The paralog is identified by screening candidate peptide sequences of 4--20 amino acids for specific affinity to the analyte. 5 figs.

  17. FIELD ANALYTICAL SCREENING PROGRAM: PCP METHOD - INNOVATIVE TECHNOLOGY EVALUATION REPORT

    EPA Science Inventory

    The Field Analytical Screening Program (FASP) pentachlorophenol (PCP) method uses a gas chromatograph (GC) equipped with a megabore capillary column and flame ionization detector (FID) and electron capture detector (ECD) to identify and quantify PCP. The FASP PCP method is design...

  18. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Keller, J.; Wallen, R.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  19. Simplified analytical model and balanced design approach for light-weight wood-based structural panel in bending

    Treesearch

    Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai

    2016-01-01

    This paper presents a simplified analytical model and balanced design approach for modeling lightweight wood-based structural panels in bending. Because many design parameters are required to input for the model of finite element analysis (FEA) during the preliminary design process and optimization, the equivalent method was developed to analyze the mechanical...

  20. An Analytical Design Method for a Regenerative Braking Control System for DC-electrified Railway Systems under Light Load Conditions

    NASA Astrophysics Data System (ADS)

    Saito, Tatsuhito; Kondo, Keiichiro; Koseki, Takafumi

    A DC-electrified railway system that is fed by diode rectifiers at a substation is unable to return the electric power to an AC grid. Accordingly, the braking cars have to restrict regenerative braking power when the power consumption of the powering cars is not sufficient. However, the characteristics of a DC-electrified railway system, including the powering cars, is not known, and a mathematical model for designing a controller has not been established yet. Hence, the object of this study is to obtain the mathematical model for an analytical design method of the regenerative braking control system. In the first part of this paper, the static characteristics of this system are presented to show the position of the equilibrium point. The linearization of this system at the equilibrium point is then performed to describe the dynamic characteristics of the system. An analytical design method is then proposed on the basis of these characteristics. The proposed design method is verified by experimental tests with a 1kW class miniature model, and numerical simulations.

  1. Analytical Quality by Design Approach in RP-HPLC Method Development for the Assay of Etofenamate in Dosage Forms

    PubMed Central

    Peraman, R.; Bhadraya, K.; Reddy, Y. Padmanabha; Reddy, C. Surayaprakash; Lokesh, T.

    2015-01-01

    By considering the current regulatory requirement for an analytical method development, a reversed phase high performance liquid chromatographic method for routine analysis of etofenamate in dosage form has been optimized using analytical quality by design approach. Unlike routine approach, the present study was initiated with understanding of quality target product profile, analytical target profile and risk assessment for method variables that affect the method response. A liquid chromatography system equipped with a C18 column (250×4.6 mm, 5 μ), a binary pump and photodiode array detector were used in this work. The experiments were conducted based on plan by central composite design, which could save time, reagents and other resources. Sigma Tech software was used to plan and analyses the experimental observations and obtain quadratic process model. The process model was used for predictive solution for retention time. The predicted data from contour diagram for retention time were verified actually and it satisfied with actual experimental data. The optimized method was achieved at 1.2 ml/min flow rate of using mobile phase composition of methanol and 0.2% triethylamine in water at 85:15, % v/v, pH adjusted to 6.5. The method was validated and verified for targeted method performances, robustness and system suitability during method transfer. PMID:26997704

  2. Analytical method development of nifedipine and its degradants binary mixture using high performance liquid chromatography through a quality by design approach

    NASA Astrophysics Data System (ADS)

    Choiri, S.; Ainurofiq, A.; Ratri, R.; Zulmi, M. U.

    2018-03-01

    Nifedipin (NIF) is a photo-labile drug that easily degrades when it exposures a sunlight. This research aimed to develop of an analytical method using a high-performance liquid chromatography and implemented a quality by design approach to obtain effective, efficient, and validated analytical methods of NIF and its degradants. A 22 full factorial design approach with a curvature as a center point was applied to optimize of the analytical condition of NIF and its degradants. Mobile phase composition (MPC) and flow rate (FR) as factors determined on the system suitability parameters. The selected condition was validated by cross-validation using a leave one out technique. Alteration of MPC affected on time retention significantly. Furthermore, an increase of FR reduced the tailing factor. In addition, the interaction of both factors affected on an increase of the theoretical plates and resolution of NIF and its degradants. The selected analytical condition of NIF and its degradants has been validated at range 1 – 16 µg/mL that had good linearity, precision, accuration and efficient due to an analysis time within 10 min.

  3. A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses

    PubMed Central

    Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert

    2011-01-01

    Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325

  4. Approximate analytical relationships for linear optimal aeroelastic flight control laws

    NASA Astrophysics Data System (ADS)

    Kassem, Ayman Hamdy

    1998-09-01

    This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.

  5. Improved Design of Tunnel Supports : Executive Summary

    DOT National Transportation Integrated Search

    1979-12-01

    This report focuses on improvement of design methodologies related to the ground-structure interaction in tunneling. The design methods range from simple analytical and empirical methods to sophisticated finite element techniques as well as an evalua...

  6. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    PubMed

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current methods. Analytical methods are often used to ensure safety, efficacy, and quality of medicinal products. According to government regulations and regulatory guidelines, these methods need to be validated through well-designed studies to minimize the risk of accepting unsuitable methods. This article describes a novel statistical test for analytical method validation, which provides better protection for the risk of accepting unsuitable analytical methods. © PDA, Inc. 2015.

  7. [Construction of NIRS-based process analytical system for production of salvianolic acid for injection and relative discussion].

    PubMed

    Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-10-01

    Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.

  8. Designing Studies That Would Address the Multilayered Nature of Health Care

    PubMed Central

    Pennell, Michael; Rhoda, Dale; Hade, Erinn M.; Paskett, Electra D.

    2010-01-01

    We review design and analytic methods available for multilevel interventions in cancer research with particular attention to study design, sample size requirements, and potential to provide statistical evidence for causal inference. The most appropriate methods will depend on the stage of development of the research and whether randomization is possible. Early on, fractional factorial designs may be used to screen intervention components, particularly when randomization of individuals is possible. Quasi-experimental designs, including time-series and multiple baseline designs, can be useful once the intervention is designed because they require few sites and can provide the preliminary evidence to plan efficacy studies. In efficacy and effectiveness studies, group-randomized trials are preferred when randomization is possible and regression discontinuity designs are preferred otherwise if assignment based on a quantitative score is possible. Quasi-experimental designs may be used, especially when combined with recent developments in analytic methods to reduce bias in effect estimates. PMID:20386057

  9. A Requirements-Driven Optimization Method for Acoustic Liners Using Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.; Lopes, Leonard V.

    2017-01-01

    More than ever, there is flexibility and freedom in acoustic liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. In a previous paper on this subject, a method deriving the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground was described. A simple code-wrapping approach was used to evaluate a community noise objective function for an external optimizer. Gradients were evaluated using a finite difference formula. The subject of this paper is an application of analytic derivatives that supply precise gradients to an optimization process. Analytic derivatives improve the efficiency and accuracy of gradient-based optimization methods and allow consideration of more design variables. In addition, the benefit of variable impedance liners is explored using a multi-objective optimization.

  10. Stress state estimation in multilayer support of vertical shafts, considering off-design cross-sectional deformation

    NASA Astrophysics Data System (ADS)

    Antsiferov, SV; Sammal, AS; Deev, PV

    2018-03-01

    To determine the stress-strain state of multilayer support of vertical shafts, including cross-sectional deformation of the tubing rings as against the design, the authors propose an analytical method based on the provision of the mechanics of underground structures and surrounding rock mass as the elements of an integrated deformable system. The method involves a rigorous solution of the corresponding problem of elasticity, obtained using the mathematical apparatus of the theory of analytic functions of a complex variable. The design method is implemented as a software program allowing multivariate applied computation. Examples of the calculation are given.

  11. An analytic model for footprint dispersions and its application to mission design

    NASA Technical Reports Server (NTRS)

    Rao, J. R. Jagannatha; Chen, Yi-Chao

    1992-01-01

    This is the final report on our recent research activities that are complementary to those conducted by our colleagues, Professor Farrokh Mistree and students, in the context of the Taguchi method. We have studied the mathematical model that forms the basis of the Simulation and Optimization of Rocket Trajectories (SORT) program and developed an analytic method for determining mission reliability with a reduced number of flight simulations. This method can be incorporated in a design algorithm to mathematically optimize different performance measures of a mission, thus leading to a robust and easy-to-use methodology for mission planning and design.

  12. Passive Magnetic Bearing With Ferrofluid Stabilization

    NASA Technical Reports Server (NTRS)

    Jansen, Ralph; DiRusso, Eliseo

    1996-01-01

    A new class of magnetic bearings is shown to exist analytically and is demonstrated experimentally. The class of magnetic bearings utilize a ferrofluid/solid magnet interaction to stabilize the axial degree of freedom of a permanent magnet radial bearing. Twenty six permanent magnet bearing designs and twenty two ferrofluid stabilizer designs are evaluated. Two types of radial bearing designs are tested to determine their force and stiffness utilizing two methods. The first method is based on the use of frequency measurements to determine stiffness by utilizing an analytical model. The second method consisted of loading the system and measuring displacement in order to measure stiffness. Two ferrofluid stabilizers are tested and force displacement curves are measured. Two experimental test fixtures are designed and constructed in order to conduct the stiffness testing. Polynomial models of the data are generated and used to design the bearing prototype. The prototype was constructed and tested and shown to be stable. Further testing shows the possibility of using this technology for vibration isolation. The project successfully demonstrated the viability of the passive magnetic bearing with ferrofluid stabilization both experimentally and analytically.

  13. Analytical Quality by Design in pharmaceutical quality assurance: Development of a capillary electrophoresis method for the analysis of zolmitriptan and its impurities.

    PubMed

    Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra

    2015-11-01

    A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Analysis methods for Kevlar shield response to rotor fragments

    NASA Technical Reports Server (NTRS)

    Gerstle, J. H.

    1977-01-01

    Several empirical and analytical approaches to rotor burst shield sizing are compared and principal differences in metal and fabric dynamic behavior are discussed. The application of transient structural response computer programs to predict Kevlar containment limits is described. For preliminary shield sizing, present analytical methods are useful if insufficient test data for empirical modeling are available. To provide other information useful for engineering design, analytical methods require further developments in material characterization, failure criteria, loads definition, and post-impact fragment trajectory prediction.

  15. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.

  16. Conservative Analytical Collision Probability for Design of Orbital Formations

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell

    2004-01-01

    The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.

  17. Development and optimization of SPE-HPLC-UV/ELSD for simultaneous determination of nine bioactive components in Shenqi Fuzheng Injection based on Quality by Design principles.

    PubMed

    Wang, Lu; Qu, Haibin

    2016-03-01

    A method combining solid phase extraction, high performance liquid chromatography, and ultraviolet/evaporative light scattering detection (SPE-HPLC-UV/ELSD) was developed according to Quality by Design (QbD) principles and used to assay nine bioactive compounds within a botanical drug, Shenqi Fuzheng Injection. Risk assessment and a Plackett-Burman design were utilized to evaluate the impact of 11 factors on the resolutions and signal-to-noise of chromatographic peaks. Multiple regression and Pareto ranking analysis indicated that the sorbent mass, sample volume, flow rate, column temperature, evaporator temperature, and gas flow rate were statistically significant (p < 0.05) in this procedure. Furthermore, a Box-Behnken design combined with response surface analysis was employed to study the relationships between the quality of SPE-HPLC-UV/ELSD analysis and four significant factors, i.e., flow rate, column temperature, evaporator temperature, and gas flow rate. An analytical design space of SPE-HPLC-UV/ELSD was then constructed by calculated Monte Carlo probability. In the presented approach, the operating parameters of sample preparation, chromatographic separation, and compound detection were investigated simultaneously. Eight terms of method validation, i.e., system-suitability tests, method robustness/ruggedness, sensitivity, precision, repeatability, linearity, accuracy, and stability, were accomplished at a selected working point. These results revealed that the QbD principles were suitable in the development of analytical procedures for samples in complex matrices. Meanwhile, the analytical quality and method robustness were validated by the analytical design space. The presented strategy provides a tutorial on the development of a robust QbD-compliant quantitative method for samples in complex matrices.

  18. Quantitative Comparison of Three Standardization Methods Using a One-Way ANOVA for Multiple Mean Comparisons

    ERIC Educational Resources Information Center

    Barrows, Russell D.

    2007-01-01

    A one-way ANOVA experiment is performed to determine whether or not the three standardization methods are statistically different in determining the concentration of the three paraffin analytes. The laboratory exercise asks students to combine the three methods in a single analytical procedure of their own design to determine the concentration of…

  19. Structural analysis at aircraft conceptual design stage

    NASA Astrophysics Data System (ADS)

    Mansouri, Reza

    In the past 50 years, computers have helped by augmenting human efforts with tremendous pace. The aircraft industry is not an exception. Aircraft industry is more than ever dependent on computing because of a high level of complexity and the increasing need for excellence to survive a highly competitive marketplace. Designers choose computers to perform almost every analysis task. But while doing so, existing effective, accurate and easy to use classical analytical methods are often forgotten, which can be very useful especially in the early phases of the aircraft design where concept generation and evaluation demands physical visibility of design parameters to make decisions [39, 2004]. Structural analysis methods have been used by human beings since the very early civilization. Centuries before computers were invented; the pyramids were designed and constructed by Egyptians around 2000 B.C, the Parthenon was built by the Greeks, around 240 B.C, Dujiangyan was built by the Chinese. Persepolis, Hagia Sophia, Taj Mahal, Eiffel tower are only few more examples of historical buildings, bridges and monuments that were constructed before we had any advancement made in computer aided engineering. Aircraft industry is no exception either. In the first half of the 20th century, engineers used classical method and designed civil transport aircraft such as Ford Tri Motor (1926), Lockheed Vega (1927), Lockheed 9 Orion (1931), Douglas DC-3 (1935), Douglas DC-4/C-54 Skymaster (1938), Boeing 307 (1938) and Boeing 314 Clipper (1939) and managed to become airborne without difficulty. Evidencing, while advanced numerical methods such as the finite element analysis is one of the most effective structural analysis methods; classical structural analysis methods can also be as useful especially during the early phase of a fixed wing aircraft design where major decisions are made and concept generation and evaluation demands physical visibility of design parameters to make decisions. Considering the strength and limitations of both methodologies, the question to be answered in this thesis is: How valuable and compatible are the classical analytical methods in today's conceptual design environment? And can these methods complement each other? To answer these questions, this thesis investigates the pros and cons of classical analytical structural analysis methods during the conceptual design stage through the following objectives: Illustrate structural design methodology of these methods within the framework of Aerospace Vehicle Design (AVD) lab's design lifecycle. Demonstrate the effectiveness of moment distribution method through four case studies. This will be done by considering and evaluating the strength and limitation of these methods. In order to objectively quantify the limitation and capabilities of the analytical method at the conceptual design stage, each case study becomes more complex than the one before.

  20. CZAEM USER'S GUIDE: MODELING CAPTURE ZONES OF GROUND-WATER WELLS USING ANALYTIC ELEMENTS

    EPA Science Inventory

    The computer program CZAEM is designed for elementary capture zone analysis, and is based on the analytic element method. CZAEM is applicable to confined and/or unconfined low in shallow aquifers; the Dupuit-Forchheimer assumption is adopted. CZAEM supports the following analyt...

  1. Integrating a Smartphone and Molecular Modeling for Determining the Binding Constant and Stoichiometry Ratio of the Iron(II)-Phenanthroline Complex: An Activity for Analytical and Physical Chemistry Laboratories

    ERIC Educational Resources Information Center

    de Morais, Camilo de L. M.; Silva, Se´rgio R. B.; Vieira, Davi S.; Lima, Ka´ssio M. G.

    2016-01-01

    The binding constant and stoichiometry ratio for the formation of iron(II)-(1,10-phenanthroline) or iron(II)-o-phenanthroline complexes has been determined by a combination of a low-cost analytical method using a smartphone and a molecular modeling method as a laboratory experiment designed for analytical and physical chemistry courses. Intensity…

  2. Analytical Model-Based Design Optimization of a Transverse Flux Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less

  3. On analytic design of loudspeaker arrays with uniform radiation characteristics

    PubMed

    Aarts; Janssen

    2000-01-01

    Some notes on analytical derived loudspeaker arrays with uniform radiation characteristics are presented. The array coefficients are derived via analytical means and compared with so-called maximal flat sequences known from telecommunications and information theory. It appears that the newly derived array, i.e., the quadratic phase array, has a higher efficiency than the Bessel array and a flatter response than the Barker array. The method discussed admits generalization to the design of arrays with desired nonuniform radiating characteristics.

  4. Application of Analytical Quality by Design concept for bilastine and its degradation impurities determination by hydrophilic interaction liquid chromatographic method.

    PubMed

    Terzić, Jelena; Popović, Igor; Stajić, Ana; Tumpa, Anja; Jančić-Stojanović, Biljana

    2016-06-05

    This paper deals with the development of hydrophilic interaction liquid chromatographic (HILIC) method for the analysis of bilastine and its degradation impurities following Analytical Quality by Design approach. It is the first time that the method for bilastine and its impurities is proposed. The main objective was to identify the conditions where an adequate separation in minimal analysis duration could be achieved within a robust region. Critical process parameters which have the most influence on method performance were defined as acetonitrile content in the mobile phase, pH of the aqueous phase and ammonium acetate concentration in the aqueous phase. Box-Behnken design was applied for establishing a relationship between critical process parameters and critical quality attributes. The defined mathematical models and Monte Carlo simulations were used to identify the design space. Fractional factorial design was applied for experimental robustness testing and the method is validated to verify the adequacy of selected optimal conditions: the analytical column Luna(®) HILIC (100mm×4.6mm, 5μm particle size); mobile phase consisted of acetonitrile-aqueous phase (50mM ammonium acetate, pH adjusted to 5.3 with glacial acetic acid) (90.5:9.5, v/v); column temperature 30°C, mobile phase flow rate 1mLmin(-1), wavelength of detection 275nm. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Application of analytical quality by design principles for the determination of alkyl p-toluenesulfonates impurities in Aprepitant by HPLC. Validation using total-error concept.

    PubMed

    Zacharis, Constantinos K; Vastardi, Elli

    2018-02-20

    In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Design and fabrication of planar structures with graded electromagnetic properties

    NASA Astrophysics Data System (ADS)

    Good, Brandon Lowell

    Successfully integrating electromagnetic properties in planar structures offers numerous benefits to the microwave and optical communities. This work aims at formulating new analytic and optimized design methods, creating new fabrication techniques for achieving those methods, and matching appropriate implementation of methods to fabrication techniques. The analytic method consists of modifying an approach that realizes perfect antireflective properties from graded profiles. This method is shown for all-dielectric and magneto-dielectric grading profiles. The optimized design methods are applied to transformer (discrete) or taper (continuous) designs. From these methods, a subtractive and an additive manufacturing technique were established and are described. The additive method, dry powder dot deposition, enables three dimensional varying electromagnetic properties in a structural composite. Combining the methods and fabrication is shown in two applied methodologies. The first uses dry powder dot deposition to design one dimensionally graded electromagnetic profiles in a planar fiberglass composite. The second method simultaneously applies antireflective properties and adjusts directivity through a slab through the use of subwavelength structures to achieve a flat antireflective lens. The end result of this work is a complete set of methods, formulations, and fabrication techniques to achieve integrated electromagnetic properties in planar structures.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, W.W.; Sullivan, H.H.

    Electroless nicke-plate characteristics are substantially influenced by percent phosphorous concentrations. Available ASTM analytical methods are designed for phosphorous concentrations of less than one percent compared to the 4.0 to 20.0% concentrations common in electroless nickel plate. A variety of analytical adaptations are applied through the industry resulting in poor data continuity. This paper presents a statistical comparison of five analytical methods and recommends accurate and precise procedures for use in percent phosphorous determinations in electroless nickel plate. 2 figures, 1 table.

  8. Development of a validated liquid chromatographic method for quantification of sorafenib tosylate in the presence of stress-induced degradation products and in biological matrix employing analytical quality by design approach.

    PubMed

    Sharma, Teenu; Khurana, Rajneet Kaur; Jain, Atul; Katare, O P; Singh, Bhupinder

    2018-05-01

    The current research work envisages an analytical quality by design-enabled development of a simple, rapid, sensitive, specific, robust and cost-effective stability-indicating reversed-phase high-performance liquid chromatographic method for determining stress-induced forced-degradation products of sorafenib tosylate (SFN). An Ishikawa fishbone diagram was constructed to embark upon analytical target profile and critical analytical attributes, i.e. peak area, theoretical plates, retention time and peak tailing. Factor screening using Taguchi orthogonal arrays and quality risk assessment studies carried out using failure mode effect analysis aided the selection of critical method parameters, i.e. mobile phase ratio and flow rate potentially affecting the chosen critical analytical attributes. Systematic optimization using response surface methodology of the chosen critical method parameters was carried out employing a two-factor-three-level-13-run, face-centered cubic design. A method operable design region was earmarked providing optimum method performance using numerical and graphical optimization. The optimum method employed a mobile phase composition consisting of acetonitrile and water (containing orthophosphoric acid, pH 4.1) at 65:35 v/v at a flow rate of 0.8 mL/min with UV detection at 265 nm using a C 18 column. Response surface methodology validation studies confirmed good efficiency and sensitivity of the developed method for analysis of SFN in mobile phase as well as in human plasma matrix. The forced degradation studies were conducted under different recommended stress conditions as per ICH Q1A (R2). Mass spectroscopy studies showed that SFN degrades in strongly acidic, alkaline and oxidative hydrolytic conditions at elevated temperature, while the drug was per se found to be photostable. Oxidative hydrolysis using 30% H 2 O 2 showed maximum degradation with products at retention times of 3.35, 3.65, 4.20 and 5.67 min. The absence of any significant change in the retention time of SFN and degradation products, formed under different stress conditions, ratified selectivity and specificity of the systematically developed method. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Shape design of an optimal comfortable pillow based on the analytic hierarchy process method

    PubMed Central

    Liu, Shuo-Fang; Lee, Yann-Long; Liang, Jung-Chin

    2011-01-01

    Objective Few studies have analyzed the shapes of pillows. The purpose of this study was to investigate the relationship between the pillow shape design and subjective comfort level for asymptomatic subjects. Methods Four basic pillow designs factors were selected on the basis of literature review and recombined into 8 configurations for testing the rank of degrees of comfort. The data were analyzed by the analytic hierarchy process method to determine the most comfortable pillow. Results Pillow number 4 was the most comfortable pillow in terms of head, neck, shoulder, height, and overall comfort. The design factors of pillow number 4 were using a combination of standard, cervical, and shoulder pillows. A prototype of this pillow was developed on the basis of the study results for designing future pillow shapes. Conclusions This study investigated the comfort level of particular users and redesign features of a pillow. A deconstruction analysis would simplify the process of determining the most comfortable pillow design and aid designers in designing pillows for groups. PMID:22654680

  10. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  11. Propfan experimental data analysis

    NASA Technical Reports Server (NTRS)

    Vernon, David F.; Page, Gregory S.; Welge, H. Robert

    1984-01-01

    A data reduction method, which is consistent with the performance prediction methods used for analysis of new aircraft designs, is defined and compared to the method currently used by NASA using data obtained from an Ames Res. Center 11 foot transonic wind tunnel test. Pressure and flow visualization data from the Ames test for both the powered straight underwing nacelle, and an unpowered contoured overwing nacelle installation is used to determine the flow phenomena present for a wind mounted turboprop installation. The test data is compared to analytic methods, showing the analytic methods to be suitable for design and analysis of new configurations. The data analysis indicated that designs with zero interference drag levels are achieveable with proper wind and nacelle tailoring. A new overwing contoured nacelle design and a modification to the wing leading edge extension for the current wind tunnel model design are evaluated. Hardware constraints of the current model parts prevent obtaining any significant performance improvement due to a modified nacelle contouring. A new aspect ratio wing design for an up outboard rotation turboprop installation is defined, and an advanced contoured nacelle is provided.

  12. Using an innovative combination of quality-by-design and green analytical chemistry approaches for the development of a stability indicating UHPLC method in pharmaceutical products.

    PubMed

    Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen

    2015-11-10

    An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  14. Development of an ultra high performance liquid chromatography method for determining triamcinolone acetonide in hydrogels using the design of experiments/design space strategy in combination with process capability index.

    PubMed

    Oliva, Alexis; Monzón, Cecilia; Santoveña, Ana; Fariña, José B; Llabrés, Matías

    2016-07-01

    An ultra high performance liquid chromatography method was developed and validated for the quantitation of triamcinolone acetonide in an injectable ophthalmic hydrogel to determine the contribution of analytical method error in the content uniformity measurement. During the development phase, the design of experiments/design space strategy was used. For this, the free R-program was used as a commercial software alternative, a fast efficient tool for data analysis. The process capability index was used to find the permitted level of variation for each factor and to define the design space. All these aspects were analyzed and discussed under different experimental conditions by the Monte Carlo simulation method. Second, a pre-study validation procedure was performed in accordance with the International Conference on Harmonization guidelines. The validated method was applied for the determination of uniformity of dosage units and the reasons for variability (inhomogeneity and the analytical method error) were analyzed based on the overall uncertainty. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    PubMed Central

    Smith, Justin D.

    2013-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874

  16. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations.

    PubMed

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. From designed door's thickness, the door designed by the MC simulation and Wu-McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations.

  17. On finding the analytic dependencies of the external field potential on the control function when optimizing the beam dynamics

    NASA Astrophysics Data System (ADS)

    Ovsyannikov, A. D.; Kozynchenko, S. A.; Kozynchenko, V. A.

    2017-12-01

    When developing a particle accelerator for generating the high-precision beams, the injection system design is of importance, because it largely determines the output characteristics of the beam. At the present paper we consider the injection systems consisting of electrodes with given potentials. The design of such systems requires carrying out simulation of beam dynamics in the electrostatic fields. For external field simulation we use the new approach, proposed by A.D. Ovsyannikov, which is based on analytical approximations, or finite difference method, taking into account the real geometry of the injection system. The software designed for solving the problems of beam dynamics simulation and optimization in the injection system for non-relativistic beams has been developed. Both beam dynamics and electric field simulations in the injection system which use analytical approach and finite difference method have been made and the results presented in this paper.

  18. Analysis and Experimental Investigation of Optimum Design of Thermoelectric Cooling/Heating System for Car Seat Climate Control (CSCC)

    NASA Astrophysics Data System (ADS)

    Elarusi, Abdulmunaem; Attar, Alaa; Lee, HoSung

    2018-02-01

    The optimum design of a thermoelectric system for application in car seat climate control has been modeled and its performance evaluated experimentally. The optimum design of the thermoelectric device combining two heat exchangers was obtained by using a newly developed optimization method based on the dimensional technique. Based on the analytical optimum design results, commercial thermoelectric cooler and heat sinks were selected to design and construct the climate control heat pump. This work focuses on testing the system performance in both cooling and heating modes to ensure accurate analytical modeling. Although the analytical performance was calculated using the simple ideal thermoelectric equations with effective thermoelectric material properties, it showed very good agreement with experiment for most operating conditions.

  19. A new method for designing dual foil electron beam forming systems. I. Introduction, concept of the method

    NASA Astrophysics Data System (ADS)

    Adrich, Przemysław

    2016-05-01

    In Part I of this work existing methods and problems in dual foil electron beam forming system design are presented. On this basis, a new method of designing these systems is introduced. The motivation behind this work is to eliminate the shortcomings of the existing design methods and improve overall efficiency of the dual foil design process. The existing methods are based on approximate analytical models applied in an unrealistically simplified geometry. Designing a dual foil system with these methods is a rather labor intensive task as corrections to account for the effects not included in the analytical models have to be calculated separately and accounted for in an iterative procedure. To eliminate these drawbacks, the new design method is based entirely on Monte Carlo modeling in a realistic geometry and using physics models that include all relevant processes. In our approach, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of the system performance in function of parameters of the foils. The new method, while being computationally intensive, minimizes the involvement of the designer and considerably shortens the overall design time. The results are of high quality as all the relevant physics and geometry details are naturally accounted for. To demonstrate the feasibility of practical implementation of the new method, specialized software tools were developed and applied to solve a real life design problem, as described in Part II of this work.

  20. Familiarity Vs Trust: A Comparative Study of Domain Scientists' Trust in Visual Analytics and Conventional Analysis Methods.

    PubMed

    Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel

    2017-01-01

    Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.

  1. A novel quality by design approach for developing an HPLC method to analyze herbal extracts: A case study of sugar content analysis.

    PubMed

    Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu

    2018-01-01

    The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.

  2. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations

    PubMed Central

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    Aim The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. Background High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. Materials and methods The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. Results From designed door's thickness, the door designed by the MC simulation and Wu–McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Conclusion Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations. PMID:26900357

  3. Nanomaterials-based biosensors for detection of microorganisms and microbial toxins.

    PubMed

    Sutarlie, Laura; Ow, Sian Yang; Su, Xiaodi

    2017-04-01

    Detection of microorganisms and microbial toxins is important for health and safety. Due to their unique physical and chemical properties, nanomaterials have been extensively used to develop biosensors for rapid detection of microorganisms with microbial cells and toxins as target analytes. In this paper, the design principles of nanomaterials-based biosensors for four selected analyte categories (bacteria cells, toxins, mycotoxins, and protozoa cells), closely associated with the target analytes' properties is reviewed. Five signal transducing methods that are less equipment intensive (colorimetric, fluorimetric, surface enhanced Raman scattering, electrochemical, and magnetic relaxometry methods) is described and compared for their sensory performance (in term oflimit of detection, dynamic range, and response time) for all analyte categories. In the end, the suitability of these five sensing principles for on-site or field applications is discussed. With a comprehensive coverage of nanomaterials, design principles, sensing principles, and assessment on the sensory performance and suitability for on-site application, this review offers valuable insight and perspective for designing suitable nanomaterials-based microorganism biosensors for a given application. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Improving Sample Distribution Homogeneity in Three-Dimensional Microfluidic Paper-Based Analytical Devices by Rational Device Design.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Milan, Luis Aparecido; Stockton, Amanda M; Carrilho, Emanuel

    2017-05-02

    Paper-based devices are a portable, user-friendly, and affordable technology that is one of the best analytical tools for inexpensive diagnostic devices. Three-dimensional microfluidic paper-based analytical devices (3D-μPADs) are an evolution of single layer devices and they permit effective sample dispersion, individual layer treatment, and multiplex analytical assays. Here, we present the rational design of a wax-printed 3D-μPAD that enables more homogeneous permeation of fluids along the cellulose matrix than other existing designs in the literature. Moreover, we show the importance of the rational design of channels on these devices using glucose oxidase, peroxidase, and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS) reactions. We present an alternative method for layer stacking using a magnetic apparatus, which facilitates fluidic dispersion and improves the reproducibility of tests performed on 3D-μPADs. We also provide the optimized designs for printing, facilitating further studies using 3D-μPADs.

  5. An improved method for predicting the lightning performance of high and extra-high-voltage substation shielding

    NASA Astrophysics Data System (ADS)

    Vinh, T.

    1980-08-01

    There is a need for better and more effective lightning protection for transmission and switching substations. In the past, a number of empirical methods were utilized to design systems to protect substations and transmission lines from direct lightning strokes. The need exists for convenient analytical lightning models adequate for engineering usage. In this study, analytical lightning models were developed along with a method for improved analysis of the physical properties of lightning through their use. This method of analysis is based upon the most recent statistical field data. The result is an improved method for predicting the occurrence of sheilding failure and for designing more effective protection for high and extra high voltage substations from direct strokes.

  6. Progress and development of analytical methods for gibberellins.

    PubMed

    Pan, Chaozhi; Tan, Swee Ngin; Yong, Jean Wan Hong; Ge, Liya

    2017-01-01

    Gibberellins, as a group of phytohormones, exhibit a wide variety of bio-functions within plant growth and development, which have been used to increase crop yields. Many analytical procedures, therefore, have been developed for the determination of the types and levels of endogenous and exogenous gibberellins. As plant tissues contain gibberellins in trace amounts (usually at the level of nanogram per gram fresh weight or even lower), the sample pre-treatment steps (extraction, pre-concentration, and purification) for gibberellins are reviewed in details. The primary focus of this comprehensive review is on the various analytical methods designed to meet the requirements for gibberellins analyses in complex matrices with particular emphasis on high-throughput analytical methods, such as gas chromatography, liquid chromatography, and capillary electrophoresis, mostly combined with mass spectrometry. The advantages and drawbacks of the each described analytical method are discussed. The overall aim of this review is to provide a comprehensive and critical view on the different analytical methods nowadays employed to analyze gibberellins in complex sample matrices and their foreseeable trends. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Analytical design of a hyper-spectral imaging spectrometer utilizing a convex grating

    NASA Astrophysics Data System (ADS)

    Kim, Seo H.; Kong, Hong J.; Ku, Hana; Lee, Jun H.

    2012-09-01

    This paper describes about the new design method for hyper-spectral Imaging spectrometers utilizing convex grating. Hyper-spectral imaging systems are power tools in the field of remote sensing. HSI systems collect at least 100 spectral bands of 10~20 nm width. Because the spectral signature is different and induced unique for each material, it should be possible to discriminate between one material and another based on difference in spectral signature of material. I mathematically analyzed parameters for the intellectual initial design. Main concept of this is the derivative of "ring of minimum aberration without vignetting". This work is a kind of analytical design of an Offner imaging spectrometer. Also, several experiment methods will be contrived to evaluate the performance of imaging spectrometer.

  8. Recent Transonic Flutter Investigations for Wings and External Stores

    DTIC Science & Technology

    1983-01-01

    and difficult method? In the early days of high-speed air- craft design . the aeroelastician realized that non -compressible aerodynamic theory and... experimental aeroelastic model program that would provide insight into the effects of Reynolds number and angle of attack on various airfoil designs regarding...investigation is carried out both experimentally and analytically. The analytic modelling will be described in a later section. The flutter calculations

  9. Calculated and measured fields in superferric wiggler magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blum, E.B.; Solomon, L.

    1995-02-01

    Although Klaus Halbach is widely known and appreciated as the originator of the computer program POISSON for electromagnetic field calculation, Klaus has always believed that analytical methods can give much more insight into the performance of a magnet than numerical simulation. Analytical approximations readily show how the different aspects of a magnet`s design such as pole dimensions, current, and coil configuration contribute to the performance. These methods yield accuracies of better than 10%. Analytical methods should therefore be used when conceptualizing a magnet design. Computer analysis can then be used for refinement. A simple model is presented for the peakmore » on-axis field of an electro-magnetic wiggler with iron poles and superconducting coils. The model is applied to the radiator section of the superconducting wiggler for the BNL Harmonic Generation Free Electron Laser. The predictions of the model are compared to the measured field and the results from POISSON.« less

  10. Design optimization of piezoresistive cantilevers for force sensing in air and water

    PubMed Central

    Doll, Joseph C.; Park, Sung-Jin; Pruitt, Beth L.

    2009-01-01

    Piezoresistive cantilevers fabricated from doped silicon or metal films are commonly used for force, topography, and chemical sensing at the micro- and macroscales. Proper design is required to optimize the achievable resolution by maximizing sensitivity while simultaneously minimizing the integrated noise over the bandwidth of interest. Existing analytical design methods are insufficient for modeling complex dopant profiles, design constraints, and nonlinear phenomena such as damping in fluid. Here we present an optimization method based on an analytical piezoresistive cantilever model. We use an existing iterative optimizer to minimimize a performance goal, such as minimum detectable force. The design tool is available as open source software. Optimal cantilever design and performance are found to strongly depend on the measurement bandwidth and the constraints applied. We discuss results for silicon piezoresistors fabricated by epitaxy and diffusion, but the method can be applied to any dopant profile or material which can be modeled in a similar fashion or extended to other microelectromechanical systems. PMID:19865512

  11. Recursive linearization of multibody dynamics equations of motion

    NASA Technical Reports Server (NTRS)

    Lin, Tsung-Chieh; Yae, K. Harold

    1989-01-01

    The equations of motion of a multibody system are nonlinear in nature, and thus pose a difficult problem in linear control design. One approach is to have a first-order approximation through the numerical perturbations at a given configuration, and to design a control law based on the linearized model. Here, a linearized model is generated analytically by following the footsteps of the recursive derivation of the equations of motion. The equations of motion are first written in a Newton-Euler form, which is systematic and easy to construct; then, they are transformed into a relative coordinate representation, which is more efficient in computation. A new computational method for linearization is obtained by applying a series of first-order analytical approximations to the recursive kinematic relationships. The method has proved to be computationally more efficient because of its recursive nature. It has also turned out to be more accurate because of the fact that analytical perturbation circumvents numerical differentiation and other associated numerical operations that may accumulate computational error, thus requiring only analytical operations of matrices and vectors. The power of the proposed linearization algorithm is demonstrated, in comparison to a numerical perturbation method, with a two-link manipulator and a seven degrees of freedom robotic manipulator. Its application to control design is also demonstrated.

  12. Flight Test Experiment Design for Characterizing Stability and Control of Hypersonic Vehicles

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2008-01-01

    A maneuver design method that is particularly well-suited for determining the stability and control characteristics of hypersonic vehicles is described in detail. Analytical properties of the maneuver design are explained. The importance of these analytical properties for maximizing information content in flight data is discussed, along with practical implementation issues. Results from flight tests of the X-43A hypersonic research vehicle (also called Hyper-X) are used to demonstrate the excellent modeling results obtained using this maneuver design approach. A detailed design procedure for generating the maneuvers is given to allow application to other flight test programs.

  13. High Throughput Determination of Ricinine Abrine and Alpha ...

    EPA Pesticide Factsheets

    Analytical Method This document provides the standard operating procedure for determination of ricinine (RIC), abrine (ABR), and α-amanitin (AMAN) in drinking water by isotope dilution liquid chromatography tandem mass spectrometry (LC/MS/MS). This method is designed to support site-specific cleanup goals of environmental remediation activities following a homeland security incident involving one or a combination of these analytes.

  14. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  15. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  16. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less

  18. [Continual improvement of quantitative analytical method development of Panax notogineng saponins based on quality by design].

    PubMed

    Dai, Sheng-Yun; Xu, Bing; Shi, Xin-Yuan; Xu, Xiang; Sun, Ying-Qiang; Qiao, Yan-Jiang

    2017-03-01

    This study is aimed to propose a continual improvement strategy based on quality by design (QbD). An ultra high performance liquid chromatography (UPLC) method was developed to accomplish the method transformation from HPLC to UPLC of Panax notogineng saponins (PNS) and achieve the continual improvement of PNS based on QbD, for example. Plackett-Burman screening design and Box-Behnken optimization design were employed to further understand the relationship between the critical method parameters (CMPs) and critical method attributes (CMAs). And then the Bayesian design space was built. The separation degree of the critical peaks (ginsenoside Rg₁ and ginsenoside Re) was over 2.0 and the analysis time was less than 17 min by a method chosen from the design space with 20% of the initial concentration of the acetonitrile, 10 min of the isocratic time and 6%•min⁻¹ of the gradient slope. At last, the optimum method was validated by accuracy profile. Based on the same analytical target profile (ATP), the comparison of HPLC and UPLC including chromatograph method, CMA identification, CMP-CMA model and system suitability test (SST) indicated that the UPLC method could shorten the analysis time, improve the critical separation and satisfy the requirement of the SST. In all, HPLC method could be replaced by UPLC for the quantity analysis of PNS. Copyright© by the Chinese Pharmaceutical Association.

  19. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1994-01-01

    The primary accomplishments of the project are as follows: (1) Using the transonic small perturbation equation as a flowfield model, the project demonstrated that the quasi-analytical method could be used to obtain aerodynamic sensitivity coefficients for airfoils at subsonic, transonic, and supersonic conditions for design variables such as Mach number, airfoil thickness, maximum camber, angle of attack, and location of maximum camber. It was established that the quasi-analytical approach was an accurate method for obtaining aerodynamic sensitivity derivatives for airfoils at transonic conditions and usually more efficient than the finite difference approach. (2) The usage of symbolic manipulation software to determine the appropriate expressions and computer coding associated with the quasi-analytical method for sensitivity derivatives was investigated. Using the three dimensional fully conservative full potential flowfield model, it was determined that symbolic manipulation along with a chain rule approach was extremely useful in developing a combined flowfield and quasi-analytical sensitivity derivative code capable of considering a large number of realistic design variables. (3) Using the three dimensional fully conservative full potential flowfield model, the quasi-analytical method was applied to swept wings (i.e. three dimensional) at transonic flow conditions. (4) The incremental iterative technique has been applied to the three dimensional transonic nonlinear small perturbation flowfield formulation, an equivalent plate deflection model, and the associated aerodynamic and structural discipline sensitivity equations; and coupled aeroelastic results for an aspect ratio three wing in transonic flow have been obtained.

  20. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    PubMed

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used to determine if the individual compounds quantified provide a suitable mass balance of total airborne organofluorochemicals based on known fluorine content. Improvements in precision and/or recovery as well as some additional testing would be needed to meet all NIOSH validation criteria. This study provided valuable information about the accuracy of this method for organofluorochemical exposure assessment.

  1. Recursively constructing analytic expressions for equilibrium distributions of stochastic biochemical reaction networks.

    PubMed

    Meng, X Flora; Baetica, Ania-Ariadna; Singhal, Vipul; Murray, Richard M

    2017-05-01

    Noise is often indispensable to key cellular activities, such as gene expression, necessitating the use of stochastic models to capture its dynamics. The chemical master equation (CME) is a commonly used stochastic model of Kolmogorov forward equations that describe how the probability distribution of a chemically reacting system varies with time. Finding analytic solutions to the CME can have benefits, such as expediting simulations of multiscale biochemical reaction networks and aiding the design of distributional responses. However, analytic solutions are rarely known. A recent method of computing analytic stationary solutions relies on gluing simple state spaces together recursively at one or two states. We explore the capabilities of this method and introduce algorithms to derive analytic stationary solutions to the CME. We first formally characterize state spaces that can be constructed by performing single-state gluing of paths, cycles or both sequentially. We then study stochastic biochemical reaction networks that consist of reversible, elementary reactions with two-dimensional state spaces. We also discuss extending the method to infinite state spaces and designing the stationary behaviour of stochastic biochemical reaction networks. Finally, we illustrate the aforementioned ideas using examples that include two interconnected transcriptional components and biochemical reactions with two-dimensional state spaces. © 2017 The Author(s).

  2. Recursively constructing analytic expressions for equilibrium distributions of stochastic biochemical reaction networks

    PubMed Central

    Baetica, Ania-Ariadna; Singhal, Vipul; Murray, Richard M.

    2017-01-01

    Noise is often indispensable to key cellular activities, such as gene expression, necessitating the use of stochastic models to capture its dynamics. The chemical master equation (CME) is a commonly used stochastic model of Kolmogorov forward equations that describe how the probability distribution of a chemically reacting system varies with time. Finding analytic solutions to the CME can have benefits, such as expediting simulations of multiscale biochemical reaction networks and aiding the design of distributional responses. However, analytic solutions are rarely known. A recent method of computing analytic stationary solutions relies on gluing simple state spaces together recursively at one or two states. We explore the capabilities of this method and introduce algorithms to derive analytic stationary solutions to the CME. We first formally characterize state spaces that can be constructed by performing single-state gluing of paths, cycles or both sequentially. We then study stochastic biochemical reaction networks that consist of reversible, elementary reactions with two-dimensional state spaces. We also discuss extending the method to infinite state spaces and designing the stationary behaviour of stochastic biochemical reaction networks. Finally, we illustrate the aforementioned ideas using examples that include two interconnected transcriptional components and biochemical reactions with two-dimensional state spaces. PMID:28566513

  3. Strategic assay deployment as a method for countering analytical bottlenecks in high throughput process development: case studies in ion exchange chromatography.

    PubMed

    Konstantinidis, Spyridon; Heldin, Eva; Chhatre, Sunil; Velayudhan, Ajoy; Titchener-Hooker, Nigel

    2012-01-01

    High throughput approaches to facilitate the development of chromatographic separations have now been adopted widely in the biopharmaceutical industry, but issues of how to reduce the associated analytical burden remain. For example, acquiring experimental data by high level factorial designs in 96 well plates can place a considerable strain upon assay capabilities, generating a bottleneck that limits significantly the speed of process characterization. This article proposes an approach designed to counter this challenge; Strategic Assay Deployment (SAD). In SAD, a set of available analytical methods is investigated to determine which set of techniques is the most appropriate to use and how best to deploy these to reduce the consumption of analytical resources while still enabling accurate and complete process characterization. The approach is demonstrated by investigating how salt concentration and pH affect the binding of green fluorescent protein from Escherichia coli homogenate to an anion exchange resin presented in a 96-well filter plate format. Compared with the deployment of routinely used analytical methods alone, the application of SAD reduced both the total assay time and total assay material consumption by at least 40% and 5%, respectively. SAD has significant utility in accelerating bioprocess development activities. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  4. Evaluation of a hydrophilic interaction liquid chromatography design space for sugars and sugar alcohols.

    PubMed

    Hetrick, Evan M; Kramer, Timothy T; Risley, Donald S

    2017-03-17

    Based on a column-screening exercise, a column ranking system was developed for sample mixtures containing any combination of 26 sugar and sugar alcohol analytes using 16 polar stationary phases in the HILIC mode with acetonitrile/water or acetone/water mobile phases. Each analyte was evaluated on the HILIC columns with gradient elution and the subsequent chromatography data was compiled into a statistical software package where any subset of the analytes can be selected and the columns are then ranked by the greatest separation. Since these analytes lack chromophores, aerosol-based detectors, including an evaporative light scattering detector (ELSD) and a charged aerosol detector (CAD) were employed for qualitative and quantitative detection. Example qualitative applications are provided to illustrate the practicality and efficiency of this HILIC column ranking. Furthermore, the design-space approach was used as a starting point for a quantitative method for the trace analysis of glucose in trehalose samples in a complex matrix. Knowledge gained from evaluating the design-space led to rapid development of a capable method as demonstrated through validation of the following parameters: specificity, accuracy, precision, linearity, limit of quantitation, limit of detection, and range. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Analytical display design for flight tasks conducted under instrument meteorological conditions. [human factors engineering of pilot performance for display device design in instrument landing systems

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1976-01-01

    Paramount to proper utilization of electronic displays is a method for determining pilot-centered display requirements. Display design should be viewed fundamentally as a guidance and control problem which has interactions with the designer's knowledge of human psychomotor activity. From this standpoint, reliable analytical models of human pilots as information processors and controllers can provide valuable insight into the display design process. A relatively straightforward, nearly algorithmic procedure for deriving model-based, pilot-centered display requirements was developed and is presented. The optimal or control theoretic pilot model serves as the backbone of the design methodology, which is specifically directed toward the synthesis of head-down, electronic, cockpit display formats. Some novel applications of the optimal pilot model are discussed. An analytical design example is offered which defines a format for the electronic display to be used in a UH-1H helicopter in a landing approach task involving longitudinal and lateral degrees of freedom.

  6. Use of LC-HRMS in full scan-XIC mode for multi-analyte urine drug testing - a step towards a 'black-box' solution?

    PubMed

    Stephanson, N N; Signell, P; Helander, A; Beck, O

    2017-08-01

    The influx of new psychoactive substances (NPS) has created a need for improved methods for drug testing in toxicology laboratories. The aim of this work was to design, validate and apply a multi-analyte liquid chromatography-high-resolution mass spectrometry (LC-HRMS) method for screening of 148 target analytes belonging to the NPS class, plant alkaloids and new psychoactive therapeutic drugs. The analytical method used a fivefold dilution of urine with nine deuterated internal standards and injection of 2 μl. The LC system involved a 2.0 μm 100 × 2.0 mm YMC-UltraHT Hydrosphere-C 18 column and gradient elution with a flow rate of 0.5 ml/min and a total analysis time of 6.0 min. Solvent A consisted of 10 mmol/l ammonium formate and 0.005% formic acid, pH 4.8, and Solvent B was methanol with 10 mmol/l ammonium formate and 0.005% formic acid. The HRMS (Q Exactive, Thermo Scientific) used a heated electrospray interface and was operated in positive mode with 70 000 resolution. The scan range was 100-650 Da, and data for extracted ion chromatograms used ± 10 ppm tolerance. Product ion monitoring was applied for confirmation analysis and for some selected analytes also for screening. Method validation demonstrated limited influence from urine matrix, linear response within the measuring range (typically 0.1-1.0 μg/ml) and acceptable imprecision in quantification (CV <15%). A few analytes were found to be unstable in urine upon storage. The method was successfully applied for routine drug testing of 17 936 unknown samples, of which 2715 (15%) contained 52 of the 148 analytes. It is concluded that the method design based on simple dilution of urine and using LC-HRMS in extracted ion chromatogram mode may offer an analytical system for urine drug testing that fulfils the requirement of a 'black box' solution and can replace immunochemical screening applied on autoanalyzers. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Big–deep–smart data in imaging for guiding materials design

    DOE PAGES

    Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.

    2015-09-23

    Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.

  8. Big-deep-smart data in imaging for guiding materials design.

    PubMed

    Kalinin, Sergei V; Sumpter, Bobby G; Archibald, Richard K

    2015-10-01

    Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.

  9. Big-deep-smart data in imaging for guiding materials design

    NASA Astrophysics Data System (ADS)

    Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.

    2015-10-01

    Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.

  10. Big–deep–smart data in imaging for guiding materials design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.

    Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.

  11. Thermal/structural design verification strategies for large space structures

    NASA Technical Reports Server (NTRS)

    Benton, David

    1988-01-01

    Requirements for space structures of increasing size, complexity, and precision have engendered a search for thermal design verification methods that do not impose unreasonable costs, that fit within the capabilities of existing facilities, and that still adequately reduce technical risk. This requires a combination of analytical and testing methods. This requires two approaches. The first is to limit thermal testing to sub-elements of the total system only in a compact configuration (i.e., not fully deployed). The second approach is to use a simplified environment to correlate analytical models with test results. These models can then be used to predict flight performance. In practice, a combination of these approaches is needed to verify the thermal/structural design of future very large space systems.

  12. Development of variable LRFD \\0x03C6 factors for deep foundation design due to site variability.

    DOT National Transportation Integrated Search

    2012-04-01

    The current design guidelines of Load and Resistance Factor Design (LRFD) specifies constant values : for deep foundation design, based on analytical method selected and degree of redundancy of the pier. : However, investigation of multiple sites in ...

  13. On the calculation of resonances by analytic continuation of eigenvalues from the stabilization graph

    NASA Astrophysics Data System (ADS)

    Haritan, Idan; Moiseyev, Nimrod

    2017-07-01

    Resonances play a major role in a large variety of fields in physics and chemistry. Accordingly, there is a growing interest in methods designed to calculate them. Recently, Landau et al. proposed a new approach to analytically dilate a single eigenvalue from the stabilization graph into the complex plane. This approach, termed Resonances Via Padé (RVP), utilizes the Padé approximant and is based on a unique analysis of the stabilization graph. Yet, analytic continuation of eigenvalues from the stabilization graph into the complex plane is not a new idea. In 1975, Jordan suggested an analytic continuation method based on the branch point structure of the stabilization graph. The method was later modified by McCurdy and McNutt, and it is still being used today. We refer to this method as the Truncated Characteristic Polynomial (TCP) method. In this manuscript, we perform an in-depth comparison between the RVP and the TCP methods. We demonstrate that while both methods are important and complementary, the advantage of one method over the other is problem-dependent. Illustrative examples are provided in the manuscript.

  14. Protein Quantification by Elemental Mass Spectrometry: An Experiment for Graduate Students

    ERIC Educational Resources Information Center

    Schwarz, Gunnar; Ickert, Stefanie; Wegner, Nina; Nehring, Andreas; Beck, Sebastian; Tiemann, Ruediger; Linscheid, Michael W.

    2014-01-01

    A multiday laboratory experiment was designed to integrate inductively coupled plasma-mass spectrometry (ICP-MS) in the context of protein quantification into an advanced practical course in analytical and environmental chemistry. Graduate students were familiar with the analytical methods employed, whereas the combination of bioanalytical assays…

  15. Results of an integrated structure-control law design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Gilbert, Michael G.

    1988-01-01

    Next generation air and space vehicle designs are driven by increased performance requirements, demanding a high level of design integration between traditionally separate design disciplines. Interdisciplinary analysis capabilities have been developed, for aeroservoelastic aircraft and large flexible spacecraft control for instance, but the requisite integrated design methods are only beginning to be developed. One integrated design method which has received attention is based on hierarchal problem decompositions, optimization, and design sensitivity analyses. This paper highlights a design sensitivity analysis method for Linear Quadratic Cost, Gaussian (LQG) optimal control laws, which predicts change in the optimal control law due to changes in fixed problem parameters using analytical sensitivity equations. Numerical results of a design sensitivity analysis for a realistic aeroservoelastic aircraft example are presented. In this example, the sensitivity of the optimally controlled aircraft's response to various problem formulation and physical aircraft parameters is determined. These results are used to predict the aircraft's new optimally controlled response if the parameter was to have some other nominal value during the control law design process. The sensitivity results are validated by recomputing the optimal control law for discrete variations in parameters, computing the new actual aircraft response, and comparing with the predicted response. These results show an improvement in sensitivity accuracy for integrated design purposes over methods which do not include changess in the optimal control law. Use of the analytical LQG sensitivity expressions is also shown to be more efficient that finite difference methods for the computation of the equivalent sensitivity information.

  16. Recent Progresses in Nanobiosensing for Food Safety Analysis

    PubMed Central

    Yang, Tao; Huang, Huifen; Zhu, Fang; Lin, Qinlu; Zhang, Lin; Liu, Junwen

    2016-01-01

    With increasing adulteration, food safety analysis has become an important research field. Nanomaterials-based biosensing holds great potential in designing highly sensitive and selective detection strategies necessary for food safety analysis. This review summarizes various function types of nanomaterials, the methods of functionalization of nanomaterials, and recent (2014–present) progress in the design and development of nanobiosensing for the detection of food contaminants including pathogens, toxins, pesticides, antibiotics, metal contaminants, and other analytes, which are sub-classified according to various recognition methods of each analyte. The existing shortcomings and future perspectives of the rapidly growing field of nanobiosensing addressing food safety issues are also discussed briefly. PMID:27447636

  17. Recent Progresses in Nanobiosensing for Food Safety Analysis.

    PubMed

    Yang, Tao; Huang, Huifen; Zhu, Fang; Lin, Qinlu; Zhang, Lin; Liu, Junwen

    2016-07-19

    With increasing adulteration, food safety analysis has become an important research field. Nanomaterials-based biosensing holds great potential in designing highly sensitive and selective detection strategies necessary for food safety analysis. This review summarizes various function types of nanomaterials, the methods of functionalization of nanomaterials, and recent (2014-present) progress in the design and development of nanobiosensing for the detection of food contaminants including pathogens, toxins, pesticides, antibiotics, metal contaminants, and other analytes, which are sub-classified according to various recognition methods of each analyte. The existing shortcomings and future perspectives of the rapidly growing field of nanobiosensing addressing food safety issues are also discussed briefly.

  18. An Analytic Approach for Optimal Geometrical Design of GaAs Nanowires for Maximal Light Harvesting in Photovoltaic Cells

    PubMed Central

    Wu, Dan; Tang, Xiaohong; Wang, Kai; Li, Xianqiang

    2017-01-01

    Semiconductor nanowires(NWs) with subwavelength scale diameters have demonstrated superior light trapping features, which unravel a new pathway for low cost and high efficiency future generation solar cells. Unlike other published work, a fully analytic design is for the first time proposed for optimal geometrical parameters of vertically-aligned GaAs NW arrays for maximal energy harvesting. Using photocurrent density as the light absorbing evaluation standard, 2 μm length NW arrays whose multiple diameters and periodicity are quantitatively identified achieving the maximal value of 29.88 mA/cm2 under solar illumination. It also turns out that our method has wide suitability for single, double and four different diameters of NW arrays for highest photon energy harvesting. To validate this analytical method, intensive numerical three-dimensional finite-difference time-domain simulations of the NWs’ light harvesting are also carried out. Compared with the simulation results, the predicted maximal photocurrent densities lie within 1.5% tolerance for all cases. Along with the high accuracy, through directly disclosing the exact geometrical dimensions of NW arrays, this method provides an effective and efficient route for high performance photovoltaic design. PMID:28425488

  19. Application of factorial designs to study factors involved in the determination of aldehydes present in beer by on-fiber derivatization in combination with gas chromatography and mass spectrometry.

    PubMed

    Carrillo, Génesis; Bravo, Adriana; Zufall, Carsten

    2011-05-11

    With the aim of studying the factors involved in on-fiber derivatization of Strecker aldehydes, furfural, and (E)-2-nonenal with O-(2,3,4,5,6-pentafluorobenzyl)hydroxylamine in beer, factorial designs were applied. The effect of the temperature, time, and NaCl addition on the analytes' derivatization/extraction efficiency was studied through a factorial 2(3) randomized-block design; all of the factors and their interactions were significant at the 95% confidence level for most of the analytes. The effect of temperature and its interactions separated the analytes in two groups. However, a single sampling condition was selected that optimized response for most aldehydes. The resulting method, combining on-fiber derivatization with gas chromatography-mass spectrometry, was validated. Limits of detections were between 0.015 and 1.60 μg/L, and relative standard deviations were between 1.1 and 12.2%. The efficacy of the internal standardization method was confirmed by recovery percentage (73-117%). The method was applied to the determination of aldehydes in fresh beer and after storage at 28 °C.

  20. Rapid and continuous analyte processing in droplet microfluidic devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strey, Helmut; Kimmerling, Robert; Bakowski, Tomasz

    The compositions and methods described herein are designed to introduce functionalized microparticles into droplets that can be manipulated in microfluidic devices by fields, including electric (dielectrophoretic) or magnetic fields, and extracted by splitting a droplet to separate the portion of the droplet that contains the majority of the microparticles from the part that is largely devoid of the microparticles. Within the device, channels are variously configured at Y- or T junctions that facilitate continuous, serial isolation and dilution of analytes in solution. The devices can be limited in the sense that they can be designed to output purified analytes thatmore » are then further analyzed in separate machines or they can include additional channels through which purified analytes can be further processed and analyzed.« less

  1. Analytical approach of laser beam propagation in the hollow polygonal light pipe.

    PubMed

    Zhu, Guangzhi; Zhu, Xiao; Zhu, Changhong

    2013-08-10

    An analytical method of researching the light distribution properties on the output end of a hollow n-sided polygonal light pipe and a light source with a Gaussian distribution is developed. The mirror transformation matrices and a special algorithm of removing void virtual images are created to acquire the location and direction vector of each effective virtual image on the entrance plane. The analytical method is demonstrated by Monte Carlo ray tracing. At the same time, four typical cases are discussed. The analytical results indicate that the uniformity of light distribution varies with the structural and optical parameters of the hollow n-sided polygonal light pipe and light source with a Gaussian distribution. The analytical approach will be useful to design and choose the hollow n-sided polygonal light pipe, especially for high-power laser beam homogenization techniques.

  2. Back analysis of geomechanical parameters in underground engineering using artificial bee colony.

    PubMed

    Zhu, Changxing; Zhao, Hongbo; Zhao, Ming

    2014-01-01

    Accurate geomechanical parameters are critical in tunneling excavation, design, and supporting. In this paper, a displacements back analysis based on artificial bee colony (ABC) algorithm is proposed to identify geomechanical parameters from monitored displacements. ABC was used as global optimal algorithm to search the unknown geomechanical parameters for the problem with analytical solution. To the problem without analytical solution, optimal back analysis is time-consuming, and least square support vector machine (LSSVM) was used to build the relationship between unknown geomechanical parameters and displacement and improve the efficiency of back analysis. The proposed method was applied to a tunnel with analytical solution and a tunnel without analytical solution. The results show the proposed method is feasible.

  3. Transient analysis of an adaptive system for optimization of design parameters

    NASA Technical Reports Server (NTRS)

    Bayard, D. S.

    1992-01-01

    Averaging methods are applied to analyzing and optimizing the transient response associated with the direct adaptive control of an oscillatory second-order minimum-phase system. The analytical design methods developed for a second-order plant can be applied with some approximation to a MIMO flexible structure having a single dominant mode.

  4. Visual analytics as a translational cognitive science.

    PubMed

    Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard

    2011-07-01

    Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.

  5. Results of an integrated structure/control law design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Gilbert, Michael G.

    1989-01-01

    A design sensitivity analysis method for Linear Quadratic Cost, Gaussian (LQG) optimal control laws, which predicts change in the optimal control law due to changes in fixed problem parameters using analytical sensitivity equations is discussed. Numerical results of a design sensitivity analysis for a realistic aeroservoelastic aircraft example are presented. In this example, the sensitivity of the optimally controlled aircraft's response to various problem formulation and physical aircraft parameters is determined. These results are used to predict the aircraft's new optimally controlled response if the parameter was to have some other nominal value during the control law design process. The sensitivity results are validated by recomputing the optimal control law for discrete variations in parameters, computing the new actual aircraft response, and comparing with the predicted response. These results show an improvement in sensitivity accuracy for integrated design purposes over methods which do not include changes in the optimal control law. Use of the analytical LQG sensitivity expressions is also shown to be more efficient than finite difference methods for the computation of the equivalent sensitivity information.

  6. Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA)

    NASA Astrophysics Data System (ADS)

    Bates, E. M.; Birmingham, W. J.; Romero-Talamás, C. A.

    2018-05-01

    The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.

  7. Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA).

    PubMed

    Bates, E M; Birmingham, W J; Romero-Talamás, C A

    2018-05-01

    The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.

  8. Multidisciplinary optimization in aircraft design using analytic technology models

    NASA Technical Reports Server (NTRS)

    Malone, Brett; Mason, W. H.

    1991-01-01

    An approach to multidisciplinary optimization is presented which combines the Global Sensitivity Equation method, parametric optimization, and analytic technology models. The result is a powerful yet simple procedure for identifying key design issues. It can be used both to investigate technology integration issues very early in the design cycle, and to establish the information flow framework between disciplines for use in multidisciplinary optimization projects using much more computational intense representations of each technology. To illustrate the approach, an examination of the optimization of a short takeoff heavy transport aircraft is presented for numerous combinations of performance and technology constraints.

  9. Proposed method for determining the thickness of glass in solar collector panels

    NASA Technical Reports Server (NTRS)

    Moore, D. M.

    1980-01-01

    An analytical method was developed for determining the minimum thickness for simply supported, rectangular glass plates subjected to uniform normal pressure environmental loads such as wind, earthquake, snow, and deadweight. The method consists of comparing an analytical prediction of the stress in the glass panel to a glass breakage stress determined from fracture mechanics considerations. Based on extensive analysis using the nonlinear finite element structural analysis program ARGUS, design curves for the structural analysis of simply supported rectangular plates were developed. These curves yield the center deflection, center stress and corner stress as a function of a dimensionless parameter describing the load intensity. A method of estimating the glass breakage stress as a function of a specified failure rate, degree of glass temper, design life, load duration time, and panel size is also presented.

  10. Evaluation of an Approximate Method for Synthesizing Covariance Matrices for Use in Meta-Analytic SEM

    ERIC Educational Resources Information Center

    Beretvas, S. Natasha; Furlow, Carolyn F.

    2006-01-01

    Meta-analytic structural equation modeling (MA-SEM) is increasingly being used to assess model-fit for variables' interrelations synthesized across studies. MA-SEM researchers have analyzed synthesized correlation matrices using structural equation modeling (SEM) estimation that is designed for covariance matrices. This can produce incorrect…

  11. Culturally Sensitive Interventions and Substance Use: A Meta-Analytic Review of Outcomes among Minority Youths

    ERIC Educational Resources Information Center

    Hodge, David R.; Jackson, Kelly F.; Vaughn, Michael G.

    2012-01-01

    This study assessed the effectiveness of culturally sensitive interventions (CSIs) ("N" = 10) designed to address substance use among minority youths. Study methods consisted of systematic search procedures, quality of study ratings, and meta-analytic techniques to gauge effects and evaluate publication bias. The results, across all measures and…

  12. Analytical Study of 90Sr Betavoltaic Nuclear Battery Performance Based on p-n Junction Silicon

    NASA Astrophysics Data System (ADS)

    Rahastama, Swastya; Waris, Abdul

    2016-08-01

    Previously, an analytical calculation of 63Ni p-n junction betavoltaic battery has been published. As the basic approach, we reproduced the analytical simulation of 63Ni betavoltaic battery and then compared it to previous results using the same design of the battery. Furthermore, we calculated its maximum power output and radiation- electricity conversion efficiency using semiconductor analysis method.Then, the same method were applied to calculate and analyse the performance of 90Sr betavoltaic battery. The aim of this project is to compare the analytical perfomance results of 90Sr betavoltaic battery to 63Ni betavoltaic battery and the source activity influences to performance. Since it has a higher power density, 90Sr betavoltaic battery yields more power than 63Ni betavoltaic battery but less radiation-electricity conversion efficiency. However, beta particles emitted from 90Sr source could travel further inside the silicon corresponding to stopping range of beta particles, thus the 90Sr betavoltaic battery could be designed thicker than 63Ni betavoltaic battery to achieve higher conversion efficiency.

  13. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    PubMed

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. General design method for three-dimensional potential flow fields. 1: Theory

    NASA Technical Reports Server (NTRS)

    Stanitz, J. D.

    1980-01-01

    A general design method was developed for steady, three dimensional, potential, incompressible or subsonic-compressible flow. In this design method, the flow field, including the shape of its boundary, was determined for arbitrarily specified, continuous distributions of velocity as a function of arc length along the boundary streamlines. The method applied to the design of both internal and external flow fields, including, in both cases, fields with planar symmetry. The analytic problems associated with stagnation points, closure of bodies in external flow fields, and prediction of turning angles in three dimensional ducts were reviewed.

  15. BETA (Bitter Electromagnet Testing Apparatus) Design and Testing

    NASA Astrophysics Data System (ADS)

    Bates, Evan; Birmingham, William; Rivera, William; Romero-Talamas, Carlos

    2016-10-01

    BETA is a 1T water cooled Bitter-type magnetic system that has been designed and constructed at the Dusty Plasma Laboratory of the University of Maryland, Baltimore County to serve as a prototype of a scaled 10T version. Currently the system is undergoing magnetic, thermal and mechanical testing to ensure safe operating conditions and to prove analytical design optimizations. These magnets will function as experimental tools for future dusty plasma based and collaborative experiments. An overview of design methods used for building a custom made Bitter magnet with user defined experimental constraints is reviewed. The three main design methods consist of minimizing the following: ohmic power, peak conductor temperatures, and stresses induced by Lorentz forces. We will also discuss the design of BETA which includes: the magnet core, pressure vessel, cooling system, power storage bank, high powered switching system, diagnostics with safety cutoff feedback, and data acquisition (DAQ)/magnet control Matlab code. Furthermore, we present experimental data from diagnostics for validation of our analytical preliminary design methodologies and finite element analysis calculations. BETA will contribute to the knowledge necessary to finalize the 10 T magnet design.

  16. Design and construction of an Offner spectrometer based on geometrical analysis of ring fields.

    PubMed

    Kim, Seo Hyun; Kong, Hong Jin; Lee, Jong Ung; Lee, Jun Ho; Lee, Jai Hoon

    2014-08-01

    A method to obtain an aberration-corrected Offner spectrometer without ray obstruction is proposed. A new, more efficient spectrometer optics design is suggested in order to increase its spectral resolution. The derivation of a new ring equation to eliminate ray obstruction is based on geometrical analysis of the ring fields for various numerical apertures. The analytical design applying this equation was demonstrated using the optical design software Code V in order to manufacture a spectrometer working in wavelengths of 900-1700 nm. The simulation results show that the new concept offers an analytical initial design taking the least time of calculation. The simulated spectrometer exhibited a modulation transfer function over 80% at Nyquist frequency, root-mean-square spot diameters under 8.6 μm, and a spectral resolution of 3.2 nm. The final design and its realization of a high resolution Offner spectrometer was demonstrated based on the simulation result. The equation and analytical design procedure shown here can be applied to most Offner systems regardless of the wavelength range.

  17. Systematically reviewing and synthesizing evidence from conversation analytic and related discursive research to inform healthcare communication practice and policy: an illustrated guide

    PubMed Central

    2013-01-01

    Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate them into a form that is credible and useful to healthcare practitioners, educators and policy-makers. PMID:23721181

  18. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  19. A generalized theory for the design of contraction cones and other low speed ducts

    NASA Technical Reports Server (NTRS)

    Barger, R. L.; Bowen, J. T.

    1972-01-01

    A generalization of the Tsien method of contraction cone design is described. The design velocity distribution is expressed in such a form that the required high order derivatives can be obtained by recursion rather than by numerical or analytic differentiation. The method is applicable to the design of diffusers and converging-diverging ducts as well as contraction cones. The computer program is described and a FORTRAN listing of the program is provided.

  20. Simple design of slanted grating with simplified modal method.

    PubMed

    Li, Shubin; Zhou, Changhe; Cao, Hongchao; Wu, Jun

    2014-02-15

    A simplified modal method (SMM) is presented that offers a clear physical image for subwavelength slanted grating. The diffraction characteristic of the slanted grating under Littrow configuration is revealed by the SMM as an equivalent rectangular grating, which is in good agreement with rigorous coupled-wave analysis. Based on the equivalence, we obtained an effective analytic solution for simplifying the design and optimization of a slanted grating. It offers a new approach for design of the slanted grating, e.g., a 1×2 beam splitter can be easily designed. This method should be helpful for designing various new slanted grating devices.

  1. Computer modeling of a two-junction, monolithic cascade solar cell

    NASA Technical Reports Server (NTRS)

    Lamorte, M. F.; Abbott, D.

    1979-01-01

    The theory and design criteria for monolithic, two-junction cascade solar cells are described. The departure from the conventional solar cell analytical method and the reasons for using the integral form of the continuity equations are briefly discussed. The results of design optimization are presented. The energy conversion efficiency that is predicted for the optimized structure is greater than 30% at 300 K, AMO and one sun. The analytical method predicts device performance characteristics as a function of temperature. The range is restricted to 300 to 600 K. While the analysis is capable of determining most of the physical processes occurring in each of the individual layers, only the more significant device performance characteristics are presented.

  2. An Investigation to Manufacturing Analytical Services Composition using the Analytical Target Cascading Method.

    PubMed

    Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas

    2017-01-01

    As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.

  3. Parallel Aircraft Trajectory Optimization with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Falck, Robert D.; Gray, Justin S.; Naylor, Bret

    2016-01-01

    Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.

  4. Conservative Analytical Collision Probabilities for Orbital Formation Flying

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell

    2004-01-01

    The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.

  5. Writing analytic element programs in Python.

    PubMed

    Bakker, Mark; Kelson, Victor A

    2009-01-01

    The analytic element method is a mesh-free approach for modeling ground water flow at both the local and the regional scale. With the advent of the Python object-oriented programming language, it has become relatively easy to write analytic element programs. In this article, an introduction is given of the basic principles of the analytic element method and of the Python programming language. A simple, yet flexible, object-oriented design is presented for analytic element codes using multiple inheritance. New types of analytic elements may be added without the need for any changes in the existing part of the code. The presented code may be used to model flow to wells (with either a specified discharge or drawdown) and streams (with a specified head). The code may be extended by any hydrogeologist with a healthy appetite for writing computer code to solve more complicated ground water flow problems. Copyright © 2009 The Author(s). Journal Compilation © 2009 National Ground Water Association.

  6. Methods for geochemical analysis

    USGS Publications Warehouse

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  7. A Graphical Approach to Teaching Amplifier Design at the Undergraduate Level

    ERIC Educational Resources Information Center

    Assaad, R. S.; Silva-Martinez, J.

    2009-01-01

    Current methods of teaching basic amplifier design at the undergraduate level need further development to match today's technological advances. The general class approach to amplifier design is analytical and heavily based on mathematical manipulations. However, the students mathematical abilities are generally modest, creating a void in which…

  8. Does design matter? Systematic evaluation of the impact of analytical choices on effect estimates in observational studies

    PubMed Central

    Ryan, Patrick B.; Schuemie, Martijn

    2013-01-01

    Background: Clinical studies that use observational databases, such as administrative claims and electronic health records, to evaluate the effects of medical products have become commonplace. These studies begin by selecting a particular study design, such as a case control, cohort, or self-controlled design, and different authors can and do choose different designs for the same clinical question. Furthermore, published papers invariably report the study design but do not discuss the rationale for the specific choice. Studies of the same clinical question with different designs, however, can generate different results, sometimes with strikingly different implications. Even within a specific study design, authors make many different analytic choices and these too can profoundly impact results. In this paper, we systematically study heterogeneity due to the type of study design and due to analytic choices within study design. Methods and findings: We conducted our analysis in 10 observational healthcare databases but mostly present our results in the context of the GE Centricity EMR database, an electronic health record database containing data for 11.2 million lives. We considered the impact of three different study design choices on estimates of associations between bisphosphonates and four particular health outcomes for which there is no evidence of an association. We show that applying alternative study designs can yield discrepant results, in terms of direction and significance of association. We also highlight that while traditional univariate sensitivity analysis may not show substantial variation, systematic assessment of all analytical choices within a study design can yield inconsistent results ranging from statistically significant decreased risk to statistically significant increased risk. Our findings show that clinical studies using observational databases can be sensitive both to study design choices and to specific analytic choices within study design. Conclusion: More attention is needed to consider how design choices may be impacting results and, when possible, investigators should examine a wide array of possible choices to confirm that significant findings are consistently identified. PMID:25083251

  9. An analytical study for the design of advanced rotor airfoils

    NASA Technical Reports Server (NTRS)

    Kemp, L. D.

    1973-01-01

    A theoretical study has been conducted to design and evaluate two airfoils for helicopter rotors. The best basic shape, designed with a transonic hodograph design method, was modified to meet subsonic criteria. One airfoil had an additional constraint for low pitching-moment at the transonic design point. Airfoil characteristics were predicted. Results of a comparative analysis of helicopter performance indicate that the new airfoils will produce reduced rotor power requirements compared to the NACA 0012. The hodograph design method, written in CDC Algol, is listed and described.

  10. Design Considerations of ISTAR Hydrocarbon Fueled Combustor Operating in Air Augmented Rocket, Ramjet and Scramjet Modes

    NASA Technical Reports Server (NTRS)

    Andreadis, Dean; Drake, Alan; Garrett, Joseph L.; Gettinger, Christopher D.; Hoxie, Stephen S.

    2003-01-01

    The development and ground test of a rocket-based combined cycle (RBCC) propulsion system is being conducted as part of the NASA Marshall Space Flight Center (MSFC) Integrated System Test of an Airbreathing Rocket (ISTAR) program. The eventual flight vehicle (X-43B) is designed to support an air-launched self-powered Mach 0.7 to 7.0 demonstration of an RBCC engine through all of its airbreathing propulsion modes - air augmented rocket (AAR), ramjet (RJ), and scramjet (SJ). Through the use of analytical tools, numerical simulations, and experimental tests the ISTAR program is developing and validating a hydrocarbon-fueled RBCC combustor design methodology. This methodology will then be used to design an integrated RBCC propulsion system that produces robust ignition and combustion stability characteristics while maximizing combustion efficiency and minimizing drag losses. First order analytical and numerical methods used to design hydrocarbon-fueled combustors are discussed with emphasis on the methods and determination of requirements necessary to establish engine operability and performance characteristics.

  11. Design Considerations of Istar Hydrocarbon Fueled Combustor Operating in Air Augmented Rocket, Ramjet and Scramjet Modes

    NASA Technical Reports Server (NTRS)

    Andreadis, Dean; Drake, Alan; Garrett, Joseph L.; Gettinger, Christopher D.; Hoxie, Stephen S.

    2002-01-01

    The development and ground test of a rocket-based combined cycle (RBCC) propulsion system is being conducted as part of the NASA Marshall Space Flight Center (MSFC) Integrated System Test of an Airbreathing Rocket (ISTAR) program. The eventual flight vehicle (X-43B) is designed to support an air-launched self-powered Mach 0.7 to 7.0 demonstration of an RBCC engine through all of its airbreathing propulsion modes - air augmented rocket (AAR), ramjet (RJ), and scramjet (SJ). Through the use of analytical tools, numerical simulations, and experimental tests the ISTAR program is developing and validating a hydrocarbon-fueled RBCC combustor design methodology. This methodology will then be used to design an integrated RBCC propulsion system thai: produces robust ignition and combustion stability characteristics while maximizing combustion efficiency and minimizing drag losses. First order analytical and numerical methods used to design hydrocarbon-fueled combustors are discussed with emphasis on the methods and determination of requirements necessary to establish engine operability and performance characteristics.

  12. Spiral trajectory design: a flexible numerical algorithm and base analytical equations.

    PubMed

    Pipe, James G; Zwart, Nicholas R

    2014-01-01

    Spiral-based trajectories for magnetic resonance imaging can be advantageous, but are often cumbersome to design or create. This work presents a flexible numerical algorithm for designing trajectories based on explicit definition of radial undersampling, and also gives several analytical expressions for charactering the base (critically sampled) class of these trajectories. Expressions for the gradient waveform, based on slew and amplitude limits, are developed such that a desired pitch in the spiral k-space trajectory is followed. The source code for this algorithm, written in C, is publicly available. Analytical expressions approximating the spiral trajectory (ignoring the radial component) are given to characterize measurement time, gradient heating, maximum gradient amplitude, and off-resonance phase for slew-limited and gradient amplitude-limited cases. Several numerically calculated trajectories are illustrated, and base Archimedean spirals are compared with analytically obtained results. Several different waveforms illustrate that the desired slew and amplitude limits are reached, as are the desired undersampling patterns, using the numerical method. For base Archimedean spirals, the results of the numerical and analytical approaches are in good agreement. A versatile numerical algorithm was developed, and was written in publicly available code. Approximate analytical formulas are given that help characterize spiral trajectories. Copyright © 2013 Wiley Periodicals, Inc.

  13. Evaluating the performance of free-formed surface parts using an analytic network process

    NASA Astrophysics Data System (ADS)

    Qian, Xueming; Ma, Yanqiao; Liang, Dezhi

    2018-03-01

    To successfully design parts with a free-formed surface, the critical issue of how to evaluate and select a favourable evaluation strategy before design is raised. The evaluation of free-formed surface parts is a multiple criteria decision-making (MCDM) problem that requires the consideration of a large number of interdependent factors. The analytic network process (ANP) is a relatively new MCDM method that can systematically deal with all kinds of dependences. In this paper, the factors, which come from the life-cycle and influence the design of free-formed surface parts, are proposed. After analysing the interdependence among these factors, a Hybrid ANP (HANP) structure for evaluating the part’s curved surface is constructed. Then, a HANP evaluation of an impeller is presented to illustrate the application of the proposed method.

  14. Linguistics and the Study of Literature. Linguistics in the Undergraduate Curriculum, Appendix 4-D.

    ERIC Educational Resources Information Center

    Steward, Ann Harleman

    Linguistics gives the student of literature an analytical tool whose sole purpose is to describe faithfully the workings of language. It provides a theoretical framework, an analytical method, and a vocabulary for communicating its insights--all designed to serve concerns other than literary interpretation and evaluation, but all useful for…

  15. Design of a portable gas chromatography with a conducting polymer nanocomposite detector device and a method to analyze a gas mixture.

    PubMed

    Pirsa, Sajad

    2017-04-01

    A portable chromatography device and a method were developed to analyze a gas mixture. The device comprises a chromatographic column for separating components of a sample of the gas mixture. It has an air pump coupled to the inlet of a chromatographic column for pumping air and an injector coupled to the inlet of chromatographic column for feeding the sample using the air as a carrier gas. A detector is arranged downstream from and coupled to the outlet of the chromatographic column. The detector is a nanostructure semiconductive microfiber. The device further comprises an evaluation unit arranged and configured to evaluate each detected component to determine the concentration. The designed portable system was used for simultaneous detection of amines. The possibility of applying dispersive liquid-liquid microextraction for the determination of analytes in trace levels is demonstrated. The reproducibility of this method is acceptable, and good standard deviations were obtained. The relative standard deviation value is less than 6% for all analytes. Finally, the method was successfully applied to the extraction and determination of analytes in water samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Flexible pavement overlay design procedures. Volume 1: Evaluation and modification of the design methods

    NASA Astrophysics Data System (ADS)

    Majidzadeh, K.; Ilves, G. J.

    1981-08-01

    A ready reference to design procedures for asphaltic concrete overlay of flexible pavements based on elastic layer theory is provided. The design procedures and the analytical techniques presented were formulated to predict the structural fatigue response of asphaltic concrete overlays for various design conditions, including geometrical and material properties, loading conditions and environmental variables.

  17. Use of experimental design in the investigation of stir bar sorptive extraction followed by ultra-high-performance liquid chromatography-tandem mass spectrometry for the analysis of explosives in water samples.

    PubMed

    Schramm, Sébastien; Vailhen, Dominique; Bridoux, Maxime Cyril

    2016-02-12

    A method for the sensitive quantification of trace amounts of organic explosives in water samples was developed by using stir bar sorptive extraction (SBSE) followed by liquid desorption and ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS). The proposed method was developed and optimized using a statistical design of experiment approach. Use of experimental designs allowed a complete study of 10 factors and 8 analytes including nitro-aromatics, amino-nitro-aromatics and nitric esters. The liquid desorption study was performed using a full factorial experimental design followed by a kinetic study. Four different variables were tested here: the liquid desorption mode (stirring or sonication), the chemical nature of the stir bar (PDMS or PDMS-PEG), the composition of the liquid desorption phase and finally, the volume of solvent used for the liquid desorption. On the other hand, the SBSE extraction study was performed using a Doehlert design. SBSE extraction conditions such as extraction time profiles, sample volume, modifier addition, and acetic acid addition were examined. After optimization of the experimental parameters, sensitivity was improved by a factor 5-30, depending on the compound studied, due to the enrichment factors reached using the SBSE method. Limits of detection were in the ng/L level for all analytes studied. Reproducibility of the extraction with different stir bars was close to the reproducibility of the analytical method (RSD between 4 and 16%). Extractions in various water sample matrices (spring, mineral and underground water) have shown similar enrichment compared to ultrapure water, revealing very low matrix effects. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. On the design of recursive digital filters

    NASA Technical Reports Server (NTRS)

    Shenoi, K.; Narasimha, M. J.; Peterson, A. M.

    1976-01-01

    A change of variables is described which transforms the problem of designing a recursive digital filter to that of approximation by a ratio of polynomials on a finite interval. Some analytic techniques for the design of low-pass filters are presented, illustrating the use of the transformation. Also considered are methods for the design of phase equalizers.

  19. Implementation of structural response sensitivity calculations in a large-scale finite-element analysis system

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Rogers, J. L., Jr.

    1982-01-01

    The implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calclating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of the system are also discussed.

  20. EPA (ENVIRONMENTAL PROTECTION AGENCY) METHOD STUDY 28, PCB'S (POLYCHLORINATED BIPHENYLS) IN OIL

    EPA Science Inventory

    This report describes the experimental design and the results of the validation study for two analytical methods to detect polychlorinated byphenyls in oil. The methods analyzed for four PCB Aroclors (1016, 1242, 1254, and 1260), 2-chlorobiphenyl, and decachlorobiphenyl. The firs...

  1. Study on bending behaviour of nickel–titanium rotary endodontic instruments by analytical and numerical analyses

    PubMed Central

    Tsao, C C; Liou, J U; Wen, P H; Peng, C C; Liu, T S

    2013-01-01

    Aim To develop analytical models and analyse the stress distribution and flexibility of nickel–titanium (NiTi) instruments subject to bending forces. Methodology The analytical method was used to analyse the behaviours of NiTi instruments under bending forces. Two NiTi instruments (RaCe and Mani NRT) with different cross-sections and geometries were considered. Analytical results were derived using Euler–Bernoulli nonlinear differential equations that took into account the screw pitch variation of these NiTi instruments. In addition, the nonlinear deformation analysis based on the analytical model and the finite element nonlinear analysis was carried out. Numerical results are obtained by carrying out a finite element method. Results According to analytical results, the maximum curvature of the instrument occurs near the instrument tip. Results of the finite element analysis revealed that the position of maximum von Mises stress was near the instrument tip. Therefore, the proposed analytical model can be used to predict the position of maximum curvature in the instrument where fracture may occur. Finally, results of analytical and numerical models were compatible. Conclusion The proposed analytical model was validated by numerical results in analysing bending deformation of NiTi instruments. The analytical model is useful in the design and analysis of instruments. The proposed theoretical model is effective in studying the flexibility of NiTi instruments. Compared with the finite element method, the analytical model can deal conveniently and effectively with the subject of bending behaviour of rotary NiTi endodontic instruments. PMID:23173762

  2. Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration

    NASA Technical Reports Server (NTRS)

    Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.

    1993-01-01

    Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.

  3. An analytic-numerical method for the construction of the reference law of operation for a class of mechanical controlled systems

    NASA Astrophysics Data System (ADS)

    Mizhidon, A. D.; Mizhidon, K. A.

    2017-04-01

    An analytic-numerical method for the construction of a reference law of operation for a class of dynamic systems describing vibrations in controlled mechanical systems is proposed. By the reference law of operation of a system, we mean a law of the system motion that satisfies all the requirements for the quality and design features of the system under permanent external disturbances. As disturbances, we consider polyharmonic functions with known amplitudes and frequencies of the harmonics but unknown initial phases. For constructing the reference law of motion, an auxiliary optimal control problem is solved in which the cost function depends on a weighting coefficient. The choice of the weighting coefficient ensures the design of the reference law. Theoretical foundations of the proposed method are given.

  4. Two-condition within-participant statistical mediation analysis: A path-analytic framework.

    PubMed

    Montoya, Amanda K; Hayes, Andrew F

    2017-03-01

    Researchers interested in testing mediation often use designs where participants are measured on a dependent variable Y and a mediator M in both of 2 different circumstances. The dominant approach to assessing mediation in such a design, proposed by Judd, Kenny, and McClelland (2001), relies on a series of hypothesis tests about components of the mediation model and is not based on an estimate of or formal inference about the indirect effect. In this article we recast Judd et al.'s approach in the path-analytic framework that is now commonly used in between-participant mediation analysis. By so doing, it is apparent how to estimate the indirect effect of a within-participant manipulation on some outcome through a mediator as the product of paths of influence. This path-analytic approach eliminates the need for discrete hypothesis tests about components of the model to support a claim of mediation, as Judd et al.'s method requires, because it relies only on an inference about the product of paths-the indirect effect. We generalize methods of inference for the indirect effect widely used in between-participant designs to this within-participant version of mediation analysis, including bootstrap confidence intervals and Monte Carlo confidence intervals. Using this path-analytic approach, we extend the method to models with multiple mediators operating in parallel and serially and discuss the comparison of indirect effects in these more complex models. We offer macros and code for SPSS, SAS, and Mplus that conduct these analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Computational thermo-fluid dynamics contributions to advanced gas turbine engine design

    NASA Technical Reports Server (NTRS)

    Graham, R. W.; Adamczyk, J. J.; Rohlik, H. E.

    1984-01-01

    The design practices for the gas turbine are traced throughout history with particular emphasis on the calculational or analytical methods. Three principal components of the gas turbine engine will be considered: namely, the compressor, the combustor and the turbine.

  6. Developments in Cylindrical Shell Stability Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Starnes, James H., Jr.

    1998-01-01

    Today high-performance computing systems and new analytical and numerical techniques enable engineers to explore the use of advanced materials for shell design. This paper reviews some of the historical developments of shell buckling analysis and design. The paper concludes by identifying key research directions for reliable and robust methods development in shell stability analysis and design.

  7. BETA (Bitter Electromagnet Testing Apparatus)

    NASA Astrophysics Data System (ADS)

    Bates, Evan M.; Birmingham, William J.; Rivera, William F.; Romero-Talamas, Carlos A.

    2017-10-01

    The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) prototype of the 10-T Adjustable Long Pulse High-Field Apparatus (ALPHA). These water-cooled resistive magnets use high DC currents to produce strong uniform magnetic fields. Presented here is the successful completion of the BETA project and experimental results validating analytical magnet designing methods developed at the Dusty Plasma Laboratory (DPL). BETA's final design specifications will be highlighted which include electromagnetic, thermal and stress analyses. The magnet core design will be explained which include: Bitter Arcs, helix starters, and clamping annuli. The final version of the magnet's vessel and cooling system are also presented, as well as the electrical system of BETA, which is composed of a unique solid-state breaker circuit. Experimental results presented will show the operation of BETA at 1 T. The results are compared to both analytical design methods and finite element analysis calculations. We also explore the steady state maximums and theoretical limits of BETA's design. The completion of BETA validates the design and manufacturing techniques that will be used in the succeeding magnet, ALPHA.

  8. A Requirements-Driven Optimization Method for Acoustic Treatment Design

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.

    2016-01-01

    Acoustic treatment designers have long been able to target specific noise sources inside turbofan engines. Facesheet porosity and cavity depth are key design variables of perforate-over-honeycomb liners that determine levels of noise suppression as well as the frequencies at which suppression occurs. Layers of these structures can be combined to create a robust attenuation spectrum that covers a wide range of frequencies. Looking to the future, rapidly-emerging additive manufacturing technologies are enabling new liners with multiple degrees of freedom, and new adaptive liners with variable impedance are showing promise. More than ever, there is greater flexibility and freedom in liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. The subject of this paper is an analytical method that derives the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground.

  9. [Systems epidemiology].

    PubMed

    Huang, T; Li, L M

    2018-05-10

    The era of medical big data, translational medicine and precision medicine brings new opportunities for the study of etiology of chronic complex diseases. How to implement evidence-based medicine, translational medicine and precision medicine are the challenges we are facing. Systems epidemiology, a new field of epidemiology, combines medical big data with system biology and examines the statistical model of disease risk, the future risk simulation and prediction using the data at molecular, cellular, population, social and ecological levels. Due to the diversity and complexity of big data sources, the development of study design and analytic methods of systems epidemiology face new challenges and opportunities. This paper summarizes the theoretical basis, concept, objectives, significances, research design and analytic methods of systems epidemiology and its application in the field of public health.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall-Anese, Emiliano; Simonetto, Andrea

    This paper focuses on the design of online algorithms based on prediction-correction steps to track the optimal solution of a time-varying constrained problem. Existing prediction-correction methods have been shown to work well for unconstrained convex problems and for settings where obtaining the inverse of the Hessian of the cost function can be computationally affordable. The prediction-correction algorithm proposed in this paper addresses the limitations of existing methods by tackling constrained problems and by designing a first-order prediction step that relies on the Hessian of the cost function (and do not require the computation of its inverse). Analytical results are establishedmore » to quantify the tracking error. Numerical simulations corroborate the analytical results and showcase performance and benefits of the algorithms.« less

  11. Green design assessment of electromechanical products based on group weighted-AHP

    NASA Astrophysics Data System (ADS)

    Guo, Jinwei; Zhou, MengChu; Li, Zhiwu; Xie, Huiguang

    2015-11-01

    Manufacturing industry is the backbone of a country's economy while environmental pollution is a serious problem that human beings must face today. The green design of electromechanical products based on enterprise information systems is an important method to solve the environmental problem. The question on how to design green products must be answered by excellent designers via both advanced design methods and effective assessment methods of electromechanical products. Making an objective and precise assessment of green design is one of the problems that must be solved when green design is conducted. An assessment method of green design on electromechanical products based on Group Weighted-AHP (Analytic Hierarchy Process) is proposed in this paper, together with the characteristics of green products. The assessment steps of green design are also established. The results are illustrated via the assessment of a refrigerator design.

  12. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    Continuing studies associated with the development of the quasi-analytical (QA) sensitivity method for three dimensional transonic flow about wings are presented. Furthermore, initial results using the quasi-analytical approach were obtained and compared to those computed using the finite difference (FD) approach. The basic goals achieved were: (1) carrying out various debugging operations pertaining to the quasi-analytical method; (2) addition of section design variables to the sensitivity equation in the form of multiple right hand sides; (3) reconfiguring the analysis/sensitivity package in order to facilitate the execution of analysis/FD/QA test cases; and (4) enhancing the display of output data to allow careful examination of the results and to permit various comparisons of sensitivity derivatives obtained using the FC/QA methods to be conducted easily and quickly. In addition to discussing the above goals, the results of executing subcritical and supercritical test cases are presented.

  13. Analytical design of an industrial two-term controller for optimal regulatory control of open-loop unstable processes under operational constraints.

    PubMed

    Tchamna, Rodrigue; Lee, Moonyong

    2018-01-01

    This paper proposes a novel optimization-based approach for the design of an industrial two-term proportional-integral (PI) controller for the optimal regulatory control of unstable processes subjected to three common operational constraints related to the process variable, manipulated variable and its rate of change. To derive analytical design relations, the constrained optimal control problem in the time domain was transformed into an unconstrained optimization problem in a new parameter space via an effective parameterization. The resulting optimal PI controller has been verified to yield optimal performance and stability of an open-loop unstable first-order process under operational constraints. The proposed analytical design method explicitly takes into account the operational constraints in the controller design stage and also provides useful insights into the optimal controller design. Practical procedures for designing optimal PI parameters and a feasible constraint set exclusive of complex optimization steps are also proposed. The proposed controller was compared with several other PI controllers to illustrate its performance. The robustness of the proposed controller against plant-model mismatch has also been investigated. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Durability predictions of adhesively bonded composite structures using accelerated characterization methods

    NASA Technical Reports Server (NTRS)

    Brinson, H. F.

    1985-01-01

    The utilization of adhesive bonding for composite structures is briefly assessed. The need for a method to determine damage initiation and propagation for such joints is outlined. Methods currently in use to analyze both adhesive joints and fiber reinforced plastics is mentioned and it is indicated that all methods require the input of the mechanical properties of the polymeric adhesive and composite matrix material. The mechanical properties of polymers are indicated to be viscoelastic and sensitive to environmental effects. A method to analytically characterize environmentally dependent linear and nonlinear viscoelastic properties is given. It is indicated that the methodology can be used to extrapolate short term data to long term design lifetimes. That is, the method can be used for long term durability predictions. Experimental results for near adhesive resins, polymers used as composite matrices and unidirectional composite laminates is given. The data is fitted well with the analytical durability methodology. Finally, suggestions are outlined for the development of an analytical methodology for the durability predictions of adhesively bonded composite structures.

  15. Symbolic Drawings Reveal Changes in Preservice Teacher Mathematics Attitudes after a Mathematics Methods Course

    ERIC Educational Resources Information Center

    Rule, Audrey C.; Harrell, Mary H.

    2006-01-01

    A new method of analyzing mathematics attitudes through symbolic drawings, situated within the field of Jungian-oriented analytical psychology, was applied to 52 preservice elementary teachers before and after a mathematics methods course. In this triangulation mixed methods design study, pretest images related to past mathematics experiences…

  16. A methodology for designing aircraft to low sonic boom constraints

    NASA Technical Reports Server (NTRS)

    Mack, Robert J.; Needleman, Kathy E.

    1991-01-01

    A method for designing conceptual supersonic cruise aircraft to meet low sonic boom requirements is outlined and described. The aircraft design is guided through a systematic evolution from initial three view drawing to a final numerical model description, while the designer using the method controls the integration of low sonic boom, high supersonic aerodynamic efficiency, adequate low speed handling, and reasonable structure and materials technologies. Some experience in preliminary aircraft design and in the use of various analytical and numerical codes is required for integrating the volume and lift requirements throughout the design process.

  17. Photonic crystal-based optical biosensor: a brief investigation

    NASA Astrophysics Data System (ADS)

    Divya, J.; Selvendran, S.; Sivanantha Raja, A.

    2018-06-01

    In this paper, a two-dimensional photonic crystal biosensor for medical applications based on two waveguides and a nanocavity was explored with different shoulder-coupled nanocavity structures. The most important biosensor parameters, like the sensitivity and quality factor, can be significantly improved. By injecting an analyte into a sensing hole, the refractive index of the hole was changed. This refractive index biosensor senses the changes and shifts its operating wavelength accordingly. The transmission characteristics of light in the biosensor under different refractive indices that correspond to the change in the analyte concentration are analyzed by the finite-difference time-domain method. The band gap for each structure is designed and observed by the plane wave expansion method. These proposed structures are designed to obtain an analyte refractive index variation of about 1–1.5 in an optical wavelength range of 1.250–1.640 µm. Accordingly, an improved sensitivity of 136.6 nm RIU‑1 and a quality factor as high as 3915 is achieved. An important feature of this structure is its very small dimensions. Such a combination of attributes makes the designed structure a promising element for label-free biosensing applications.

  18. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  19. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  20. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  1. Current Trends in Nanomaterial-Based Amperometric Biosensors

    PubMed Central

    Hayat, Akhtar; Catanante, Gaëlle; Marty, Jean Louis

    2014-01-01

    The last decade has witnessed an intensive research effort in the field of electrochemical sensors, with a particular focus on the design of amperometric biosensors for diverse analytical applications. In this context, nanomaterial integration in the construction of amperometric biosensors may constitute one of the most exciting approaches. The attractive properties of nanomaterials have paved the way for the design of a wide variety of biosensors based on various electrochemical detection methods to enhance the analytical characteristics. However, most of these nanostructured materials are not explored in the design of amperometric biosensors. This review aims to provide insight into the diverse properties of nanomaterials that can be possibly explored in the construction of amperometric biosensors. PMID:25494347

  2. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  3. Evaluation Using Sequential Trials Methods.

    ERIC Educational Resources Information Center

    Cohen, Mark E.; Ralls, Stephen A.

    1986-01-01

    Although dental school faculty as well as practitioners are interested in evaluating products and procedures used in clinical practice, research design and statistical analysis can sometimes pose problems. Sequential trials methods provide an analytical structure that is both easy to use and statistically valid. (Author/MLW)

  4. 76 FR 43231 - Receipt of Several Pesticide Petitions Filed for Residues of Pesticide Chemicals in or on Various...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ... of food with residues at or above the levels set in these tolerances. The Analytical Chemistry... the limit of detection of the designated method. In plants, the method is aqueous organic solvent...

  5. Workspace Program for Complex-Number Arithmetic

    NASA Technical Reports Server (NTRS)

    Patrick, M. C.; Howell, Leonard W., Jr.

    1986-01-01

    COMPLEX is workspace program designed to empower APL with complexnumber capabilities. Complex-variable methods provide analytical tools invaluable for applications in mathematics, science, and engineering. COMPLEX written in APL.

  6. Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.

    1989-01-01

    The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.

  7. Tire Changes, Fresh Air, and Yellow Flags: Challenges in Predictive Analytics for Professional Racing.

    PubMed

    Tulabandhula, Theja; Rudin, Cynthia

    2014-06-01

    Our goal is to design a prediction and decision system for real-time use during a professional car race. In designing a knowledge discovery process for racing, we faced several challenges that were overcome only when domain knowledge of racing was carefully infused within statistical modeling techniques. In this article, we describe how we leveraged expert knowledge of the domain to produce a real-time decision system for tire changes within a race. Our forecasts have the potential to impact how racing teams can optimize strategy by making tire-change decisions to benefit their rank position. Our work significantly expands previous research on sports analytics, as it is the only work on analytical methods for within-race prediction and decision making for professional car racing.

  8. Surface enhanced Raman spectroscopy based nanoparticle assays for rapid, point-of-care diagnostics

    NASA Astrophysics Data System (ADS)

    Driscoll, Ashley J.

    Nucleotide and immunoassays are important tools for disease diagnostics. Many of the current laboratory-based analytical diagnostic techniques require multiple assay steps and long incubation times before results are acquired. In the development of bioassays designed for detecting the emergence and spread of diseases in point-of-care (POC) and remote settings, more rapid and portable analytical methods are necessary. Nanoparticles provide simple and reproducible synthetic methods for the preparation of substrates that can be applied in colloidal assays, providing gains in kinetics due to miniaturization and plasmonic substrates for surface enhanced spectroscopies. Specifically, surface enhanced Raman spectroscopy (SERS) is finding broad application as a signal transduction method in immunological and nucleotide assays due to the production of narrow spectral peaks from the scattering molecules and the potential for simultaneous multiple analyte detection. The application of SERS to a no-wash, magnetic capture assay for the detection of West Nile Virus Envelope and Rift Valley Fever Virus N antigens is described. The platform utilizes colloid based capture of the target antigen in solution, magnetic collection of the immunocomplexes and acquisition of SERS spectra by a handheld Raman spectrometer. The reagents for a core-shell nanoparticle, SERS based assay designed for the capture of target microRNA implicated in acute myocardial infarction are also characterized. Several new, small molecule Raman scatterers are introduced and used to analyze the enhancing properties of the synthesized gold coated-magnetic nanoparticles. Nucleotide and immunoassay platforms have shown improvements in speed and analyte capture through the miniaturization of the capture surface and particle-based capture systems can provide a route to further surface miniaturization. A reaction-diffusion model of the colloidal assay platform is presented to understand the interplay of system parameters such as particle diameter, initial analyte concentration and dissociation constants. The projected sensitivities over a broad range of assay conditions are examined and the governing regime of particle systems reported. The results provide metrics in the design of more robust analytics that are of particular interest for POC diagnostics.

  9. Validation protocol of analytical procedures for quantification of drugs in polymeric systems for parenteral administration: dexamethasone phosphate disodium microparticles.

    PubMed

    Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel

    2013-12-15

    In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Historical review of missile aerodynamic developments

    NASA Technical Reports Server (NTRS)

    Spearman, M. Leroy

    1989-01-01

    A comprehensive development history to about 1970 is presented for missile technologies and their associated capabilities and difficulties. Attention is given to the growth of an experimental data base for missile design, as well as to the critical early efforts to develop analytical methods applicable to missiles. Most of the important missile development efforts made during the period from the end of the Second World War to the early 1960s were based primarily on experiences gained through wind tunnel and flight testing; analytical techniques began to demonstrate their usefulness in the design process only in the late 1960s.

  11. Study designs appropriate for the workplace.

    PubMed

    Hogue, C J

    1986-01-01

    Carlo and Hearn have called for "refinement of old [epidemiologic] methods and an ongoing evaluation of where methods fit in the overall scheme as we address the multiple complexities of reproductive hazard assessment." This review is an attempt to bring together the current state-of-the-art methods for problem definition and hypothesis testing available to the occupational epidemiologist. For problem definition, meta analysis can be utilized to narrow the field of potential causal hypotheses. Passive active surveillance may further refine issues for analytic research. Within analytic epidemiology, several methods may be appropriate for the workplace setting. Those discussed here may be used to estimate the risk ratio in either a fixed or dynamic population.

  12. A Digital Mixed Methods Research Design: Integrating Multimodal Analysis with Data Mining and Information Visualization for Big Data Analytics

    ERIC Educational Resources Information Center

    O'Halloran, Kay L.; Tan, Sabine; Pham, Duc-Son; Bateman, John; Vande Moere, Andrew

    2018-01-01

    This article demonstrates how a digital environment offers new opportunities for transforming qualitative data into quantitative data in order to use data mining and information visualization for mixed methods research. The digital approach to mixed methods research is illustrated by a framework which combines qualitative methods of multimodal…

  13. Designing a Double-Pole Nanoscale Relay Based on a Carbon Nanotube: A Theoretical Study

    NASA Astrophysics Data System (ADS)

    Mu, Weihua; Ou-Yang, Zhong-can; Dresselhaus, Mildred S.

    2017-08-01

    We theoretically investigate a novel and powerful double-pole nanoscale relay based on a carbon nanotube, which is one of the nanoelectromechanical switches being able to work under the strong nuclear radiation, and analyze the physical mechanism of the operating stages in the operation, including "pull in," "connection," and "pull back," as well as the key factors influencing the efficiency of the devices. We explicitly provide the analytical expression of the two important operation voltages, Vpull in and Vpull back , therefore clearly showing the dependence of the material properties and geometry of the present devices by the analytical method from basic physics, avoiding complex numerical calculations. Our method is easy to use in preparing the design guide for fabricating the present device and other nanoelectromechanical devices.

  14. Design optimization of an axial-field eddy-current magnetic coupling based on magneto-thermal analytical model

    NASA Astrophysics Data System (ADS)

    Fontchastagner, Julien; Lubin, Thierry; Mezani, Smaïl; Takorabet, Noureddine

    2018-03-01

    This paper presents a design optimization of an axial-flux eddy-current magnetic coupling. The design procedure is based on a torque formula derived from a 3D analytical model and a population algorithm method. The main objective of this paper is to determine the best design in terms of magnets volume in order to transmit a torque between two movers, while ensuring a low slip speed and a good efficiency. The torque formula is very accurate and computationally efficient, and is valid for any slip speed values. Nevertheless, in order to solve more realistic problems, and then, take into account the thermal effects on the torque value, a thermal model based on convection heat transfer coefficients is also established and used in the design optimization procedure. Results show the effectiveness of the proposed methodology.

  15. Designing Glass Panels for Economy and Reliability

    NASA Technical Reports Server (NTRS)

    Moore, D. M.

    1983-01-01

    Analytical method determines probability of failure of rectangular glass plates subjected to uniformly distributed loads such as those from wind, earthquake, snow, and deadweight. Developed as aid in design of protective glass covers for solar-cell arrays and solar collectors, method is also useful in estimating the reliability of large windows in buildings exposed to high winds and is adapted to nonlinear stress analysis of simply supported plates of any elastic material.

  16. 77 FR 15722 - Southern California Hook and Line Survey; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-16

    ... meeting to evaluate the Southern California Shelf Rockfish Hook and Line Survey which was designed to... and Line survey design and protocols; (2) examine the analytical methods used to generate rockfish... California Hook and Line Survey; Public Meeting AGENCY: National Marine Fisheries Service (NMFS), National...

  17. Vortex-Lattice Utilization. [in aeronautical engineering and aircraft design

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The many novel, innovative, and unique implementations and applications of the vortex-lattice method to aerodynamic design and analysis which have been performed by Industry, Government, and Universities were presented. Although this analytical tool is not new, it continues to be utilized and refined in the aeronautical community.

  18. A Novel Analysis Method for Paired-Sample Microbial Ecology Experiments.

    PubMed

    Olesen, Scott W; Vora, Suhani; Techtmann, Stephen M; Fortney, Julian L; Bastidas-Oyanedel, Juan R; Rodríguez, Jorge; Hazen, Terry C; Alm, Eric J

    2016-01-01

    Many microbial ecology experiments use sequencing data to measure a community's response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samples and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method's validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of "bottle effects".

  19. Utilizing global data to estimate analytical performance on the Sigma scale: A global comparative analysis of methods, instruments, and manufacturers through external quality assurance and proficiency testing programs.

    PubMed

    Westgard, Sten A

    2016-06-01

    To assess the analytical performance of instruments and methods through external quality assessment and proficiency testing data on the Sigma scale. A representative report from five different EQA/PT programs around the world (2 US, 1 Canadian, 1 UK, and 1 Australasian) was accessed. The instrument group standard deviations were used as surrogate estimates of instrument imprecision. Performance specifications from the US CLIA proficiency testing criteria were used to establish a common quality goal. Then Sigma-metrics were calculated to grade the analytical performance. Different methods have different Sigma-metrics for each analyte reviewed. Summary Sigma-metrics estimate the percentage of the chemistry analytes that are expected to perform above Five Sigma, which is where optimized QC design can be implemented. The range of performance varies from 37% to 88%, exhibiting significant differentiation between instruments and manufacturers. Median Sigmas for the different manufacturers in three analytes (albumin, glucose, sodium) showed significant differentiation. Chemistry tests are not commodities. Quality varies significantly from manufacturer to manufacturer, instrument to instrument, and method to method. The Sigma-assessments from multiple EQA/PT programs provide more insight into the performance of methods and instruments than any single program by itself. It is possible to produce a ranking of performance by manufacturer, instrument and individual method. Laboratories seeking optimal instrumentation would do well to consult this data as part of their decision-making process. To confirm that these assessments are stable and reliable, a longer term study should be conducted that examines more results over a longer time period. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. Cylindrical optical resonators: fundamental properties and bio-sensing characteristics

    NASA Astrophysics Data System (ADS)

    Khozeymeh, Foroogh; Razaghi, Mohammad

    2018-04-01

    In this paper, detailed theoretical analysis of cylindrical resonators is demonstrated. As illustrated, these kinds of resonators can be used as optical bio-sensing devices. The proposed structure is analyzed using an analytical method based on Lam's approximation. This method is systematic and has simplified the tedious process of whispering-gallery mode (WGM) wavelength analysis in optical cylindrical biosensors. By this method, analysis of higher radial orders of high angular momentum WGMs has been possible. Using closed-form analytical equations, resonance wavelengths of higher radial and angular order WGMs of TE and TM polarization waves are calculated. It is shown that high angular momentum WGMs are more appropriate for bio-sensing applications. Some of the calculations are done using a numerical non-linear Newton method. A perfect match of 99.84% between the analytical and the numerical methods has been achieved. In order to verify the validity of the calculations, Meep simulations based on the finite difference time domain (FDTD) method are performed. In this case, a match of 96.70% between the analytical and FDTD results has been obtained. The analytical predictions are in good agreement with other experimental work (99.99% match). These results validate the proposed analytical modelling for the fast design of optical cylindrical biosensors. It is shown that by extending the proposed two-layer resonator structure analyzing scheme, it is possible to study a three-layer cylindrical resonator structure as well. Moreover, by this method, fast sensitivity optimization in cylindrical resonator-based biosensors has been possible. Sensitivity of the WGM resonances is analyzed as a function of the structural parameters of the cylindrical resonators. Based on the results, fourth radial order WGMs, with a resonator radius of 50 μm, display the most bulk refractive index sensitivity of 41.50 (nm/RIU).

  1. A method to design blended rolled edges for compact range reflectors

    NASA Technical Reports Server (NTRS)

    Gupta, Inder J.; Burnside, Walter D.

    1989-01-01

    A method to design blended rolled edges for arbitrary rim shape compact range reflectors is presented. The reflectors may be center-fed or offset-fed. The method leads to rolled edges with minimal surface discontinuities. It is shown that the reflectors designed using the prescribed method can be defined analytically using simple expressions. A procedure to obtain optimum rolled edges parameter is also presented. The procedure leads to blended rolled edges that minimize the diffracted fields emanating from the junction between the paraboloid and the rolled edge surface while satisfying certain constraints regarding the reflector size and the minimum operating frequency of the system.

  2. A method to design blended rolled edges for compact range reflectors

    NASA Technical Reports Server (NTRS)

    Gupta, Inder J.; Ericksen, Kurt P.; Burnside, Walter D.

    1990-01-01

    A method to design blended rolled edges for arbitrary rim shape compact range reflectors is presented. The reflectors may be center-fed or offset-fed. The method leads to rolled edges with minimal surface discontinuities. It is shown that the reflectors designed using the prescribed method can be defined analytically using simple expressions. A procedure to obtain optimum rolled edges parameters is also presented. The procedure leads to blended rolled edges that minimize the diffracted fields emanating from the junction between the paraboloid and the rolled edge surface while satisfying certain constraints regarding the reflector size and the minimum operating frequency of the system.

  3. Simultaneous determination of macronutrients, micronutrients and trace elements in mineral fertilizers by inductively coupled plasma optical emission spectrometry

    NASA Astrophysics Data System (ADS)

    de Oliveira Souza, Sidnei; da Costa, Silvânio Silvério Lopes; Santos, Dayane Melo; dos Santos Pinto, Jéssica; Garcia, Carlos Alexandre Borges; Alves, José do Patrocínio Hora; Araujo, Rennan Geovanny Oliveira

    2014-06-01

    An analytical method for simultaneous determination of macronutrients (Ca, Mg, Na and P), micronutrients (Cu, Fe, Mn and Zn) and trace elements (Al, As, Cd, Pb and V) in mineral fertilizers was optimized. Two-level full factorial design was applied to evaluate the optimal proportions of reagents used in the sample digestion on hot plate. A Doehlert design for two variables was used to evaluate the operating conditions of the inductively coupled plasma optical emission spectrometer in order to accomplish the simultaneous determination of the analyte concentrations. The limits of quantification (LOQs) ranged from 2.0 mg kg- 1 for Mn to 77.3 mg kg- 1 for P. The accuracy and precision of the proposed method were evaluated by analysis of standard reference materials (SRMs) of Western phosphate rock (NIST 694), Florida phosphate rock (NIST 120C) and Trace elements in multi-nutrient fertilizer (NIST 695), considered to be adequate for simultaneous determination. Twenty-one samples of mineral fertilizers collected in Sergipe State, Brazil, were analyzed. For all samples, the As, Ca, Cd and Pb concentrations were below the LOQ values of the analytical method. For As, Cd and Pb the obtained LOQ values were below the maximum limit allowed by the Brazilian Ministry of Agriculture, Livestock and Food Supply (Ministério da Agricultura, Pecuária e Abastecimento - MAPA). The optimized method presented good accuracy and was effectively applied to quantitative simultaneous determination of the analytes in mineral fertilizers by inductively coupled plasma optical emission spectrometry (ICP OES).

  4. Non-imaging ray-tracing for sputtering simulation with apodization

    NASA Astrophysics Data System (ADS)

    Ou, Chung-Jen

    2018-04-01

    Although apodization patterns have been adopted for the analysis of sputtering sources, the analytical solutions for the film thickness equations are yet limited to only simple conditions. Empirical formulations for thin film sputtering lacking the flexibility in dealing with multi-substrate conditions, a suitable cost-effective procedure is required to estimate the film thickness distribution. This study reports a cross-discipline simulation program, which is based on discrete particle Monte-Carlo methods and has been successfully applied to a non-imaging design to solve problems associated with sputtering uniformity. Robustness of the present method is first proved by comparing it with a typical analytical solution. Further, this report also investigates the overall all effects cause by the sizes of the deposited substrate, such that the determination of the distance between the target surface and the apodization index can be complete. This verifies the capability of the proposed method for solving the sputtering film thickness problems. The benefit is that an optical thin film engineer can, using the same optical software, design a specific optical component and consider the possible coating qualities with thickness tolerance, during the design stage.

  5. Non-imaging ray-tracing for sputtering simulation with apodization

    NASA Astrophysics Data System (ADS)

    Ou, Chung-Jen

    2018-06-01

    Although apodization patterns have been adopted for the analysis of sputtering sources, the analytical solutions for the film thickness equations are yet limited to only simple conditions. Empirical formulations for thin film sputtering lacking the flexibility in dealing with multi-substrate conditions, a suitable cost-effective procedure is required to estimate the film thickness distribution. This study reports a cross-discipline simulation program, which is based on discrete particle Monte-Carlo methods and has been successfully applied to a non-imaging design to solve problems associated with sputtering uniformity. Robustness of the present method is first proved by comparing it with a typical analytical solution. Further, this report also investigates the overall all effects cause by the sizes of the deposited substrate, such that the determination of the distance between the target surface and the apodization index can be complete. This verifies the capability of the proposed method for solving the sputtering film thickness problems. The benefit is that an optical thin film engineer can, using the same optical software, design a specific optical component and consider the possible coating qualities with thickness tolerance, during the design stage.

  6. Research education: findings of a study of teaching-learning research using multiple analytical perspectives.

    PubMed

    Vandermause, Roxanne; Barbosa-Leiker, Celestina; Fritz, Roschelle

    2014-12-01

    This multimethod, qualitative study provides results for educators of nursing doctoral students to consider. Combining the expertise of an empirical analytical researcher (who uses statistical methods) and an interpretive phenomenological researcher (who uses hermeneutic methods), a course was designed that would place doctoral students in the midst of multiparadigmatic discussions while learning fundamental research methods. Field notes and iterative analytical discussions led to patterns and themes that highlight the value of this innovative pedagogical application. Using content analysis and interpretive phenomenological approaches, together with one of the students, data were analyzed from field notes recorded in real time over the period the course was offered. This article describes the course and the study analysis, and offers the pedagogical experience as transformative. A link to a sample syllabus is included in the article. The results encourage nurse educators of doctoral nursing students to focus educational practice on multiple methodological perspectives. Copyright 2014, SLACK Incorporated.

  7. Immunoanalysis Methods for the Detection of Dioxins and Related Chemicals

    PubMed Central

    Tian, Wenjing; Xie, Heidi Qunhui; Fu, Hualing; Pei, Xinhui; Zhao, Bin

    2012-01-01

    With the development of biotechnology, approaches based on antibodies, such as enzyme-linked immunosorbent assay (ELISA), active aryl hydrocarbon immunoassay (Ah-I) and other multi-analyte immunoassays, have been utilized as alternatives to the conventional techniques based on gas chromatography and mass spectroscopy for the analysis of dioxin and dioxin-like compounds in environmental and biological samples. These screening methods have been verified as rapid, simple and cost-effective. This paper provides an overview on the development and application of antibody-based approaches, such as ELISA, Ah-I, and multi-analyte immunoassays, covering the sample extraction and cleanup, antigen design, antibody preparation and immunoanalysis. However, in order to meet the requirements for on-site fast detection and relative quantification of dioxins in the environment, further optimization is needed to make these immuno-analytical methods more sensitive and easy to use. PMID:23443395

  8. An analytical model to design circumferential clasps for laser-sintered removable partial dentures.

    PubMed

    Alsheghri, Ammar A; Alageel, Omar; Caron, Eric; Ciobanu, Ovidiu; Tamimi, Faleh; Song, Jun

    2018-06-21

    Clasps of removable partial dentures (RPDs) often suffer from plastic deformation and failure by fatigue; a common complication of RPDs. A new technology for processing metal frameworks for dental prostheses based on laser-sintering, which allows for precise fabrication of clasp geometry, has been recently developed. This study sought to propose a novel method for designing circumferential clasps for laser-sintered RPDs to avoid plastic deformation or fatigue failure. An analytical model for designing clasps with semicircular cross-sections was derived based on mechanics. The Euler-Bernoulli elastic curved beam theory and Castigliano's energy method were used to relate the stress and undercut with the clasp length, cross-sectional radius, alloy properties, tooth type, and retention force. Finite element analysis (FEA) was conducted on a case study and the resultant tensile stress and undercut were compared with the analytical model predictions. Pull-out experiments were conducted on laser-sintered cobalt-chromium (Co-Cr) dental prostheses to validate the analytical model results. The proposed circumferential clasp design model yields results in good agreement with FEA and experiments. The results indicate that Co-Cr circumferential clasps in molars that are 13mm long engaging undercuts of 0.25mm should have a cross-section radius of 1.2mm to provide a retention of 10N and to avoid plastic deformation or fatigue failure. However, shorter circumferential clasps such as those in premolars present high stresses and cannot avoid plastic deformation or fatigue failure. Laser-sintered Co-Cr circumferential clasps in molars are safe, whereas they are susceptible to failure in premolars. Copyright © 2018 The Academy of Dental Materials. Published by Elsevier Inc. All rights reserved.

  9. Quantitative Electron Probe Microanalysis: State of the Art

    NASA Technical Reports Server (NTRS)

    Carpernter, P. K.

    2005-01-01

    Quantitative electron-probe microanalysis (EPMA) has improved due to better instrument design and X-ray correction methods. Design improvement of the electron column and X-ray spectrometer has resulted in measurement precision that exceeds analytical accuracy. Wavelength-dispersive spectrometer (WDS) have layered-dispersive diffraction crystals with improved light-element sensitivity. Newer energy-dispersive spectrometers (EDS) have Si-drift detector elements, thin window designs, and digital processing electronics with X-ray throughput approaching that of WDS Systems. Using these systems, digital X-ray mapping coupled with spectrum imaging is a powerful compositional mapping tool. Improvements in analytical accuracy are due to better X-ray correction algorithms, mass absorption coefficient data sets,and analysis method for complex geometries. ZAF algorithms have ban superceded by Phi(pz) algorithms that better model the depth distribution of primary X-ray production. Complex thin film and particle geometries are treated using Phi(pz) algorithms, end results agree well with Monte Carlo simulations. For geological materials, X-ray absorption dominates the corretions end depends on the accuracy of mass absorption coefficient (MAC) data sets. However, few MACs have been experimentally measured, and the use of fitted coefficients continues due to general success of the analytical technique. A polynomial formulation of the Bence-Albec alpha-factor technique, calibrated using Phi(pz) algorithms, is used to critically evaluate accuracy issues and can be also be used for high 2% relative and is limited by measurement precision for ideal cases, but for many elements the analytical accuracy is unproven. The EPMA technique has improved to the point where it is frequently used instead of the petrogaphic microscope for reconnaissance work. Examples of stagnant research areas are: WDS detector design characterization of calibration standards, and the need for more complete treatment of the continuum X-ray fluorescence correction.

  10. Optimal optical filters of fluorescence excitation and emission for poultry fecal detection

    USDA-ARS?s Scientific Manuscript database

    Purpose: An analytic method to design excitation and emission filters of a multispectral fluorescence imaging system is proposed and was demonstrated in an application to poultry fecal inspection. Methods: A mathematical model of a multispectral imaging system is proposed and its system parameters, ...

  11. To Demonstrate the Specificity of an Enzymatic Method for Plasma Paracetamol Estimation.

    ERIC Educational Resources Information Center

    O'Mullane, John A.

    1987-01-01

    Describes an experiment designed to introduce biochemistry students to the specificity of an analytical method which uses an enzyme to quantitate its substrate. Includes the use of toxicity charts together with the concept of the biological half-life of a drug. (TW)

  12. Three-dimensional nonsteady heat-transfer analysis of an indirect heating furnace

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ito, H.; Umeda, Y.; Nakamura, Y.

    1991-01-01

    This paper reports on an accurate design method for industrial furnaces from the viewpoint of heat transfer. The authors carried out a three-dimensional nonsteady heat-transfer analysis for a practical-size heat- treatment furnace equipped with radiant heaters. The authors applied three software package programs, STREAM, MORSE, and TRUMP, for the analysis of the combined heat-transfer problems of radiation, conduction, and convection. The authors also carried out experiments of the heating of a charge consisting of packed bolts. The authors found that the air swirled inside the furnace. As for the temperature in each part in the furnace, analytical results were generallymore » in close agreement with the experimental ones. This suggests that our analytical method is useful for a fundamental heat- transfer-based design of a practical-size industrial furnace with an actual charge such as packed bolts. As for the temperature distribution inside the bolt charge (work), the analytical results were also in close agreement with the experimental ones. Consequently, it was found that the heat transfer in the bolt charge could be described with an effective thermal conductivity.« less

  13. Mass-based design and optimization of wave rotors for gas turbine engine enhancement

    NASA Astrophysics Data System (ADS)

    Chan, S.; Liu, H.

    2017-03-01

    An analytic method aiming at mass properties was developed for the preliminary design and optimization of wave rotors. In the present method, we introduce the mass balance principle into the design and thus can predict and optimize the mass qualities as well as the performance of wave rotors. A dedicated least-square method with artificial weighting coefficients was developed to solve the over-constrained system in the mass-based design. This method and the adoption of the coefficients were validated by numerical simulation. Moreover, the problem of fresh air exhaustion (FAE) was put forward and analyzed, and exhaust gas recirculation (EGR) was investigated. Parameter analyses and optimization elucidated which designs would not only achieve the best performance, but also operate with minimum EGR and no FAE.

  14. How Can Visual Analytics Assist Investigative Analysis? Design Implications from an Evaluation.

    PubMed

    Youn-Ah Kang; Görg, Carsten; Stasko, John

    2011-05-01

    Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations on metrics and techniques for evaluating visual analytics systems for investigative analysis.

  15. Magnetic ionic liquids in analytical chemistry: A review.

    PubMed

    Clark, Kevin D; Nacham, Omprakash; Purslow, Jeffrey A; Pierson, Stephen A; Anderson, Jared L

    2016-08-31

    Magnetic ionic liquids (MILs) have recently generated a cascade of innovative applications in numerous areas of analytical chemistry. By incorporating a paramagnetic component within the cation or anion, MILs exhibit a strong response toward external magnetic fields. Careful design of the MIL structure has yielded magnetoactive compounds with unique physicochemical properties including high magnetic moments, enhanced hydrophobicity, and the ability to solvate a broad range of molecules. The structural tunability and paramagnetic properties of MILs have enabled magnet-based technologies that can easily be added to the analytical method workflow, complement needed extraction requirements, or target specific analytes. This review highlights the application of MILs in analytical chemistry and examines the important structural features of MILs that largely influence their physicochemical and magnetic properties. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Statistical process control in nursing research.

    PubMed

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  17. Comparison of Analysis with Test for Static Loading of Two Hypersonic Inflatable Aerodynamic Decelerator Concepts

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.

    2015-01-01

    Acceptance of new spacecraft structural architectures and concepts requires validated design methods to minimize the expense involved with technology demonstration via flight-testing. Hypersonic Inflatable Aerodynamic Decelerator (HIAD) architectures are attractive for spacecraft deceleration because they are lightweight, store compactly, and utilize the atmosphere to decelerate a spacecraft during entry. However, designers are hesitant to include these inflatable approaches for large payloads or spacecraft because of the lack of flight validation. This publication summarizes results comparing analytical results with test data for two concepts subjected to representative entry, static loading. The level of agreement and ability to predict the load distribution is considered sufficient to enable analytical predictions to be used in the design process.

  18. Long-term cryogenic space storage system

    NASA Technical Reports Server (NTRS)

    Hopkins, R. A.; Chronic, W. L.

    1973-01-01

    Discussion of the design, fabrication and testing of a 225-cu ft spherical cryogenic storage system for long-term subcritical applications under zero-g conditions in storing subcritical cryogens for space vehicle propulsion systems. The insulation system design, the analytical methods used, and the correlation between the performance test results and analytical predictions are described. The best available multilayer insulation materials and state-of-the-art thermal protection concepts were applied in the design, providing a boiloff rate of 0.152 lb/hr, or 0.032% per day, and an overall heat flux of 0.066 Btu/sq ft hr based on a 200 sq ft surface area. A six to eighteen month cryogenic storage is provided by this system for space applications.

  19. A review of the analytical simulation of aircraft crash dynamics

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Carden, Huey D.; Boitnott, Richard L.; Hayduk, Robert J.

    1990-01-01

    A large number of full scale tests of general aviation aircraft, helicopters, and one unique air-to-ground controlled impact of a transport aircraft were performed. Additionally, research was also conducted on seat dynamic performance, load-limiting seats, load limiting subfloor designs, and emergency-locator-transmitters (ELTs). Computer programs were developed to provide designers with methods for predicting accelerations, velocities, and displacements of collapsing structure and for estimating the human response to crash loads. The results of full scale aircraft and component tests were used to verify and guide the development of analytical simulation tools and to demonstrate impact load attenuating concepts. Analytical simulation of metal and composite aircraft crash dynamics are addressed. Finite element models are examined to determine their degree of corroboration by experimental data and to reveal deficiencies requiring further development.

  20. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  1. Analysis of Variance in the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Deloach, Richard

    2010-01-01

    This paper is a tutorial introduction to the analysis of variance (ANOVA), intended as a reference for aerospace researchers who are being introduced to the analytical methods of the Modern Design of Experiments (MDOE), or who may have other opportunities to apply this method. One-way and two-way fixed-effects ANOVA, as well as random effects ANOVA, are illustrated in practical terms that will be familiar to most practicing aerospace researchers.

  2. A method for determining spiral-bevel gear tooth geometry for finite element analysis

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.; Litvin, Faydor L.

    1991-01-01

    An analytical method was developed to determine gear tooth surface coordinates of face-milled spiral bevel gears. The method uses the basic gear design parameters in conjunction with the kinematical aspects of spiral bevel gear manufacturing machinery. A computer program, SURFACE, was developed. The computer program calculates the surface coordinates and outputs 3-D model data that can be used for finite element analysis. Development of the modeling method and an example case are presented. This analysis method could also find application for gear inspection and near-net-shape gear forging die design.

  3. Simultaneous determination of glucose, triglycerides, urea, cholesterol, albumin and total protein in human plasma by Fourier transform infrared spectroscopy: direct clinical biochemistry without reagents.

    PubMed

    Jessen, Torben E; Höskuldsson, Agnar T; Bjerrum, Poul J; Verder, Henrik; Sørensen, Lars; Bratholm, Palle S; Christensen, Bo; Jensen, Lene S; Jensen, Maria A B

    2014-09-01

    Direct measurement of chemical constituents in complex biologic matrices without the use of analyte specific reagents could be a step forward toward the simplification of clinical biochemistry. Problems related to reagents such as production errors, improper handling, and lot-to-lot variations would be eliminated as well as errors occurring during assay execution. We describe and validate a reagent free method for direct measurement of six analytes in human plasma based on Fourier-transform infrared spectroscopy (FTIR). Blood plasma is analyzed without any sample preparation. FTIR spectrum of the raw plasma is recorded in a sampling cuvette specially designed for measurement of aqueous solutions. For each analyte, a mathematical calibration process is performed by a stepwise selection of wavelengths giving the optimal least-squares correlation between the measured FTIR signal and the analyte concentration measured by conventional clinical reference methods. The developed calibration algorithms are subsequently evaluated for their capability to predict the concentration of the six analytes in blinded patient samples. The correlation between the six FTIR methods and corresponding reference methods were 0.87

  4. SU-E-T-569: Neutron Shielding Calculation Using Analytical and Multi-Monte Carlo Method for Proton Therapy Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, S; Shin, E H; Kim, J

    2015-06-15

    Purpose: To evaluate the shielding wall design to protect patients, staff and member of the general public for secondary neutron using a simply analytic solution, multi-Monte Carlo code MCNPX, ANISN and FLUKA. Methods: An analytical and multi-Monte Carlo method were calculated for proton facility (Sumitomo Heavy Industry Ltd.) at Samsung Medical Center in Korea. The NCRP-144 analytical evaluation methods, which produced conservative estimates on the dose equivalent values for the shielding, were used for analytical evaluations. Then, the radiation transport was simulated with the multi-Monte Carlo code. The neutron dose at evaluation point is got by the value using themore » production of the simulation value and the neutron dose coefficient introduced in ICRP-74. Results: The evaluation points of accelerator control room and control room entrance are mainly influenced by the point of the proton beam loss. So the neutron dose equivalent of accelerator control room for evaluation point is 0.651, 1.530, 0.912, 0.943 mSv/yr and the entrance of cyclotron room is 0.465, 0.790, 0.522, 0.453 mSv/yr with calculation by the method of NCRP-144 formalism, ANISN, FLUKA and MCNP, respectively. The most of Result of MCNPX and FLUKA using the complicated geometry showed smaller values than Result of ANISN. Conclusion: The neutron shielding for a proton therapy facility has been evaluated by the analytic model and multi-Monte Carlo methods. We confirmed that the setting of shielding was located in well accessible area to people when the proton facility is operated.« less

  5. Development Of Antibody-Based Fiber-Optic Sensors

    NASA Astrophysics Data System (ADS)

    Tromberg, Bruce J.; Sepaniak, Michael J.; Vo-Dinh, Tuan

    1988-06-01

    The speed and specificity characteristic of immunochemical complex formation has encouraged the development of numerous antibody-based analytical techniques. The scope and versatility of these established methods can be enhanced by combining the principles of conventional immunoassay with laser-based fiber-optic fluorimetry. This merger of spectroscopy and immunochemistry provides the framework for the construction of highly sensitive and selective fiber-optic devices (fluoroimmuno-sensors) capable of in-situ detection of drugs, toxins, and naturally occurring biochemicals. Fluoroimmuno-sensors (FIS) employ an immobilized reagent phase at the sampling terminus of a single quartz optical fiber. Laser excitation of antibody-bound analyte produces a fluorescence signal which is either directly proportional (as in the case of natural fluorophor and "antibody sandwich" assays) or inversely proportional (as in the case of competitive-binding assays) to analyte concentration. Factors which influence analysis time, precision, linearity, and detection limits include the nature (solid or liquid) and amount of the reagent phase, the method of analyte delivery (passive diffusion, convection, etc.), and whether equilibrium or non-equilibrium assays are performed. Data will be presented for optical fibers whose sensing termini utilize: (1) covalently-bound solid antibody reagent phases, and (2) membrane-entrapped liquid antibody reagents. Assays for large-molecular weight proteins (antigens) and small-molecular weight, carcinogenic, polynuclear aromatics (haptens) will be considered. In this manner, the influence of a system's chemical characteristics and measurement requirements on sensor design, and the consequence of various sensor designs on analytical performance will be illustrated.

  6. Unlocking Proteomic Heterogeneity in Complex Diseases through Visual Analytics

    PubMed Central

    Bhavnani, Suresh K.; Dang, Bryant; Bellala, Gowtham; Divekar, Rohit; Visweswaran, Shyam; Brasier, Allan; Kurosky, Alex

    2015-01-01

    Despite years of preclinical development, biological interventions designed to treat complex diseases like asthma often fail in phase III clinical trials. These failures suggest that current methods to analyze biomedical data might be missing critical aspects of biological complexity such as the assumption that cases and controls come from homogeneous distributions. Here we discuss why and how methods from the rapidly evolving field of visual analytics can help translational teams (consisting of biologists, clinicians, and bioinformaticians) to address the challenge of modeling and inferring heterogeneity in the proteomic and phenotypic profiles of patients with complex diseases. Because a primary goal of visual analytics is to amplify the cognitive capacities of humans for detecting patterns in complex data, we begin with an overview of the cognitive foundations for the field of visual analytics. Next, we organize the primary ways in which a specific form of visual analytics called networks have been used to model and infer biological mechanisms, which help to identify the properties of networks that are particularly useful for the discovery and analysis of proteomic heterogeneity in complex diseases. We describe one such approach called subject-protein networks, and demonstrate its application on two proteomic datasets. This demonstration provides insights to help translational teams overcome theoretical, practical, and pedagogical hurdles for the widespread use of subject-protein networks for analyzing molecular heterogeneities, with the translational goal of designing biomarker-based clinical trials, and accelerating the development of personalized approaches to medicine. PMID:25684269

  7. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  8. Towards a full integration of optimization and validation phases: An analytical-quality-by-design approach.

    PubMed

    Hubert, C; Houari, S; Rozet, E; Lebrun, P; Hubert, Ph

    2015-05-22

    When using an analytical method, defining an analytical target profile (ATP) focused on quantitative performance represents a key input, and this will drive the method development process. In this context, two case studies were selected in order to demonstrate the potential of a quality-by-design (QbD) strategy when applied to two specific phases of the method lifecycle: the pre-validation study and the validation step. The first case study focused on the improvement of a liquid chromatography (LC) coupled to mass spectrometry (MS) stability-indicating method by the means of the QbD concept. The design of experiments (DoE) conducted during the optimization step (i.e. determination of the qualitative design space (DS)) was performed a posteriori. Additional experiments were performed in order to simultaneously conduct the pre-validation study to assist in defining the DoE to be conducted during the formal validation step. This predicted protocol was compared to the one used during the formal validation. A second case study based on the LC/MS-MS determination of glucosamine and galactosamine in human plasma was considered in order to illustrate an innovative strategy allowing the QbD methodology to be incorporated during the validation phase. An operational space, defined by the qualitative DS, was considered during the validation process rather than a specific set of working conditions as conventionally performed. Results of all the validation parameters conventionally studied were compared to those obtained with this innovative approach for glucosamine and galactosamine. Using this strategy, qualitative and quantitative information were obtained. Consequently, an analyst using this approach would be able to select with great confidence several working conditions within the operational space rather than a given condition for the routine use of the method. This innovative strategy combines both a learning process and a thorough assessment of the risk involved. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Comparative study of solar optics for paraboloidal concentrators

    NASA Technical Reports Server (NTRS)

    Wen, L.; Poon, P.; Carley, W.; Huang, L.

    1979-01-01

    Different analytical methods for computing the flux distribution on the focal plane of a paraboloidal solar concentrator are reviewed. An analytical solution in algebraic form is also derived for an idealized model. The effects resulting from using different assumptions in the definition of optical parameters used in these methodologies are compared and discussed in detail. These parameters include solar irradiance distribution (limb darkening and circumsolar), reflector surface specular spreading, surface slope error, and concentrator pointing inaccuracy. The type of computational method selected for use depends on the maturity of the design and the data available at the time the analysis is made.

  10. 49 CFR Appendix B to Part 236 - Risk Assessment Criteria

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... results of the application of safety design principles as noted in Appendix C to this part. The MTTHE is... fault/failure analysis must be based on the assessment of the design and implementation of all safety... associated device drivers, as well as historical performance data, analytical methods and experimental safety...

  11. 49 CFR Appendix B to Part 236 - Risk Assessment Criteria

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... results of the application of safety design principles as noted in Appendix C to this part. The MTTHE is... fault/failure analysis must be based on the assessment of the design and implementation of all safety... associated device drivers, as well as historical performance data, analytical methods and experimental safety...

  12. Systematic Development and Validation of a Thin-Layer Densitometric Bioanalytical Method for Estimation of Mangiferin Employing Analytical Quality by Design (AQbD) Approach

    PubMed Central

    Khurana, Rajneet Kaur; Rao, Satish; Beg, Sarwar; Katare, O.P.; Singh, Bhupinder

    2016-01-01

    The present work aims at the systematic development of a simple, rapid and highly sensitive densitometry-based thin-layer chromatographic method for the quantification of mangiferin in bioanalytical samples. Initially, the quality target method profile was defined and critical analytical attributes (CAAs) earmarked, namely, retardation factor (Rf), peak height, capacity factor, theoretical plates and separation number. Face-centered cubic design was selected for optimization of volume loaded and plate dimensions as the critical method parameters selected from screening studies employing D-optimal and Plackett–Burman design studies, followed by evaluating their effect on the CAAs. The mobile phase containing a mixture of ethyl acetate : acetic acid : formic acid : water in a 7 : 1 : 1 : 1 (v/v/v/v) ratio was finally selected as the optimized solvent for apt chromatographic separation of mangiferin at 262 nm with Rf 0.68 ± 0.02 and all other parameters within the acceptance limits. Method validation studies revealed high linearity in the concentration range of 50–800 ng/band for mangiferin. The developed method showed high accuracy, precision, ruggedness, robustness, specificity, sensitivity, selectivity and recovery. In a nutshell, the bioanalytical method for analysis of mangiferin in plasma revealed the presence of well-resolved peaks and high recovery of mangiferin. PMID:26912808

  13. Low/Medium Volatile Data Validation

    EPA Pesticide Factsheets

    Document designed to offer data reviewers guidance in determining the validity of analytical data generated through the US EPA Contract Laboratory Program Statement of Work ISM01.X Inorganic Superfund Methods (Multi-Media, Multi-Concentration)

  14. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship-Quasi-Experimental Designs.

    PubMed

    Schweizer, Marin L; Braun, Barbara I; Milstone, Aaron M

    2016-10-01

    Quasi-experimental studies evaluate the association between an intervention and an outcome using experiments in which the intervention is not randomly assigned. Quasi-experimental studies are often used to evaluate rapid responses to outbreaks or other patient safety problems requiring prompt, nonrandomized interventions. Quasi-experimental studies can be categorized into 3 major types: interrupted time-series designs, designs with control groups, and designs without control groups. This methods paper highlights key considerations for quasi-experimental studies in healthcare epidemiology and antimicrobial stewardship, including study design and analytic approaches to avoid selection bias and other common pitfalls of quasi-experimental studies. Infect Control Hosp Epidemiol 2016;1-6.

  15. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship – Quasi-Experimental Designs

    PubMed Central

    Schweizer, Marin L.; Braun, Barbara I.; Milstone, Aaron M.

    2016-01-01

    Quasi-experimental studies evaluate the association between an intervention and an outcome using experiments in which the intervention is not randomly assigned. Quasi-experimental studies are often used to evaluate rapid responses to outbreaks or other patient safety problems requiring prompt non-randomized interventions. Quasi-experimental studies can be categorized into three major types: interrupted time series designs, designs with control groups, and designs without control groups. This methods paper highlights key considerations for quasi-experimental studies in healthcare epidemiology and antimicrobial stewardship including study design and analytic approaches to avoid selection bias and other common pitfalls of quasi-experimental studies. PMID:27267457

  16. Analysis and optimization of hybrid excitation permanent magnet synchronous generator for stand-alone power system

    NASA Astrophysics Data System (ADS)

    Wang, Huijun; Qu, Zheng; Tang, Shaofei; Pang, Mingqi; Zhang, Mingju

    2017-08-01

    In this paper, electromagnetic design and permanent magnet shape optimization for permanent magnet synchronous generator with hybrid excitation are investigated. Based on generator structure and principle, design outline is presented for obtaining high efficiency and low voltage fluctuation. In order to realize rapid design, equivalent magnetic circuits for permanent magnet and iron poles are developed. At the same time, finite element analysis is employed. Furthermore, by means of design of experiment (DOE) method, permanent magnet is optimized to reduce voltage waveform distortion. Finally, the validity of proposed design methods is validated by the analytical and experimental results.

  17. Optical characterization of nonimaging dish concentrator for the application of dense-array concentrator photovoltaic system.

    PubMed

    Tan, Ming-Hui; Chong, Kok-Keong; Wong, Chee-Woon

    2014-01-20

    Optimization of the design of a nonimaging dish concentrator (NIDC) for a dense-array concentrator photovoltaic system is presented. A new algorithm has been developed to determine configuration of facet mirrors in a NIDC. Analytical formulas were derived to analyze the optical performance of a NIDC and then compared with a simulated result obtained from a numerical method. Comprehensive analysis of optical performance via analytical method has been carried out based on facet dimension and focal distance of the concentrator with a total reflective area of 120 m2. The result shows that a facet dimension of 49.8 cm, focal distance of 8 m, and solar concentration ratio of 411.8 suns is the most optimized design for the lowest cost-per-output power, which is US$1.93 per watt.

  18. Analysis of Volatile Organic Compounds in Air Contained in Canisters by Method TO-15, SOP No. HW-31 Revision 6

    EPA Pesticide Factsheets

    This document is designed to offer the data reviewer guidance in determining the validity of analytical data from the analysis of Volatile Organic Compounds in air samples taken in canisters and analyzed by method TO-15.

  19. Triangulation and Mixed Methods Designs: Data Integration with New Research Technologies

    ERIC Educational Resources Information Center

    Fielding, Nigel G.

    2012-01-01

    Data integration is a crucial element in mixed methods analysis and conceptualization. It has three principal purposes: illustration, convergent validation (triangulation), and the development of analytic density or "richness." This article discusses such applications in relation to new technologies for social research, looking at three…

  20. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  1. Optimal design of a thermally stable composite optical bench

    NASA Technical Reports Server (NTRS)

    Gray, C. E., Jr.

    1985-01-01

    The Lidar Atmospheric Sensing Experiment will be performed aboard an ER-2 aircraft; the lidar system used will be mounted on a lightweight, thermally stable graphite/epoxy optical bench whose design is presently subjected to analytical study and experimental validation. Attention is given to analytical methods for the selection of such expected laminate properties as the thermal expansion coefficient, the apparent in-plane moduli, and ultimate strength. For a symmetric laminate in which one of the lamina angles remains variable, an optimal lamina angle is selected to produce a design laminate with a near-zero coefficient of thermal expansion. Finite elements are used to model the structural concept of the design, with a view to the optical bench's thermal structural response as well as the determination of the degree of success in meeting the experiment's alignment tolerances.

  2. Method for improving the limit of detection in a data signal

    DOEpatents

    Synovec, Robert E.; Yueng, Edward S.

    1989-10-17

    A method for improving the limit of detection for a data set in which experimental noise is uncorrelated along a given abscissa and an analytical signal is correlated to the abscissa, the steps comprising collecting the data set, converting the data set into a data signal including an analytical portion and the experimental noise portion, designating and adjusting a baseline of the data signal to center the experimental noise numerically about a zero reference, and integrating the data signal preserving the corresponding information for each point of the data signal. The steps of the method produce an enhanced integrated data signal which improves the limit of detection of the data signal.

  3. Method for improving the limit of detection in a data signal

    DOEpatents

    Synovec, R.E.; Yueng, E.S.

    1989-10-17

    Disclosed is a method for improving the limit of detection for a data set in which experimental noise is uncorrelated along a given abscissa and an analytical signal is correlated to the abscissa, the steps comprising collecting the data set, converting the data set into a data signal including an analytical portion and the experimental noise portion, designating and adjusting a baseline of the data signal to center the experimental noise numerically about a zero reference, and integrating the data signal preserving the corresponding information for each point of the data signal. The steps of the method produce an enhanced integrated data signal which improves the limit of detection of the data signal. 8 figs.

  4. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry.

    PubMed

    Ornatsky, Olga I; Kinach, Robert; Bandura, Dmitry R; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I; Nitz, Mark; Winnik, Mitchell A

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping.

  5. Analytical chemistry in water quality monitoring during manned space missions

    NASA Astrophysics Data System (ADS)

    Artemyeva, Anastasia A.

    2016-09-01

    Water quality monitoring during human spaceflights is essential. However, most of the traditional methods require sample collection with a subsequent ground analysis because of the limitations in volume, power, safety and gravity. The space missions are becoming longer-lasting; hence methods suitable for in-flight monitoring are demanded. Since 2009, water quality has been monitored in-flight with colorimetric methods allowing for detection of iodine and ionic silver. Organic compounds in water have been monitored with a second generation total organic carbon analyzer, which provides information on the amount of carbon in water at both the U.S. and Russian segments of the International Space Station since 2008. The disadvantage of this approach is the lack of compound-specific information. The recently developed methods and tools may potentially allow one to obtain in-flight a more detailed information on water quality. Namely, the microanalyzers based on potentiometric measurements were designed for online detection of chloride, potassium, nitrate ions and ammonia. The recent application of the current highly developed air quality monitoring system for water analysis was a logical step because most of the target analytes are the same in air and water. An electro-thermal vaporizer was designed, manufactured and coupled with the air quality control system. This development allowed for liberating the analytes from the aqueous matrix and further compound-specific analysis in the gas phase.

  6. Net analyte signal-based simultaneous determination of ethanol and water by quartz crystal nanobalance sensor.

    PubMed

    Mirmohseni, A; Abdollahi, H; Rostamizadeh, K

    2007-02-28

    Net analyte signal (NAS)-based method called HLA/GO was applied for the selectively determination of binary mixture of ethanol and water by quartz crystal nanobalance (QCN) sensor. A full factorial design was applied for the formation of calibration and prediction sets in the concentration ranges 5.5-22.2 microg mL(-1) for ethanol and 7.01-28.07 microg mL(-1) for water. An optimal time range was selected by procedure which was based on the calculation of the net analyte signal regression plot in any considered time window for each test sample. A moving window strategy was used for searching the region with maximum linearity of NAS regression plot (minimum error indicator) and minimum of PRESS value. On the base of obtained results, the differences on the adsorption profiles in the time range between 1 and 600 s were used to determine mixtures of both compounds by HLA/GO method. The calculation of the net analytical signal using HLA/GO method allows determination of several figures of merit like selectivity, sensitivity, analytical sensitivity and limit of detection, for each component. To check the ability of the proposed method in the selection of linear regions of adsorption profile, a test for detecting non-linear regions of adsorption profile data in the presence of methanol was also described. The results showed that the method was successfully applied for the determination of ethanol and water.

  7. Aerothermodynamic shape optimization of hypersonic blunt bodies

    NASA Astrophysics Data System (ADS)

    Eyi, Sinan; Yumuşak, Mine

    2015-07-01

    The aim of this study is to develop a reliable and efficient design tool that can be used in hypersonic flows. The flow analysis is based on the axisymmetric Euler/Navier-Stokes and finite-rate chemical reaction equations. The equations are coupled simultaneously and solved implicitly using Newton's method. The Jacobian matrix is evaluated analytically. A gradient-based numerical optimization is used. The adjoint method is utilized for sensitivity calculations. The objective of the design is to generate a hypersonic blunt geometry that produces the minimum drag with low aerodynamic heating. Bezier curves are used for geometry parameterization. The performances of the design optimization method are demonstrated for different hypersonic flow conditions.

  8. Screening for and validated quantification of amphetamines and of amphetamine- and piperazine-derived designer drugs in human blood plasma by gas chromatography/mass spectrometry.

    PubMed

    Peters, Frank T; Schaefer, Simone; Staack, Roland F; Kraemer, Thomas; Maurer, Hans H

    2003-06-01

    The classical stimulants amphetamine, methamphetamine, ethylamphetamine and the amphetamine-derived designer drugs MDA, MDMA ('ecstasy'), MDEA, BDB and MBDB have been widely abused for a relatively long time. In recent years, a number of newer designer drugs have entered the illicit drug market. 4-Methylthioamphetamine (MTA), p-methoxyamphetamine (PMA) and p-methoxymethamphetamine (PMMA) are also derived from amphetamine. Other designer drugs are derived from piperazine, such as benzylpiperazine (BZP), methylenedioxybenzylpiperazine (MDBP), trifluoromethylphenylpiperazine (TFMPP), m-chlorophenylpiperazine (mCPP) and p-methoxyphenylpiperazine (MeOPP). A number of severe or even fatal intoxications involving these newer substances, especially PMA, have been reported. This paper describes a method for screening for and simultaneous quantification of the above-mentioned compounds and the metabolites p-hydroxyamphetamine and p-hydroxymethamphetamine (pholedrine) in human blood plasma. The analytes were analyzed by gas chromatography/mass spectrometry in the selected-ion monitoring mode after mixed-mode solid-phase extraction (HCX) and derivatization with heptafluorobutyric anhydride. The method was fully validated according to international guidelines. It was linear from 5 to 1000 micro g l(-1) for all analytes. Data for accuracy and precision were within required limits with the exception of those for MDBP. The limit of quantification was 5 micro g l(-1) for all analytes. The applicability of the assay was proven by analysis of authentic plasma samples and of a certified reference sample. This procedure should also be suitable for confirmation of immunoassay results positive for amphetamines and/or designer drugs of the ecstasy type. Copyright 2003 John Wiley & Sons, Ltd.

  9. APPLIED REMOTE SENSING

    EPA Science Inventory

    Remote Sensing is a scientific discipline of non-contact monitoring. It includes a range of technologies that span from aerial photography to advanced spectral imaging and analytical methods. This Session is designed to demonstrate contemporary practical applications of remote se...

  10. Analytical Methods for Interconnection | Distributed Generation

    Science.gov Websites

    ; ANALYSIS Program Lead Kristen.Ardani@nrel.gov 303-384-4641 Accurately and quickly defining the effects of designed to accommodate voltage rises, bi-directional power flows, and other effects caused by distributed

  11. Integrated analysis and design of thick composite structures for optimal passive damping characteristics

    NASA Technical Reports Server (NTRS)

    Saravanos, D. A.

    1993-01-01

    The development of novel composite mechanics for the analysis of damping in composite laminates and structures and the more significant results of this effort are summarized. Laminate mechanics based on piecewise continuous in-plane displacement fields are described that can represent both intralaminar stresses and interlaminar shear stresses and the associated effects on the stiffness and damping characteristics of a composite laminate. Among other features, the mechanics can accurately model the static and damped dynamic response of either thin or thick composite laminates, as well as, specialty laminates with embedded compliant damping layers. The discrete laminate damping theory is further incorporated into structural analysis methods. In this context, an exact semi-analytical method for the simulation of the damped dynamic response of composite plates was developed. A finite element based method and a specialty four-node plate element were also developed for the analysis of composite structures of variable shape and boundary conditions. Numerous evaluations and applications demonstrate the quality and superiority of the mechanics in predicting the damped dynamic characteristics of composite structures. Finally, additional development was focused on the development of optimal tailoring methods for the design of thick composite structures based on the developed analytical capability. Applications on composite plates illustrated the influence of composite mechanics in the optimal design of composites and the potential for significant deviations in the resultant designs when more simplified (classical) laminate theories are used.

  12. Label-free functional nucleic acid sensors for detecting target agents

    DOEpatents

    Lu, Yi; Xiang, Yu

    2015-01-13

    A general methodology to design label-free fluorescent functional nucleic acid sensors using a vacant site approach and an abasic site approach is described. In one example, a method for designing label-free fluorescent functional nucleic acid sensors (e.g., those that include a DNAzyme, aptamer or aptazyme) that have a tunable dynamic range through the introduction of an abasic site (e.g., dSpacer) or a vacant site into the functional nucleic acids. Also provided is a general method for designing label-free fluorescent aptamer sensors based on the regulation of malachite green (MG) fluorescence. A general method for designing label-free fluorescent catalytic and molecular beacons (CAMBs) is also provided. The methods demonstrated here can be used to design many other label-free fluorescent sensors to detect a wide range of analytes. Sensors and methods of using the disclosed sensors are also provided.

  13. Analytic solution of field distribution and demagnetization function of ideal hollow cylindrical field source

    NASA Astrophysics Data System (ADS)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-09-01

    The Halbach type hollow cylindrical permanent magnet array (HCPMA) is a volume compact and energy conserved field source, which have attracted intense interests in many practical applications. Here, using the complex variable integration method based on the Biot-Savart Law (including current distributions inside the body and on the surfaces of magnet), we derive analytical field solutions to an ideal multipole HCPMA in entire space including the interior of magnet. The analytic field expression inside the array material is used to construct an analytic demagnetization function, with which we can explain the origin of demagnetization phenomena in HCPMA by taking into account an ideal magnetic hysteresis loop with finite coercivity. These analytical field expressions and demagnetization functions provide deeper insight into the nature of such permanent magnet array systems and offer guidance in designing optimized array system.

  14. A novel analysis method for paired-sample microbial ecology experiments

    DOE PAGES

    Olesen, Scott W.; Vora, Suhani; Techtmann, Stephen M.; ...

    2016-05-06

    Many microbial ecology experiments use sequencing data to measure a community s response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samplesmore » and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method s validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Furthermore, our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of bottle effects .« less

  15. A novel analysis method for paired-sample microbial ecology experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olesen, Scott W.; Vora, Suhani; Techtmann, Stephen M.

    Many microbial ecology experiments use sequencing data to measure a community s response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samplesmore » and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method s validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Furthermore, our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of bottle effects .« less

  16. Platform construction and extraction mechanism study of magnetic mixed hemimicelles solid-phase extraction

    NASA Astrophysics Data System (ADS)

    Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua

    2016-12-01

    Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples.

  17. Designing stellarator coils by a modified Newton method using FOCUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Caoxiang; Hudson, Stuart R.; Song, Yuntao

    To find the optimal coils for stellarators, nonlinear optimization algorithms are applied in existing coil design codes. However, none of these codes have used the information from the second-order derivatives. In this paper, we present a modified Newton method in the recently developed code FOCUS. The Hessian matrix is calculated with analytically derived equations. Its inverse is approximated by a modified Cholesky factorization and applied in the iterative scheme of a classical Newton method. Using this method, FOCUS is able to recover the W7-X modular coils starting from a simple initial guess. Results demonstrate significant advantages.

  18. Designing stellarator coils by a modified Newton method using FOCUS

    NASA Astrophysics Data System (ADS)

    Zhu, Caoxiang; Hudson, Stuart R.; Song, Yuntao; Wan, Yuanxi

    2018-06-01

    To find the optimal coils for stellarators, nonlinear optimization algorithms are applied in existing coil design codes. However, none of these codes have used the information from the second-order derivatives. In this paper, we present a modified Newton method in the recently developed code FOCUS. The Hessian matrix is calculated with analytically derived equations. Its inverse is approximated by a modified Cholesky factorization and applied in the iterative scheme of a classical Newton method. Using this method, FOCUS is able to recover the W7-X modular coils starting from a simple initial guess. Results demonstrate significant advantages.

  19. Designing stellarator coils by a modified Newton method using FOCUS

    DOE PAGES

    Zhu, Caoxiang; Hudson, Stuart R.; Song, Yuntao; ...

    2018-03-22

    To find the optimal coils for stellarators, nonlinear optimization algorithms are applied in existing coil design codes. However, none of these codes have used the information from the second-order derivatives. In this paper, we present a modified Newton method in the recently developed code FOCUS. The Hessian matrix is calculated with analytically derived equations. Its inverse is approximated by a modified Cholesky factorization and applied in the iterative scheme of a classical Newton method. Using this method, FOCUS is able to recover the W7-X modular coils starting from a simple initial guess. Results demonstrate significant advantages.

  20. First Order Reliability Application and Verification Methods for Semistatic Structures

    NASA Technical Reports Server (NTRS)

    Verderaime, Vincent

    1994-01-01

    Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored by conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments, its stress audits are shown to be arbitrary and incomplete, and it compromises high strength materials performance. A reliability method is proposed which combines first order reliability principles with deterministic design variables and conventional test technique to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety index expression. The application is reduced to solving for a factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and with the pace of semistatic structural designs.

  1. The use of analytical models in human-computer interface design

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo

    1993-01-01

    Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.

  2. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  3. Strategies for dealing with missing data in clinical trials: from design to analysis.

    PubMed

    Dziura, James D; Post, Lori A; Zhao, Qing; Fu, Zhixuan; Peduzzi, Peter

    2013-09-01

    Randomized clinical trials are the gold standard for evaluating interventions as randomized assignment equalizes known and unknown characteristics between intervention groups. However, when participants miss visits, the ability to conduct an intent-to-treat analysis and draw conclusions about a causal link is compromised. As guidance to those performing clinical trials, this review is a non-technical overview of the consequences of missing data and a prescription for its treatment beyond the typical analytic approaches to the entire research process. Examples of bias from incorrect analysis with missing data and discussion of the advantages/disadvantages of analytic methods are given. As no single analysis is definitive when missing data occurs, strategies for its prevention throughout the course of a trial are presented. We aim to convey an appreciation for how missing data influences results and an understanding of the need for careful consideration of missing data during the design, planning, conduct, and analytic stages.

  4. Achieving optimal SERS through enhanced experimental design

    PubMed Central

    Fisk, Heidi; Westley, Chloe; Turner, Nicholas J.

    2016-01-01

    One of the current limitations surrounding surface‐enhanced Raman scattering (SERS) is the perceived lack of reproducibility. SERS is indeed challenging, and for analyte detection, it is vital that the analyte interacts with the metal surface. However, as this is analyte dependent, there is not a single set of SERS conditions that are universal. This means that experimental optimisation for optimum SERS response is vital. Most researchers optimise one factor at a time, where a single parameter is altered first before going onto optimise the next. This is a very inefficient way of searching the experimental landscape. In this review, we explore the use of more powerful multivariate approaches to SERS experimental optimisation based on design of experiments and evolutionary computational methods. We particularly focus on colloidal‐based SERS rather than thin film preparations as a result of their popularity. © 2015 The Authors. Journal of Raman Spectroscopy published by John Wiley & Sons, Ltd. PMID:27587905

  5. Achieving optimal SERS through enhanced experimental design.

    PubMed

    Fisk, Heidi; Westley, Chloe; Turner, Nicholas J; Goodacre, Royston

    2016-01-01

    One of the current limitations surrounding surface-enhanced Raman scattering (SERS) is the perceived lack of reproducibility. SERS is indeed challenging, and for analyte detection, it is vital that the analyte interacts with the metal surface. However, as this is analyte dependent, there is not a single set of SERS conditions that are universal. This means that experimental optimisation for optimum SERS response is vital. Most researchers optimise one factor at a time, where a single parameter is altered first before going onto optimise the next. This is a very inefficient way of searching the experimental landscape. In this review, we explore the use of more powerful multivariate approaches to SERS experimental optimisation based on design of experiments and evolutionary computational methods. We particularly focus on colloidal-based SERS rather than thin film preparations as a result of their popularity. © 2015 The Authors. Journal of Raman Spectroscopy published by John Wiley & Sons, Ltd.

  6. Limited Qualities Evaluation of Longitudinal Flight Control Systems Designed Using Multiobjective Control Design Techniques (HAVE INFINITY II)

    DTIC Science & Technology

    1998-06-01

    analytical phase of this research. Finally, the mixed H2/H-Infinity method optimally tradeoff the different benefits offered by the separate H2 and H...potential benefits of the multiobjective design techniques used. Due to the HAVE INFINITY I test results, AFIT made the decision to continue the...sensitivity and complimentary sensitivity weighting, and a mixed H2/H-Infinity design that compromised the benefits of both design techniques optimally. The

  7. Confronting Analytical Dilemmas for Understanding Complex Human Interactions in Design-Based Research from a Cultural-Historical Activity Theory (CHAT) Framework

    ERIC Educational Resources Information Center

    Yamagata-Lynch, Lisa C.

    2007-01-01

    Understanding human activity in real-world situations often involves complicated data collection, analysis, and presentation methods. This article discusses how Cultural-Historical Activity Theory (CHAT) can inform design-based research practices that focus on understanding activity in real-world situations. I provide a sample data set with…

  8. Reviews of Single Subject Research Designs: Applications to Special Education and School Psychology

    ERIC Educational Resources Information Center

    Nevin, Ann I., Ed.

    2004-01-01

    The authors of this collection of research reviews studied how single subject research designs might be a useful method to apply as part of being accountable to clients. The single subject research studies were evaluated in accordance with the following criteria: Was the study applied, behavioral, reliable, analytic, effective, and generalizable?…

  9. Survey of NASA research on crash dynamics

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Carden, H. D.; Hayduk, R. J.

    1984-01-01

    Ten years of structural crash dynamics research activities conducted on general aviation aircraft by the National Aeronautics and Space Administration (NASA) are described. Thirty-two full-scale crash tests were performed at Langley Research Center, and pertinent data on airframe and seat behavior were obtained. Concurrent with the experimental program, analytical methods were developed to help predict structural behavior during impact. The effects of flight parameters at impact on cabin deceleration pulses at the seat/occupant interface, experimental and analytical correlation of data on load-limiting subfloor and seat configurations, airplane section test results for computer modeling validation, and data from emergency-locator-transmitter (ELT) investigations to determine probable cause of false alarms and nonactivations are assessed. Computer programs which provide designers with analytical methods for predicting accelerations, velocities, and displacements of collapsing structures are also discussed.

  10. Neural Network and Regression Methods Demonstrated in the Design Optimization of a Subsonic Aircraft

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale A.; Lavelle, Thomas M.; Patnaik, Surya

    2003-01-01

    The neural network and regression methods of NASA Glenn Research Center s COMETBOARDS design optimization testbed were used to generate approximate analysis and design models for a subsonic aircraft operating at Mach 0.85 cruise speed. The analytical model is defined by nine design variables: wing aspect ratio, engine thrust, wing area, sweep angle, chord-thickness ratio, turbine temperature, pressure ratio, bypass ratio, fan pressure; and eight response parameters: weight, landing velocity, takeoff and landing field lengths, approach thrust, overall efficiency, and compressor pressure and temperature. The variables were adjusted to optimally balance the engines to the airframe. The solution strategy included a sensitivity model and the soft analysis model. Researchers generated the sensitivity model by training the approximators to predict an optimum design. The trained neural network predicted all response variables, within 5-percent error. This was reduced to 1 percent by the regression method. The soft analysis model was developed to replace aircraft analysis as the reanalyzer in design optimization. Soft models have been generated for a neural network method, a regression method, and a hybrid method obtained by combining the approximators. The performance of the models is graphed for aircraft weight versus thrust as well as for wing area and turbine temperature. The regression method followed the analytical solution with little error. The neural network exhibited 5-percent maximum error over all parameters. Performance of the hybrid method was intermediate in comparison to the individual approximators. Error in the response variable is smaller than that shown in the figure because of a distortion scale factor. The overall performance of the approximators was considered to be satisfactory because aircraft analysis with NASA Langley Research Center s FLOPS (Flight Optimization System) code is a synthesis of diverse disciplines: weight estimation, aerodynamic analysis, engine cycle analysis, propulsion data interpolation, mission performance, airfield length for landing and takeoff, noise footprint, and others.

  11. Optimal clinical trial design based on a dichotomous Markov-chain mixed-effect sleep model.

    PubMed

    Steven Ernest, C; Nyberg, Joakim; Karlsson, Mats O; Hooker, Andrew C

    2014-12-01

    D-optimal designs for discrete-type responses have been derived using generalized linear mixed models, simulation based methods and analytical approximations for computing the fisher information matrix (FIM) of non-linear mixed effect models with homogeneous probabilities over time. In this work, D-optimal designs using an analytical approximation of the FIM for a dichotomous, non-homogeneous, Markov-chain phase advanced sleep non-linear mixed effect model was investigated. The non-linear mixed effect model consisted of transition probabilities of dichotomous sleep data estimated as logistic functions using piecewise linear functions. Theoretical linear and nonlinear dose effects were added to the transition probabilities to modify the probability of being in either sleep stage. D-optimal designs were computed by determining an analytical approximation the FIM for each Markov component (one where the previous state was awake and another where the previous state was asleep). Each Markov component FIM was weighted either equally or by the average probability of response being awake or asleep over the night and summed to derive the total FIM (FIM(total)). The reference designs were placebo, 0.1, 1-, 6-, 10- and 20-mg dosing for a 2- to 6-way crossover study in six dosing groups. Optimized design variables were dose and number of subjects in each dose group. The designs were validated using stochastic simulation/re-estimation (SSE). Contrary to expectations, the predicted parameter uncertainty obtained via FIM(total) was larger than the uncertainty in parameter estimates computed by SSE. Nevertheless, the D-optimal designs decreased the uncertainty of parameter estimates relative to the reference designs. Additionally, the improvement for the D-optimal designs were more pronounced using SSE than predicted via FIM(total). Through the use of an approximate analytic solution and weighting schemes, the FIM(total) for a non-homogeneous, dichotomous Markov-chain phase advanced sleep model was computed and provided more efficient trial designs and increased nonlinear mixed-effects modeling parameter precision.

  12. Portal scatter to primary dose ratio of 4 to 18 MV photon spectra incident on heterogeneous phantoms

    NASA Astrophysics Data System (ADS)

    Ozard, Siobhan R.

    Electronic portal imagers designed and used to verify the positioning of a cancer patient undergoing radiation treatment can also be employed to measure the in vivo dose received by the patient. This thesis investigates the ratio of the dose from patient-scattered particles to the dose from primary (unscattered) photons at the imaging plane, called the scatter to primary dose ratio (SPR). The composition of the SPR according to the origin of scatter is analyzed more thoroughly than in previous studies. A new analytical method for calculating the SPR is developed and experimentally verified for heterogeneous phantoms. A novel technique that applies the analytical SPR method for in vivo dosimetry with a portal imager is evaluated. Monte Carlo simulation was used to determine the imager dose from patient-generated electrons and photons that scatter one or more times within the object. The database of SPRs reported from this investigation is new since the contribution from patient-generated electrons was neglected by previous Monte Carlo studies. The SPR from patient-generated electrons was found here to be as large as 0.03. The analytical SPR method relies on the established result that the scatter dose is uniform for an air gap between the patient and the imager that is greater than 50 cm. This method also applies the hypothesis that first-order Compton scatter only, is sufficient for scatter estimation. A comparison of analytical and measured SPRs for neck, thorax, and pelvis phantoms showed that the maximum difference was within +/-0.03, and the mean difference was less than +/-0.01 for most cases. This accuracy was comparable to similar analytical approaches that are limited to homogeneous phantoms. The analytical SPR method could replace lookup tables of measured scatter doses that can require significant time to measure. In vivo doses were calculated by combining our analytical SPR method and the convolution/superposition algorithm. Our calculated in vivo doses agreed within +/-3% with the doses measured in the phantom. The present in vivo method was faster compared to other techniques that use convolution/superposition. Our method is a feasible and satisfactory approach that contributes to on-line patient dose monitoring.

  13. Evaluation of structural design concepts for an arrow-wing supersonic cruise aircraft

    NASA Technical Reports Server (NTRS)

    Sakata, I. F.; Davis, G. W.

    1977-01-01

    An analytical study was performed to determine the best structural approach for design of primary wing and fuselage structure of a Mach 2.7 arrow wing supersonic cruise aircraft. Concepts were evaluated considering near term start of design. Emphasis was placed on the complex interactions between thermal stress, static aeroelasticity, flutter, fatigue and fail safe design, static and dynamic loads, and the effects of variations in structural arrangements, concepts and materials on these interactions. Results indicate that a hybrid wing structure incorporating low profile convex beaded and honeycomb sandwich surface panels of titanium alloy 6Al-4V were the most efficient. The substructure includes titanium alloy spar caps reinforced with boron polyimide composites. The fuselage shell consists of hat stiffened skin and frame construction of titanium alloy 6Al-4V. A summary of the study effort is presented, and a discussion of the overall logic, design philosophy and interaction between the analytical methods for supersonic cruise aircraft design are included.

  14. Using FTIR-ATR Spectroscopy to Teach the Internal Standard Method

    ERIC Educational Resources Information Center

    Bellamy, Michael K.

    2010-01-01

    The internal standard method is widely applied in quantitative analyses. However, most analytical chemistry textbooks either omit this topic or only provide examples of a single-point internal standardization. An experiment designed to teach students how to prepare an internal standard calibration curve is described. The experiment is a modified…

  15. Optimal cure cycle design of a resin-fiber composite laminate

    NASA Technical Reports Server (NTRS)

    Hou, Jean W.; Sheen, Jeenson

    1987-01-01

    A unified computed aided design method was studied for the cure cycle design that incorporates an optimal design technique with the analytical model of a composite cure process. The preliminary results of using this proposed method for optimal cure cycle design are reported and discussed. The cure process of interest is the compression molding of a polyester which is described by a diffusion reaction system. The finite element method is employed to convert the initial boundary value problem into a set of first order differential equations which are solved simultaneously by the DE program. The equations for thermal design sensitivities are derived by using the direct differentiation method and are solved by the DE program. A recursive quadratic programming algorithm with an active set strategy called a linearization method is used to optimally design the cure cycle, subjected to the given design performance requirements. The difficulty of casting the cure cycle design process into a proper mathematical form is recognized. Various optimal design problems are formulated to address theses aspects. The optimal solutions of these formulations are compared and discussed.

  16. The use of analytical models in human-computer interface design

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo

    1991-01-01

    Some of the many analytical models in human-computer interface design that are currently being developed are described. The usefulness of analytical models for human-computer interface design is evaluated. Can the use of analytical models be recommended to interface designers? The answer, based on the empirical research summarized here, is: not at this time. There are too many unanswered questions concerning the validity of models and their ability to meet the practical needs of design organizations.

  17. Coupled rotor/fuselage dynamic analysis of the AH-1G helicopter and correlation with flight vibrations data

    NASA Technical Reports Server (NTRS)

    Corrigan, J. C.; Cronkhite, J. D.; Dompka, R. V.; Perry, K. S.; Rogers, J. P.; Sadler, S. G.

    1989-01-01

    Under a research program designated Design Analysis Methods for VIBrationS (DAMVIBS), existing analytical methods are used for calculating coupled rotor-fuselage vibrations of the AH-1G helicopter for correlation with flight test data from an AH-1G Operational Load Survey (OLS) test program. The analytical representation of the fuselage structure is based on a NASTRAN finite element model (FEM), which has been developed, extensively documented, and correlated with ground vibration test. One procedure that was used for predicting coupled rotor-fuselage vibrations using the advanced Rotorcraft Flight Simulation Program C81 and NASTRAN is summarized. Detailed descriptions of the analytical formulation of rotor dynamics equations, fuselage dynamic equations, coupling between the rotor and fuselage, and solutions to the total system of equations in C81 are included. Analytical predictions of hub shears for main rotor harmonics 2p, 4p, and 6p generated by C81 are used in conjunction with 2p OLS measured control loads and a 2p lateral tail rotor gearbox force, representing downwash impingement on the vertical fin, to excite the NASTRAN model. NASTRAN is then used to correlate with measured OLS flight test vibrations. Blade load comparisons predicted by C81 showed good agreement. In general, the fuselage vibration correlations show good agreement between anslysis and test in vibration response through 15 to 20 Hz.

  18. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  19. Merits and limitations of optimality criteria method for structural optimization

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Guptill, James D.; Berke, Laszlo

    1993-01-01

    The merits and limitations of the optimality criteria (OC) method for the minimum weight design of structures subjected to multiple load conditions under stress, displacement, and frequency constraints were investigated by examining several numerical examples. The examples were solved utilizing the Optimality Criteria Design Code that was developed for this purpose at NASA Lewis Research Center. This OC code incorporates OC methods available in the literature with generalizations for stress constraints, fully utilized design concepts, and hybrid methods that combine both techniques. Salient features of the code include multiple choices for Lagrange multiplier and design variable update methods, design strategies for several constraint types, variable linking, displacement and integrated force method analyzers, and analytical and numerical sensitivities. The performance of the OC method, on the basis of the examples solved, was found to be satisfactory for problems with few active constraints or with small numbers of design variables. For problems with large numbers of behavior constraints and design variables, the OC method appears to follow a subset of active constraints that can result in a heavier design. The computational efficiency of OC methods appears to be similar to some mathematical programming techniques.

  20. Grain Propellant Optimization Using Real Code Genetic Algorithm (RCGA)

    NASA Astrophysics Data System (ADS)

    Farizi, Muhammad Farraz Al; Oktovianus Bura, Romie; Fajar Junjunan, Soleh; Jihad, Bagus H.

    2018-04-01

    Grain propellant design is important in rocket motor design. The total impulse and ISP of the rocket motor is influenced by the grain propellant design. One way to get a grain propellant shape that generates the maximum total impulse value is to use the Real Code Genetic Algorithm (RCGA) method. In this paper RCGA is applied to star grain Rx-450. To find burn area of propellant used analytical method. While the combustion chamber pressures are sought with zero-dimensional equations. The optimization result can reach the desired target and increase the total impulse value by 3.3% from the initial design of Rx-450.

  1. Guidance for Data Useability in Risk Assessment (Part B-4), Final, May, 1992

    EPA Pesticide Factsheets

    This chapter provides guidance to the RPM and the risk assessor for designing an effective sampling plan and selecting suitable analytical methods to collect environmental data for use in baseline risk assessments.

  2. Development and optimization of an energy-regenerative suspension system under stochastic road excitation

    NASA Astrophysics Data System (ADS)

    Huang, Bo; Hsieh, Chen-Yu; Golnaraghi, Farid; Moallem, Mehrdad

    2015-11-01

    In this paper a vehicle suspension system with energy harvesting capability is developed, and an analytical methodology for the optimal design of the system is proposed. The optimization technique provides design guidelines for determining the stiffness and damping coefficients aimed at the optimal performance in terms of ride comfort and energy regeneration. The corresponding performance metrics are selected as root-mean-square (RMS) of sprung mass acceleration and expectation of generated power. The actual road roughness is considered as the stochastic excitation defined by ISO 8608:1995 standard road profiles and used in deriving the optimization method. An electronic circuit is proposed to provide variable damping in the real-time based on the optimization rule. A test-bed is utilized and the experiments under different driving conditions are conducted to verify the effectiveness of the proposed method. The test results suggest that the analytical approach is credible in determining the optimality of system performance.

  3. Establishment and reliability evaluation of the design space for HPLC analysis of six alkaloids in Coptis chinensis (Huanglian) using Bayesian approach.

    PubMed

    Dai, Sheng-Yun; Xu, Bing; Zhang, Yi; Li, Jian-Yu; Sun, Fei; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2016-09-01

    Coptis chinensis (Huanglian) is a commonly used traditional Chinese medicine (TCM) herb and alkaloids are the most important chemical constituents in it. In the present study, an isocratic reverse phase high performance liquid chromatography (RP-HPLC) method allowing the separation of six alkaloids in Huanglian was for the first time developed under the quality by design (QbD) principles. First, five chromatographic parameters were identified to construct a Plackett-Burman experimental design. The critical resolution, analysis time, and peak width were responses modeled by multivariate linear regression. The results showed that the percentage of acetonitrile, concentration of sodium dodecyl sulfate, and concentration of potassium phosphate monobasic were statistically significant parameters (P < 0.05). Then, the Box-Behnken experimental design was applied to further evaluate the interactions between the three parameters on selected responses. Full quadratic models were built and used to establish the analytical design space. Moreover, the reliability of design space was estimated by the Bayesian posterior predictive distribution. The optimal separation was predicted at 40% acetonitrile, 1.7 g·mL(-1) of sodium dodecyl sulfate and 0.03 mol·mL(-1) of potassium phosphate monobasic. Finally, the accuracy profile methodology was used to validate the established HPLC method. The results demonstrated that the QbD concept could be efficiently used to develop a robust RP-HPLC analytical method for Huanglian. Copyright © 2016 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.

  4. Design considerations of a hollow microneedle-optofluidic biosensing platform incorporating enzyme-linked assays

    NASA Astrophysics Data System (ADS)

    Ranamukhaarachchi, Sahan A.; Padeste, Celestino; Häfeli, Urs O.; Stoeber, Boris; Cadarso, Victor J.

    2018-02-01

    A hollow metallic microneedle is integrated with microfluidics and photonic components to form a microneedle-optofluidic biosensor suitable for therapeutic drug monitoring (TDM) in biological fluids, like interstitial fluid, that can be collected in a painless and minimally-invasive manner. The microneedle inner lumen surface is bio-functionalized to trap and bind target analytes on-site in a sample volume as small as 0.6 nl, and houses an enzyme-linked assay on its 0.06 mm2 wall. The optofluidic components are designed to rapidly quantify target analytes present in the sample and collected in the microneedle using a simple and sensitive absorbance scheme. This contribution describes how the biosensor components were optimized to detect in vitro streptavidin-horseradish peroxidase (Sav-HRP) as a model analyte over a large detection range (0-7.21 µM) and a very low limit of detection (60.2 nM). This biosensor utilizes the lowest analyte volume reported for TDM with microneedle technology, and presents significant avenues to improve current TDM methods for patients, by potentially eliminating blood draws for several drug candidates.

  5. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  6. Using design of experiments to optimize derivatization with methyl chloroformate for quantitative analysis of the aqueous phase from hydrothermal liquefaction of biomass.

    PubMed

    Madsen, René Bjerregaard; Jensen, Mads Mørk; Mørup, Anders Juul; Houlberg, Kasper; Christensen, Per Sigaard; Klemmer, Maika; Becker, Jacob; Iversen, Bo Brummerstedt; Glasius, Marianne

    2016-03-01

    Hydrothermal liquefaction is a promising technique for the production of bio-oil. The process produces an oil phase, a gas phase, a solid residue, and an aqueous phase. Gas chromatography coupled with mass spectrometry is used to analyze the complex aqueous phase. Especially small organic acids and nitrogen-containing compounds are of interest. The efficient derivatization reagent methyl chloroformate was used to make analysis of the complex aqueous phase from hydrothermal liquefaction of dried distillers grains with solubles possible. A circumscribed central composite design was used to optimize the responses of both derivatized and nonderivatized analytes, which included small organic acids, pyrazines, phenol, and cyclic ketones. Response surface methodology was used to visualize significant factors and identify optimized derivatization conditions (volumes of methyl chloroformate, NaOH solution, methanol, and pyridine). Twenty-nine analytes of small organic acids, pyrazines, phenol, and cyclic ketones were quantified. An additional three analytes were pseudoquantified with use of standards with similar mass spectra. Calibration curves with high correlation coefficients were obtained, in most cases R (2)  > 0.991. Method validation was evaluated with repeatability, and spike recoveries of all 29 analytes were obtained. The 32 analytes were quantified in samples from the commissioning of a continuous flow reactor and in samples from recirculation experiments involving the aqueous phase. The results indicated when the steady-state condition of the flow reactor was obtained and the effects of recirculation. The validated method will be especially useful for investigations of the effect of small organic acids on the hydrothermal liquefaction process.

  7. Implementation of structural response sensitivity calculations in a large-scale finite-element analysis system

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Rogers, J. L., Jr.

    1982-01-01

    The methodology used to implement structural sensitivity calculations into a major, general-purpose finite-element analysis system (SPAR) is described. This implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calculating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of SPAR are also discussed.

  8. First-order reliability application and verification methods for semistatic structures

    NASA Astrophysics Data System (ADS)

    Verderaime, V.

    1994-11-01

    Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored in conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments; stress audits are shown to be arbitrary and incomplete, and the concept compromises the performance of high-strength materials. A reliability method is proposed that combines first-order reliability principles with deterministic design variables and conventional test techniques to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety-index expression. The application is reduced to solving for a design factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this design factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and the development of semistatic structural designs.

  9. Handbook of Analytical Methods for Textile Composites

    NASA Technical Reports Server (NTRS)

    Cox, Brian N.; Flanagan, Gerry

    1997-01-01

    The purpose of this handbook is to introduce models and computer codes for predicting the properties of textile composites. The handbook includes several models for predicting the stress-strain response all the way to ultimate failure; methods for assessing work of fracture and notch sensitivity; and design rules for avoiding certain critical mechanisms of failure, such as delamination, by proper textile design. The following textiles received some treatment: 2D woven, braided, and knitted/stitched laminates and 3D interlock weaves, and braids.

  10. Locating bomb factories by detecting hydrogen peroxide.

    PubMed

    Romolo, Francesco Saverio; Connell, Samantha; Ferrari, Carlotta; Suarez, Guillaume; Sauvain, Jean-Jacques; Hopf, Nancy B

    2016-11-01

    The analytical capability to detect hydrogen peroxide vapour can play a key role in localizing a site where a H2O2 based Improvised Explosive (IE) is manufactured. In security activities it is very important to obtain information in a short time. For this reason, an analytical method to be used in security activity needs portable devices. The authors have developed the first analytical method based on a portable luminometer, specifically designed and validated to locate IE manufacturing sites using quantitative on-site vapour analysis for H2O2. The method was tested both indoor and outdoor. The results demonstrate that the detection of H2O2 vapours could allow police forces to locate the site, while terrorists are preparing an attack. The collected data are also very important in developing new sensors, able to give an early alarm if located at a proper distance from a site where an H2O2 based IE is prepared. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. The Analytical Chemistry of Drug Monitoring in Athletes

    NASA Astrophysics Data System (ADS)

    Bowers, Larry D.

    2009-07-01

    The detection and deterrence of the abuse of performance-enhancing drugs in sport are important to maintaining a level playing field among athletes and to decreasing the risk to athletes’ health. The World Anti-Doping Program consists of six documents, three of which play a role in analytical development: The World Anti-Doping Code, The List of Prohibited Substances and Methods, and The International Standard for Laboratories. Among the classes of prohibited substances, three have given rise to the most recent analytical developments in the field: anabolic agents; peptide and protein hormones; and methods to increase oxygen delivery to the tissues, including recombinant erythropoietin. Methods for anabolic agents, including designer steroids, have been enhanced through the use of liquid chromatography/tandem mass spectrometry and gas chromatography/combustion/isotope-ratio mass spectrometry. Protein and peptide identification and quantification have benefited from advances in liquid chromatography/tandem mass spectrometry. Incorporation of techniques such as flow cytometry and isoelectric focusing have supported the detection of blood doping.

  12. Circular Functions Based Comprehensive Analysis of Plastic Creep Deformations in the Fiber Reinforced Composites

    NASA Astrophysics Data System (ADS)

    Monfared, Vahid

    2016-12-01

    Analytically based model is presented for behavioral analysis of the plastic deformations in the reinforced materials using the circular (trigonometric) functions. The analytical method is proposed to predict creep behavior of the fibrous composites based on basic and constitutive equations under a tensile axial stress. New insight of the work is to predict some important behaviors of the creeping matrix. In the present model, the prediction of the behaviors is simpler than the available methods. Principal creep strain rate behaviors are very noteworthy for designing the fibrous composites in the creeping composites. Analysis of the mentioned parameter behavior in the reinforced materials is necessary to analyze failure, fracture, and fatigue studies in the creep of the short fiber composites. Shuttles, spaceships, turbine blades and discs, and nozzle guide vanes are commonly subjected to the creep effects. Also, predicting the creep behavior is significant to design the optoelectronic and photonic advanced composites with optical fibers. As a result, the uniform behavior with constant gradient is seen in the principal creep strain rate behavior, and also creep rupture may happen at the fiber end. Finally, good agreements are found through comparing the obtained analytical and FEM results.

  13. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry

    PubMed Central

    Ornatsky, Olga I.; Kinach, Robert; Bandura, Dmitry R.; Lou, Xudong; Tanner, Scott D.; Baranov, Vladimir I.; Nitz, Mark; Winnik, Mitchell A.

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping. PMID:19122859

  14. Microfluidic-Based Robotic Sampling System for Radioactive Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack D. Law; Julia L. Tripp; Tara E. Smith

    A novel microfluidic based robotic sampling system has been developed for sampling and analysis of liquid solutions in nuclear processes. This system couples the use of a microfluidic sample chip with a robotic system designed to allow remote, automated sampling of process solutions in-cell and facilitates direct coupling of the microfluidic sample chip with analytical instrumentation. This system provides the capability for near real time analysis, reduces analytical waste, and minimizes the potential for personnel exposure associated with traditional sampling methods. A prototype sampling system was designed, built and tested. System testing demonstrated operability of the microfluidic based sample systemmore » and identified system modifications to optimize performance.« less

  15. Systematic Development and Validation of a Thin-Layer Densitometric Bioanalytical Method for Estimation of Mangiferin Employing Analytical Quality by Design (AQbD) Approach.

    PubMed

    Khurana, Rajneet Kaur; Rao, Satish; Beg, Sarwar; Katare, O P; Singh, Bhupinder

    2016-01-01

    The present work aims at the systematic development of a simple, rapid and highly sensitive densitometry-based thin-layer chromatographic method for the quantification of mangiferin in bioanalytical samples. Initially, the quality target method profile was defined and critical analytical attributes (CAAs) earmarked, namely, retardation factor (Rf), peak height, capacity factor, theoretical plates and separation number. Face-centered cubic design was selected for optimization of volume loaded and plate dimensions as the critical method parameters selected from screening studies employing D-optimal and Plackett-Burman design studies, followed by evaluating their effect on the CAAs. The mobile phase containing a mixture of ethyl acetate : acetic acid : formic acid : water in a 7 : 1 : 1 : 1 (v/v/v/v) ratio was finally selected as the optimized solvent for apt chromatographic separation of mangiferin at 262 nm withRf 0.68 ± 0.02 and all other parameters within the acceptance limits. Method validation studies revealed high linearity in the concentration range of 50-800 ng/band for mangiferin. The developed method showed high accuracy, precision, ruggedness, robustness, specificity, sensitivity, selectivity and recovery. In a nutshell, the bioanalytical method for analysis of mangiferin in plasma revealed the presence of well-resolved peaks and high recovery of mangiferin. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Application of advanced control techniques to aircraft propulsion systems

    NASA Technical Reports Server (NTRS)

    Lehtinen, B.

    1984-01-01

    Two programs are described which involve the application of advanced control techniques to the design of engine control algorithms. Multivariable control theory is used in the F100 MVCS (multivariable control synthesis) program to design controls which coordinate the control inputs for improved engine performance. A systematic method for handling a complex control design task is given. Methods of analytical redundancy are aimed at increasing the control system reliability. The F100 DIA (detection, isolation, and accommodation) program, which investigates the uses of software to replace or augment hardware redundancy for certain critical engine sensor, is described.

  17. A simple method to produce 2D and 3D microfluidic paper-based analytical devices for clinical analysis.

    PubMed

    de Oliveira, Ricardo A G; Camargo, Fiamma; Pesquero, Naira C; Faria, Ronaldo Censi

    2017-03-08

    This paper describes the fabrication of 2D and 3D microfluidic paper-based analytical devices (μPADs) for monitoring glucose, total protein, and nitrite in blood serum and artificial urine. A new method of cutting and sealing filter paper to construct μPADs was demonstrated. Using an inexpensive home cutter printer soft cellulose-based filter paper was easily and precisely cut to produce pattern hydrophilic microchannels. 2D and 3D μPADs were designed with three detection zones each for the colorimetric detection of the analytes. A small volume of samples was added to the μPADs, which was photographed after 15 min using a digital camera. Both μPADs presented an excellent analytical performance for all analytes. The 2D device was applied in artificial urine samples and reached limits of detection (LODs) of 0.54 mM, 5.19 μM, and 2.34 μM for glucose, protein, and nitrite, respectively. The corresponding LODs of the 3D device applied for detecting the same analytes in artificial blood serum were 0.44 mM, 1.26 μM, and 4.35 μM. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Analytical investigation of aerodynamic characteristics of highly swept wings with separated flow

    NASA Technical Reports Server (NTRS)

    Reddy, C. S.

    1980-01-01

    Many modern aircraft designed for supersonic speeds employ highly swept-back and low-aspect-ratio wings with sharp or thin edges. Flow separation occurs near the leading and tip edges of such wings at moderate to high angles of attack. Attempts have been made over the years to develop analytical methods for predicting the aerodynamic characteristics of such aircraft. Before any method can really be useful, it must be tested against a standard set of data to determine its capabilities and limitations. The present work undertakes such an investigation. Three methods are considered: the free-vortex-sheet method (Weber et al., 1975), the vortex-lattice method with suction analogy (Lamar and Gloss, 1975), and the quasi-vortex lattice method of Mehrotra (1977). Both flat and cambered wings of different configurations, for which experimental data are available, are studied and comparisons made.

  19. (Bio)Sensing Using Nanoparticle Arrays: On the Effect of Analyte Transport on Sensitivity.

    PubMed

    Lynn, N Scott; Homola, Jiří

    2016-12-20

    There has recently been an extensive amount of work regarding the development of optical, electrical, and mechanical (bio)sensors employing planar arrays of surface-bound nanoparticles. The sensor output for these systems is dependent on the rate at which analyte is transported to, and interacts with, each nanoparticle in the array. There has so far been little discussion on the relationship between the design parameters of an array and the interplay of convection, diffusion, and reaction. Moreover, current methods providing such information require extensive computational simulation. Here we demonstrate that the rate of analyte transport to a nanoparticle array can be quantified analytically. We show that such rates are bound by both the rate to a single NP and that to a planar surface (having equivalent size as the array), with the specific rate determined by the fill fraction: the ratio between the total surface area used for biomolecular capture with respect to the entire sensing area. We characterize analyte transport to arrays with respect to changes in numerous parameters relevant to experiment, including variation of the nanoparticle shape and size, packing density, flow conditions, and analyte diffusivity. We also explore how analyte capture is dependent on the kinetic parameters related to an affinity-based biosensor, and furthermore, we classify the conditions under which the array might be diffusion- or reaction-limited. The results obtained herein are applicable toward the design and optimization of all (bio)sensors based on nanoparticle arrays.

  20. Multiresidue analytical method for pharmaceuticals and personal care products in sewage and sewage sludge by online direct immersion SPME on-fiber derivatization - GCMS.

    PubMed

    López-Serna, Rebeca; Marín-de-Jesús, David; Irusta-Mata, Rubén; García-Encina, Pedro Antonio; Lebrero, Raquel; Fdez-Polanco, María; Muñoz, Raúl

    2018-08-15

    The work here presented aimed at developing an analytical method for the simultaneous determination of 22 pharmaceuticals and personal care products, including 3 transformation products, in sewage and sludge. A meticulous method optimization, involving an experimental design, was carried out. The developed method was fully automated and consisted of the online extraction of 17 mL of water sample by Direct Immersion Solid Phase MicroExtraction followed by On-fiber Derivatization coupled to Gas Chromatography - Mass Spectrometry (DI-SPME - On-fiber Derivatization - GC - MS). This methodology was validated for 12 of the initial compounds as a reliable (relative recoveries above 90% for sewage and 70% for sludge; repeatability as %RSD below 10% in all cases), sensitive (LODs below 20 ng L -1 in sewage and 10 ng g -1 in sludge), versatile (sewage and sewage-sludge samples up to 15,000 ng L -1 and 900 ng g -1 , respectively) and green analytical alternative for many medium-tech routine laboratories around the world to keep up with both current and forecast environmental regulations requirements. The remaining 10 analytes initially considered showed insufficient suitability to be included in the final method. The methodology was successfully applied to real samples generated in a pilot scale sewage treatment reactor. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. EPA-ORD MEASUREMENT SCIENCE SUPPORT FOR HOMELAND SECURITY

    EPA Science Inventory

    This presentation will describe the organization and the research and development activities of the ORD National Exposure Measurements Center and will focus on the Center's planned role in providing analytical method development, statistical sampling and design guidance, quality ...

  2. Ion-pairing HPLC methods to determine EDTA and DTPA in small molecule and biological pharmaceutical formulations.

    PubMed

    Wang, George; Tomasella, Frank P

    2016-06-01

    Ion-pairing high-performance liquid chromatography-ultraviolet (HPLC-UV) methods were developed to determine two commonly used chelating agents, ethylenediaminetetraacetic acid (EDTA) in Abilify® (a small molecule drug with aripiprazole as the active pharmaceutical ingredient) oral solution and diethylenetriaminepentaacetic acid (DTPA) in Yervoy® (a monoclonal antibody drug with ipilimumab as the active pharmaceutical ingredient) intravenous formulation. Since the analytes, EDTA and DTPA, do not contain chromophores, transition metal ions (Cu 2+ , Fe 3+ ) which generate highly stable metallocomplexes with the chelating agents were added into the sample preparation to enhance UV detection. The use of metallocomplexes with ion-pairing chromatography provides the ability to achieve the desired sensitivity and selectivity in the development of the method. Specifically, the sample preparation involving metallocomplex formation allowed sensitive UV detection. Copper was utilized for the determination of EDTA and iron was utilized for the determination of DTPA. In the case of EDTA, a gradient mobile phase separated the components of the formulation from the analyte. In the method for DTPA, the active drug substance, ipilimumab, was eluted in the void. In addition, the optimization of the concentration of the ion-pairing reagent was discussed as a means of enhancing the retention of the aminopolycarboxylic acids (APCAs) including EDTA and DTPA and the specificity of the method. The analytical method development was designed based on the chromatographic properties of the analytes, the nature of the sample matrix and the intended purpose of the method. Validation data were presented for the two methods. Finally, both methods were successfully utilized in determining the fate of the chelates.

  3. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    PubMed

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  4. Shape optimization using a NURBS-based interface-enriched generalized FEM

    DOE PAGES

    Najafi, Ahmad R.; Safdari, Masoud; Tortorelli, Daniel A.; ...

    2016-11-26

    This study presents a gradient-based shape optimization over a fixed mesh using a non-uniform rational B-splines-based interface-enriched generalized finite element method, applicable to multi-material structures. In the proposed method, non-uniform rational B-splines are used to parameterize the design geometry precisely and compactly by a small number of design variables. An analytical shape sensitivity analysis is developed to compute derivatives of the objective and constraint functions with respect to the design variables. Subtle but important new terms involve the sensitivity of shape functions and their spatial derivatives. As a result, verification and illustrative problems are solved to demonstrate the precision andmore » capability of the method.« less

  5. New vistas in refractive laser beam shaping with an analytic design approach

    NASA Astrophysics Data System (ADS)

    Duerr, Fabian; Thienpont, Hugo

    2014-05-01

    Many commercial, medical and scientific applications of the laser have been developed since its invention. Some of these applications require a specific beam irradiance distribution to ensure optimal performance. Often, it is possible to apply geometrical methods to design laser beam shapers. This common design approach is based on the ray mapping between the input plane and the output beam. Geometric ray mapping designs with two plano-aspheric lenses have been thoroughly studied in the past. Even though analytic expressions for various ray mapping functions do exist, the surface profiles of the lenses are still calculated numerically. In this work, we present an alternative novel design approach that allows direct calculation of the rotational symmetric lens profiles described by analytic functions. Starting from the example of a basic beam expander, a set of functional differential equations is derived from Fermat's principle. This formalism allows calculating the exact lens profiles described by Taylor series coefficients up to very high orders. To demonstrate the versatility of this new approach, two further cases are solved: a Gaussian to at-top irradiance beam shaping system, and a beam shaping system that generates a more complex dark-hollow Gaussian (donut-like) irradiance profile with zero intensity in the on-axis region. The presented ray tracing results confirm the high accuracy of all calculated solutions and indicate the potential of this design approach for refractive beam shaping applications.

  6. Solar thermal storage applications program

    NASA Astrophysics Data System (ADS)

    Peila, W. C.

    1982-12-01

    The efforts of the Storage Applications Program are reviewed. The program concentrated on the investigation of storage media and evaluation of storage methods. Extensive effort was given to experimental and analytical investigations of nitrate salts. Two tasks are the preliminary design of a 1200 MW/sub th/ system and the design, construction, operation, and evaluation of a subsystem research experiment, which utilized the same design. Some preliminary conclusions drawn from the subsystem research experiment are given.

  7. Analysis of Slabs-on-Grade for a Variety of Loading and Support Conditions.

    DTIC Science & Technology

    1984-12-01

    applications, namely "- the problem of a slab-on-grade, as encountered in the analysis and design of rigid pavements. - ". This is one of the few...proper design and construction methods are adhered to. There are several additional reasons, entirely due to recent developments, that warrant the...conservative designs led to almost imperceptible pavement deformations, thus warranting the term "rigid pavements". Modern-day analytical techniques

  8. Axisymmetric inlet minimum weight design method

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth

    1995-01-01

    An analytical method for determining the minimum weight design of an axisymmetric supersonic inlet has been developed. The goal of this method development project was to improve the ability to predict the weight of high-speed inlets in conceptual and preliminary design. The initial model was developed using information that was available from inlet conceptual design tools (e.g., the inlet internal and external geometries and pressure distributions). Stiffened shell construction was assumed. Mass properties were computed by analyzing a parametric cubic curve representation of the inlet geometry. Design loads and stresses were developed at analysis stations along the length of the inlet. The equivalent minimum structural thicknesses for both shell and frame structures required to support the maximum loads produced by various load conditions were then determined. Preliminary results indicated that inlet hammershock pressures produced the critical design load condition for a significant portion of the inlet. By improving the accuracy of inlet weight predictions, the method will improve the fidelity of propulsion and vehicle design studies and increase the accuracy of weight versus cost studies.

  9. Small Gas Turbine Combustor Primary Zone Study

    NASA Technical Reports Server (NTRS)

    Sullivan, R. E.; Young, E. R.; Miles, G. A.; Williams, J. R.

    1983-01-01

    A development process is described which consists of design, fabrication, and preliminary test evaluations of three approaches to internal aerodynamic primary zone flow patterns: (1) conventional double vortex swirl stabilization; (2) reverse flow swirl stabilization; and (3) large single vortex flow system. Each concept incorporates special design features aimed at extending the performance capability of the small engine combustor. Since inherent geometry of these combustors result in small combustion zone height and high surface area to volume ratio, design features focus on internal aerodynamics, fuel placement, and advanced cooling. The combustors are evaluated on a full scale annular combustor rig. A correlation of the primary zone performance with the overall performance is accomplished using three intrusion type gas sampling probes located at the exit of the primary zone section. Empirical and numerical methods are used for designing and predicting the performance of the three combustor concepts and their subsequent modifications. The calibration of analytical procedures with actual test results permits an updating of the analytical design techniques applicable to small reverse flow annular combustors.

  10. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  11. Rapid Method Development in Hydrophilic Interaction Liquid Chromatography for Pharmaceutical Analysis Using a Combination of Quantitative Structure-Retention Relationships and Design of Experiments.

    PubMed

    Taraji, Maryam; Haddad, Paul R; Amos, Ruth I J; Talebi, Mohammad; Szucs, Roman; Dolan, John W; Pohl, Chris A

    2017-02-07

    A design-of-experiment (DoE) model was developed, able to describe the retention times of a mixture of pharmaceutical compounds in hydrophilic interaction liquid chromatography (HILIC) under all possible combinations of acetonitrile content, salt concentration, and mobile-phase pH with R 2 > 0.95. Further, a quantitative structure-retention relationship (QSRR) model was developed to predict retention times for new analytes, based only on their chemical structures, with a root-mean-square error of prediction (RMSEP) as low as 0.81%. A compound classification based on the concept of similarity was applied prior to QSRR modeling. Finally, we utilized a combined QSRR-DoE approach to propose an optimal design space in a quality-by-design (QbD) workflow to facilitate the HILIC method development. The mathematical QSRR-DoE model was shown to be highly predictive when applied to an independent test set of unseen compounds in unseen conditions with a RMSEP value of 5.83%. The QSRR-DoE computed retention time of pharmaceutical test analytes and subsequently calculated separation selectivity was used to optimize the chromatographic conditions for efficient separation of targets. A Monte Carlo simulation was performed to evaluate the risk of uncertainty in the model's prediction, and to define the design space where the desired quality criterion was met. Experimental realization of peak selectivity between targets under the selected optimal working conditions confirmed the theoretical predictions. These results demonstrate how discovery of optimal conditions for the separation of new analytes can be accelerated by the use of appropriate theoretical tools.

  12. Creating and Evaluating a Hypertext System of Documenting Analytical Test Methods in a Chemical Plant Quality Assurance Laboratory.

    ERIC Educational Resources Information Center

    White, Charles E., Jr.

    The purpose of this study was to develop and implement a hypertext documentation system in an industrial laboratory and to evaluate its usefulness by participative observation and a questionnaire. Existing word-processing test method documentation was converted directly into a hypertext format or "hyperdocument." The hyperdocument was designed and…

  13. Optimal study design with identical power: an application of power equivalence to latent growth curve models.

    PubMed

    von Oertzen, Timo; Brandmaier, Andreas M

    2013-06-01

    Structural equation models have become a broadly applied data-analytic framework. Among them, latent growth curve models have become a standard method in longitudinal research. However, researchers often rely solely on rules of thumb about statistical power in their study designs. The theory of power equivalence provides an analytical answer to the question of how design factors, for example, the number of observed indicators and the number of time points assessed in repeated measures, trade off against each other while holding the power for likelihood-ratio tests on the latent structure constant. In this article, we present applications of power-equivalent transformations on a model with data from a previously published study on cognitive aging, and highlight consequences of participant attrition on power. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  14. Analytical methods in the high conversion reactor core design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeggel, W.; Oldekop, W.; Axmann, J.K.

    High conversion reactor (HCR) design methods have been used at the Technical University of Braunschweig (TUBS) with the technological support of Kraftwerk Union (KWU). The present state and objectives of this cooperation between KWU and TUBS in the field of HCRs have been described using existing design models and current activities aimed at further development and validation of the codes. The hard physical and thermal-hydraulic boundary conditions of pressurized water reactor (PWR) cores with a high degree of fuel utilization result from the tight packing of the HCR fuel rods and the high fissionable plutonium content of the fuel. Inmore » terms of design, the problem will be solved with rod bundles whose fuel rods are adjusted by helical spacers to the proposed small rod pitches. These HCR properties require novel computational models for neutron physics, thermal hydraulics, and fuel rod design. By means of a survey of the codes, the analytical procedure for present-day HCR core design is presented. The design programs are currently under intensive development, as design tools with a solid, scientific foundation and with essential parameters that are widely valid and are required for a promising optimization of the HCR core. Design results and a survey of future HCR development are given. In this connection, the reoptimization of the PWR core in the direction of an HCR is considered a fascinating scientific task, with respect to both economic and safety aspects.« less

  15. Attractive design: an elution solvent optimization platform for magnetic-bead-based fractionation using digital microfluidics and design of experiments.

    PubMed

    Lafrenière, Nelson M; Mudrik, Jared M; Ng, Alphonsus H C; Seale, Brendon; Spooner, Neil; Wheeler, Aaron R

    2015-04-07

    There is great interest in the development of integrated tools allowing for miniaturized sample processing, including solid phase extraction (SPE). We introduce a new format for microfluidic SPE relying on C18-functionalized magnetic beads that can be manipulated in droplets in a digital microfluidic platform. This format provides the opportunity to tune the amount (and potentially the type) of stationary phase on-the-fly, and allows the removal of beads after the extraction (to enable other operations in same device-space), maintaining device reconfigurability. Using the new method, we employed a design of experiments (DOE) operation to enable automated on-chip optimization of elution solvent composition for reversed phase SPE of a model system. Further, conditions were selected to enable on-chip fractionation of multiple analytes. Finally, the method was demonstrated to be useful for online cleanup of extracts from dried blood spot (DBS) samples. We anticipate this combination of features will prove useful for separating a wide range of analytes, from small molecules to peptides, from complex matrices.

  16. Gaussian process based modeling and experimental design for sensor calibration in drifting environments

    PubMed Central

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2016-01-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894

  17. Applications of flight control system methods to an advanced combat rotorcraft

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Fletcher, Jay W.; Morris, Patrick M.; Tucker, George T.

    1989-01-01

    Advanced flight control system design, analysis, and testing methodologies developed at the Ames Research Center are applied in an analytical and flight test evaluation of the Advanced Digital Optical Control System (ADOCS) demonstrator. The primary objectives are to describe the knowledge gained about the implications of digital flight control system design for rotorcraft, and to illustrate the analysis of the resulting handling-qualities in the context of the proposed new handling-qualities specification for rotorcraft. Topics covered in-depth are digital flight control design and analysis methods, flight testing techniques, ADOCS handling-qualities evaluation results, and correlation of flight test results with analytical models and the proposed handling-qualities specification. The evaluation of the ADOCS demonstrator indicates desirable response characteristics based on equivalent damping and frequency, but undersirably large effective time-delays (exceeding 240 m sec in all axes). Piloted handling-qualities are found to be desirable or adequate for all low, medium, and high pilot gain tasks; but handling-qualities are inadequate for ultra-high gain tasks such as slope and running landings.

  18. Uncertainty Estimation for the Determination of Ni, Pb and Al in Natural Water Samples by SPE-ICP-OES

    NASA Astrophysics Data System (ADS)

    Ghorbani, A.; Farahani, M. Mahmoodi; Rabbani, M.; Aflaki, F.; Waqifhosain, Syed

    2008-01-01

    In this paper we propose uncertainty estimation for the analytical results we obtained from determination of Ni, Pb and Al by solidphase extraction and inductively coupled plasma optical emission spectrometry (SPE-ICP-OES). The procedure is based on the retention of analytes in the form of 8-hydroxyquinoline (8-HQ) complexes on a mini column of XAD-4 resin and subsequent elution with nitric acid. The influence of various analytical parameters including the amount of solid phase, pH, elution factors (concentration and volume of eluting solution), volume of sample solution, and amount of ligand on the extraction efficiency of analytes was investigated. To estimate the uncertainty of analytical result obtained, we propose assessing trueness by employing spiked sample. Two types of bias are calculated in the assessment of trueness: a proportional bias and a constant bias. We applied Nested design for calculating proportional bias and Youden method to calculate the constant bias. The results we obtained for proportional bias are calculated from spiked samples. In this case, the concentration found is plotted against the concentration added and the slop of standard addition curve is an estimate of the method recovery. Estimated method of average recovery in Karaj river water is: (1.004±0.0085) for Ni, (0.999±0.010) for Pb and (0.987±0.008) for Al.

  19. Analytical and numerical solutions of the potential and electric field generated by different electrode arrays in a tumor tissue under electrotherapy

    PubMed Central

    2011-01-01

    Background Electrotherapy is a relatively well established and efficient method of tumor treatment. In this paper we focus on analytical and numerical calculations of the potential and electric field distributions inside a tumor tissue in a two-dimensional model (2D-model) generated by means of electrode arrays with shapes of different conic sections (ellipse, parabola and hyperbola). Methods Analytical calculations of the potential and electric field distributions based on 2D-models for different electrode arrays are performed by solving the Laplace equation, meanwhile the numerical solution is solved by means of finite element method in two dimensions. Results Both analytical and numerical solutions reveal significant differences between the electric field distributions generated by electrode arrays with shapes of circle and different conic sections (elliptic, parabolic and hyperbolic). Electrode arrays with circular, elliptical and hyperbolic shapes have the advantage of concentrating the electric field lines in the tumor. Conclusion The mathematical approach presented in this study provides a useful tool for the design of electrode arrays with different shapes of conic sections by means of the use of the unifying principle. At the same time, we verify the good correspondence between the analytical and numerical solutions for the potential and electric field distributions generated by the electrode array with different conic sections. PMID:21943385

  20. Determination of 32 cathinone derivatives and other designer drugs in serum by comprehensive LC-QQQ-MS/MS analysis.

    PubMed

    Swortwood, Madeleine J; Boland, Diane M; DeCaprio, Anthony P

    2013-02-01

    Recently, clandestine drug lab operators have attempted to bypass controlled substances laws and regulations with "designer" compounds chemically and pharmacologically similar to controlled substances. For example, "bath salts" have erupted onto the scene as "legal highs" containing cathinone analogs that have produced severe side effects in users worldwide. These products have sparked concern among law enforcement agencies, and emergency bans have been placed on the sale of such items. Despite the increasing number of designer drugs available, there are few comprehensive screening techniques for their detection and quantification in biological specimens. The liquid chromatography triple quadrupole tandem mass spectrometry (LC-QQQ-MS/MS) method presented here encompasses over thirty important compounds within the phenethylamine, tryptamine, and piperazine designer drug classes. Analytes were determined by LC-QQQ-MS/MS in the multiple-reaction monitoring mode after mixed-mode solid-phase extraction. The bioanalytical method was fully validated according to recommended international guidelines. The assay was selective for all analytes with acceptable accuracy and precision. Limits of quantification were in the range of 1-10 ng/mL for each compound with limits of detection near 10 pg/mL. In order to evaluate its applicability in a forensic toxicological setting, the validated method was used to analyze post-mortem specimens from two cases that were suspected of containing designer drugs. The method was able to identify and quantify seven of these compounds at concentrations as low as 11 ng/mL. The method should have wide applicability for rapid screening of important new drugs of abuse at high sensitivity in both post- and ante-mortem forensic analysis.

  1. Extraction, isolation, and purification of analytes from samples of marine origin--a multivariate task.

    PubMed

    Liguori, Lucia; Bjørsvik, Hans-René

    2012-12-01

    The development of a multivariate study for a quantitative analysis of six different polybrominated diphenyl ethers (PBDEs) in tissue of Atlantic Salmo salar L. is reported. An extraction, isolation, and purification process based on an accelerated solvent extraction system was designed, investigated, and optimized by means of statistical experimental design and multivariate data analysis and regression. An accompanying gas chromatography-mass spectrometry analytical method was developed for the identification and quantification of the analytes, BDE 28, BDE 47, BDE 99, BDE 100, BDE 153, and BDE 154. These PBDEs have been used in commercial blends that were used as flame-retardants for a variety of materials, including electronic devices, synthetic polymers and textiles. The present study revealed that an extracting solvent mixture composed of hexane and CH₂Cl₂ (10:90) provided excellent recoveries of all of the six PBDEs studied herein. A somewhat lower polarity in the extracting solvent, hexane and CH₂Cl₂ (40:60) decreased the analyte %-recoveries, which still remain acceptable and satisfactory. The study demonstrates the necessity to perform an intimately investigation of the extraction and purification process in order to achieve quantitative isolation of the analytes from the specific matrix. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Analyte-driven switching of DNA charge transport: de novo creation of electronic sensors for an early lung cancer biomarker.

    PubMed

    Thomas, Jason M; Chakraborty, Banani; Sen, Dipankar; Yu, Hua-Zhong

    2012-08-22

    A general approach is described for the de novo design and construction of aptamer-based electrochemical biosensors, for potentially any analyte of interest (ranging from small ligands to biological macromolecules). As a demonstration of the approach, we report the rapid development of a made-to-order electronic sensor for a newly reported early biomarker for lung cancer (CTAP III/NAP2). The steps include the in vitro selection and characterization of DNA aptamer sequences, design and biochemical testing of wholly DNA sensor constructs, and translation to a functional electrode-bound sensor format. The working principle of this distinct class of electronic biosensors is the enhancement of DNA-mediated charge transport in response to analyte binding. We first verify such analyte-responsive charge transport switching in solution, using biochemical methods; successful sensor variants were then immobilized on gold electrodes. We show that using these sensor-modified electrodes, CTAP III/NAP2 can be detected with both high specificity and sensitivity (K(d) ~1 nM) through a direct electrochemical reading. To investigate the underlying basis of analyte binding-induced conductivity switching, we carried out Förster Resonance Energy Transfer (FRET) experiments. The FRET data establish that analyte binding-induced conductivity switching in these sensors results from very subtle structural/conformational changes, rather than large scale, global folding events. The implications of this finding are discussed with respect to possible charge transport switching mechanisms in electrode-bound sensors. Overall, the approach we describe here represents a unique design principle for aptamer-based electrochemical sensors; its application should enable rapid, on-demand access to a class of portable biosensors that offer robust, inexpensive, and operationally simplified alternatives to conventional antibody-based immunoassays.

  3. Sustained prediction ability of net analyte preprocessing methods using reduced calibration sets. Theoretical and experimental study involving the spectrophotometric analysis of multicomponent mixtures.

    PubMed

    Goicoechea, H C; Olivieri, A C

    2001-07-01

    A newly developed multivariate method involving net analyte preprocessing (NAP) was tested using central composite calibration designs of progressively decreasing size regarding the multivariate simultaneous spectrophotometric determination of three active components (phenylephrine, diphenhydramine and naphazoline) and one excipient (methylparaben) in nasal solutions. Its performance was evaluated and compared with that of partial least-squares (PLS-1). Minimisation of the calibration predicted error sum of squares (PRESS) as a function of a moving spectral window helped to select appropriate working spectral ranges for both methods. The comparison of NAP and PLS results was carried out using two tests: (1) the elliptical joint confidence region for the slope and intercept of a predicted versus actual concentrations plot for a large validation set of samples and (2) the D-optimality criterion concerning the information content of the calibration data matrix. Extensive simulations and experimental validation showed that, unlike PLS, the NAP method is able to furnish highly satisfactory results when the calibration set is reduced from a full four-component central composite to a fractional central composite, as expected from the modelling requirements of net analyte based methods.

  4. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  5. Analytical method for the evaluation of the outdoor air contamination by emerging pollutants using tree leaves as bioindicators.

    PubMed

    Barroso, Pedro José; Martín, Julia; Santos, Juan Luis; Aparicio, Irene; Alonso, Esteban

    2018-01-01

    In this work, an analytical method, based on sonication-assisted extraction, clean-up by dispersive solid-phase extraction and determination by liquid chromatography-tandem mass spectrometry, has been developed and validated for the simultaneous determination of 15 emerging pollutants in leaves from four ornamental tree species. Target compounds include perfluorinated organic compounds, plasticizers, surfactants, brominated flame retardant, and preservatives. The method was optimized using Box-Behnken statistical experimental design with response surface methodology and validated in terms of recovery, accuracy, precision, and method detection and quantification limits. Quantification of target compounds was carried out using matrix-matched calibration curves. The highest recoveries were achieved for the perfluorinated organic compounds (mean values up to 87%) and preservatives (up to 88%). The lowest recoveries were achieved for plasticizers (51%) and brominated flame retardant (63%). Method detection and quantification limits were in the ranges 0.01-0.09 ng/g dry matter (dm) and 0.02-0.30 ng/g dm, respectively, for most of the target compounds. The method was successfully applied to the determination of the target compounds on leaves from four tree species used as urban ornamental trees (Citrus aurantium, Celtis australis, Platanus hispanica, and Jacaranda mimosifolia). Graphical abstract Analytical method for the biomonitorization of emerging pollutants in outdoor air.

  6. Application of the Life Cycle Analysis and the Building Information Modelling Software in the Architectural Climate Change-Oriented Design Process

    NASA Astrophysics Data System (ADS)

    Gradziński, Piotr

    2017-10-01

    Whereas World’s climate is changing (inter alia, under the influence of architecture activity), the author attempts to reorientations design practice primarily in a direction the use and adapt to the climatic conditions. Architectural Design using in early stages of the architectural Design Process of the building, among other Life Cycle Analysis (LCA) and digital analytical tools BIM (Building Information Modelling) defines the overriding requirements which the designer/architect should meet. The first part, the text characterized the architecture activity influences (by consumption, pollution, waste, etc.) and the use of building materials (embodied energy, embodied carbon, Global Warming Potential, etc.) within the meaning of the direct negative environmental impact. The second part, the paper presents the revision of the methods and analytical techniques prevent negative influences. Firstly, showing the study of the building by using the Life Cycle Analysis of the structure (e.g. materials) and functioning (e.g. energy consumptions) of the architectural object (stages: before use, use, after use). Secondly, the use of digital analytical tools for determining the benefits of running multi-faceted simulations in terms of environmental factors (exposure to light, shade, wind) directly affecting shaping the form of the building. The conclusion, author’s research results highlight the fact that indicates the possibility of building design using the above-mentioned elements (LCA, BIM) causes correction, early designs decisions in the design process of architectural form, minimizing the impact on nature, environment. The work refers directly to the architectural-environmental dimensions, orienting the design process of buildings in respect of widely comprehended climatic changes.

  7. Methods for the determination of European Union-permitted added natural colours in foods: a review.

    PubMed

    Scotter, M J

    2011-05-01

    Coupled to increasing consumer demand, food manufacturers have moved towards increased usage of approved natural colours. There is a legal requirement for governments to monitor the consumption of all food additives in the European Union to ensure the acceptable daily intakes (ADIs) are not exceeded, especially by young children. Validated analytical methods are needed to fulfil this requirement. The aim of this paper is to review the available literature on methods of extraction for approved natural colours in food and drink. Available analytical methods for the determination of European Union-permitted natural food colour additives in foods and beverages have been assessed for their fitness for purpose in terms of their key extraction and analysis procedures, selectivity and sensitivity, especially with regard to maximum permitted levels, and their applicability for use in surveillance and in an enforcement role. The advantages and disadvantages of available analytical methods for each of nine designated chemical classes (groups) of natural colours in different food and beverage matrices are given. Other important factors such as technical requirements, cost, transferability and applicability are given due consideration. Gaps in the knowledge and levels of validation are identified and recommendations made on further research to develop suitable methods. The nine designated natural colour classes covered are: 1. Curcumin (E100), 2. Riboflavins (E101i-ii), 3. Cochineal (E120), 4. Chlorophylls--including chlorophyllins and copper analogues (E140-141), 5. Caramel Classes I-IV (E150a-d), 6. Carotenoids (E160a-f, E161b, E161g), 7. Beetroot red (E162), 8. Anthocyanins (E163), and 9. Other colours--Vegetable carbon (E153), Calcium carbonate (E170), Titanium dioxide (E171) and Iron oxides and hydroxides (E172).

  8. Comparison of two microextraction methods based on solidification of floating organic droplet for the determination of multiclass analytes in river water samples by liquid chromatography tandem mass spectrometry using Central Composite Design.

    PubMed

    Asati, Ankita; Satyanarayana, G N V; Patel, Devendra K

    2017-09-01

    Two low density organic solvents based liquid-liquid microextraction methods, namely Vortex assisted liquid-liquid microextraction based on solidification of floating organic droplet (VALLME-SFO) and Dispersive liquid-liquid microextraction based on solidification of floating organic droplet(DLLME-SFO) have been compared for the determination of multiclass analytes (pesticides, plasticizers, pharmaceuticals and personal care products) in river water samples by using liquid chromatography tandem mass spectrometry (LC-MS/MS). The effect of various experimental parameters on the efficiency of the two methods and their optimum values were studied with the aid of Central Composite Design (CCD) and Response Surface Methodology(RSM). Under optimal conditions, VALLME-SFO was validated in terms of limit of detection, limit of quantification, dynamic linearity range, determination of coefficient, enrichment factor and extraction recovery for which the respective values were (0.011-0.219ngmL -1 ), (0.035-0.723ngmL -1 ), (0.050-0.500ngmL -1 ), (R 2 =0.992-0.999), (40-56), (80-106%). However, when the DLLME-SFO method was validated under optimal conditions, the range of values of limit of detection, limit of quantification, dynamic linearity range, determination of coefficient, enrichment factor and extraction recovery were (0.025-0.377ngmL -1 ), (0.083-1.256ngmL -1 ), (0.100-1.000ngmL -1 ), (R 2 =0.990-0.999), (35-49), (69-98%) respectively. Interday and intraday precisions were calculated as percent relative standard deviation (%RSD) and the values were ≤15% for VALLME-SFO and DLLME-SFO methods. Both methods were successfully applied for determining multiclass analytes in river water samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Variation in choice of study design: findings from the Epidemiology Design Decision Inventory and Evaluation (EDDIE) survey.

    PubMed

    Stang, Paul E; Ryan, Patrick B; Overhage, J Marc; Schuemie, Martijn J; Hartzema, Abraham G; Welebob, Emily

    2013-10-01

    Researchers using observational data to understand drug effects must make a number of analytic design choices that suit the characteristics of the data and the subject of the study. Review of the published literature suggests that there is a lack of consistency even when addressing the same research question in the same database. To characterize the degree of similarity or difference in the method and analysis choices made by observational database research experts when presented with research study scenarios. On-line survey using research scenarios on drug-effect studies to capture method selection and analysis choices that follow a dependency branching based on response to key questions. Voluntary participants experienced in epidemiological study design solicited for participation through registration on the Observational Medical Outcomes Partnership website, membership in particular professional organizations, or links in relevant newsletters. Description (proportion) of respondents selecting particular methods and making specific analysis choices based on individual drug-outcome scenario pairs. The number of questions/decisions differed based on stem questions of study design, time-at-risk, outcome definition, and comparator. There is little consistency across scenarios, by drug or by outcome of interest, in the decisions made for design and analyses in scenarios using large healthcare databases. The most consistent choice was the cohort study design but variability in the other critical decisions was common. There is great variation among epidemiologists in the design and analytical choices that they make when implementing analyses in observational healthcare databases. These findings confirm that it will be important to generate empiric evidence to inform these decisions and to promote a better understanding of the impact of standardization on research implementation.

  10. Analytical applications of microbial fuel cells. Part II: Toxicity, microbial activity and quantification, single analyte detection and other uses.

    PubMed

    Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo

    2015-01-15

    Microbial fuel cells were rediscovered twenty years ago and now are a very active research area. The reasons behind this new activity are the relatively recent discovery of electrogenic or electroactive bacteria and the vision of two important practical applications, as wastewater treatment coupled with clean energy production and power supply systems for isolated low-power sensor devices. Although some analytical applications of MFCs were proposed earlier (as biochemical oxygen demand sensing) only lately a myriad of new uses of this technology are being presented by research groups around the world, which combine both biological-microbiological and electroanalytical expertises. This is the second part of a review of MFC applications in the area of analytical sciences. In Part I a general introduction to biological-based analytical methods including bioassays, biosensors, MFCs design, operating principles, as well as, perhaps the main and earlier presented application, the use as a BOD sensor was reviewed. In Part II, other proposed uses are presented and discussed. As other microbially based analytical systems, MFCs are satisfactory systems to measure and integrate complex parameters that are difficult or impossible to measure otherwise, such as water toxicity (where the toxic effect to aquatic organisms needed to be integrated). We explore here the methods proposed to measure toxicity, microbial metabolism, and, being of special interest to space exploration, life sensors. Also, some methods with higher specificity, proposed to detect a single analyte, are presented. Different possibilities to increase selectivity and sensitivity, by using molecular biology or other modern techniques are also discussed here. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Free-form surface design method for a collimator TIR lens.

    PubMed

    Tsai, Chung-Yu

    2016-04-01

    A free-form (FF) surface design method is proposed for a general axial-symmetrical collimator system consisting of a light source and a total internal reflection lens with two coupled FF boundary surfaces. The profiles of the boundary surfaces are designed using a FF surface construction method such that each incident ray is directed (refracted and reflected) in such a way as to form a specified image pattern on the target plane. The light ray paths within the system are analyzed using an exact analytical model and a skew-ray tracing approach. In addition, the validity of the proposed FF design method is demonstrated by means of ZEMAX simulations. It is shown that the illumination distribution formed on the target plane is in good agreement with that specified by the user. The proposed surface construction method is mathematically straightforward and easily implemented in computer code. As such, it provides a useful tool for the design and analysis of general axial-symmetrical optical systems.

  12. Magnet pole shape design for reduction of thrust ripple of slotless permanent magnet linear synchronous motor with arc-shaped magnets considering end-effect based on analytical method

    NASA Astrophysics Data System (ADS)

    Shin, Kyung-Hun; Park, Hyung-Il; Kim, Kwan-Ho; Jang, Seok-Myeong; Choi, Jang-Young

    2017-05-01

    The shape of the magnet is essential to the performance of a slotless permanent magnet linear synchronous machine (PMLSM) because it is directly related to desirable machine performance. This paper presents a reduction in the thrust ripple of a PMLSM through the use of arc-shaped magnets based on electromagnetic field theory. The magnetic field solutions were obtained by considering end effect using a magnetic vector potential and two-dimensional Cartesian coordinate system. The analytical solution of each subdomain (PM, air-gap, coil, and end region) is derived, and the field solution is obtained by applying the boundary and interface conditions between the subdomains. In particular, an analytical method was derived for the instantaneous thrust and thrust ripple reduction of a PMLSM with arc-shaped magnets. In order to demonstrate the validity of the analytical results, the back electromotive force results of a finite element analysis and experiment on the manufactured prototype model were compared. The optimal point for thrust ripple minimization is suggested.

  13. Analytical Modeling of a Double-Sided Flux Concentrating E-Core Transverse Flux Machine with Pole Windings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muljadi, Eduard; Hasan, Iftekhar; Husain, Tausif

    In this paper, a nonlinear analytical model based on the Magnetic Equivalent Circuit (MEC) method is developed for a double-sided E-Core Transverse Flux Machine (TFM). The proposed TFM has a cylindrical rotor, sandwiched between E-core stators on both sides. Ferrite magnets are used in the rotor with flux concentrating design to attain high airgap flux density, better magnet utilization, and higher torque density. The MEC model was developed using a series-parallel combination of flux tubes to estimate the reluctance network for different parts of the machine including air gaps, permanent magnets, and the stator and rotor ferromagnetic materials, in amore » two-dimensional (2-D) frame. An iterative Gauss-Siedel method is integrated with the MEC model to capture the effects of magnetic saturation. A single phase, 1 kW, 400 rpm E-Core TFM is analytically modeled and its results for flux linkage, no-load EMF, and generated torque, are verified with Finite Element Analysis (FEA). The analytical model significantly reduces the computation time while estimating results with less than 10 percent error.« less

  14. Solution of magnetic field and eddy current problem induced by rotating magnetic poles (abstract)

    NASA Astrophysics Data System (ADS)

    Liu, Z. J.; Low, T. S.

    1996-04-01

    The magnetic field and eddy current problems induced by rotating permanent magnet poles occur in electromagnetic dampers, magnetic couplings, and many other devices. Whereas numerical techniques, for example, finite element methods can be exploited to study various features of these problems, such as heat generation and drag torque development, etc., the analytical solution is always of interest to the designers since it helps them to gain the insight into the interdependence of the parameters involved and provides an efficient tool for designing. Some of the previous work showed that the solution of the eddy current problem due to the linearly moving magnet poles can give satisfactory approximation for the eddy current problem due to rotating fields. However, in many practical cases, especially when the number of magnet poles is small, there is significant effect of flux focusing due to the geometry. The above approximation can therefore lead to marked errors in the theoretical predictions of the device performance. Bernot et al. recently described an analytical solution in a polar coordinate system where the radial field is excited by a time-varying source. A discussion of an analytical solution of the magnetic field and eddy current problems induced by moving magnet poles in radial field machines will be given in this article. The theoretical predictions obtained from this method is compared with the results obtained from finite element calculations. The validity of the method is also checked by the comparison of the theoretical predictions and the measurements from a test machine. It is shown that the introduced solution leads to a significant improvement in the air gap field prediction as compared with the results obtained from the analytical solution that models the eddy current problems induced by linearly moving magnet poles.

  15. Dynamics and Control of Flexible Space Vehicles

    NASA Technical Reports Server (NTRS)

    Likins, P. W.

    1970-01-01

    The purpose of this report is twofold: (1) to survey the established analytic procedures for the simulation of controlled flexible space vehicles, and (2) to develop in detail methods that employ a combination of discrete and distributed ("modal") coordinates, i.e., the hybrid-coordinate methods. Analytic procedures are described in three categories: (1) discrete-coordinate methods, (2) hybrid-coordinate methods, and (3) vehicle normal-coordinate methods. Each of these approaches is described and analyzed for its advantages and disadvantages, and each is found to have an area of applicability. The hybrid-coordinate method combines the efficiency of the vehicle normal-coordinate method with the versatility of the discrete-coordinate method, and appears to have the widest range of practical application. The results in this report have practical utility in two areas: (1) complex digital computer simulation of flexible space vehicles of arbitrary configuration subject to realistic control laws, and (2) preliminary control system design based on transfer functions for linearized models of dynamics and control laws.

  16. Anabolic agents: recent strategies for their detection and protection from inadvertent doping

    PubMed Central

    Geyer, Hans; Schänzer, Wilhelm; Thevis, Mario

    2014-01-01

    According to the World Anti-Doping Agency (WADA) Prohibited List, anabolic agents consist of exogenous anabolic androgenic steroids (AAS), endogenous AAS and other anabolic agents such as clenbuterol and selective androgen receptor modulators (SARMs). Currently employed strategies for their improved detection include the prolongation of the detection windows for exogenous AAS, non-targeted and indirect analytical approaches for the detection of modified steroids (designer steroids), the athlete’s biological passport and isotope ratio mass spectrometry for the detection of the misuse of endogenous AAS, as well as preventive doping research for the detection of SARMs. The recent use of these strategies led to 4–80-fold increases of adverse analytical findings for exogenous AAS, to the detection of the misuse of new designer steroids, to adverse analytical findings of different endogenous AAS and to the first adverse analytical findings of SARMs. The strategies of the antidoping research are not only focused on the development of methods to catch the cheating athlete but also to protect the clean athlete from inadvertent doping. Within the past few years several sources of inadvertent doping with anabolic agents have been identified. Among these are nutritional supplements adulterated with AAS, meat products contaminated with clenbuterol, mycotoxin (zearalenone) contamination leading to zeranol findings, and natural products containing endogenous AAS. The protection strategy consists of further investigations in case of reasonable suspicion of inadvertent doping, publication of the results, education of athletes and development of methods to differentiate between intentional and unintentional doping. PMID:24632537

  17. Pulse cleaning flow models and numerical computation of candle ceramic filters.

    PubMed

    Tian, Gui-shan; Ma, Zhen-ji; Zhang, Xin-yi; Xu, Ting-xiang

    2002-04-01

    Analytical and numerical computed models are developed for reverse pulse cleaning system of candle ceramic filters. A standard turbulent model is demonstrated suitably to the designing computation of reverse pulse cleaning system from the experimental and one-dimensional computational result. The computed results can be used to guide the designing of reverse pulse cleaning system, which is optimum Venturi geometry. From the computed results, the general conclusions and the designing methods are obtained.

  18. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  19. Multi-band transmission color filters for multi-color white LEDs based visible light communication

    NASA Astrophysics Data System (ADS)

    Wang, Qixia; Zhu, Zhendong; Gu, Huarong; Chen, Mengzhu; Tan, Qiaofeng

    2017-11-01

    Light-emitting diodes (LEDs) based visible light communication (VLC) can provide license-free bands, high data rates, and high security levels, which is a promising technique that will be extensively applied in future. Multi-band transmission color filters with enough peak transmittance and suitable bandwidth play a pivotal role for boosting signal-noise-ratio in VLC systems. In this paper, multi-band transmission color filters with bandwidth of dozens nanometers are designed by a simple analytical method. Experiment results of one-dimensional (1D) and two-dimensional (2D) tri-band color filters demonstrate the effectiveness of the multi-band transmission color filters and the corresponding analytical method.

  20. Technique for determining the amount of hydrogen diffusing through a steel membrane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kardash, N.V.; Batrakov, V.V.

    1995-07-01

    Hydrogen diffusion through steel membranes still attracts much attention from scientists, and during recent years new results have been reported. Hydrogen diffusion is usually studied in the cell designed by M.A. Devanathan, but there are also other techniques for determining hydrogen permeability, namely: from the change in the solution volume in a horizontal or gas microburette; from the hydrogen ionization current; from the penetration current; and from the buckling of the cathode. The authors developed an analytical method using autocatalytic titration for determining the amount of hydrogen passed through a steel membrane. The method is based on permanganatometry which ismore » widely used in analytical chemistry.« less

  1. Gear and Transmission Research at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Townsend, Dennis P.

    1997-01-01

    This paper is a review of some of the research work of the NASA Lewis Research Center Mechanical Components Branch. It includes a brief review of the NASA Lewis Research Center and the Mechanical Components Branch. The research topics discussed are crack propagation of gear teeth, gear noise of spiral bevel and other gears, design optimization methods, methods we have investigated for transmission diagnostics, the analytical and experimental study of gear thermal conditions, the analytical and experimental study of split torque systems, the evaluation of several new advanced gear steels and transmission lubricants and the evaluation of various aircraft transmissions. The area of research needs for gearing and transmissions is also discussed.

  2. Verifiable Adaptive Control with Analytical Stability Margins by Optimal Control Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2010-01-01

    This paper presents a verifiable model-reference adaptive control method based on an optimal control formulation for linear uncertain systems. A predictor model is formulated to enable a parameter estimation of the system parametric uncertainty. The adaptation is based on both the tracking error and predictor error. Using a singular perturbation argument, it can be shown that the closed-loop system tends to a linear time invariant model asymptotically under an assumption of fast adaptation. A stability margin analysis is given to estimate a lower bound of the time delay margin using a matrix measure method. Using this analytical method, the free design parameter n of the optimal control modification adaptive law can be determined to meet a specification of stability margin for verification purposes.

  3. Platform construction and extraction mechanism study of magnetic mixed hemimicelles solid-phase extraction

    PubMed Central

    Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua

    2016-01-01

    Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples. PMID:27924944

  4. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    PubMed

    West, Brady T; Sakshaug, Joseph W; Aurelien, Guy Alain S

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.

  5. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    PubMed Central

    West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817

  6. Vortex maneuver lift for super-cruise configurations

    NASA Technical Reports Server (NTRS)

    Campbell, J. F.; Gloss, B. B.; Lamar, J. E.

    1976-01-01

    Some of the theoretical and experimental research conducted at the NASA Langley Research Center is presented to investigate the subsonic vortex-lift producing capabilities for two classes of Super-Cruise designs: a close-coupled wing-canard arrangement and a slender wing configuration. In addition, several analytical methods are discussed for estimating critical structural design loads for thin, highly swept wings having separated leading-edge vortex flows.

  7. Characterization and modeling of an advanced flexible thermal protection material for space applications

    NASA Technical Reports Server (NTRS)

    Clayton, Joseph P.; Tinker, Michael L.

    1991-01-01

    This paper describes experimental and analytical characterization of a new flexible thermal protection material known as Tailorable Advanced Blanket Insulation (TABI). This material utilizes a three-dimensional ceramic fabric core structure and an insulation filler. TABI is the leading candidate for use in deployable aeroassisted vehicle designs. Such designs require extensive structural modeling, and the most significant in-plane material properties necessary for model development are measured and analytically verified in this study. Unique test methods are developed for damping measurements. Mathematical models are developed for verification of the experimental modulus and damping data, and finally, transverse properties are described in terms of the inplane properties through use of a 12-dof finite difference model of a simple TABI configuration.

  8. HOST turbine heat transfer program summary

    NASA Technical Reports Server (NTRS)

    Gladden, Herbert J.; Simoneau, Robert J.

    1988-01-01

    The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding with the remainder going to analytical efforts. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.

  9. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In the first laboratory evaluation of these biobeds endosulfan was bioconverted up to 87% and chlorpyrifos more than 79% after 27 days. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. LOCATING BURIED WORLD WAR 1 MUNITIONS WITH REMOTE SENSING AND GIS

    EPA Science Inventory

    Remote Sensing is a scientific discipline of non-contact monitoring. It includes a range of technologies that span from aerial photography to advanced spectral imaging and analytical methods. This Session is designed to demonstrate contemporary practical applications of remote ...

  11. Ethnographic/Qualitative Research: Theoretical Perspectives and Methodological Strategies.

    ERIC Educational Resources Information Center

    Butler, E. Dean

    This paper examines the metatheoretical concepts associated with ethnographic/qualitative educational inquiry and overviews the more commonly utilized research designs, data collection methods, and analytical approaches. The epistemological and ontological assumptions of this newer approach differ greatly from those of the traditional educational…

  12. HANDBOOK: CONTINUOUS EMISSION MONITORING SYSTEMS FOR NON-CRITERIA POLLUTANTS

    EPA Science Inventory

    This Handbook provides a description of the methods used to continuously monitor non-criteria pollutants emitted from stationary sources. The Handbook contains a review of current regulatory programs, the state-of-the-art sampling system design, analytical techniques, and the use...

  13. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    PubMed

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  14. Exposure assessment for endocrine disruptors: some considerations in the design of studies.

    PubMed Central

    Rice, Carol; Birnbaum, Linda S; Cogliano, James; Mahaffey, Kathryn; Needham, Larry; Rogan, Walter J; vom Saal, Frederick S

    2003-01-01

    In studies designed to evaluate exposure-response relationships in children's development from conception through puberty, multiple factors that affect the generation of meaningful exposure metrics must be considered. These factors include multiple routes of exposure; the timing, frequency, and duration of exposure; need for qualitative and quantitative data; sample collection and storage protocols; and the selection and documentation of analytic methods. The methods for exposure data collection and analysis must be sufficiently robust to accommodate the a priori hypotheses to be tested, as well as hypotheses generated from the data. A number of issues that must be considered in study design are summarized here. PMID:14527851

  15. Analytical drain current model for symmetric dual-gate amorphous indium gallium zinc oxide thin-film transistors

    NASA Astrophysics Data System (ADS)

    Qin, Ting; Liao, Congwei; Huang, Shengxiang; Yu, Tianbao; Deng, Lianwen

    2018-01-01

    An analytical drain current model based on the surface potential is proposed for amorphous indium gallium zinc oxide (a-InGaZnO) thin-film transistors (TFTs) with a synchronized symmetric dual-gate (DG) structure. Solving the electric field, surface potential (φS), and central potential (φ0) of the InGaZnO film using the Poisson equation with the Gaussian method and Lambert function is demonstrated in detail. The compact analytical model of current-voltage behavior, which consists of drift and diffusion components, is investigated by regional integration, and voltage-dependent effective mobility is taken into account. Comparison results demonstrate that the calculation results obtained using the derived models match well with the simulation results obtained using a technology computer-aided design (TCAD) tool. Furthermore, the proposed model is incorporated into SPICE simulations using Verilog-A to verify the feasibility of using DG InGaZnO TFTs for high-performance circuit designs.

  16. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    PubMed Central

    Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.

    2017-01-01

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034

  17. Strategies for Dealing with Missing Data in Clinical Trials: From Design to Analysis

    PubMed Central

    Dziura, James D.; Post, Lori A.; Zhao, Qing; Fu, Zhixuan; Peduzzi, Peter

    2013-01-01

    Randomized clinical trials are the gold standard for evaluating interventions as randomized assignment equalizes known and unknown characteristics between intervention groups. However, when participants miss visits, the ability to conduct an intent-to-treat analysis and draw conclusions about a causal link is compromised. As guidance to those performing clinical trials, this review is a non-technical overview of the consequences of missing data and a prescription for its treatment beyond the typical analytic approaches to the entire research process. Examples of bias from incorrect analysis with missing data and discussion of the advantages/disadvantages of analytic methods are given. As no single analysis is definitive when missing data occurs, strategies for its prevention throughout the course of a trial are presented. We aim to convey an appreciation for how missing data influences results and an understanding of the need for careful consideration of missing data during the design, planning, conduct, and analytic stages. PMID:24058309

  18. Prediction of pressure and flow transients in a gaseous bipropellant reaction control rocket engine

    NASA Technical Reports Server (NTRS)

    Markowsky, J. J.; Mcmanus, H. N., Jr.

    1974-01-01

    An analytic model is developed to predict pressure and flow transients in a gaseous hydrogen-oxygen reaction control rocket engine feed system. The one-dimensional equations of momentum and continuity are reduced by the method of characteristics from partial derivatives to a set of total derivatives which describe the state properties along the feedline. System components, e.g., valves, manifolds, and injectors are represented by pseudo steady-state relations at discrete junctions in the system. Solutions were effected by a FORTRAN IV program on an IBM 360/65. The results indicate the relative effect of manifold volume, combustion lag time, feedline pressure fluctuations, propellant temperature, and feedline length on the chamber pressure transient. The analytical combustion model is verified by good correlation between predicted and observed chamber pressure transients. The developed model enables a rocket designer to vary the design parameters analytically to obtain stable combustion for a particular mode of operation which is prescribed by mission objectives.

  19. The Effectiveness of Circular Equating as a Criterion for Evaluating Equating.

    ERIC Educational Resources Information Center

    Wang, Tianyou; Hanson, Bradley A.; Harris, Deborah J.

    Equating a test form to itself through a chain of equatings, commonly referred to as circular equating, has been widely used as a criterion to evaluate the adequacy of equating. This paper uses both analytical methods and simulation methods to show that this criterion is in general invalid in serving this purpose. For the random groups design done…

  20. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  1. A decision support system using analytical hierarchy process (AHP) for the optimal environmental reclamation of an open-pit mine

    NASA Astrophysics Data System (ADS)

    Bascetin, A.

    2007-04-01

    The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.

  2. Technology of an adhesive silicone film as drug carrier in transdermal therapy. I: Analytical methods used for characterization and design of the universal elastomer layers.

    PubMed

    Mojsiewicz-Pieńkowska, Krystyna; Jamrógiewicz, Marzena; Zebrowska, Maria; Sznitowska, Małgorzata; Centkowska, Katarzyna

    2011-08-25

    Silicone polymers possess unique properties, which make them suitable for many different applications, for example in the pharmaceutical and medical industry. To create an adhesive silicone film, the appropriate silicone components have to be chosen first. From these components two layers were made: an adhesive elastomer applied on the skin, and a non-adhesive elastomer on the other side of the film. The aim of this study was to identify a set of analytical methods that can be used for detailed characterization of the elastomer layers, as needed when designing new silicone films. More specifically, the following methods were combined to detailed identification of the silicone components: Fourier transform infrared spectroscopy (FTIR), proton nuclear magnetic resonance (¹H NMR) and size exclusion chromatography with evaporative light scattering detector (SEC-ELSD). It was demonstrated that these methods together with a rheological analysis are suitable for controlling the cross-linking reaction, thus obtaining the desired properties of the silicone film. Adhesive silicone films can be used as universal materials for medical use, particularly for effective treatment of scars and keloids or as drug carriers in transdermal therapy.

  3. Multivariate analysis in the pharmaceutical industry: enabling process understanding and improvement in the PAT and QbD era.

    PubMed

    Ferreira, Ana P; Tobyn, Mike

    2015-01-01

    In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.

  4. Design of PCB search coils for AC magnetic flux density measurement

    NASA Astrophysics Data System (ADS)

    Ulvr, Michal

    2018-04-01

    This paper presents single-layer, double-layer and ten-layer planar square search coils designed for AC magnetic flux density amplitude measurement up to 1 T in the low frequency range in a 10 mm air gap. The printed-circuit-board (PCB) method was used for producing the search coils. Special attention is given to a full characterization of the PCB search coils including a comparison between the detailed analytical design method and the finite integration technique method (FIT) on the one hand, and experimental results on the other. The results show very good agreement in the resistance, inductance and search coil constant values (the area turns) and also in the frequency dependence of the search coil constant.

  5. Direct high-performance liquid chromatography method with refractometric detection designed for stability studies of treosulfan and its biologically active epoxy-transformers.

    PubMed

    Główka, Franciszek K; Romański, Michał; Teżyk, Artur; Żaba, Czesław

    2013-01-01

    Treosulfan (TREO) is an alkylating agent registered for treatment of advanced platin-resistant ovarian carcinoma. Nowadays, TREO is increasingly applied iv in high doses as a promising myeloablative agent with low organ toxicity in children. Under physiological conditions it undergoes pH-dependent transformation into epoxy-transformers (S,S-EBDM and S,S-DEB). The mechanism of this reaction is generally known, but not its kinetic details. In order to investigate kinetics of TREO transformation, HPLC method with refractometric detection for simultaneous determination of the three analytes in one analytical run has been developed for the first time. The samples containing TREO, S,S-EBDM, S,S-DEB and acetaminophen (internal standard) were directly injected onto the reversed phase column. To assure stability of the analytes and obtain their complete resolution, mobile phase composed of acetate buffer pH 4.5 and acetonitrile was applied. The linear range of the calibration curves of TREO, S,S-EBDM and S,S-DEB spanned concentrations of 20-6000, 34-8600 and 50-6000 μM, respectively. Intra- and interday precision and accuracy of the developed method fulfilled analytical criteria. The stability of the analytes in experimental samples was also established. The validated HPLC method was successfully applied to the investigation of the kinetics of TREO activation to S,S-EBDM and S,S-DEB. At pH 7.4 and 37 °C the transformation of TREO followed first-order kinetics with a half-life 1.5h. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. The cooperative effect of reduced graphene oxide and Triton X-114 on the electromembrane microextraction efficiency of Pramipexole as a model analyte in urine samples.

    PubMed

    Fashi, Armin; Khanban, Fatemeh; Yaftian, Mohammad Reza; Zamani, Abbasali

    2017-01-01

    A new design of electromembrane microextraction coupled with high-performance liquid chromatography was developed for the determination of Pramipexole as a model analyte in urine samples. The presence of reduced graphene oxide in the membrane and Triton X-114 in the donor phase augments the extraction efficiency of Pramipexole by the proposed method. Dispersed reduced graphene oxide in the organic solvent was held in the pores of the fiber wall by capillary forces and sonication. It is possible that the immobilized reduced graphene oxide acts as a sorbent, affording an additional pathway for analyte transportation. Besides, the presence of Triton X-114 in the donor phase promotes effective migration of ionic analytes across the membrane. The parameters influencing the extraction procedure, such as type and concentration of surfactant, type of organic solvent, amount of reduced graphene oxide, sonication time, applied voltage, extraction time, ionic strength, pH of the donor and acceptor solutions, and stirring rate were optimized. The linear working ranges of the method for preconcentration- determination of Pramipexole in water and urine samples were found to be 0.13-1000 and 0.47-1000ngmL -1 with corresponding detection limits of 0.04 and 0.14ngmL -1 , respectively. The proposed method allows achieving enrichment factors of 301 and 265 for preconcentration of the analyte in water and urine samples, respectively. The method was successfully applied for the determination of Pramipexole in the urine samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Comparison of analytical methods for calculation of wind loads

    NASA Technical Reports Server (NTRS)

    Minderman, Donald J.; Schultz, Larry L.

    1989-01-01

    The following analysis is a comparison of analytical methods for calculation of wind load pressures. The analytical methods specified in ASCE Paper No. 3269, ANSI A58.1-1982, the Standard Building Code, and the Uniform Building Code were analyzed using various hurricane speeds to determine the differences in the calculated results. The winds used for the analysis ranged from 100 mph to 125 mph and applied inland from the shoreline of a large open body of water (i.e., an enormous lake or the ocean) a distance of 1500 feet or ten times the height of the building or structure considered. For a building or structure less than or equal to 250 feet in height acted upon by a wind greater than or equal to 115 mph, it was determined that the method specified in ANSI A58.1-1982 calculates a larger wind load pressure than the other methods. For a building or structure between 250 feet and 500 feet tall acted upon by a wind rangind from 100 mph to 110 mph, there is no clear choice of which method to use; for these cases, factors that must be considered are the steady-state or peak wind velocity, the geographic location, the distance from a large open body of water, and the expected design life and its risk factor.

  8. Complete characterization of the spasing (L-L) curve of a three-level quantum coherence enhanced spaser for design optimization

    NASA Astrophysics Data System (ADS)

    Kumarapperuma, Lakshitha; Premaratne, Malin; Jha, Pankaj K.; Stockman, Mark I.; Agrawal, Govind P.

    2018-05-01

    We demonstrate that it is possible to derive an approximate analytical expression to characterize the spasing (L-L) curve of a coherently enhanced spaser with 3-level gain-medium chromophores. The utility of this solution stems from the fact that it enables optimization of the large parameter space associated with spaser designing, a functionality not offered by the methods currently available in the literature. This is vital for the advancement of spaser technology towards the level of device realization. Owing to the compact nature of the analytical expressions, our solution also facilitates the grouping and identification of key processes responsible for the spasing action, whilst providing significant physical insights. Furthermore, we show that our expression generates results within 0.1% error compared to numerically obtained results for pumping rates higher than the spasing threshold, thereby drastically reducing the computational cost associated with spaser designing.

  9. Sensory evaluation based fuzzy AHP approach for material selection in customized garment design and development process

    NASA Astrophysics Data System (ADS)

    Hong, Y.; Curteza, A.; Zeng, X.; Bruniaux, P.; Chen, Y.

    2016-06-01

    Material selection is the most difficult section in the customized garment product design and development process. This study aims to create a hierarchical framework for material selection. The analytic hierarchy process and fuzzy sets theories have been applied to mindshare the diverse requirements from the customer and inherent interaction/interdependencies among these requirements. Sensory evaluation ensures a quick and effective selection without complex laboratory test such as KES and FAST, using the professional knowledge of the designers. A real empirical application for the physically disabled people is carried out to demonstrate the proposed method. Both the theoretical and practical background of this paper have indicated the fuzzy analytical network process can capture expert's knowledge existing in the form of incomplete, ambiguous and vague information for the mutual influence on attribute and criteria of the material selection.

  10. Green approach using monolithic column for simultaneous determination of coformulated drugs.

    PubMed

    Yehia, Ali M; Mohamed, Heba M

    2016-06-01

    Green chemistry and sustainability is now entirely encompassed across the majority of pharmaceutical companies and research labs. Researchers' attention is careworn toward implementing the green analytical chemistry principles for more eco-friendly analytical methodologies. Solvents play a dominant role in determining the greenness of the analytical procedure. Using safer solvents, the greenness profile of the methodology could be increased remarkably. In this context, a green chromatographic method has been developed and validated for the simultaneous determination of phenylephrine, paracetamol, and guaifenesin in their ternary pharmaceutical mixture. The chromatographic separation was carried out using monolithic column and green solvents as mobile phase. The use of monolithic column allows efficient separation protocols at higher flow rates, which results in short time of analysis. Two-factor three-level experimental design was used to optimize the chromatographic conditions. The greenness profile of the proposed methodology was assessed using eco-scale as a green metrics and was found to be an excellent green method with regard to the usage and production of hazardous chemicals and solvents, energy consumption, and amount of produced waste. The proposed method improved the environmental impact without compromising the analytical performance criteria and could be used as a safer alternate for the routine analysis of the studied drugs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. QbD-oriented development and validation of a bioanalytical method for nevirapine with enhanced liquid-liquid extraction and chromatographic separation.

    PubMed

    Beg, Sarwar; Chaudhary, Vandna; Sharma, Gajanand; Garg, Babita; Panda, Sagar Suman; Singh, Bhupinder

    2016-06-01

    The present studies describe the systematic quality by design (QbD)-oriented development and validation of a simple, rapid, sensitive and cost-effective reversed-phase HPLC bioanalytical method for nevirapine in rat plasma. Chromatographic separation was carried out on a C18 column using isocratic 68:9:23% v/v elution of methanol, acetonitrile and water (pH 3, adjusted by orthophosphoric acid) at a flow rate of 1.0 mL/min using UV detection at 230 nm. A Box-Behnken design was applied for chromatographic method optimization taking mobile phase ratio, pH and flow rate as the critical method parameters (CMPs) from screening studies. Peak area, retention time, theoretical plates and peak tailing were measured as the critical analytical attributes (CAAs). Further, the bioanalytical liquid-liquid extraction process was optimized using an optimal design by selecting extraction time, centrifugation speed and temperature as the CMPs for percentage recovery of nevirapine as the CAA. The search for an optimum chromatographic solution was conducted through numerical desirability function. Validation studies performed as per the US Food and Drug Administration requirements revealed results within the acceptance limit. In a nutshell, the studies successfully demonstrate the utility of analytical QbD approach for the rational development of a bioanalytical method with enhanced chromatographic separation and recovery of nevirapine in rat plasma. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    PubMed

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  13. Assuring the Quality of Test Results in the Field of Nuclear Techniques and Ionizing Radiation. The Practical Implementation of Section 5.9 of the EN ISO/IEC 17025 Standard

    NASA Astrophysics Data System (ADS)

    Cucu, Daniela; Woods, Mike

    2008-08-01

    The paper aims to present a practical approach for testing laboratories to ensure the quality of their test results. It is based on the experience gained in assessing a large number of testing laboratories, discussing with management and staff, reviewing results obtained in national and international PTs and ILCs and exchanging information in the EA laboratory committee. According to EN ISO/IEC 17025, an accredited laboratory has to implement a programme to ensure the quality of its test results for each measurand. Pre-analytical, analytical and post-analytical measures shall be applied in a systematic manner. They shall include both quality control and quality assurance measures. When designing the quality assurance programme a laboratory should consider pre-analytical activities (like personnel training, selection and validation of test methods, qualifying equipment), analytical activities ranging from sampling, sample preparation, instrumental analysis and post-analytical activities (like decoding, calculation, use of statistical tests or packages, management of results). Designed on different levels (analyst, quality manager and technical manager), including a variety of measures, the programme shall ensure the validity and accuracy of test results, the adequacy of the management system, prove the laboratory's competence in performing tests under accreditation and last but not least show the comparability of test results. Laboratory management should establish performance targets and review periodically QC/QA results against them, implementing appropriate measures in case of non-compliance.

  14. An approach to design a 90Sr radioisotope thermoelectric generator using analytical and Monte Carlo methods with ANSYS, COMSOL, and MCNP.

    PubMed

    Khajepour, Abolhasan; Rahmani, Faezeh

    2017-01-01

    In this study, a 90 Sr radioisotope thermoelectric generator (RTG) with power of milliWatt was designed to operate in the determined temperature (300-312K). For this purpose, the combination of analytical and Monte Carlo methods with ANSYS and COMSOL software as well as the MCNP code was used. This designed RTG contains 90 Sr as a radioisotope heat source (RHS) and 127 coupled thermoelectric modules (TEMs) based on bismuth telluride. Kapton (2.45mm in thickness) and Cryotherm sheets (0.78mm in thickness) were selected as the thermal insulators of the RHS, as well as a stainless steel container was used as a generator chamber. The initial design of the RHS geometry was performed according to the amount of radioactive material (strontium titanate) as well as the heat transfer calculations and mechanical strength considerations. According to the Monte Carlo simulation performed by the MCNP code, approximately 0.35 kCi of 90 Sr is sufficient to generate heat power in the RHS. To determine the optimal design of the RTG, the distribution of temperature as well as the dissipated heat and input power to the module were calculated in different parts of the generator using the ANSYS software. Output voltage according to temperature distribution on TEM was calculated using COMSOL. Optimization of the dimension of the RHS and heat insulator was performed to adapt the average temperature of the hot plate of TEM to the determined hot temperature value. This designed RTG generates 8mW in power with an efficiency of 1%. This proposed approach of combination method can be used for the precise design of various types of RTGs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A graphical approach to radio frequency quadrupole design

    NASA Astrophysics Data System (ADS)

    Turemen, G.; Unel, G.; Yasatekin, B.

    2015-07-01

    The design of a radio frequency quadrupole, an important section of all ion accelerators, and the calculation of its beam dynamics properties can be achieved using the existing computational tools. These programs, originally designed in 1980s, show effects of aging in their user interfaces and in their output. The authors believe there is room for improvement in both design techniques using a graphical approach and in the amount of analytical calculations before going into CPU burning finite element analysis techniques. Additionally an emphasis on the graphical method of controlling the evolution of the relevant parameters using the drag-to-change paradigm is bound to be beneficial to the designer. A computer code, named DEMIRCI, has been written in C++ to demonstrate these ideas. This tool has been used in the design of Turkish Atomic Energy Authority (TAEK)'s 1.5 MeV proton beamline at Saraykoy Nuclear Research and Training Center (SANAEM). DEMIRCI starts with a simple analytical model, calculates the RFQ behavior and produces 3D design files that can be fed to a milling machine. The paper discusses the experience gained during design process of SANAEM Project Prometheus (SPP) RFQ and underlines some of DEMIRCI's capabilities.

  16. Analytical solutions to optimal underactuated spacecraft formation reconfiguration

    NASA Astrophysics Data System (ADS)

    Huang, Xu; Yan, Ye; Zhou, Yang

    2015-11-01

    Underactuated systems can generally be defined as systems with fewer number of control inputs than that of the degrees of freedom to be controlled. In this paper, analytical solutions to optimal underactuated spacecraft formation reconfiguration without either the radial or the in-track control are derived. By using a linear dynamical model of underactuated spacecraft formation in circular orbits, controllability analysis is conducted for either underactuated case. Indirect optimization methods based on the minimum principle are then introduced to generate analytical solutions to optimal open-loop underactuated reconfiguration problems. Both fixed and free final conditions constraints are considered for either underactuated case and comparisons between these two final conditions indicate that the optimal control strategies with free final conditions require less control efforts than those with the fixed ones. Meanwhile, closed-loop adaptive sliding mode controllers for both underactuated cases are designed to guarantee optimal trajectory tracking in the presence of unmatched external perturbations, linearization errors, and system uncertainties. The adaptation laws are designed via a Lyapunov-based method to ensure the overall stability of the closed-loop system. The explicit expressions of the terminal convergent regions of each system states have also been obtained. Numerical simulations demonstrate the validity and feasibility of the proposed open-loop and closed-loop control schemes for optimal underactuated spacecraft formation reconfiguration in circular orbits.

  17. Multi-disciplinary optimization of aeroservoelastic systems

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1990-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  18. Multidisciplinary optimization of aeroservoelastic systems using reduced-size models

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1992-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  19. Efficient Simulation of Wing Modal Response: Application of 2nd Order Shape Sensitivities and Neural Networks

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.; Liu, Youhua

    2000-01-01

    At the preliminary design stage of a wing structure, an efficient simulation, one needing little computation but yielding adequately accurate results for various response quantities, is essential in the search of optimal design in a vast design space. In the present paper, methods of using sensitivities up to 2nd order, and direct application of neural networks are explored. The example problem is how to decide the natural frequencies of a wing given the shape variables of the structure. It is shown that when sensitivities cannot be obtained analytically, the finite difference approach is usually more reliable than a semi-analytical approach provided an appropriate step size is used. The use of second order sensitivities is proved of being able to yield much better results than the case where only the first order sensitivities are used. When neural networks are trained to relate the wing natural frequencies to the shape variables, a negligible computation effort is needed to accurately determine the natural frequencies of a new design.

  20. Integration of electrochemistry in micro-total analysis systems for biochemical assays: recent developments.

    PubMed

    Xu, Xiaoli; Zhang, Song; Chen, Hui; Kong, Jilie

    2009-11-15

    Micro-total analysis systems (microTAS) integrate different analytical operations like sample preparation, separation and detection into a single microfabricated device. With the outstanding advantages of low cost, satisfactory analytical efficiency and flexibility in design, highly integrated and miniaturized devices from the concept of microTAS have gained widespread applications, especially in biochemical assays. Electrochemistry is shown to be quite compatible with microanalytical systems for biochemical assays, because of its attractive merits such as simplicity, rapidity, high sensitivity, reduced power consumption, and sample/reagent economy. This review presents recent developments in the integration of electrochemistry in microdevices for biochemical assays. Ingenious microelectrode design and fabrication methods, and versatility of electrochemical techniques are involved. Practical applications of such integrated microsystem in biochemical assays are focused on in situ analysis, point-of-care testing and portable devices. Electrochemical techniques are apparently suited to microsystems, since easy microfabrication of electrochemical elements and a high degree of integration with multi-analytical functions can be achieved at low cost. Such integrated microsystems will play an increasingly important role for analysis of small volume biochemical samples. Work is in progress toward new microdevice design and applications.

  1. Slurry sampling high-resolution continuum source electrothermal atomic absorption spectrometry for direct beryllium determination in soil and sediment samples after elimination of SiO interference by least-squares background correction.

    PubMed

    Husáková, Lenka; Urbanová, Iva; Šafránková, Michaela; Šídová, Tereza

    2017-12-01

    In this work a simple, efficient, and environmentally-friendly method is proposed for determination of Be in soil and sediment samples employing slurry sampling and high-resolution continuum source electrothermal atomic absorption spectrometry (HR-CS-ETAAS). The spectral effects originating from SiO species were identified and successfully corrected by means of a mathematical correction algorithm. Fractional factorial design has been employed to assess the parameters affecting the analytical results and especially to help in the development of the slurry preparation and optimization of measuring conditions. The effects of seven analytical variables including particle size, concentration of glycerol and HNO 3 for stabilization and analyte extraction, respectively, the effect of ultrasonic agitation for slurry homogenization, concentration of chemical modifier, pyrolysis and atomization temperature were investigated by a 2 7-3 replicate (n = 3) design. Using the optimized experimental conditions, the proposed method allowed the determination of Be with a detection limit being 0.016mgkg -1 and characteristic mass 1.3pg. Optimum results were obtained after preparing the slurries by weighing 100mg of a sample with particle size < 54µm and adding 25mL of 20% w/w glycerol. The use of 1μg Rh and 50μg citric acid was found satisfactory for the analyte stabilization. Accurate data were obtained with the use of matrix-free calibration. The accuracy of the method was confirmed by analysis of two certified reference materials (NIST SRM 2702 Inorganics in Marine Sediment and IGI BIL-1 Baikal Bottom Silt) and by comparison of the results obtained for ten real samples by slurry sampling with those determined after microwave-assisted extraction by inductively coupled plasma time of flight mass spectrometry (TOF-ICP-MS). The reported method has a precision better than 7%. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Recommendations for choosing an analysis method that controls Type I error for unbalanced cluster sample designs with Gaussian outcomes.

    PubMed

    Johnson, Jacqueline L; Kreidler, Sarah M; Catellier, Diane J; Murray, David M; Muller, Keith E; Glueck, Deborah H

    2015-11-30

    We used theoretical and simulation-based approaches to study Type I error rates for one-stage and two-stage analytic methods for cluster-randomized designs. The one-stage approach uses the observed data as outcomes and accounts for within-cluster correlation using a general linear mixed model. The two-stage model uses the cluster specific means as the outcomes in a general linear univariate model. We demonstrate analytically that both one-stage and two-stage models achieve exact Type I error rates when cluster sizes are equal. With unbalanced data, an exact size α test does not exist, and Type I error inflation may occur. Via simulation, we compare the Type I error rates for four one-stage and six two-stage hypothesis testing approaches for unbalanced data. With unbalanced data, the two-stage model, weighted by the inverse of the estimated theoretical variance of the cluster means, and with variance constrained to be positive, provided the best Type I error control for studies having at least six clusters per arm. The one-stage model with Kenward-Roger degrees of freedom and unconstrained variance performed well for studies having at least 14 clusters per arm. The popular analytic method of using a one-stage model with denominator degrees of freedom appropriate for balanced data performed poorly for small sample sizes and low intracluster correlation. Because small sample sizes and low intracluster correlation are common features of cluster-randomized trials, the Kenward-Roger method is the preferred one-stage approach. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Innovations in coating technology.

    PubMed

    Behzadi, Sharareh S; Toegel, Stefan; Viernstein, Helmut

    2008-01-01

    Despite representing one of the oldest pharmaceutical techniques, coating of dosage forms is still frequently used in pharmaceutical manufacturing. The aims of coating range from simply masking the taste or odour of drugs to the sophisticated controlling of site and rate of drug release. The high expectations for different coating technologies have required great efforts regarding the development of reproducible and controllable production processes. Basically, improvements in coating methods have focused on particle movement, spraying systems, and air and energy transport. Thereby, homogeneous distribution of coating material and increased drying efficiency should be accomplished in order to achieve high end product quality. Moreover, given the claim of the FDA to design the end product quality already during the manufacturing process (Quality by Design), the development of analytical methods for the analysis, management and control of coating processes has attracted special attention during recent years. The present review focuses on recent patents claiming improvements in pharmaceutical coating technology and intends to first familiarize the reader with the available procedures and to subsequently explain the application of different analytical tools. Aiming to structure this comprehensive field, coating technologies are primarily divided into pan and fluidized bed coating methods. Regarding pan coating procedures, pans rotating around inclined, horizontal and vertical axes are reviewed separately. On the other hand, fluidized bed technologies are subdivided into those involving fluidized and spouted beds. Then, continuous processing techniques and improvements in spraying systems are discussed in dedicated chapters. Finally, currently used analytical methods for the understanding and management of coating processes are reviewed in detail in the last section of the review.

  4. Kinematic synthesis of adjustable robotic mechanisms

    NASA Astrophysics Data System (ADS)

    Chuenchom, Thatchai

    1993-01-01

    Conventional hard automation, such as a linkage-based or a cam-driven system, provides high speed capability and repeatability but not the flexibility required in many industrial applications. The conventional mechanisms, that are typically single-degree-of-freedom systems, are being increasingly replaced by multi-degree-of-freedom multi-actuators driven by logic controllers. Although this new trend in sophistication provides greatly enhanced flexibility, there are many instances where the flexibility needs are exaggerated and the associated complexity is unnecessary. Traditional mechanism-based hard automation, on the other hand, neither can fulfill multi-task requirements nor are cost-effective mainly due to lack of methods and tools to design-in flexibility. This dissertation attempts to bridge this technological gap by developing Adjustable Robotic Mechanisms (ARM's) or 'programmable mechanisms' as a middle ground between high speed hard automation and expensive serial jointed-arm robots. This research introduces the concept of adjustable robotic mechanisms towards cost-effective manufacturing automation. A generalized analytical synthesis technique has been developed to support the computational design of ARM's that lays the theoretical foundation for synthesis of adjustable mechanisms. The synthesis method developed in this dissertation, called generalized adjustable dyad and triad synthesis, advances the well-known Burmester theory in kinematics to a new level. While this method provides planar solutions, a novel patented scheme is utilized for converting prescribed three-dimensional motion specifications into sets of planar projections. This provides an analytical and a computational tool for designing adjustable mechanisms that satisfy multiple sets of three-dimensional motion specifications. Several design issues were addressed, including adjustable parameter identification, branching defect, and mechanical errors. An efficient mathematical scheme for identification of adjustable member was also developed. The analytical synthesis techniques developed in this dissertation were successfully implemented in a graphic-intensive user-friendly computer program. A physical prototype of a general purpose adjustable robotic mechanism has been constructed to serve as a proof-of-concept model.

  5. Arrow-wing supersonic cruise aircraft structural design concepts evaluation. Volume 2: Sections 7 through 11

    NASA Technical Reports Server (NTRS)

    Sakata, I. F.; Davis, G. W.

    1975-01-01

    The materials and advanced producibility methods that offer potential structural mass savings in the design of the primary structure for a supersonic cruise aircraft are identified and reported. A summary of the materials and fabrication techniques selected for this analytical effort is presented. Both metallic and composite material systems were selected for application to a near-term start-of-design technology aircraft. Selective reinforcement of the basic metallic structure was considered as the appropriate level of composite application for the near-term design.

  6. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  7. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  8. 7 CFR 2902.5 - Item designation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., USDA will use life cycle cost information only from tests using the BEES analytical method. (c... availability of such items and the economic and technological feasibility of using such items, including life cycle costs. USDA will gather information on individual products within an item and extrapolate that...

  9. An experimental and analytical investigation on the response of GR/EP composite I-frames

    NASA Technical Reports Server (NTRS)

    Moas, E., Jr.; Boitnott, R. L.; Griffin, O. H., Jr.

    1991-01-01

    Six-foot diameter, semicircular graphite/epoxy specimens representative of generic aircraft frames were loaded quasi-statically to determine their load response and failure mechanisms for large deflections that occur in an airplane crash. These frame-skin specimens consisted of a cylindrical skin section cocured with a semicircular I-frame. Various frame laminate stacking sequences and geometries were evaluated by statically loading the specimen until multiple failures occurred. Two analytical methods were compared for modeling the frame-skin specimens: a two-dimensional branched-shell finite element analysis and a one-dimensional, closed-form, curved beam solution derived using an energy method. Excellent correlation was obtained between experimental results and the finite element predictions of the linear response of the frames prior to the initial failure. The beam solution was used for rapid parameter and design studies, and was found to be stiff in comparison with the finite element analysis. The specimens were found to be useful for evaluating composite frame designs.

  10. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  11. Benchmark solutions for the galactic heavy-ion transport equations with energy and spatial coupling

    NASA Technical Reports Server (NTRS)

    Ganapol, Barry D.; Townsend, Lawrence W.; Lamkin, Stanley L.; Wilson, John W.

    1991-01-01

    Nontrivial benchmark solutions are developed for the galactic heavy ion transport equations in the straightahead approximation with energy and spatial coupling. Analytical representations of the ion fluxes are obtained for a variety of sources with the assumption that the nuclear interaction parameters are energy independent. The method utilizes an analytical LaPlace transform inversion to yield a closed form representation that is computationally efficient. The flux profiles are then used to predict ion dose profiles, which are important for shield design studies.

  12. MODFLOW equipped with a new method for the accurate simulation of axisymmetric flow

    NASA Astrophysics Data System (ADS)

    Samani, N.; Kompani-Zare, M.; Barry, D. A.

    2004-01-01

    Axisymmetric flow to a well is an important topic of groundwater hydraulics, the simulation of which depends on accurate computation of head gradients. Groundwater numerical models with conventional rectilinear grid geometry such as MODFLOW (in contrast to analytical models) generally have not been used to simulate aquifer test results at a pumping well because they are not designed or expected to closely simulate the head gradient near the well. A scaling method is proposed based on mapping the governing flow equation from cylindrical to Cartesian coordinates, and vice versa. A set of relationships and scales is derived to implement the conversion. The proposed scaling method is then embedded in MODFLOW 2000. To verify the accuracy of the method steady and unsteady flows in confined and unconfined aquifers with fully or partially penetrating pumping wells are simulated and compared with the corresponding analytical solutions. In all cases a high degree of accuracy is achieved.

  13. Analytical and numerical solutions of the potential and electric field generated by different electrode arrays in a tumor tissue under electrotherapy.

    PubMed

    Bergues Pupo, Ana E; Reyes, Juan Bory; Bergues Cabrales, Luis E; Bergues Cabrales, Jesús M

    2011-09-24

    Electrotherapy is a relatively well established and efficient method of tumor treatment. In this paper we focus on analytical and numerical calculations of the potential and electric field distributions inside a tumor tissue in a two-dimensional model (2D-model) generated by means of electrode arrays with shapes of different conic sections (ellipse, parabola and hyperbola). Analytical calculations of the potential and electric field distributions based on 2D-models for different electrode arrays are performed by solving the Laplace equation, meanwhile the numerical solution is solved by means of finite element method in two dimensions. Both analytical and numerical solutions reveal significant differences between the electric field distributions generated by electrode arrays with shapes of circle and different conic sections (elliptic, parabolic and hyperbolic). Electrode arrays with circular, elliptical and hyperbolic shapes have the advantage of concentrating the electric field lines in the tumor. The mathematical approach presented in this study provides a useful tool for the design of electrode arrays with different shapes of conic sections by means of the use of the unifying principle. At the same time, we verify the good correspondence between the analytical and numerical solutions for the potential and electric field distributions generated by the electrode array with different conic sections.

  14. How to determine spiral bevel gear tooth geometry for finite element analysis

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.; Litvin, Faydor L.

    1991-01-01

    An analytical method was developed to determine gear tooth surface coordinates of face milled spiral bevel gears. The method combines the basic gear design parameters with the kinematical aspects for spiral bevel gear manufacturing. A computer program was developed to calculate the surface coordinates. From this data a 3-D model for finite element analysis can be determined. Development of the modeling method and an example case are presented.

  15. Experimental design-based isotope-dilution SPME-GC/MS method development for the analysis of smoke flavouring products.

    PubMed

    Giri, Anupam; Zelinkova, Zuzana; Wenzl, Thomas

    2017-12-01

    For the implementation of Regulation (EC) No 2065/2003 related to smoke flavourings used or intended for use in or on foods a method based on solid-phase micro extraction (SPME) GC/MS was developed for the characterisation of liquid smoke products. A statistically based experimental design (DoE) was used for method optimisation. The best general conditions to quantitatively analyse the liquid smoke compounds were obtained with a polydimethylsiloxane/divinylbenzene (PDMS/DVB) fibre, 60°C extraction temperature, 30 min extraction time, 250°C desorption temperature, 180 s desorption time, 15 s agitation time, and 250 rpm agitation speed. Under the optimised conditions, 119 wood pyrolysis products including furan/pyran derivatives, phenols, guaiacol, syringol, benzenediol, and their derivatives, cyclic ketones, and several other heterocyclic compounds were identified. The proposed method was repeatable (RSD% <5) and the calibration functions were linear for all compounds under study. Nine isotopically labelled internal standards were used for improving quantification of analytes by compensating matrix effects that might affect headspace equilibrium and extractability of compounds. The optimised isotope dilution SPME-GC/MS based analytical method proved to be fit for purpose, allowing the rapid identification and quantification of volatile compounds in liquid smoke flavourings.

  16. Development of a dynamic headspace gas chromatography-mass spectrometry method for on-site analysis of sulfur mustard degradation products in sediments.

    PubMed

    Magnusson, R; Nordlander, T; Östin, A

    2016-01-15

    Sampling teams performing work at sea in areas where chemical munitions may have been dumped require rapid and reliable analytical methods for verifying sulfur mustard leakage from suspected objects. Here we present such an on-site analysis method based on dynamic headspace GC-MS for analysis of five cyclic sulfur mustard degradation products that have previously been detected in sediments from chemical weapon dumping sites: 1,4-oxathiane, 1,3-dithiolane, 1,4-dithiane, 1,4,5-oxadithiephane, and 1,2,5-trithiephane. An experimental design involving authentic Baltic Sea sediments spiked with the target analytes was used to develop an optimized protocol for sample preparation, headspace extraction and analysis that afforded recoveries of up to 60-90%. The optimized method needs no organic solvents, uses only two grams of sediment on a dry weight basis and involves a unique sample presentation whereby sediment is spread uniformly as a thin layer inside the walls of a glass headspace vial. The method showed good linearity for analyte concentrations of 5-200 ng/g dw, good repeatability, and acceptable carry-over. The method's limits of detection for spiked sediment samples ranged from 2.5 to 11 μg/kg dw, with matrix interference being the main limiting factor. The instrumental detection limits were one to two orders of magnitude lower. Full-scan GC-MS analysis enabled the use of automated mass spectral deconvolution for rapid identification of target analytes. Using this approach, analytes could be identified in spiked sediment samples at concentrations down to 13-65 μg/kg dw. On-site validation experiments conducted aboard the research vessel R/V Oceania demonstrated the method's practical applicability, enabling the successful identification of four cyclic sulfur mustard degradation products at concentrations of 15-308μg/kg in sediments immediately after being collected near a wreck at the Bornholm Deep dumpsite in the Baltic Sea. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. High speed operation of permanent magnet machines

    NASA Astrophysics Data System (ADS)

    El-Refaie, Ayman M.

    This work proposes methods to extend the high-speed operating capabilities of both the interior PM (IPM) and surface PM (SPM) machines. For interior PM machines, this research has developed and presented the first thorough analysis of how a new bi-state magnetic material can be usefully applied to the design of IPM machines. Key elements of this contribution include identifying how the unique properties of the bi-state magnetic material can be applied most effectively in the rotor design of an IPM machine by "unmagnetizing" the magnet cavity center posts rather than the outer bridges. The importance of elevated rotor speed in making the best use of the bi-state magnetic material while recognizing its limitations has been identified. For surface PM machines, this research has provided, for the first time, a clear explanation of how fractional-slot concentrated windings can be applied to SPM machines in order to achieve the necessary conditions for optimal flux weakening. A closed-form analytical procedure for analyzing SPM machines designed with concentrated windings has been developed. Guidelines for designing SPM machines using concentrated windings in order to achieve optimum flux weakening are provided. Analytical and numerical finite element analysis (FEA) results have provided promising evidence of the scalability of the concentrated winding technique with respect to the number of poles, machine aspect ratio, and output power rating. Useful comparisons between the predicted performance characteristics of SPM machines equipped with concentrated windings and both SPM and IPM machines designed with distributed windings are included. Analytical techniques have been used to evaluate the impact of the high pole number on various converter performance metrics. Both analytical techniques and FEA have been used for evaluating the eddy-current losses in the surface magnets due to the stator winding subharmonics. Techniques for reducing these losses have been investigated. A 6kW, 36slot/30pole prototype SPM machine has been designed and built. Experimental measurements have been used to verify the analytical and FEA results. These test results have demonstrated that wide constant-power speed range can be achieved. Other important machine features such as the near-sinusoidal back-emf, high efficiency, and low cogging torque have also been demonstrated.

  18. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  19. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  20. Analysis of potential genotoxic impurities in rabeprazole active pharmaceutical ingredient via Liquid Chromatography-tandem Mass Spectrometry, following quality-by-design principles for method development.

    PubMed

    Iliou, Katerina; Malenović, Anđelija; Loukas, Yannis L; Dotsikas, Yannis

    2018-02-05

    A novel Liquid Chromatography-tandem mass spectrometry (LC-MS/MS) method is presented for the quantitative determination of two potential genotoxic impurities (PGIs) in rabeprazole active pharmaceutical ingredient (API). In order to overcome the analytical challenges in the trace analysis of PGIs, a development procedure supported by Quality-by-Design (QbD) principles was evaluated. The efficient separation between rabeprazole and the two PGIs in the shortest analysis time was set as the defined analytical target profile (ATP) and to this purpose utilization of a switching valve allowed the flow to be sent to waste when rabeprazole was eluted. The selected critical quality attributes (CQAs) were the separation criterion s between the critical peak pair and the capacity factor k of the last eluted compound. The effect of the following critical process parameters (CPPs) on the CQAs was studied: %ACN content, the pH and the concentration of the buffer salt in the mobile phase, as well as the stationary phase of the analytical column. D-Optimal design was implemented to set the plan of experiments with UV detector. In order to define the design space, Monte Carlo simulations with 5000 iterations were performed. Acceptance criteria were met for C 8 column (50×4mm, 5μm) , and the region having probability π≥95% to achieve satisfactory values of all defined CQAs was computed. The working point was selected with the mobile phase consisting ‎of ACN, ammonium formate 11mM at a ratio 31/69v/v with pH=6,8 for the water phase. The LC protocol was transferred to LC-MS/MS and validated according to ICH guidelines. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. A field study of selected U.S. Geological Survey analytical methods for measuring pesticides in filtered stream water, June - September 2012

    USGS Publications Warehouse

    Martin, Jeffrey D.; Norman, Julia E.; Sandstrom, Mark W.; Rose, Claire E.

    2017-09-06

    U.S. Geological Survey monitoring programs extensively used two analytical methods, gas chromatography/mass spectrometry and liquid chromatography/mass spectrometry, to measure pesticides in filtered water samples during 1992–2012. In October 2012, the monitoring programs began using direct aqueous-injection liquid chromatography tandem mass spectrometry as a new analytical method for pesticides. The change in analytical methods, however, has the potential to inadvertently introduce bias in analysis of datasets that span the change.A field study was designed to document performance of the new method in a variety of stream-water matrices and to quantify any potential changes in measurement bias or variability that could be attributed to changes in analytical methods. The goals of the field study were to (1) summarize performance (bias and variability of pesticide recovery) of the new method in a variety of stream-water matrices; (2) compare performance of the new method in laboratory blank water (laboratory reagent spikes) to that in a variety of stream-water matrices; (3) compare performance (analytical recovery) of the new method to that of the old methods in a variety of stream-water matrices; (4) compare pesticide detections and concentrations measured by the new method to those of the old methods in a variety of stream-water matrices; (5) compare contamination measured by field blank water samples in old and new methods; (6) summarize the variability of pesticide detections and concentrations measured by the new method in field duplicate water samples; and (7) identify matrix characteristics of environmental water samples that adversely influence the performance of the new method. Stream-water samples and a variety of field quality-control samples were collected at 48 sites in the U.S. Geological Survey monitoring networks during June–September 2012. Stream sites were located across the United States and included sites in agricultural and urban land-use settings, as well as sites on major rivers.The results of the field study identified several challenges for the analysis and interpretation of data analyzed by both old and new methods, particularly when data span the change in methods and are combined for analysis of temporal trends in water quality. The main challenges identified are large (greater than 30 percent), statistically significant differences in analytical recovery, detection capability, and (or) measured concentrations for selected pesticides. These challenges are documented and discussed, but specific guidance or statistical methods to resolve these differences in methods are beyond the scope of the report. The results of the field study indicate that the implications of the change in analytical methods must be assessed individually for each pesticide and method.Understanding the possible causes of the systematic differences in concentrations between methods that remain after recovery adjustment might be necessary to determine how to account for the differences in data analysis. Because recoveries for each method are independently determined from separate reference standards and spiking solutions, the differences might be due to an error in one of the reference standards or solutions or some other basic aspect of standard procedure in the analytical process. Further investigation of the possible causes is needed, which will lead to specific decisions on how to compensate for these differences in concentrations in data analysis. In the event that further investigations do not provide insight into the causes of systematic differences in concentrations between methods, the authors recommend continuing to collect and analyze paired environmental water samples by both old and new methods. This effort should be targeted to seasons, sites, and expected concentrations to supplement those concentrations already assessed and to compare the ongoing analytical recovery of old and new methods to those observed in the summer and fall of 2012.

  2. Development of a capillary electrophoresis method for the determination of the chiral purity of dextromethorphan by a dual selector system using quality by design methodology.

    PubMed

    Krait, Sulaiman; Heuermann, Matthias; Scriba, Gerhard K E

    2018-03-01

    Dextromethorphan is a centrally acting antitussive drug, while its enantiomer levomethorphan is an illicit drug with opioid analgesic effects. As capillary electrophoresis has been proven as an ideal technique for enantiomer analysis, the present study was conducted in order to develop a capillary electrophoresis-based limit test for levomethorphan. The analytical target profile was defined as a method that should be able to determine levomethorphan with acceptable precision and accuracy at the 0.1 % level. From initial scouting experiments, a dual selector system consisting of sulfated β-cyclodextrin and methyl-α-cyclodextrin was identified. The critical process parameters were evaluated in a fractional factorial resolution IV design followed by a central composite face-centered design and Monte Carlo simulations for defining the design space of the method. The selected working conditions consisted of a 30/40.2 cm, 50 μm id fused-silica capillary, 30 mM sodium phosphate buffer, pH 6.5, 16 mg/mL sulfated β-cyclodextrin, and 14 mg/mL methyl-α-cyclodextrin at 20°C and 20 kV. The method was validated according to ICH guideline Q2(R1) and applied to the analysis of a capsule formulation. Furthermore, the apparent binding constants between the enantiomers and the cyclodextrins as well as complex mobilities were determined to understand the migration behavior of the analytes. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Multivariate Approaches for Simultaneous Determination of Avanafil and Dapoxetine by UV Chemometrics and HPLC-QbD in Binary Mixtures and Pharmaceutical Product.

    PubMed

    2016-04-07

    Multivariate UV-spectrophotometric methods and Quality by Design (QbD) HPLC are described for concurrent estimation of avanafil (AV) and dapoxetine (DP) in the binary mixture and in the dosage form. Chemometric methods have been developed, including classical least-squares, principal component regression, partial least-squares, and multiway partial least-squares. Analytical figures of merit, such as sensitivity, selectivity, analytical sensitivity, LOD, and LOQ were determined. QbD consists of three steps, starting with the screening approach to determine the critical process parameter and response variables. This is followed by understanding of factors and levels, and lastly the application of a Box-Behnken design containing four critical factors that affect the method. From an Ishikawa diagram and a risk assessment tool, four main factors were selected for optimization. Design optimization, statistical calculation, and final-condition optimization of all the reactions were Carried out. Twenty-five experiments were done, and a quadratic model was used for all response variables. Desirability plot, surface plot, design space, and three-dimensional plots were calculated. In the optimized condition, HPLC separation was achieved on Phenomenex Gemini C18 column (250 × 4.6 mm, 5 μm) using acetonitrile-buffer (ammonium acetate buffer at pH 3.7 with acetic acid) as a mobile phase at flow rate of 0.7 mL/min. Quantification was done at 239 nm, and temperature was set at 20°C. The developed methods were validated and successfully applied for simultaneous determination of AV and DP in the dosage form.

  4. Analytic Method for Computing Instrument Pointing Jitter

    NASA Technical Reports Server (NTRS)

    Bayard, David

    2003-01-01

    A new method of calculating the root-mean-square (rms) pointing jitter of a scientific instrument (e.g., a camera, radar antenna, or telescope) is introduced based on a state-space concept. In comparison with the prior method of calculating the rms pointing jitter, the present method involves significantly less computation. The rms pointing jitter of an instrument (the square root of the jitter variance shown in the figure) is an important physical quantity which impacts the design of the instrument, its actuators, controls, sensory components, and sensor- output-sampling circuitry. Using the Sirlin, San Martin, and Lucke definition of pointing jitter, the prior method of computing the rms pointing jitter involves a frequency-domain integral of a rational polynomial multiplied by a transcendental weighting function, necessitating the use of numerical-integration techniques. In practice, numerical integration complicates the problem of calculating the rms pointing error. In contrast, the state-space method provides exact analytic expressions that can be evaluated without numerical integration.

  5. Spectrophotometric determination of sulphate in automotive fuel ethanol by sequential injection analysis using dimethylsulphonazo(III) reaction.

    PubMed

    de Oliveira, Fabio Santos; Korn, Mauro

    2006-01-15

    A sensitive SIA method was developed for sulphate determination in automotive fuel ethanol. This method was based on the reaction of sulphate with barium-dimethylsulphonazo(III) leading to a decrease on the magnitude of analytical signal monitored at 665 nm. Alcohol fuel samples were previously burned up to avoid matrix effects for sulphate determinations. Binary sampling and stop-flow strategies were used to increase the sensitivity of the method. The optimization of analytical parameter was performed by response surface method using Box-Behnker and central composite designs. The proposed sequential flow procedure permits to determine up to 10.0mg SO(4)(2-)l(-1) with R.S.D. <2.5% and limit of detection of 0.27 mg l(-1). The method has been successfully applied for sulphate determination in automotive fuel alcohol and the results agreed with the reference volumetric method. In the optimized condition the SIA system carried out 27 samples per hour.

  6. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    PubMed

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  7. Effect of vibration on retention characteristics of screen acquisition systems. [for surface tension propellant acquisition

    NASA Technical Reports Server (NTRS)

    Tegart, J. R.; Aydelott, J. C.

    1978-01-01

    The design of surface tension propellant acquisition systems using fine-mesh screen must take into account all factors that influence the liquid pressure differentials within the system. One of those factors is spacecraft vibration. Analytical models to predict the effects of vibration have been developed. A test program to verify the analytical models and to allow a comparative evaluation of the parameters influencing the response to vibration was performed. Screen specimens were tested under conditions simulating the operation of an acquisition system, considering the effects of such parameters as screen orientation and configuration, screen support method, screen mesh, liquid flow and liquid properties. An analytical model, based on empirical coefficients, was most successful in predicting the effects of vibration.

  8. Reduction of movement resistance force of pipeline in horizontal curved well at stage of designing underground passage

    NASA Astrophysics Data System (ADS)

    Toropov, V. S.; Toropov, S. Yu

    2018-05-01

    A method has been developed to reduce the resistance to movement of a pipeline in a horizontal curved well in the construction of underground passages using trenchless technologies. The method can be applied at the design stage. The idea of the proposed method consists in approximating the trajectory of the designed trenchless passage to the equilibrium profile. It has been proved that in order to reduce the resistance to movement of the pipeline arising from contact with the borehole wall, the profile of its initial and final sections must correspond, depending on the initial conditions, to the parabola or hyperbolic cosine equation. Analytical dependences are obtained which allow supplementing the methods of calculation of traction effort in trenchless construction for the case when the profile of the well is given by an arbitrary function.

  9. A Critical Look at the Cross Impact Matrix Method. A Research Report.

    ERIC Educational Resources Information Center

    Folk, Michael

    This paper explains some of the problems with, and their importance to the application of, the Cross-Impact Matrix (CIM). The CIM is a research method designed to serve as a heuristic device to enhance a person's ability to think about the future and as an analytical device to be used by planners to help in actually forecasting future occurrences.…

  10. Measurement of Henry's Law Constants Using Internal Standards: A Quantitative GC Experiment for the Instrumental Analysis or Environmental Chemistry Laboratory

    ERIC Educational Resources Information Center

    Ji, Chang; Boisvert, Susanne M.; Arida, Ann-Marie C.; Day, Shannon E.

    2008-01-01

    An internal standard method applicable to undergraduate instrumental analysis or environmental chemistry laboratory has been designed and tested to determine the Henry's law constants for a series of alkyl nitriles. In this method, a mixture of the analytes and an internal standard is prepared and used to make a standard solution (organic solvent)…

  11. Space vehicle engine and heat shield environment review. Volume 1: Engineering analysis

    NASA Technical Reports Server (NTRS)

    Mcanelly, W. B.; Young, C. T. K.

    1973-01-01

    Methods for predicting the base heating characteristics of a multiple rocket engine installation are discussed. The environmental data is applied to the design of adequate protection system for the engine components. The methods for predicting the base region thermal environment are categorized as: (1) scale model testing, (2) extrapolation of previous and related flight test results, and (3) semiempirical analytical techniques.

  12. Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.

    PubMed

    Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G

    2018-06-01

    This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.

  13. Recent theoretical developments and experimental studies pertinent to vortex flow aerodynamics - With a view towards design

    NASA Technical Reports Server (NTRS)

    Lamar, J. E.; Luckring, J. M.

    1978-01-01

    A review is presented of recent progress in a research program directed towards the development of an improved vortex-flow technology base. It is pointed out that separation induced vortex-flows from the leading and side edges play an important role in the high angle-of-attack aerodynamic characteristics of a wide range of modern aircraft. In the analysis and design of high-speed aircraft, a detailed knowledge of this type of separation is required, particularly with regard to critical wind loads and the stability and performance at various off-design conditions. A description of analytical methods is presented. The theoretical methods employed are divided into two classes which are dependent upon the underlying aerodynamic assumptions. One conical flow method is considered along with three different nonconical flow methods. Comparisons are conducted between the described methods and available aerodynamic data. Attention is also given to a vortex flow drag study and a vortex flow wing design using suction analogy.

  14. An analytical design procedure for the determination of effective leading edge extensions on thick delta wings

    NASA Technical Reports Server (NTRS)

    Ghaffari, F.; Chaturvedi, S. K.

    1984-01-01

    An analytical design procedure for leading edge extensions (LEE) was developed for thick delta wings. This LEE device is designed to be mounted to a wing along the pseudo-stagnation stream surface associated with the attached flow design lift coefficient of greater than zero. The intended purpose of this device is to improve the aerodynamic performance of high subsonic and low supersonic aircraft at incidences above that of attached flow design lift coefficient, by using a vortex system emanating along the leading edges of the device. The low pressure associated with these vortices would act on the LEE upper surface and the forward facing area at the wing leading edges, providing an additional lift and effective leading edge thrust recovery. The first application of this technique was to a thick, round edged, twisted and cambered wing of approximately triangular planform having a sweep of 58 deg and aspect ratio of 2.30. The panel aerodynamics and vortex lattice method with suction analogy computer codes were employed to determine the pseudo-stagnation stream surface and an optimized LEE planform shape.

  15. Analytical expression for position sensitivity of linear response beam position monitor having inter-electrode cross talk

    NASA Astrophysics Data System (ADS)

    Kumar, Mukesh; Ojha, A.; Garg, A. D.; Puntambekar, T. A.; Senecha, V. K.

    2017-02-01

    According to the quasi electrostatic model of linear response capacitive beam position monitor (BPM), the position sensitivity of the device depends only on the aperture of the device and it is independent of processing frequency and load impedance. In practice, however, due to the inter-electrode capacitive coupling (cross talk), the actual position sensitivity of the device decreases with increasing frequency and load impedance. We have taken into account the inter-electrode capacitance to derive and propose a new analytical expression for the position sensitivity as a function of frequency and load impedance. The sensitivity of a linear response shoe-box type BPM has been obtained through simulation using CST Studio Suite to verify and confirm the validity of the new analytical equation. Good agreement between the simulation results and the new analytical expression suggest that this method can be exploited for proper designing of BPM.

  16. Critical Thinking in Social Work Education: A Research Synthesis

    ERIC Educational Resources Information Center

    Samson, Patricia L.

    2016-01-01

    In a meta-analytic review of critical thinking in social work education, findings revealed variability in research designs, methods, and subsequent findings. The 10 studies reviewed assessed different components of critical thinking and highlighted different potential moderator variables. Although there are significant limitations to all the…

  17. Experimental performance and acoustic investigation of modern, counterrotating blade concepts

    NASA Technical Reports Server (NTRS)

    Hoff, G. E.

    1990-01-01

    The aerodynamic, acoustic, and aeromechanical performance of counterrotating blade concepts were evaluated both theoretically and experimentally. Analytical methods development and design are addressed. Utilizing the analytical methods which evolved during the conduct of this work, aerodynamic and aeroacoustic predictions were developed, which were compared to NASA and GE wind tunnel test results. The detailed mechanical design and fabrication of five different composite shell/titanium spar counterrotating blade set configurations are presented. Design philosophy, analyses methods, and material geometry are addressed, as well as the influence of aerodynamics, aeromechanics, and aeroacoustics on the design procedures. Blade fabrication and quality control procedures are detailed; bench testing procedures and results of blade integrity verification are presented; and instrumentation associated with the bench testing also is identified. Additional hardware to support specialized testing is described, as are operating blade instrumentation and the associated stress limits. The five counterrotating blade concepts were scaled to a tip diameter of 2 feet, so they could be incorporated into MPS (model propulsion simulators). Aerodynamic and aeroacoustic performance testing was conducted in the NASA Lewis 8 x 6 supersonic and 9 x 15 V/STOL (vertical or short takeoff and landing) wind tunnels and in the GE freejet anechoic test chamber (Cell 41) to generate an experimental data base for these counterrotating blade designs. Test facility and MPS vehicle matrices are provided, and test procedures are presented. Effects on performance of rotor-to-rotor spacing, angle-of-attack, pylon proximity, blade number, reduced-diameter aft blades, and mismatched rotor speeds are addressed. Counterrotating blade and specialized aeromechanical hub stability test results are also furnished.

  18. Analytical Solution for Optimum Design of Furrow Irrigation Systems

    NASA Astrophysics Data System (ADS)

    Kiwan, M. E.

    1996-05-01

    An analytical solution for the optimum design of furrow irrigation systems is derived. The non-linear calculus optimization method is used to formulate a general form for designing the optimum system elements under circumstances of maximizing the water application efficiency of the system during irrigation. Different system bases and constraints are considered in the solution. A full irrigation water depth is considered to be achieved at the tail of the furrow line. The solution is based on neglecting the recession and depletion times after off-irrigation. This assumption is valid in the case of open-end (free gradient) furrow systems rather than closed-end (closed dike) systems. Illustrative examples for different systems are presented and the results are compared with the output obtained using an iterative numerical solution method. The final derived solution is expressed as a function of the furrow length ratio (the furrow length to the water travelling distance). The function of water travelling developed by Reddy et al. is considered for reaching the optimum solution. As practical results from the study, the optimum furrow elements for free gradient systems can be estimated to achieve the maximum application efficiency, i.e. furrow length, water inflow rate and cutoff irrigation time.

  19. Review and assessment of the database and numerical modeling for turbine heat transfer

    NASA Technical Reports Server (NTRS)

    Gladden, H. J.; Simoneau, R. J.

    1988-01-01

    The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.

  20. Analytical and Experimental Evaluation of the Heat Transfer Distribution over the Surfaces of Turbine Vanes

    NASA Technical Reports Server (NTRS)

    Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.

    1983-01-01

    Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.

  1. Analytical and experimental evaluation of the heat transfer distribution over the surfaces of turbine vanes

    NASA Astrophysics Data System (ADS)

    Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.

    1983-05-01

    Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.

  2. Magnetic Nanoparticles for Antibiotics Detection

    PubMed Central

    Cristea, Cecilia; Tertis, Mihaela; Galatus, Ramona

    2017-01-01

    Widespread use of antibiotics has led to pollution of waterways, potentially creating resistance among freshwater bacterial communities. Microorganisms resistant to commonly prescribed antibiotics (superbug) have dramatically increased over the last decades. The presence of antibiotics in waters, in food and beverages in both their un-metabolized and metabolized forms are of interest for humans. This is due to daily exposure in small quantities, that, when accumulated, could lead to development of drug resistance to antibiotics, or multiply the risk of allergic reaction. Conventional analytical methods used to quantify antibiotics are relatively expensive and generally require long analysis time associated with the difficulties to perform field analyses. In this context, electrochemical and optical based sensing devices are of interest, offering great potentials for a broad range of analytical applications. This review will focus on the application of magnetic nanoparticles in the design of different analytical methods, mainly sensors, used for the detection of antibiotics in different matrices (human fluids, the environmental, food and beverages samples). PMID:28538684

  3. Evaluation on Bending Properties of Biomaterial GUM Metal Meshed Plates for Bone Graft Applications

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiromichi; He, Jianmei

    2017-11-01

    There are three bone graft methods for bone defects caused by diseases such as cancer and accident injuries: Autogenous bone grafts, Allografts and Artificial bone grafts. In this study, meshed GUM Metal plates with lower elasticity, high strength and high biocompatibility are introduced to solve the over stiffness & weight problems of ready-used metal implants. Basic mesh shapes are designed and applied to GUM Metal plates using 3D CAD modeling tools. Bending properties of prototype meshed GUM Metal plates are evaluated experimentally and analytically. Meshed plate specimens with 180°, 120° and 60° axis-symmetrical types were fabricated for 3-point bending tests. The pseudo bending elastic moduli of meshed plate specimens obtained from 3-point bending test are ranged from 4.22 GPa to 16.07 GPa, within the elasticity range of natural cortical bones from 2.0 GPa to 30.0 GPa. Analytical approach method is validated by comparison with experimental and analytical results for evaluation on bending property of meshed plates.

  4. Experimental results of active control on a large structure to suppress vibration

    NASA Technical Reports Server (NTRS)

    Dunn, H. J.

    1991-01-01

    Three design methods, Linear Quadratic Gaussian with Loop Transfer Recovery (LQG/LTR), H-infinity, and mu-synthesis, are used to obtain compensators for suppressing the vibrations of a 10-bay vertical truss structure, a component typical of what may be used to build a large space structure. For the design process the plant dynamic characteristics of the structure were determined experimentally using an identification method. The resulting compensators were implemented on a digital computer and tested for their ability to suppress the first bending mode response of the 10-bay vertical truss. Time histories of the measured motion are presented, and modal damping obtained during the experiments are compared with analytical predictions. The advantages and disadvantages of using the various design methods are discussed.

  5. Analytical Method to Evaluate Failure Potential During High-Risk Component Development

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Stone, Robert B.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    Communicating failure mode information during design and manufacturing is a crucial task for failure prevention. Most processes use Failure Modes and Effects types of analyses, as well as prior knowledge and experience, to determine the potential modes of failures a product might encounter during its lifetime. When new products are being considered and designed, this knowledge and information is expanded upon to help designers extrapolate based on their similarity with existing products and the potential design tradeoffs. This paper makes use of similarities and tradeoffs that exist between different failure modes based on the functionality of each component/product. In this light, a function-failure method is developed to help the design of new products with solutions for functions that eliminate or reduce the potential of a failure mode. The method is applied to a simplified rotating machinery example in this paper, and is proposed as a means to account for helicopter failure modes during design and production, addressing stringent safety and performance requirements for NASA applications.

  6. Liquid-cooling technology for gas turbines - Review and status

    NASA Technical Reports Server (NTRS)

    Van Fossen, G. J., Jr.; Stepka, F. S.

    1978-01-01

    After a brief review of past efforts involving the forced-convection cooling of gas turbines, the paper surveys the state of the art of the liquid cooling of gas turbines. Emphasis is placed on thermosyphon methods of cooling, including those utilizing closed, open, and closed-loop thermosyphons; other methods, including sweat, spray and stator cooling, are also discussed. The more significant research efforts, design data, correlations, and analytical methods are mentioned and voids in technology are summarized.

  7. Chemometric-assisted QuEChERS extraction method for post-harvest pesticide determination in fruits and vegetables

    NASA Astrophysics Data System (ADS)

    Li, Minmin; Dai, Chao; Wang, Fengzhong; Kong, Zhiqiang; He, Yan; Huang, Ya Tao; Fan, Bei

    2017-02-01

    An effective analysis method was developed based on a chemometric tool for the simultaneous quantification of five different post-harvest pesticides (2,4-dichlorophenoxyacetic acid (2,4-D), carbendazim, thiabendazole, iprodione, and prochloraz) in fruits and vegetables. In the modified QuEChERS (quick, easy, cheap, effective, rugged and safe) method, the factors and responses for optimization of the extraction and cleanup analyses were compared using the Plackett-Burman (P-B) screening design. Furthermore, the significant factors (toluene percentage, hydrochloric acid (HCl) percentage, and graphitized carbon black (GCB) amount) were optimized using a central composite design (CCD) combined with Derringer’s desirability function (DF). The limits of quantification (LOQs) were estimated to be 1.0 μg/kg for 2,4-D, carbendazim, thiabendazole, and prochloraz, and 1.5 μg/kg for iprodione in food matrices. The mean recoveries were in the range of 70.4-113.9% with relative standard deviations (RSDs) of less than 16.9% at three spiking levels. The measurement uncertainty of the analytical method was determined using the bottom-up approach, which yielded an average value of 7.6%. Carbendazim was most frequently found in real samples analyzed using the developed method. Consequently, the analytical method can serve as an advantageous and rapid tool for determination of five preservative pesticides in fruits and vegetables.

  8. Chemometric-assisted QuEChERS extraction method for post-harvest pesticide determination in fruits and vegetables

    PubMed Central

    Li, Minmin; Dai, Chao; Wang, Fengzhong; Kong, Zhiqiang; He, Yan; Huang, Ya Tao; Fan, Bei

    2017-01-01

    An effective analysis method was developed based on a chemometric tool for the simultaneous quantification of five different post-harvest pesticides (2,4-dichlorophenoxyacetic acid (2,4-D), carbendazim, thiabendazole, iprodione, and prochloraz) in fruits and vegetables. In the modified QuEChERS (quick, easy, cheap, effective, rugged and safe) method, the factors and responses for optimization of the extraction and cleanup analyses were compared using the Plackett–Burman (P–B) screening design. Furthermore, the significant factors (toluene percentage, hydrochloric acid (HCl) percentage, and graphitized carbon black (GCB) amount) were optimized using a central composite design (CCD) combined with Derringer’s desirability function (DF). The limits of quantification (LOQs) were estimated to be 1.0 μg/kg for 2,4-D, carbendazim, thiabendazole, and prochloraz, and 1.5 μg/kg for iprodione in food matrices. The mean recoveries were in the range of 70.4–113.9% with relative standard deviations (RSDs) of less than 16.9% at three spiking levels. The measurement uncertainty of the analytical method was determined using the bottom-up approach, which yielded an average value of 7.6%. Carbendazim was most frequently found in real samples analyzed using the developed method. Consequently, the analytical method can serve as an advantageous and rapid tool for determination of five preservative pesticides in fruits and vegetables. PMID:28225030

  9. Chemometric-assisted QuEChERS extraction method for post-harvest pesticide determination in fruits and vegetables.

    PubMed

    Li, Minmin; Dai, Chao; Wang, Fengzhong; Kong, Zhiqiang; He, Yan; Huang, Ya Tao; Fan, Bei

    2017-02-22

    An effective analysis method was developed based on a chemometric tool for the simultaneous quantification of five different post-harvest pesticides (2,4-dichlorophenoxyacetic acid (2,4-D), carbendazim, thiabendazole, iprodione, and prochloraz) in fruits and vegetables. In the modified QuEChERS (quick, easy, cheap, effective, rugged and safe) method, the factors and responses for optimization of the extraction and cleanup analyses were compared using the Plackett-Burman (P-B) screening design. Furthermore, the significant factors (toluene percentage, hydrochloric acid (HCl) percentage, and graphitized carbon black (GCB) amount) were optimized using a central composite design (CCD) combined with Derringer's desirability function (DF). The limits of quantification (LOQs) were estimated to be 1.0 μg/kg for 2,4-D, carbendazim, thiabendazole, and prochloraz, and 1.5 μg/kg for iprodione in food matrices. The mean recoveries were in the range of 70.4-113.9% with relative standard deviations (RSDs) of less than 16.9% at three spiking levels. The measurement uncertainty of the analytical method was determined using the bottom-up approach, which yielded an average value of 7.6%. Carbendazim was most frequently found in real samples analyzed using the developed method. Consequently, the analytical method can serve as an advantageous and rapid tool for determination of five preservative pesticides in fruits and vegetables.

  10. Engineering of a miniaturized, robotic clinical laboratory

    PubMed Central

    Nourse, Marilyn B.; Engel, Kate; Anekal, Samartha G.; Bailey, Jocelyn A.; Bhatta, Pradeep; Bhave, Devayani P.; Chandrasekaran, Shekar; Chen, Yutao; Chow, Steven; Das, Ushati; Galil, Erez; Gong, Xinwei; Gessert, Steven F.; Ha, Kevin D.; Hu, Ran; Hyland, Laura; Jammalamadaka, Arvind; Jayasurya, Karthik; Kemp, Timothy M.; Kim, Andrew N.; Lee, Lucie S.; Liu, Yang Lily; Nguyen, Alphonso; O'Leary, Jared; Pangarkar, Chinmay H.; Patel, Paul J.; Quon, Ken; Ramachandran, Pradeep L.; Rappaport, Amy R.; Roy, Joy; Sapida, Jerald F.; Sergeev, Nikolay V.; Shee, Chandan; Shenoy, Renuka; Sivaraman, Sharada; Sosa‐Padilla, Bernardo; Tran, Lorraine; Trent, Amanda; Waggoner, Thomas C.; Wodziak, Dariusz; Yuan, Amy; Zhao, Peter; Holmes, Elizabeth A.

    2018-01-01

    Abstract The ability to perform laboratory testing near the patient and with smaller blood volumes would benefit patients and physicians alike. We describe our design of a miniaturized clinical laboratory system with three components: a hardware platform (ie, the miniLab) that performs preanalytical and analytical processing steps using miniaturized sample manipulation and detection modules, an assay‐configurable cartridge that provides consumable materials and assay reagents, and a server that communicates bidirectionally with the miniLab to manage assay‐specific protocols and analyze, store, and report results (i.e., the virtual analyzer). The miniLab can detect analytes in blood using multiple methods, including molecular diagnostics, immunoassays, clinical chemistry, and hematology. Analytical performance results show that our qualitative Zika virus assay has a limit of detection of 55 genomic copies/ml. For our anti‐herpes simplex virus type 2 immunoglobulin G, lipid panel, and lymphocyte subset panel assays, the miniLab has low imprecision, and method comparison results agree well with those from the United States Food and Drug Administration‐cleared devices. With its small footprint and versatility, the miniLab has the potential to provide testing of a range of analytes in decentralized locations. PMID:29376134

  11. Electro-thermal vaporization direct analysis in real time-mass spectrometry for water contaminant analysis during space missions.

    PubMed

    Dwivedi, Prabha; Gazda, Daniel B; Keelor, Joel D; Limero, Thomas F; Wallace, William T; Macatangay, Ariel V; Fernández, Facundo M

    2013-10-15

    The development of a direct analysis in real time-mass spectrometry (DART-MS) method and first prototype vaporizer for the detection of low molecular weight (∼30-100 Da) contaminants representative of those detected in water samples from the International Space Station is reported. A temperature-programmable, electro-thermal vaporizer (ETV) was designed, constructed, and evaluated as a sampling interface for DART-MS. The ETV facilitates analysis of water samples with minimum user intervention while maximizing analytical sensitivity and sample throughput. The integrated DART-ETV-MS methodology was evaluated in both positive and negative ion modes to (1) determine experimental conditions suitable for coupling DART with ETV as a sample inlet and ionization platform for time-of-flight MS, (2) to identify analyte response ions, (3) to determine the detection limit and dynamic range for target analyte measurement, and (4) to determine the reproducibility of measurements made with the method when using manual sample introduction into the vaporizer. Nitrogen was used as the DART working gas, and the target analytes chosen for the study were ethyl acetate, acetone, acetaldehyde, ethanol, ethylene glycol, dimethylsilanediol, formaldehyde, isopropanol, methanol, methylethyl ketone, methylsulfone, propylene glycol, and trimethylsilanol.

  12. Olive oil authentication: A comparative analysis of regulatory frameworks with especial emphasis on quality and authenticity indices, and recent analytical techniques developed for their assessment. A review.

    PubMed

    Bajoub, Aadil; Bendini, Alessandra; Fernández-Gutiérrez, Alberto; Carrasco-Pancorbo, Alegría

    2018-03-24

    Over the last decades, olive oil quality and authenticity control has become an issue of great importance to consumers, suppliers, retailers, and regulators in both traditional and emerging olive oil producing countries, mainly due to the increasing worldwide popularity and the trade globalization of this product. Thus, in order to ensure olive oil authentication, various national and international laws and regulations have been adopted, although some of them are actually causing an enormous debate about the risk that they can represent for the harmonization of international olive oil trade standards. Within this context, this review was designed to provide a critical overview and comparative analysis of selected regulatory frameworks for olive oil authentication, with special emphasis on the quality and purity criteria considered by these regulation systems, their thresholds and the analytical methods employed for monitoring them. To complete the general overview, recent analytical advances to overcome drawbacks and limitations of the official methods to evaluate olive oil quality and to determine possible adulterations were reviewed. Furthermore, the latest trends on analytical approaches to assess the olive oil geographical and varietal origin traceability were also examined.

  13. Engineering of a miniaturized, robotic clinical laboratory.

    PubMed

    Nourse, Marilyn B; Engel, Kate; Anekal, Samartha G; Bailey, Jocelyn A; Bhatta, Pradeep; Bhave, Devayani P; Chandrasekaran, Shekar; Chen, Yutao; Chow, Steven; Das, Ushati; Galil, Erez; Gong, Xinwei; Gessert, Steven F; Ha, Kevin D; Hu, Ran; Hyland, Laura; Jammalamadaka, Arvind; Jayasurya, Karthik; Kemp, Timothy M; Kim, Andrew N; Lee, Lucie S; Liu, Yang Lily; Nguyen, Alphonso; O'Leary, Jared; Pangarkar, Chinmay H; Patel, Paul J; Quon, Ken; Ramachandran, Pradeep L; Rappaport, Amy R; Roy, Joy; Sapida, Jerald F; Sergeev, Nikolay V; Shee, Chandan; Shenoy, Renuka; Sivaraman, Sharada; Sosa-Padilla, Bernardo; Tran, Lorraine; Trent, Amanda; Waggoner, Thomas C; Wodziak, Dariusz; Yuan, Amy; Zhao, Peter; Young, Daniel L; Robertson, Channing R; Holmes, Elizabeth A

    2018-01-01

    The ability to perform laboratory testing near the patient and with smaller blood volumes would benefit patients and physicians alike. We describe our design of a miniaturized clinical laboratory system with three components: a hardware platform (ie, the miniLab) that performs preanalytical and analytical processing steps using miniaturized sample manipulation and detection modules, an assay-configurable cartridge that provides consumable materials and assay reagents, and a server that communicates bidirectionally with the miniLab to manage assay-specific protocols and analyze, store, and report results (i.e., the virtual analyzer). The miniLab can detect analytes in blood using multiple methods, including molecular diagnostics, immunoassays, clinical chemistry, and hematology. Analytical performance results show that our qualitative Zika virus assay has a limit of detection of 55 genomic copies/ml. For our anti-herpes simplex virus type 2 immunoglobulin G, lipid panel, and lymphocyte subset panel assays, the miniLab has low imprecision, and method comparison results agree well with those from the United States Food and Drug Administration-cleared devices. With its small footprint and versatility, the miniLab has the potential to provide testing of a range of analytes in decentralized locations.

  14. Combined Numerical/Analytical Perturbation Solutions of the Navier-Stokes Equations for Aerodynamic Ejector/Mixer Nozzle Flows

    NASA Technical Reports Server (NTRS)

    DeChant, Lawrence Justin

    1998-01-01

    In spite of rapid advances in both scalar and parallel computational tools, the large number of variables involved in both design and inverse problems make the use of sophisticated fluid flow models impractical, With this restriction, it is concluded that an important family of methods for mathematical/computational development are reduced or approximate fluid flow models. In this study a combined perturbation/numerical modeling methodology is developed which provides a rigorously derived family of solutions. The mathematical model is computationally more efficient than classical boundary layer but provides important two-dimensional information not available using quasi-1-d approaches. An additional strength of the current methodology is its ability to locally predict static pressure fields in a manner analogous to more sophisticated parabolized Navier Stokes (PNS) formulations. To resolve singular behavior, the model utilizes classical analytical solution techniques. Hence, analytical methods have been combined with efficient numerical methods to yield an efficient hybrid fluid flow model. In particular, the main objective of this research has been to develop a system of analytical and numerical ejector/mixer nozzle models, which require minimal empirical input. A computer code, DREA Differential Reduced Ejector/mixer Analysis has been developed with the ability to run sufficiently fast so that it may be used either as a subroutine or called by an design optimization routine. Models are of direct use to the High Speed Civil Transport Program (a joint government/industry project seeking to develop an economically.viable U.S. commercial supersonic transport vehicle) and are currently being adopted by both NASA and industry. Experimental validation of these models is provided by comparison to results obtained from open literature and Limited Exclusive Right Distribution (LERD) sources, as well as dedicated experiments performed at Texas A&M. These experiments have been performed using a hydraulic/gas flow analog. Results of comparisons of DREA computations with experimental data, which include entrainment, thrust, and local profile information, are overall good. Computational time studies indicate that DREA provides considerably more information at a lower computational cost than contemporary ejector nozzle design models. Finally. physical limitations of the method, deviations from experimental data, potential improvements and alternative formulations are described. This report represents closure to the NASA Graduate Researchers Program. Versions of the DREA code and a user's guide may be obtained from the NASA Lewis Research Center.

  15. Validation of the enthalpy method by means of analytical solution

    NASA Astrophysics Data System (ADS)

    Kleiner, Thomas; Rückamp, Martin; Bondzio, Johannes; Humbert, Angelika

    2014-05-01

    Numerical simulations moved in the recent year(s) from describing the cold-temperate transition surface (CTS) towards an enthalpy description, which allows avoiding incorporating a singular surface inside the model (Aschwanden et al., 2012). In Enthalpy methods the CTS is represented as a level set of the enthalpy state variable. This method has several numerical and practical advantages (e.g. representation of the full energy by one scalar field, no restriction to topology and shape of the CTS). The proposed method is rather new in glaciology and to our knowledge not verified and validated against analytical solutions. Unfortunately we are still lacking analytical solutions for sufficiently complex thermo-mechanically coupled polythermal ice flow. However, we present two experiments to test the implementation of the enthalpy equation and corresponding boundary conditions. The first experiment tests particularly the functionality of the boundary condition scheme and the corresponding basal melt rate calculation. Dependent on the different thermal situations that occur at the base, the numerical code may have to switch to another boundary type (from Neuman to Dirichlet or vice versa). The main idea of this set-up is to test the reversibility during transients. A former cold ice body that run through a warmer period with an associated built up of a liquid water layer at the base must be able to return to its initial steady state. Since we impose several assumptions on the experiment design analytical solutions can be formulated for different quantities during distinct stages of the simulation. The second experiment tests the positioning of the internal CTS in a parallel-sided polythermal slab. We compare our simulation results to the analytical solution proposed by Greve and Blatter (2009). Results from three different ice flow-models (COMIce, ISSM, TIMFD3) are presented.

  16. Interactive program for analysis and design problems in advanced composites technology

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Swedlow, J. L.

    1971-01-01

    During the past year an experimental program in the fracture of advanced fiber composites has been completed. The experimental program has given direction to additional experimental and theoretical work. A synthesis program for designing low weight multifastener joints in composites is proposed, based on extensive analytical background. A number of failed joints have been thoroughly analyzed to evaluate the failure hypothesis used in the synthesis procedure. Finally, a new solution is reported for isotropic and anisotropic laminates using the boundary-integral method. The solution method offers significant savings of computer core and time for important problems.

  17. Conceptualizing the Critical Path Linked by Teacher Commitment

    ERIC Educational Resources Information Center

    Sun, Jingping

    2015-01-01

    Purpose: The purpose of this paper is to propose a critical path through which school leadership travels to students by highlighting the importance of teacher commitment. Design/methodology/approach: Using both meta-analytic and narrative review methods, this paper systematically reviews the evidence in the past 20 years about the…

  18. An Application of Latent Variable Structural Equation Modeling for Experimental Research in Educational Technology

    ERIC Educational Resources Information Center

    Lee, Hyeon Woo

    2011-01-01

    As the technology-enriched learning environments and theoretical constructs involved in instructional design become more sophisticated and complex, a need arises for equally sophisticated analytic methods to research these environments, theories, and models. Thus, this paper illustrates a comprehensive approach for analyzing data arising from…

  19. National Functional Guidelines for Inorganic Superfund Methods Data Review (ISM02.4)

    EPA Pesticide Factsheets

    This document is designed to assist the reviewer in evaluating (a) whether the analytical data meet the technical and Quality Control (QC) criteria specified in the SOW, and (b) the usability and extent of bias of any data that do not meet these criteria.

  20. Talking about Texts: Middle School Students' Engagement in Metalinguistic Talk

    ERIC Educational Resources Information Center

    D'warte, Jacqueline

    2012-01-01

    In this paper, discourse analytical methods are applied to data from two middle school classrooms, as a teacher, researcher, and students' engage in research based curricula (Martinez, Orellana, Pacheco, & Carbone, 2008; Orellana & Reynolds, 2008) designed to leverage students' language brokering skills and facilitate discussion about languages.…

  1. Teaching Critical Analytical Methods in the Digital Typography Classroom.

    ERIC Educational Resources Information Center

    Gibson, Michael

    1997-01-01

    Describes a studio project designed to help students (1) utilize the digital environment to organize typography and images that represent the socio-political context their solutions were required to identify; and (2) explore the empirical variables that help readers to access and contemplate the content presented by their text. (PA)

  2. Level of Environmental Awareness of Students in Republic of Serbia

    ERIC Educational Resources Information Center

    Maravic, Milutin; Cvjeticanin, Stanko; Ivkovic, Sonja

    2014-01-01

    The aim of this research was developed in order to determine and analyze the level of environmental awareness of students from primary and secondary schools. Environmental awareness is an essential product of environmental education. Conducted research included analytical and descriptive method. The research instrument was the survey designed for…

  3. Effectiveness of Motivational Interviewing Interventions for Adolescent Substance Use Behavior Change: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Jensen, Chad D.; Cushing, Christopher C.; Aylward, Brandon S.; Craig, James T.; Sorell, Danielle M.; Steele, Ric G.

    2011-01-01

    Objective: This study was designed to quantitatively evaluate the effectiveness of motivational interviewing (MI) interventions for adolescent substance use behavior change. Method: Literature searches of electronic databases were undertaken in addition to manual reference searches of identified review articles. Databases searched include…

  4. Deepening Instructional Reform through System Monitoring

    ERIC Educational Resources Information Center

    Supovitz, Jonathan

    2008-01-01

    This article describes the system of monitoring district instructional-reform efforts and the influences of the system on teachers and school and district leaders in Duval County, Florida. First, the author offers a brief description of the snapshot system. Second, the study's design, data sources, and analytic methods are presented. Third, the…

  5. In-line micro-matrix solid-phase dispersion extraction for simultaneous separation and extraction of Sudan dyes in different spices.

    PubMed

    Rajabi, Maryam; Sabzalian, Sedigheh; Barfi, Behruz; Arghavani-Beydokhti, Somayeh; Asghari, Alireza

    2015-12-18

    A novel, simple, fast, and miniaturized method, termed in-line micro-matrix solid-phase dispersion (in-line MMSPD), coupled with high performance liquid chromatography (HPLC) was developed for the simultaneous extraction and determination of Sudan dyes (i.e. Sudan I-IV, Sudan orange G, Sudan black B, and Sudan red G) with the aid of an experimental design strategy. In this method, a matrix solid-phase dispersion (MSPD) column including a suitable mixture of polar sorbents was inserted in the mobile phase pathway, and while the interfering compounds were retained, the analytes were eluted and entered into the analytical column. In this way, the extraction, elution, and separation of the analytes were performed sequentially. Under the optimal experimental conditions (including the amount of sample, 0.0426g; amount of dispersant phase, 0.0216g of florisil, 0.0227g of silica, 0.0141g of alumina; and blending time, 112s), the limits of detection (LODs), limits of quantification, linear dynamic ranges, and recoveries were obtained to be 0.3-15.3μgkg(-1), 1-50μgkg(-1), 50-28,000μgkg(-1), and 94.5-99.1%, respectively. The results obtained showed that determination of the selected Sudan dyes in food samples using an enough sensitive and a simple analytically validated method like in-line MMSPD may offer a suitable screening method, which could be useful for food analysis and adulteration. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Transfer of analytical procedures: a panel of strategies selected for risk management, with emphasis on an integrated equivalence-based comparative testing approach.

    PubMed

    Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A

    2011-09-10

    In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. On Nonlinear Combustion Instability in Liquid Propellant Rocket Motors

    NASA Technical Reports Server (NTRS)

    Sims, J. D. (Technical Monitor); Flandro, Gary A.; Majdalani, Joseph; Sims, Joseph D.

    2004-01-01

    All liquid propellant rocket instability calculations in current use have limited value in the predictive sense and serve mainly as a correlating framework for the available data sets. The well-known n-t model first introduced by Crocco and Cheng in 1956 is still used as the primary analytical tool of this type. A multitude of attempts to establish practical analytical methods have achieved only limited success. These methods usually produce only stability boundary maps that are of little use in making critical design decisions in new motor development programs. Recent progress in understanding the mechanisms of combustion instability in solid propellant rockets"' provides a firm foundation for a new approach to prediction, diagnosis, and correction of the closely related problems in liquid motor instability. For predictive tools to be useful in the motor design process, they must have the capability to accurately determine: 1) time evolution of the pressure oscillations and limit amplitude, 2) critical triggering pulse amplitude, and 3) unsteady heat transfer rates at injector surfaces and chamber walls. The method described in this paper relates these critical motor characteristics directly to system design parameters. Inclusion of mechanisms such as wave steepening, vorticity production and transport, and unsteady detonation wave phenomena greatly enhance the representation of key features of motor chamber oscillatory behavior. The basic theoretical model is described and preliminary computations are compared to experimental data. A plan to develop the new predictive method into a comprehensive analysis tool is also described.

  8. Design Criteria for Low Profile Flange Calculations

    NASA Technical Reports Server (NTRS)

    Leimbach, K. R.

    1973-01-01

    An analytical method and a design procedure to develop flanged separable pipe connectors are discussed. A previously established algorithm is the basis for calculating low profile flanges. The characteristics and advantages of the low profile flange are analyzed. The use of aluminum, titanium, and plastics for flange materials is described. Mathematical models are developed to show the mechanical properties of various flange configurations. A computer program for determining the structural stability of the flanges is described.

  9. An Exploration of Function Analysis and Function Allocation in the Commercial Flight Domain

    DTIC Science & Technology

    1991-11-01

    therefore, imperative that designers apply the most effective analytical methods available to optimize the baseline crew system design, prior to the test and...International Airport (LAX) to John F. Kennedy International Airport ( JFK ) in New York. This mission was also selected because a detailed task-timeline (TIL...International Airport (LAX) and terminating at New York International Airport ( JFK ). The weather at LAX is fair with temperature at 60° Fahrenheit

  10. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  11. Design and Analysis Tool for External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2012-01-01

    A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

  12. Do placebo based validation standards mimic real batch products behaviour? Case studies.

    PubMed

    Bouabidi, A; Talbi, M; Bouklouze, A; El Karbane, M; Bourichi, H; El Guezzar, M; Ziemons, E; Hubert, Ph; Rozet, E

    2011-06-01

    Analytical methods validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application. Validation usually involves validation standards or quality control samples that are prepared in placebo or reconstituted matrix made of a mixture of all the ingredients composing the drug product except the active substance or the analyte under investigation. However, one of the main concerns that can be made with this approach is that it may lack an important source of variability that come from the manufacturing process. The question that remains at the end of the validation step is about the transferability of the quantitative performance from validation standards to real authentic drug product samples. In this work, this topic is investigated through three case studies. Three analytical methods were validated using the commonly spiked placebo validation standards at several concentration levels as well as using samples coming from authentic batch samples (tablets and syrups). The results showed that, depending on the type of response function used as calibration curve, there were various degrees of differences in the results accuracy obtained with the two types of samples. Nonetheless the use of spiked placebo validation standards was showed to mimic relatively well the quantitative behaviour of the analytical methods with authentic batch samples. Adding these authentic batch samples into the validation design may help the analyst to select and confirm the most fit for purpose calibration curve and thus increase the accuracy and reliability of the results generated by the method in routine application. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. PLS and first derivative of ratio spectra methods for determination of hydrochlorothiazide and propranolol hydrochloride in tablets.

    PubMed

    Vignaduzzo, Silvana E; Maggio, Rubén M; Castellano, Patricia M; Kaufman, Teodoro S

    2006-12-01

    Two new analytical methods have been developed as convenient and useful alternatives for simultaneous determination of hydrochlorothiazide (HCT) and propranolol hydrochloride (PRO) in pharmaceutical formulations. The methods are based on the first derivative of ratio spectra (DRS) and on partial least squares (PLS) analysis of the ultraviolet absorption spectra of the samples in the 250-350-nm region. The methods were calibrated between 8.7 and 16.0 mg L(-1) for HCT and between 14.0 and 51.5 mg L(-1) for PRO. An asymmetric full-factorial design and wavelength selection (277-294 nm for HCT and 297-319 for PRO) were used for the PLS method and signal intensities at 276 and 322 nm were used in the DRS method for HCT and PRO, respectively. Performance characteristics of the analytical methods were evaluated by use of validation samples and both methods showed to be accurate and precise, furnishing near quantitative analyte recoveries (100.4 and 99.3% for HCT and PRO by use of PLS) and relative standard deviations below 2%. For PLS the lower limits of quantification were 0.37 and 0.66 mg L(-1) for HCT and PRO, respectively, whereas for DRS they were 1.15 and 3.05 mg L(-1) for HCT and PRO, respectively. The methods were used for quantification of HCT and PRO in synthetic mixtures and in two commercial tablet preparations containing different proportions of the analytes. The results of the drug content assay and the tablet dissolution test were in statistical agreement (p < 0.05) with those furnished by the official procedures of the USP 29. Preparation of dissolution profiles of the combined tablet formulations was also performed with the aid of the proposed methods. The methods are easy to apply, use relatively simple equipment, require minimum sample pre-treatment, enable high sample throughput, and generate less solvent waste than other procedures.

  14. Use of fractional factorial design for optimization of digestion procedures followed by multi-element determination of essential and non-essential elements in nuts using ICP-OES technique.

    PubMed

    Momen, Awad A; Zachariadis, George A; Anthemidis, Aristidis N; Stratis, John A

    2007-01-15

    Two digestion procedures have been tested on nut samples for application in the determination of essential (Cr, Cu, Fe, Mg, Mn, Zn) and non-essential (Al, Ba, Cd, Pb) elements by inductively coupled plasma-optical emission spectrometry (ICP-OES). These included wet digestions with HNO(3)/H(2)SO(4) and HNO(3)/H(2)SO(4)/H(2)O(2). The later one is recommended for better analytes recoveries (relative error<11%). Two calibrations (aqueous standard and standard addition) procedures were studied and proved that standard addition was preferable for all analytes. Experimental designs for seven factors (HNO(3), H(2)SO(4) and H(2)O(2) volumes, digestion time, pre-digestion time, temperature of the hot plate and sample weight) were used for optimization of sample digestion procedures. For this purpose Plackett-Burman fractional factorial design, which involve eight experiments was adopted. The factors HNO(3) and H(2)O(2) volume, and the digestion time were found to be the most important parameters. The instrumental conditions were also optimized (using peanut matrix rather than aqueous standard solutions) considering radio-frequency (rf) incident power, nebulizer argon gas flow rate and sample uptake flow rate. The analytical performance, such as limits of detection (LOD<0.74mugg(-1)), precision of the overall procedures (relative standard deviation between 2.0 and 8.2%) and accuracy (relative errors between 0.4 and 11%) were assessed statistically to evaluate the developed analytical procedures. The good agreement between measured and certified values for all analytes (relative error <11%) with respect to IAEA-331 (spinach leaves) and IAEA-359 (cabbage) indicates that the developed analytical method is well suited for further studies on the fate of major elements in nuts and possibly similar matrices.

  15. TH-C-19A-01: Analytic Design Method to Make a 2D Planar, Segmented Ion Chamber Water-Equivalent for Proton Dose Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, W; Hollebeek, R; Teo, B

    2014-06-15

    Purpose: Quality Assurance (QA) measurements of proton therapy fields must accurately measure steep longitudinal dose gradients as well as characterize the dose distribution laterally. Currently, available devices for two-dimensional field measurements perturb the dose distribution such that routine QA measurements performed at multiple depths require multiple field deliveries and are time consuming. Methods: A design procedure for a two-dimensional detector array is introduced whereby the proton energy loss and scatter are adjusted so that the downstream dose distribution is maintained to be equivalent to that which would occur in uniform water. Starting with the design for an existing, functional two-dimensionalmore » segmented ion chamber prototype, a compensating material is introduced downstream of the detector to simultaneously equate the energy loss and lateral scatter in the detector assembly to the values in water. An analytic formalism and procedure is demonstrated to calculate the properties of the compensating material in the general case of multiple layers of arbitrary material. The resulting design is validated with Monte Carlo simulations. Results: With respect to the specific prototype design considered, the results indicate that a graphite compensating layer of the proper dimensions can yield proton beam range perturbation less than 0.1mm and beam sigma perturbation less than 2% across the energy range of therapeutic proton beams. Conclusion: We have shown that, for a 2D gas-filled detector array, a graphite-compensating layer can balance the energy loss and multiple Coulomb scattering relative to uniform water. We have demonstrated an analytic formalism and procedure to determine a compensating material in the general case of multiple layers of arbitrary material. This work was supported by the US Army Medical Research and Materiel Command under Contract Agreement No. DAMD17-W81XWH-04-2-0022. Opinions, interpretations, conclusions and recommendations are those of the author and are not necessarily endorsed by the US Army.« less

  16. Design optimization of transmitting antennas for weakly coupled magnetic induction communication systems

    PubMed Central

    2017-01-01

    This work focuses on the design of transmitting coils in weakly coupled magnetic induction communication systems. We propose several optimization methods that reduce the active, reactive and apparent power consumption of the coil. These problems are formulated as minimization problems, in which the power consumed by the transmitting coil is minimized, under the constraint of providing a required magnetic field at the receiver location. We develop efficient numeric and analytic methods to solve the resulting problems, which are of high dimension, and in certain cases non-convex. For the objective of minimal reactive power an analytic solution for the optimal current distribution in flat disc transmitting coils is provided. This problem is extended to general three-dimensional coils, for which we develop an expression for the optimal current distribution. Considering the objective of minimal apparent power, a method is developed to reduce the computational complexity of the problem by transforming it to an equivalent problem of lower dimension, allowing a quick and accurate numeric solution. These results are verified experimentally by testing a number of coil geometries. The results obtained allow reduced power consumption and increased performances in magnetic induction communication systems. Specifically, for wideband systems, an optimal design of the transmitter coil reduces the peak instantaneous power provided by the transmitter circuitry, and thus reduces its size, complexity and cost. PMID:28192463

  17. Double-multiple streamtube model for studying vertical-axis wind turbines

    NASA Astrophysics Data System (ADS)

    Paraschivoiu, Ion

    1988-08-01

    This work describes the present state-of-the-art in double-multiple streamtube method for modeling the Darrieus-type vertical-axis wind turbine (VAWT). Comparisons of the analytical results with the other predictions and available experimental data show a good agreement. This method, which incorporates dynamic-stall and secondary effects, can be used for generating a suitable aerodynamic-load model for structural design analysis of the Darrieus rotor.

  18. MSFC Advanced Concepts Office and the Iterative Launch Vehicle Concept Method

    NASA Technical Reports Server (NTRS)

    Creech, Dennis

    2011-01-01

    This slide presentation reviews the work of the Advanced Concepts Office (ACO) at Marshall Space Flight Center (MSFC) with particular emphasis on the method used to model launch vehicles using INTegrated ROcket Sizing (INTROS), a modeling system that assists in establishing the launch concept design, and stage sizing, and facilitates the integration of exterior analytic efforts, vehicle architecture studies, and technology and system trades and parameter sensitivities.

  19. Fuzzy and neural control

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1992-01-01

    Fuzzy logic and neural networks provide new methods for designing control systems. Fuzzy logic controllers do not require a complete analytical model of a dynamic system and can provide knowledge-based heuristic controllers for ill-defined and complex systems. Neural networks can be used for learning control. In this chapter, we discuss hybrid methods using fuzzy logic and neural networks which can start with an approximate control knowledge base and refine it through reinforcement learning.

  20. Immobilized aptamer paper spray ionization source for ion mobility spectrometry.

    PubMed

    Zargar, Tahereh; Khayamian, Taghi; Jafari, Mohammad T

    2017-01-05

    A selective thin-film microextraction based on aptamer immobilized on cellulose paper was used as a paper spray ionization source for ion mobility spectrometry (PSI-IMS), for the first time. In this method, the paper is not only used as an ionization source but also it is utilized for the selective extraction of analyte, based on immobilized aptamer. This combination integrates both sample preparation and analyte ionization in a Whatman paper. To that end, an appropriate sample introduction system with a novel design was constructed for the paper spray ionization source. Using this system, a continuous solvent flow works as an elution and spray solvent simultaneously. In this method, analyte is adsorbed on a triangular paper with immobilized aptamer and then it is desorbed and ionized by elution solvent and applied high voltage on paper, respectively. The effects of different experimental parameters such as applied voltage, angle of paper tip, distance between paper tip and counter electrode, elution solvent type, and solvent flow rate were optimized. The proposed method was exhaustively validated in terms of sensitivity and reproducibility by analyzing the standard solutions of codeine and acetamiprid. The analytical results obtained are promising enough to ensure the use of immobilized aptamer paper-spray as both the extraction and ionization techniques in IMS for direct analysis of biomedicine. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Sampling and sample processing in pesticide residue analysis.

    PubMed

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  2. Preliminary Design of a SP-100/Stirling Radiatively Coupled Heat Exchanger

    NASA Technical Reports Server (NTRS)

    Schmitz, Paul; Tower, Leonard; Dawson, Ronald; Blue, Brian; Dunn, Pat

    1995-01-01

    Several methods for coupling the SP-100 space nuclear reactor to the NASA Lewis Research Center's Free Piston Stirling Power Convertor (FPSPC) are presented. A 25 kWe, dual opposed Stirling convertor configuration is used in these designs. The concepts use radiative coupling between the SP-100 lithium loop and the sodium heat pipe of the Stirling convertor to transfer the heat from the reactor to the convertor. Four separate configurations are presented. Masses for the four designs vary from 41 to 176 kgs. Each design's structure, heat transfer characteristics, and heat pipe performance are analytically modeled.

  3. Preliminary design of a SP-100/Stirling radiatively coupled heat exchanger

    NASA Astrophysics Data System (ADS)

    Schmitz, Paul; Tower, Leonard; Dawson, Ronald; Blue, Brian; Dunn, Pat

    1995-10-01

    Several methods for coupling the SP-100 space nuclear reactor to the NASA Lewis Research Center's Free Piston Stirling Power Convertor (FPSPC) are presented. A 25 kWe, dual opposed Stirling convertor configuration is used in these designs. The concepts use radiative coupling between the SP-100 lithium loop and the sodium heat pipe of the Stirling convertor to transfer the heat from the reactor to the convertor. Four separate configurations are presented. Masses for the four designs vary from 41 to 176 kgs. Each design's structure, heat transfer characteristics, and heat pipe performance are analytically modeled.

  4. Computational aspects of sensitivity calculations in linear transient structural analysis. Ph.D. Thesis - Virginia Polytechnic Inst. and State Univ.

    NASA Technical Reports Server (NTRS)

    Greene, William H.

    1990-01-01

    A study was performed focusing on the calculation of sensitivities of displacements, velocities, accelerations, and stresses in linear, structural, transient response problems. One significant goal of the study was to develop and evaluate sensitivity calculation techniques suitable for large-order finite element analyses. Accordingly, approximation vectors such as vibration mode shapes are used to reduce the dimensionality of the finite element model. Much of the research focused on the accuracy of both response quantities and sensitivities as a function of number of vectors used. Two types of sensitivity calculation techniques were developed and evaluated. The first type of technique is an overall finite difference method where the analysis is repeated for perturbed designs. The second type of technique is termed semi-analytical because it involves direct, analytical differentiation of the equations of motion with finite difference approximation of the coefficient matrices. To be computationally practical in large-order problems, the overall finite difference methods must use the approximation vectors from the original design in the analyses of the perturbed models. In several cases this fixed mode approach resulted in very poor approximations of the stress sensitivities. Almost all of the original modes were required for an accurate sensitivity and for small numbers of modes, the accuracy was extremely poor. To overcome this poor accuracy, two semi-analytical techniques were developed. The first technique accounts for the change in eigenvectors through approximate eigenvector derivatives. The second technique applies the mode acceleration method of transient analysis to the sensitivity calculations. Both result in accurate values of the stress sensitivities with a small number of modes and much lower computational costs than if the vibration modes were recalculated and then used in an overall finite difference method.

  5. Ultrasound-assisted low-density solvent dispersive liquid-liquid microextraction for the determination of 4 designer benzodiazepines in urine samples by gas chromatography-triple quadrupole mass spectrometry.

    PubMed

    Meng, Liang; Zhu, Binling; Zheng, Kefang; Fu, Shanlin

    2017-05-15

    A novel microextraction technique based on ultrasound-assisted low-density solvent dispersive liquid-liquid microextraction (UA-LDS-DLLME) had been applied for the determination of 4 designer benzodiazepines (phenazepam, diclazepam, flubromazepam and etizolam) in urine samples by gas chromatography- triple quadrupole mass spectrometry (GC-QQQ-MS). Ethyl acetate (168μL) was added into the urine samples after adjusting pH to 11.3. The samples were sonicated in an ultrasonic bath for 5.5min to form a cloudy suspension. After centrifugation at 10000rpm for 3min, the supernatant extractant was withdrawn and injected into the GC-QQQ-MS for analysis. Parameters affecting the extraction efficiency have been investigated and optimized by means of single factor experiment and response surface methodology (Box-Behnken design). Under the optimum extraction conditions, a recovery of 73.8-85.5% were obtained for all analytes. The analytical method was linear for all analytes in the range from 0.003 to 10μg/mL with the correlation coefficient ranging from 0.9978 to 0.9990. The LODs were estimated to be 1-3ng/mL. The accuracy (expressed as mean relative error MRE) was within ±5.8% and the precision (expressed as relative standard error RSD) was less than 5.9%. UA-LDS-DLLME technique has the advantages of shorter extraction time and is suitable for simultaneous pretreatment of samples in batches. The combination of UA-LDS-DLLME with GC-QQQ-MS offers an alternative analytical approach for the sensitive detection of these designer benzodiazepines in urine matrix for clinical and medico-legal purposes. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Simplified, inverse, ejector design tool

    NASA Technical Reports Server (NTRS)

    Dechant, Lawrence J.

    1993-01-01

    A simple lumped parameter based inverse design tool has been developed which provides flow path geometry and entrainment estimates subject to operational, acoustic, and design constraints. These constraints are manifested through specification of primary mass flow rate or ejector thrust, fully-mixed exit velocity, and static pressure matching. Fundamentally, integral forms of the conservation equations coupled with the specified design constraints are combined to yield an easily invertible linear system in terms of the flow path cross-sectional areas. Entrainment is computed by back substitution. Initial comparison with experimental and analogous one-dimensional methods show good agreement. Thus, this simple inverse design code provides an analytically based, preliminary design tool with direct application to High Speed Civil Transport (HSCT) design studies.

  7. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  8. Rapid and sensitive analysis of 27 underivatized free amino acids, dipeptides, and tripeptides in fruits of Siraitia grosvenorii Swingle using HILIC-UHPLC-QTRAP(®)/MS (2) combined with chemometrics methods.

    PubMed

    Zhou, Guisheng; Wang, Mengyue; Li, Yang; Peng, Ying; Li, Xiaobo

    2015-08-01

    In the present study, a new strategy based on chemical analysis and chemometrics methods was proposed for the comprehensive analysis and profiling of underivatized free amino acids (FAAs) and small peptides among various Luo-Han-Guo (LHG) samples. Firstly, the ultrasound-assisted extraction (UAE) parameters were optimized using Plackett-Burman (PB) screening and Box-Behnken designs (BBD), and the following optimal UAE conditions were obtained: ultrasound power of 280 W, extraction time of 43 min, and the solid-liquid ratio of 302 mL/g. Secondly, a rapid and sensitive analytical method was developed for simultaneous quantification of 24 FAAs and 3 active small peptides in LHG at trace levels using hydrophilic interaction ultra-performance liquid chromatography coupled with triple-quadrupole linear ion-trap tandem mass spectrometry (HILIC-UHPLC-QTRAP(®)/MS(2)). The analytical method was validated by matrix effects, linearity, LODs, LOQs, precision, repeatability, stability, and recovery. Thirdly, the proposed optimal UAE conditions and analytical methods were applied to measurement of LHG samples. It was shown that LHG was rich in essential amino acids, which were beneficial nutrient substances for human health. Finally, based on the contents of the 27 analytes, the chemometrics methods of unsupervised principal component analysis (PCA) and supervised counter propagation artificial neural network (CP-ANN) were applied to differentiate and classify the 40 batches of LHG samples from different cultivated forms, regions, and varieties. As a result, these samples were mainly clustered into three clusters, which illustrated the cultivating disparity among the samples. In summary, the presented strategy had potential for the investigation of edible plants and agricultural products containing FAAs and small peptides.

  9. Models of dyadic social interaction.

    PubMed Central

    Griffin, Dale; Gonzalez, Richard

    2003-01-01

    We discuss the logic of research designs for dyadic interaction and present statistical models with parameters that are tied to psychologically relevant constructs. Building on Karl Pearson's classic nineteenth-century statistical analysis of within-organism similarity, we describe several approaches to indexing dyadic interdependence and provide graphical methods for visualizing dyadic data. We also describe several statistical and conceptual solutions to the 'levels of analytic' problem in analysing dyadic data. These analytic strategies allow the researcher to examine and measure psychological questions of interdependence and social influence. We provide illustrative data from casually interacting and romantic dyads. PMID:12689382

  10. High-performance heat pipes for heat recovery applications

    NASA Technical Reports Server (NTRS)

    Saaski, E. W.; Hartl, J. H.

    1980-01-01

    Methods to improve the performance of reflux heat pipes for heat recovery applications were examined both analytically and experimentally. Various models for the estimation of reflux heat pipe transport capacity were surveyed in the literature and compared with experimental data. A high transport capacity reflux heat pipe was developed that provides up to a factor of 10 capacity improvement over conventional open tube designs; analytical models were developed for this device and incorporated into a computer program HPIPE. Good agreement of the model predictions with data for R-11 and benzene reflux heat pipes was obtained.

  11. Maneuver Planning for Conjunction Risk Mitigation with Ground-track Control Requirements

    NASA Technical Reports Server (NTRS)

    McKinley, David

    2008-01-01

    The planning of conjunction Risk Mitigation Maneuvers (RMM) in the presence of ground-track control requirements is analyzed. Past RMM planning efforts on the Aqua, Aura, and Terra spacecraft have demonstrated that only small maneuvers are available when ground-track control requirements are maintained. Assuming small maneuvers, analytical expressions for the effect of a given maneuver on conjunction geometry are derived. The analytical expressions are used to generate a large trade space for initial RMM design. This trade space represents a significant improvement in initial maneuver planning over existing methods that employ high fidelity maneuver models and propagation.

  12. Big data sharing and analysis to advance research in post-traumatic epilepsy.

    PubMed

    Duncan, Dominique; Vespa, Paul; Pitkanen, Asla; Braimah, Adebayo; Lapinlampi, Nina; Toga, Arthur W

    2018-06-01

    We describe the infrastructure and functionality for a centralized preclinical and clinical data repository and analytic platform to support importing heterogeneous multi-modal data, automatically and manually linking data across modalities and sites, and searching content. We have developed and applied innovative image and electrophysiology processing methods to identify candidate biomarkers from MRI, EEG, and multi-modal data. Based on heterogeneous biomarkers, we present novel analytic tools designed to study epileptogenesis in animal model and human with the goal of tracking the probability of developing epilepsy over time. Copyright © 2017. Published by Elsevier Inc.

  13. Development of garlic bioactive compounds analytical methodology based on liquid phase microextraction using response surface design. Implications for dual analysis: Cooked and biological fluids samples.

    PubMed

    Ramirez, Daniela Andrea; Locatelli, Daniela Ana; Torres-Palazzolo, Carolina Andrea; Altamirano, Jorgelina Cecilia; Camargo, Alejandra Beatriz

    2017-01-15

    Organosulphur compounds (OSCs) present in garlic (Allium sativum L.) are responsible of several biological properties. Functional foods researches indicate the importance of quantifying these compounds in food matrices and biological fluids. For this purpose, this paper introduces a novel methodology based on dispersive liquid-liquid microextraction (DLLME) coupled to high performance liquid chromatography with ultraviolet detector (HPLC-UV) for the extraction and determination of organosulphur compounds in different matrices. The target analytes were allicin, (E)- and (Z)-ajoene, 2-vinyl-4H-1,2-dithiin (2-VD), diallyl sulphide (DAS) and diallyl disulphide (DADS). The microextraction technique was optimized using an experimental design, and the analytical performance was evaluated under optimum conditions. The desirability function presented an optimal value for 600μL of chloroform as extraction solvent using acetonitrile as dispersant. The method proved to be reliable, precise and accurate. It was successfully applied to determine OSCs in cooked garlic samples as well as blood plasma and digestive fluids. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Computational Methodology for Absolute Calibration Curves for Microfluidic Optical Analyses

    PubMed Central

    Chang, Chia-Pin; Nagel, David J.; Zaghloul, Mona E.

    2010-01-01

    Optical fluorescence and absorption are two of the primary techniques used for analytical microfluidics. We provide a thorough yet tractable method for computing the performance of diverse optical micro-analytical systems. Sample sizes range from nano- to many micro-liters and concentrations from nano- to milli-molar. Equations are provided to trace quantitatively the flow of the fundamental entities, namely photons and electrons, and the conversion of energy from the source, through optical components, samples and spectral-selective components, to the detectors and beyond. The equations permit facile computations of calibration curves that relate the concentrations or numbers of molecules measured to the absolute signals from the system. This methodology provides the basis for both detailed understanding and improved design of microfluidic optical analytical systems. It saves prototype turn-around time, and is much simpler and faster to use than ray tracing programs. Over two thousand spreadsheet computations were performed during this study. We found that some design variations produce higher signal levels and, for constant noise levels, lower minimum detection limits. Improvements of more than a factor of 1,000 were realized. PMID:22163573

  15. Heat Transfer Analysis of Thermal Protection Structures for Hypersonic Vehicles

    NASA Astrophysics Data System (ADS)

    Zhou, Chen; Wang, Zhijin; Hou, Tianjiao

    2017-11-01

    This research aims to develop an analytical approach to study the heat transfer problem of thermal protection systems (TPS) for hypersonic vehicles. Laplace transform and integral method are used to describe the temperature distribution through the TPS subject to aerodynamic heating during flight. Time-dependent incident heat flux is also taken into account. Two different cases with heat flux and radiation boundary conditions are studied and discussed. The results are compared with those obtained by finite element analyses and show a good agreement. Although temperature profiles of such problems can be readily accessed via numerical simulations, analytical solutions give a greater insight into the physical essence of the heat transfer problem. Furthermore, with the analytical approach, rapid thermal analyses and even thermal optimization can be achieved during the preliminary TPS design.

  16. Advancements in nano-enabled therapeutics for neuroHIV management.

    PubMed

    Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan

    This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.

  17. Plasma biochemical and PCV ranges for healthy, wild, immature hawksbill (Eretmochelys imbricata) sea turtles.

    PubMed

    Whiting, S D; Guinea, M L; Fomiatti, K; Flint, M; Limpus, C J

    2014-06-14

    In recent years, the use of blood chemistry as a diagnostic tool for sea turtles has been demonstrated, but much of its effectiveness relies on reference intervals. The first comprehensive blood chemistry values for healthy wild hawksbill (Eretmochelys imbricata) sea turtles are presented. Nineteen blood chemistry analytes and packed cell volume were analysed for 40 clinically healthy juvenile hawksbill sea turtles captured from a rocky reef habitat in northern Australia. We used four statistical approaches to calculate reference intervals and to investigate their use with non-normal distributions and small sample sizes, and to compare upper and lower limits between methods. Eleven analytes were correlated with curved carapace length indicating that body size should be considered when designing future studies and interpreting analyte values. British Veterinary Association.

  18. Quality by Design in the development of hydrophilic interaction liquid chromatography method with gradient elution for the analysis of olanzapine.

    PubMed

    Tumpa, Anja; Stajić, Ana; Jančić-Stojanović, Biljana; Medenica, Mirjana

    2017-02-05

    This paper deals with the development of hydrophilic interaction liquid chromatography (HILIC) method with gradient elution, in accordance with Analytical Quality by Design (AQbD) methodology, for the first time. The method is developed for olanzapine and its seven related substances. Following step by step AQbD methodology, firstly as critical process parameters (CPPs) temperature, starting content of aqueous phase and duration of linear gradient are recognized, and as critical quality attributes (CQAs) separation criterion S of critical pairs of substances are investigated. Rechtschaffen design is used for the creation of models that describe the dependence between CPPs and CQAs. The design space that is obtained at the end is used for choosing the optimal conditions (set point). The method is fully validated at the end to verify the adequacy of the chosen optimal conditions and applied to real samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Manufacturing error sensitivity analysis and optimal design method of cable-network antenna structures

    NASA Astrophysics Data System (ADS)

    Zong, Yali; Hu, Naigang; Duan, Baoyan; Yang, Guigeng; Cao, Hongjun; Xu, Wanye

    2016-03-01

    Inevitable manufacturing errors and inconsistency between assumed and actual boundary conditions can affect the shape precision and cable tensions of a cable-network antenna, and even result in failure of the structure in service. In this paper, an analytical sensitivity analysis method of the shape precision and cable tensions with respect to the parameters carrying uncertainty was studied. Based on the sensitivity analysis, an optimal design procedure was proposed to alleviate the effects of the parameters that carry uncertainty. The validity of the calculated sensitivities is examined by those computed by a finite difference method. Comparison with a traditional design method shows that the presented design procedure can remarkably reduce the influence of the uncertainties on the antenna performance. Moreover, the results suggest that especially slender front net cables, thick tension ties, relatively slender boundary cables and high tension level can improve the ability of cable-network antenna structures to resist the effects of the uncertainties on the antenna performance.

  20. Spin bearing retainer design optimization

    NASA Technical Reports Server (NTRS)

    Boesiger, Edward A.; Warner, Mark H.

    1991-01-01

    The dynamics behavior of spin bearings for momentum wheels (control-moment gyroscope, reaction wheel assembly) is critical to satellite stability and life. Repeated bearing retainer instabilities hasten lubricant deterioration and can lead to premature bearing failure and/or unacceptable vibration. These instabilities are typically distinguished by increases in torque, temperature, audible noise, and vibration induced by increases into the bearing cartridge. Ball retainer design can be optimized to minimize these occurrences. A retainer was designed using a previously successful smaller retainer as an example. Analytical methods were then employed to predict its behavior and optimize its configuration.

  1. Design and analytical study of a rotor airfoil

    NASA Technical Reports Server (NTRS)

    Dadone, L. U.

    1978-01-01

    An airfoil section for use on helicopter rotor blades was defined and analyzed by means of potential flow/boundary layer interaction and viscous transonic flow methods to meet as closely as possible a set of advanced airfoil design objectives. The design efforts showed that the first priority objectives, including selected low speed pitching moment, maximum lift and drag divergence requirements can be met, though marginally. The maximum lift requirement at M = 0.5 and most of the profile drag objectives cannot be met without some compromise of at least one of the higher order priorities.

  2. Acoustic Treatment Design Scaling Methods. Volume 1; Overview, Results, and Recommendations

    NASA Technical Reports Server (NTRS)

    Kraft, R. E.; Yu, J.

    1999-01-01

    Scale model fan rigs that simulate new generation ultra-high-bypass engines at about 1/5-scale are achieving increased importance as development vehicles for the design of low-noise aircraft engines. Testing at small scale allows the tests to be performed in existing anechoic wind tunnels, which provides an accurate simulation of the important effects of aircraft forward motion on the noise generation. The ability to design, build, and test miniaturized acoustic treatment panels on scale model fan rigs representative of the fullscale engine provides not only a cost-savings, but an opportunity to optimize the treatment by allowing tests of different designs. The primary objective of this study was to develop methods that will allow scale model fan rigs to be successfully used as acoustic treatment design tools. The study focuses on finding methods to extend the upper limit of the frequency range of impedance prediction models and acoustic impedance measurement methods for subscale treatment liner designs, and confirm the predictions by correlation with measured data. This phase of the program had as a goal doubling the upper limit of impedance measurement from 6 kHz to 12 kHz. The program utilizes combined analytical and experimental methods to achieve the objectives.

  3. [The validity of radioimmunologic determination of bioavailability of beta-escin in horse chestnut extracts].

    PubMed

    Schrödter, A; Loew, D; Schwankl, W; Rietbrock, N

    1998-09-01

    The bioavailability under steady state conditions of a standard, slow-release horse chestnut seed extract (HCSE)-containing product was compared with that of an analogous, fast-release test preparation (Noricaven novo) in a prospective, randomised, double-blind study in a double cross-over design. The serum concentration of beta-escin (CAS 6805-41-0) was measured by radioimmunoassay. In addition, the biopharmaceutical properties of the HCSEs present in the products were investigated, the amount and composition of the active ingredient, escin, being analysed with a validated HPLC method. The pharmacokinetics of this study were compared with the corresponding data of a similar investigation carried out under analogous conditions concerning study design, analytical methods and reference preparation. Comparison of the similar studies revealed differences in characteristic pharmakokinetic values of beta-escin in terms of a shift of the concentration time curves as could be demonstrated for the reference product. The total amounts of escin in the two products investigated did not differ significantly. However, quantitative and qualitative differences were detected in the constituents of the two different extract preparations. It is concluded that the high specificity of the validated beta-escin radioimmunoassay leads to analytical imprecision due to the variable constituents of the extract preparations used. It is necessary to test whether this problem can be solved using an analytical approach, which is specific for each extract.

  4. Relative tracking control of constellation satellites considering inter-satellite link

    NASA Astrophysics Data System (ADS)

    Fakoor, M.; Amozegary, F.; Bakhtiari, M.; Daneshjou, K.

    2017-11-01

    In this article, two main issues related to the large-scale relative motion of satellites in the constellation are investigated to establish the Inter Satellite Link (ISL) which means the dynamic and control problems. In the section related to dynamic problems, a detailed and effective analytical solution is initially provided for the problem of satellite relative motion considering perturbations. The direct geometric method utilizing spherical coordinates is employed to achieve this solution. The evaluation of simulation shows that the solution obtained from the geometric method calculates the relative motion of the satellite with high accuracy. Thus, the proposed analytical solution will be applicable and effective. In the section related to control problems, the relative tracking control system between two satellites will be designed in order to establish a communication link between the satellites utilizing analytical solution for relative motion of satellites with respect to the reference trajectory. Sliding mode control approach is employed to develop the relative tracking control system for body to body and payload to payload tracking control. Efficiency of sliding mode control approach is compared with PID and LQR controllers. Two types of payload to payload tracking control considering with and without payload degree of freedom are designed and suitable one for practical ISL applications is introduced. Also, Fuzzy controller is utilized to eliminate the control input in the sliding mode controller.

  5. A qualitative/quantitative approach for the detection of 37 tryptamine-derived designer drugs, 5 β-carbolines, ibogaine, and yohimbine in human urine and plasma using standard urine screening and multi-analyte approaches.

    PubMed

    Meyer, Markus R; Caspar, Achim; Brandt, Simon D; Maurer, Hans H

    2014-01-01

    The first synthetic tryptamines have entered the designer drug market in the late 1990s and were distributed as psychedelic recreational drugs. In the meantime, several analogs have been brought onto the market indicating a growing interest in this drug class. So far, only scarce analytical data were available on the detectability of tryptamines in human biosamples. Therefore, the aim of the presented study was the development and full validation of a method for their detection in human urine and plasma and their quantification in human plasma. The liquid chromatography-linear ion trap mass spectrometry method presented covered 37 tryptamines as well as five β-carbolines, ibogaine, and yohimbine. Compounds were analyzed after protein precipitation of urine or fast liquid-liquid extraction of plasma using an LXQ linear ion trap coupled to an Accela ultra ultra high-performance liquid chromatography system. Data mining was performed via information-dependent acquisition or targeted product ion scan mode with positive electrospray ionization. The assay was selective for all tested substances with limits of detection in urine between 10 and 100 ng/mL and in plasma between 1 and 100 ng/mL. A validated quantification in plasma according to international recommendation could be demonstrated for 33 out of 44 analytes.

  6. Evaluation of capillary electrophoresis for in-flight ionic contaminant monitoring of SSF potable water

    NASA Technical Reports Server (NTRS)

    Mudgett, Paul D.; Schultz, John R.; Sauer, Richard L.

    1992-01-01

    Until 1989, ion chromatography (IC) was the baseline technology selected for the Specific Ion Analyzer, an in-flight inorganic water quality monitor being designed for Space Station Freedom. Recent developments in capillary electrophoresis (CE) may offer significant savings of consumables, power consumption, and weight/volume allocation, relative to IC technology. A thorough evaluation of CE's analytical capability, however, is necessary before one of the two techniques is chosen. Unfortunately, analytical methods currently available for inorganic CE are unproven for NASA's target list of anions and cations. Thus, CE electrolyte chemistry and methods to measure the target contaminants must be first identified and optimized. This paper reports the status of a study to evaluate CE's capability with regard to inorganic and carboxylate anions, alkali and alkaline earth cations, and transition metal cations. Preliminary results indicate that CE has an impressive selectivity and trace sensitivity, although considerable methods development remains to be performed.

  7. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  8. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  9. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  10. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  11. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  12. SAM Radiochemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.

  13. Micromechanical analysis and design of an integrated thermal protection system for future space vehicles

    NASA Astrophysics Data System (ADS)

    Martinez, Oscar

    Thermal protection systems (TPS) are the key features incorporated into a spacecraft's design to protect it from severe aerodynamic heating during high-speed travel through planetary atmospheres. The thermal protection system is the key technology that enables a spacecraft to be lightweight, fully reusable, and easily maintainable. Add-on TPS concepts have been used since the beginning of the space race. The Apollo space capsule used ablative TPS and the Space Shuttle Orbiter TPS technology consisted of ceramic tiles and blankets. Many problems arose from the add-on concept such as incompatibility, high maintenance costs, non-load bearing, and not being robust and operable. To make the spacecraft's TPS more reliable, robust, and efficient, we investigated Integral Thermal Protection System (ITPS) concept in which the load-bearing structure and the TPS are combined into one single component. The design of an ITPS was a challenging task, because the requirement of a load-bearing structure and a TPS are often conflicting. Finite element (FE) analysis is often the preferred method of choice for a structural analysis problem. However, as the structure becomes complex, the computational time and effort for an FE analysis increases. New structural analytical tools were developed, or available ones were modified, to perform a full structural analysis of the ITPS. With analytical tools, the designer is capable of obtaining quick and accurate results and has a good idea of the response of the structure without having to go to an FE analysis. A MATLABRTM code was developed to analytically determine performance metrics of the ITPS such as stresses, buckling, deflection, and other failure modes. The analytical models provide fast and accurate results that were within 5% difference from the FEM results. The optimization procedure usually performs 100 function evaluations for every design variable. Using the analytical models in the optimization procedure was a time saver, because the optimization time to reach an optimum design was reached in less than an hour, where as an FE optimization study would take hours to reach an optimum design. Corrugated-core structures were designed for ITPS applications with loads and boundary conditions similar to that of a Space Shuttle-like vehicle. Temperature, buckling, deflection and stress constraints were considered for the design and optimization process. An optimized design was achieved with consideration of all the constraints. The ITPS design obtained from the analytical solutions was lighter (4.38 lb/ft2) when compared to the ITPS design obtained from a finite element analysis (4.85 lb/ft 2). The ITPS boundary effects added local stresses and compressive loads to the top facesheet that was not able to be captured by the 2D plate solutions. The inability to fully capture the boundary effects lead to a lighter ITPS when compared to the FE solution. However, the ITPS can withstand substantially large mechanical loads when compared to the previous designs. Truss-core structures were found to be unsuitable as they could not withstand the large thermal gradients frequently encountered in ITPS applications.

  14. Multidisciplinary design and analytic approaches to advance prospective research on the multilevel determinants of child health.

    PubMed

    Johnson, Sara B; Little, Todd D; Masyn, Katherine; Mehta, Paras D; Ghazarian, Sharon R

    2017-06-01

    Characterizing the determinants of child health and development over time, and identifying the mechanisms by which these determinants operate, is a research priority. The growth of precision medicine has increased awareness and refinement of conceptual frameworks, data management systems, and analytic methods for multilevel data. This article reviews key methodological challenges in cohort studies designed to investigate multilevel influences on child health and strategies to address them. We review and summarize methodological challenges that could undermine prospective studies of the multilevel determinants of child health and ways to address them, borrowing approaches from the social and behavioral sciences. Nested data, variation in intervals of data collection and assessment, missing data, construct measurement across development and reporters, and unobserved population heterogeneity pose challenges in prospective multilevel cohort studies with children. We discuss innovations in missing data, innovations in person-oriented analyses, and innovations in multilevel modeling to address these challenges. Study design and analytic approaches that facilitate the integration across multiple levels, and that account for changes in people and the multiple, dynamic, nested systems in which they participate over time, are crucial to fully realize the promise of precision medicine for children and adolescents. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Development and evaluation of a LOR-based image reconstruction with 3D system response modeling for a PET insert with dual-layer offset crystal design.

    PubMed

    Zhang, Xuezhu; Stortz, Greg; Sossi, Vesna; Thompson, Christopher J; Retière, Fabrice; Kozlowski, Piotr; Thiessen, Jonathan D; Goertzen, Andrew L

    2013-12-07

    In this study we present a method of 3D system response calculation for analytical computer simulation and statistical image reconstruction for a magnetic resonance imaging (MRI) compatible positron emission tomography (PET) insert system that uses a dual-layer offset (DLO) crystal design. The general analytical system response functions (SRFs) for detector geometric and inter-crystal penetration of coincident crystal pairs are derived first. We implemented a 3D ray-tracing algorithm with 4π sampling for calculating the SRFs of coincident pairs of individual DLO crystals. The determination of which detector blocks are intersected by a gamma ray is made by calculating the intersection of the ray with virtual cylinders with radii just inside the inner surface and just outside the outer-edge of each crystal layer of the detector ring. For efficient ray-tracing computation, the detector block and ray to be traced are then rotated so that the crystals are aligned along the X-axis, facilitating calculation of ray/crystal boundary intersection points. This algorithm can be applied to any system geometry using either single-layer (SL) or multi-layer array design with or without offset crystals. For effective data organization, a direct lines of response (LOR)-based indexed histogram-mode method is also presented in this work. SRF calculation is performed on-the-fly in both forward and back projection procedures during each iteration of image reconstruction, with acceleration through use of eight-fold geometric symmetry and multi-threaded parallel computation. To validate the proposed methods, we performed a series of analytical and Monte Carlo computer simulations for different system geometry and detector designs. The full-width-at-half-maximum of the numerical SRFs in both radial and tangential directions are calculated and compared for various system designs. By inspecting the sinograms obtained for different detector geometries, it can be seen that the DLO crystal design can provide better sampling density than SL or dual-layer no-offset system designs with the same total crystal length. The results of the image reconstruction with SRFs modeling for phantom studies exhibit promising image recovery capability for crystal widths of 1.27-1.43 mm and top/bottom layer lengths of 4/6 mm. In conclusion, we have developed efficient algorithms for system response modeling of our proposed PET insert with DLO crystal arrays. This provides an effective method for both 3D computer simulation and quantitative image reconstruction, and will aid in the optimization of our PET insert system with various crystal designs.

  16. Design, ancillary testing, analysis and fabrication data for the advanced composite stabilizer for Boeing 737 aircraft. Volume 1: Technical summary

    NASA Technical Reports Server (NTRS)

    Aniversario, R. B.; Harvey, S. T.; Mccarty, J. E.; Parsons, J. T.; Peterson, D. C.; Pritchett, L. D.; Wilson, D. R.; Wogulis, E. R.

    1983-01-01

    The horizontal stabilizer of the 737 transport was redesigned. Five shipsets were fabricated using composite materials. Weight reduction greater than the 20% goal was achieved. Parts and assemblies were readily produced on production-type tooling. Quality assurance methods were demonstrated. Repair methods were developed and demonstrated. Strength and stiffness analytical methods were substantiated by comparison with test results. Cost data was accumulated in a semiproduction environment. FAA certification was obtained.

  17. QbD-Based Development and Validation of a Stability-Indicating HPLC Method for Estimating Ketoprofen in Bulk Drug and Proniosomal Vesicular System.

    PubMed

    Yadav, Nand K; Raghuvanshi, Ashish; Sharma, Gajanand; Beg, Sarwar; Katare, Om P; Nanda, Sanju

    2016-03-01

    The current studies entail systematic quality by design (QbD)-based development of simple, precise, cost-effective and stability-indicating high-performance liquid chromatography method for estimation of ketoprofen. Analytical target profile was defined and critical analytical attributes (CAAs) were selected. Chromatographic separation was accomplished with an isocratic, reversed-phase chromatography using C-18 column, pH 6.8, phosphate buffer-methanol (50 : 50v/v) as a mobile phase at a flow rate of 1.0 mL/min and UV detection at 258 nm. Systematic optimization of chromatographic method was performed using central composite design by evaluating theoretical plates and peak tailing as the CAAs. The method was validated as per International Conference on Harmonization guidelines with parameters such as high sensitivity, specificity of the method with linearity ranging between 0.05 and 250 µg/mL, detection limit of 0.025 µg/mL and quantification limit of 0.05 µg/mL. Precision was demonstrated using relative standard deviation of 1.21%. Stress degradation studies performed using acid, base, peroxide, thermal and photolytic methods helped in identifying the degradation products in the proniosome delivery systems. The results successfully demonstrated the utility of QbD for optimizing the chromatographic conditions for developing highly sensitive liquid chromatographic method for ketoprofen. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Advances in spatial epidemiology and geographic information systems.

    PubMed

    Kirby, Russell S; Delmelle, Eric; Eberth, Jan M

    2017-01-01

    The field of spatial epidemiology has evolved rapidly in the past 2 decades. This study serves as a brief introduction to spatial epidemiology and the use of geographic information systems in applied research in epidemiology. We highlight technical developments and highlight opportunities to apply spatial analytic methods in epidemiologic research, focusing on methodologies involving geocoding, distance estimation, residential mobility, record linkage and data integration, spatial and spatio-temporal clustering, small area estimation, and Bayesian applications to disease mapping. The articles included in this issue incorporate many of these methods into their study designs and analytical frameworks. It is our hope that these studies will spur further development and utilization of spatial analysis and geographic information systems in epidemiologic research. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Optimal design of damping layers in SMA/GFRP laminated hybrid composites

    NASA Astrophysics Data System (ADS)

    Haghdoust, P.; Cinquemani, S.; Lo Conte, A.; Lecis, N.

    2017-10-01

    This work describes the optimization of the shape profiles for shape memory alloys (SMA) sheets in hybrid layered composite structures, i.e. slender beams or thinner plates, designed for the passive attenuation of flexural vibrations. The paper starts with the description of the material and architecture of the investigated hybrid layered composite. An analytical method, for evaluating the energy dissipation inside a vibrating cantilever beam is developed. The analytical solution is then followed by a shape profile optimization of the inserts, using a genetic algorithm to minimize the SMA material layer usage, while maintaining target level of structural damping. Delamination problem at SMA/glass fiber reinforced polymer interface is discussed. At the end, the proposed methodology has been applied to study the hybridization of a wind turbine layered structure blade with SMA material, in order to increase its passive damping.

  20. Review and assessment of the database and numerical modeling for turbine heat transfer

    NASA Technical Reports Server (NTRS)

    Gladden, H. J.; Simoneau, R. J.

    1989-01-01

    The objectives of the NASA Hot Section Technology (HOST) Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.

  1. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  2. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  3. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  4. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  5. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  6. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  7. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  8. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture... Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MREs are listed as follows: (1) Official Methods of Analysis of AOAC...

  9. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  10. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  11. Surface Plasmon Resonance: New Biointerface Designs and High-Throughput Affinity Screening

    NASA Astrophysics Data System (ADS)

    Linman, Matthew J.; Cheng, Quan Jason

    Surface plasmon resonance (SPR) is a surface optical technique that measures minute changes in refractive index at a metal-coated surface. It has become increasingly popular in the study of biological and chemical analytes because of its label-free measurement feature. In addition, SPR allows for both quantitative and qualitative assessment of binding interactions in real time, making it ideally suited for probing weak interactions that are often difficult to study with other methods. This chapter presents the biosensor development in the last 3 years or so utilizing SPR as the principal analytical technique, along with a concise background of the technique itself. While SPR has demonstrated many advantages, it is a nonselective method and so, building reproducible and functional interfaces is vital to sensing applications. This chapter, therefore, focuses mainly on unique surface chemistries and assay approaches to examine biological interactions with SPR. In addition, SPR imaging for high-throughput screening based on microarrays and novel hyphenated techniques involving the coupling of SPR to other analytical methods is discussed. The chapter concludes with a commentary on the current state of SPR biosensing technology and the general direction of future biosensor research.

  12. Quality by design (QbD), Process Analytical Technology (PAT), and design of experiment applied to the development of multifunctional sunscreens.

    PubMed

    Peres, Daniela D'Almeida; Ariede, Maira Bueno; Candido, Thalita Marcilio; de Almeida, Tania Santos; Lourenço, Felipe Rebello; Consiglieri, Vladi Olga; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Baby, André Rolim

    2017-02-01

    Multifunctional formulations are of great importance to ensure better skin protection from harm caused by ultraviolet radiation (UV). Despite the advantages of Quality by Design and Process Analytical Technology approaches to the development and optimization of new products, we found in the literature only a few studies concerning their applications in cosmetic product industry. Thus, in this research work, we applied the QbD and PAT approaches to the development of multifunctional sunscreens containing bemotrizinol, ethylhexyl triazone, and ferulic acid. In addition, UV transmittance method was applied to assess qualitative and quantitative critical quality attributes of sunscreens using chemometrics analyses. Linear discriminant analysis allowed classifying unknown formulations, which is useful for investigation of counterfeit and adulteration. Simultaneous quantification of ethylhexyl triazone, bemotrizinol, and ferulic acid presented at the formulations was performed using PLS regression. This design allowed us to verify the compounds in isolation and in combination and to prove that the antioxidant action of ferulic acid as well as the sunscreen actions, since the presence of this component increased 90% of antioxidant activity in vitro.

  13. X-ray optics simulation and beamline design for the APS upgrade

    NASA Astrophysics Data System (ADS)

    Shi, Xianbo; Reininger, Ruben; Harder, Ross; Haeffner, Dean

    2017-08-01

    The upgrade of the Advanced Photon Source (APS) to a Multi-Bend Achromat (MBA) will increase the brightness of the APS by between two and three orders of magnitude. The APS upgrade (APS-U) project includes a list of feature beamlines that will take full advantage of the new machine. Many of the existing beamlines will be also upgraded to profit from this significant machine enhancement. Optics simulations are essential in the design and optimization of these new and existing beamlines. In this contribution, the simulation tools used and developed at APS, ranging from analytical to numerical methods, are summarized. Three general optical layouts are compared in terms of their coherence control and focusing capabilities. The concept of zoom optics, where two sets of focusing elements (e.g., CRLs and KB mirrors) are used to provide variable beam sizes at a fixed focal plane, is optimized analytically. The effects of figure errors on the vertical spot size and on the local coherence along the vertical direction of the optimized design are investigated.

  14. Back-support large laser mirror unit: mounting modeling and analysis

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Zhang, Zheng; Long, Kai; Liu, Tianye; Li, Jun; Liu, Changchun; Xiong, Zhao; Yuan, Xiaodong

    2018-01-01

    In high-power laser system, the surface wavefront of large optics has a close link with its structure design and mounting method. The back-support transport mirror design is presently being investigated as a means in China's high-power laser system to hold the optical component firmly while minimizing the distortion of its reflecting surface. We have proposed a comprehensive analytical framework integrated numerical modeling and precise metrology for the mirror's mounting performance evaluation while treating the surface distortion as a key decision variable. The combination of numerical simulation and field tests demonstrates that the comprehensive analytical framework provides a detailed and accurate approach to evaluate the performance of the transport mirror. It is also verified that the back-support transport mirror is effectively compatible with state-of-the-art optical quality specifications. This study will pave the way for future research to solidify the design of back-support large laser optics in China's next generation inertial confinement fusion facility.

  15. Use of the Threshold of Toxicological Concern (TTC) approach for deriving target values for drinking water contaminants.

    PubMed

    Mons, M N; Heringa, M B; van Genderen, J; Puijker, L M; Brand, W; van Leeuwen, C J; Stoks, P; van der Hoek, J P; van der Kooij, D

    2013-03-15

    Ongoing pollution and improving analytical techniques reveal more and more anthropogenic substances in drinking water sources, and incidentally in treated water as well. In fact, complete absence of any trace pollutant in treated drinking water is an illusion as current analytical techniques are capable of detecting very low concentrations. Most of the substances detected lack toxicity data to derive safe levels and have not yet been regulated. Although the concentrations in treated water usually do not have adverse health effects, their presence is still undesired because of customer perception. This leads to the question how sensitive analytical methods need to become for water quality screening, at what levels water suppliers need to take action and how effective treatment methods need to be designed to remove contaminants sufficiently. Therefore, in the Netherlands a clear and consistent approach called 'Drinking Water Quality for the 21st century (Q21)' has been developed within the joint research program of the drinking water companies. Target values for anthropogenic drinking water contaminants were derived by using the recently introduced Threshold of Toxicological Concern (TTC) approach. The target values for individual genotoxic and steroid endocrine chemicals were set at 0.01 μg/L. For all other organic chemicals the target values were set at 0.1 μg/L. The target value for the total sum of genotoxic chemicals, the total sum of steroid hormones and the total sum of all other organic compounds were set at 0.01, 0.01 and 1.0 μg/L, respectively. The Dutch Q21 approach is further supplemented by the standstill-principle and effect-directed testing. The approach is helpful in defining the goals and limits of future treatment process designs and of analytical methods to further improve and ensure the quality of drinking water, without going to unnecessary extents. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Measured effects of coolant injection on the performance of a film cooled turbine

    NASA Technical Reports Server (NTRS)

    Mcdonel, J. D.; Eiswerth, J. E.

    1977-01-01

    Tests have been conducted on a 20-inch diameter single-stage air-cooled turbine designed to evaluate the effects of film cooling air on turbine aerodynamic performance. The present paper reports the results of five test configurations, including two different cooling designs and three combinations of cooled and solid airfoils. A comparison is made of the experimental results with a previously published analytical method of evaluating coolant injection effects on turbine performance.

  17. A metadata-driven approach to data repository design.

    PubMed

    Harvey, Matthew J; McLean, Andrew; Rzepa, Henry S

    2017-01-01

    The design and use of a metadata-driven data repository for research data management is described. Metadata is collected automatically during the submission process whenever possible and is registered with DataCite in accordance with their current metadata schema, in exchange for a persistent digital object identifier. Two examples of data preview are illustrated, including the demonstration of a method for integration with commercial software that confers rich domain-specific data analytics without introducing customisation into the repository itself.

  18. Experimental demonstration of the control of flexible structures

    NASA Technical Reports Server (NTRS)

    Schaechter, D. B.; Eldred, D. B.

    1984-01-01

    The Large Space Structure Technology Flexible Beam Experiment employs a pinned-free flexible beam to demonstrate such required methods as dynamic and adaptive control, as well as various control law design approaches and hardware requirements. An attempt is made to define the mechanization difficulties that may inhere in flexible structures. Attention is presently given to analytical work performed in support of the test facility's development, the final design's specifications, the control laws' synthesis, and experimental results obtained.

  19. 40 CFR 161.180 - Enforcement analytical method.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Enforcement analytical method. 161.180... DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements § 161.180 Enforcement analytical method. An analytical method suitable for enforcement purposes must be...

  20. Guidance from an NIH Workshop on Designing, Implementing, and Reporting Clinical Studies of Soy Interventions1–4

    PubMed Central

    Klein, Marguerite A.; Nahin, Richard L.; Messina, Mark J.; Rader, Jeanne I.; Thompson, Lilian U.; Badger, Thomas M.; Dwyer, Johanna T.; Kim, Young S.; Pontzer, Carol H.; Starke-Reed, Pamela E.; Weaver, Connie M.

    2010-01-01

    The NIH sponsored a scientific workshop, “Soy Protein/Isoflavone Research: Challenges in Designing and Evaluating Intervention Studies,” July 28–29, 2009. The workshop goal was to provide guidance for the next generation of soy protein/isoflavone human research. Session topics included population exposure to soy; the variability of the human response to soy; product composition; methods, tools, and resources available to estimate exposure and protocol adherence; and analytical methods to assess soy in foods and supplements and analytes in biologic fluids and other tissues. The intent of the workshop was to address the quality of soy studies, not the efficacy or safety of soy. Prior NIH workshops and an evidence-based review questioned the quality of data from human soy studies. If clinical studies are pursued, investigators need to ensure that the experimental designs are optimal and the studies properly executed. The workshop participants identified methodological issues that may confound study results and interpretation. Scientifically sound and useful options for dealing with these issues were discussed. The resulting guidance is presented in this document with a brief rationale. The guidance is specific to soy clinical research and does not address nonsoy-related factors that should also be considered in designing and reporting clinical studies. This guidance may be used by investigators, journal editors, study sponsors, and protocol reviewers for a variety of purposes, including designing and implementing trials, reporting results, and interpreting published epidemiological and clinical studies. PMID:20392880

Top