Sample records for develop analytical methodology

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    A new field of research, visual analytics, has recently been introduced. This has been defined as “the science of analytical reasoning facilitated by visual interfaces." Visual analytic environments, therefore, support analytical reasoning using visual representations and interactions, with data representations and transformation capabilities, to support production, presentation and dissemination. As researchers begin to develop visual analytic environments, it will be advantageous to develop metrics and methodologies to help researchers measure the progress of their work and understand the impact their work will have on the users who will work in such environments. This paper presents five areas or aspects ofmore » visual analytic environments that should be considered as metrics and methodologies for evaluation are developed. Evaluation aspects need to include usability, but it is necessary to go beyond basic usability. The areas of situation awareness, collaboration, interaction, creativity, and utility are proposed as areas for initial consideration. The steps that need to be undertaken to develop systematic evaluation methodologies and metrics for visual analytic environments are outlined.« less

  2. A methodology to enhance electromagnetic compatibility in joint military operations

    NASA Astrophysics Data System (ADS)

    Buckellew, William R.

    The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.

  3. Force 2025 and Beyond Strategic Force Design Analytic Model

    DTIC Science & Technology

    2017-01-12

    depiction of the core ideas of our force design model. Figure 1: Description of Force Design Model Figure 2 shows an overview of our methodology ...the F2025B Force Design Analytic Model research conducted by TRAC- MTRY and the Naval Postgraduate School. Our research develops a methodology for...designs. We describe a data development methodology that characterizes the data required to construct a force design model using our approach. We

  4. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 1: Executive summary and technical narrative

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Walker, Richard E.

    1993-01-01

    During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.

  5. Eco-analytical Methodology in Environmental Problems Monitoring

    NASA Astrophysics Data System (ADS)

    Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.

    2017-01-01

    Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.

  6. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  7. Validated analytical methodology for the simultaneous determination of a wide range of pesticides in human blood using GC-MS/MS and LC-ESI/MS/MS and its application in two poisoning cases.

    PubMed

    Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D

    2015-09-01

    Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  8. Recent Methodology in Ginseng Analysis

    PubMed Central

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  9. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In the first laboratory evaluation of these biobeds endosulfan was bioconverted up to 87% and chlorpyrifos more than 79% after 27 days. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    DOT National Transportation Integrated Search

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  11. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  12. Shuttle payload bay dynamic environments: Summary and conclusion report for STS flights 1-5 and 9

    NASA Technical Reports Server (NTRS)

    Oconnell, M.; Garba, J.; Kern, D.

    1984-01-01

    The vibration, acoustic and low frequency loads data from the first 5 shuttle flights are presented. The engineering analysis of that data is also presented. Vibroacoustic data from STS-9 are also presented because they represent the only data taken on a large payload. Payload dynamic environment predictions developed by the participation of various NASA and industrial centers are presented along with a comparison of analytical loads methodology predictions with flight data, including a brief description of the methodologies employed in developing those predictions for payloads. The review of prediction methodologies illustrates how different centers have approached the problems of developing shuttle dynamic environmental predictions and criteria. Ongoing research activities related to the shuttle dynamic environments are also described. Analytical software recently developed for the prediction of payload acoustic and vibration environments are also described.

  13. Analytical and Numerical Results for an Adhesively Bonded Joint Subjected to Pure Bending

    NASA Technical Reports Server (NTRS)

    Smeltzer, Stanley S., III; Lundgren, Eric

    2006-01-01

    A one-dimensional, semi-analytical methodology that was previously developed for evaluating adhesively bonded joints composed of anisotropic adherends and adhesives that exhibit inelastic material behavior is further verified in the present paper. A summary of the first-order differential equations and applied joint loading used to determine the adhesive response from the methodology are also presented. The method was previously verified against a variety of single-lap joint configurations from the literature that subjected the joints to cases of axial tension and pure bending. Using the same joint configuration and applied bending load presented in a study by Yang, the finite element analysis software ABAQUS was used to further verify the semi-analytical method. Linear static ABAQUS results are presented for two models, one with a coarse and one with a fine element meshing, that were used to verify convergence of the finite element analyses. Close agreement between the finite element results and the semi-analytical methodology were determined for both the shear and normal stress responses of the adhesive bondline. Thus, the semi-analytical methodology was successfully verified using the ABAQUS finite element software and a single-lap joint configuration subjected to pure bending.

  14. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  15. PCB congener analysis with Hall electrolytic conductivity detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edstrom, R.D.

    1989-01-01

    This work reports the development of an analytical methodology for the analysis of PCB congeners based on integrating relative retention data provided by other researchers. The retention data were transposed into a multiple retention marker system which provided good precision in the calculation of relative retention indices for PCB congener analysis. Analytical run times for the developed methodology were approximately one hour using a commercially available GC capillary column. A Tracor Model 700A Hall Electrolytic Conductivity Detector (HECD) was employed in the GC detection of Aroclor standards and environmental samples. Responses by the HECD provided good sensitivity and were reasonablymore » predictable. Ten response factors were calculated based on the molar chlorine content of each homolog group. Homolog distributions were determined for Aroclors 1016, 1221, 1232, 1242, 1248, 1254, 1260, 1262 along with binary and ternary mixtures of the same. These distributions were compared with distributions reported by other researchers using electron capture detection as well as chemical ionization mass spectrometric methodologies. Homolog distributions acquired by the HECD methodology showed good correlation with the previously mentioned methodologies. The developed analytical methodology was used in the analysis of bluefish (Pomatomas saltatrix) and weakfish (Cynoscion regalis) collected from the York River, lower James River and lower Chesapeake Bay in Virginia. Total PCB concentrations were calculated and homolog distributions were constructed from the acquired data. Increases in total PCB concentrations were found in the analyzed fish samples during the fall of 1985 collected from the lower James River and lower Chesapeake Bay.« less

  16. Building a Three-Dimensional Nano-Bio Interface for Aptasensing: An Analytical Methodology Based on Steric Hindrance Initiated Signal Amplification Effect.

    PubMed

    Du, Xiaojiao; Jiang, Ding; Hao, Nan; Qian, Jing; Dai, Liming; Zhou, Lei; Hu, Jianping; Wang, Kun

    2016-10-04

    The development of novel detection methodologies in electrochemiluminescence (ECL) aptasensor fields with simplicity and ultrasensitivity is essential for constructing biosensing architectures. Herein, a facile, specific, and sensitive methodology was developed unprecedentedly for quantitative detection of microcystin-LR (MC-LR) based on three-dimensional boron and nitrogen codoped graphene hydrogels (BN-GHs) assisted steric hindrance amplifying effect between the aptamer and target analytes. The recognition reaction was monitored by quartz crystal microbalance (QCM) to validate the possible steric hindrance effect. First, the BN-GHs were synthesized via self-assembled hydrothermal method and then applied as the Ru(bpy) 3 2+ immobilization platform for further loading the biomolecule aptamers due to their nanoporous structure and large specific surface area. Interestingly, we discovered for the first time that, without the aid of conventional double-stranded DNA configuration, such three-dimensional nanomaterials can directly amplify the steric hindrance effect between the aptamer and target analytes to a detectable level, and this facile methodology could be for an exquisite assay. With the MC-LR as a model, this novel ECL biosensor showed a high sensitivity and a wide linear range. This strategy supplies a simple and versatile platform for specific and sensitive determination of a wide range of aptamer-related targets, implying that three-dimensional nanomaterials would play a crucial role in engineering and developing novel detection methodologies for ECL aptasensing fields.

  17. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  18. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less

  19. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  1. Analytical Methodologies for the Determination of Endocrine Disrupting Compounds in Biological and Environmental Samples

    PubMed Central

    Sosa-Ferrera, Zoraida; Mahugo-Santana, Cristina; Santana-Rodríguez, José Juan

    2013-01-01

    Endocrine-disruptor compounds (EDCs) can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented. PMID:23738329

  2. Transport composite fuselage technology: Impact dynamics and acoustic transmission

    NASA Technical Reports Server (NTRS)

    Jackson, A. C.; Balena, F. J.; Labarge, W. L.; Pei, G.; Pitman, W. A.; Wittlin, G.

    1986-01-01

    A program was performed to develop and demonstrate the impact dynamics and acoustic transmission technology for a composite fuselage which meets the design requirements of a 1990 large transport aircraft without substantial weight and cost penalties. The program developed the analytical methodology for the prediction of acoustic transmission behavior of advanced composite stiffened shell structures. The methodology predicted that the interior noise level in a composite fuselage due to turbulent boundary layer will be less than in a comparable aluminum fuselage. The verification of these analyses will be performed by NASA Langley Research Center using a composite fuselage shell fabricated by filament winding. The program also developed analytical methodology for the prediction of the impact dynamics behavior of lower fuselage structure constructed with composite materials. Development tests were performed to demonstrate that the composite structure designed to the same operating load requirement can have at least the same energy absorption capability as aluminum structure.

  3. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  4. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    PubMed

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Analytical methodology for safety validation of computer controlled subsystems. Volume 1 : state-of-the-art and assessment of safety verification/validation methodologies

    DOT National Transportation Integrated Search

    1995-09-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...

  6. Determination of glycols in air: development of sampling and analytical methodology and application to theatrical smokes.

    PubMed

    Pendergrass, S M

    1999-01-01

    Glycol-based fluids are used in the production of theatrical smokes in theaters, concerts, and other stage productions. The fluids are heated and dispersed in aerosol form to create the effect of a smoke, mist, or fog. There have been reports of adverse health effects such as respiratory irritation, chest tightness, shortness of breath, asthma, and skin rashes. Previous attempts to collect and quantify the aerosolized glycols used in fogging agents have been plagued by inconsistent results, both in the efficiency of collection and in the chromatographic analysis of the glycol components. The development of improved sampling and analytical methodology for aerosolized glycols was required to assess workplace exposures more effectively. An Occupational Safety and Health Administration versatile sampler tube was selected for the collection of ethylene glycol, propylene glycol, 1,3-butylene glycol, diethylene glycol, triethylene glycol, and tetraethylene glycol aerosols. Analytical methodology for the separation, identification, and quantitation of the six glycols using gas chromatography/flame ionization detection is described. Limits of detection of the glycol analytes ranged from 7 to 16 micrograms/sample. Desorption efficiencies for all glycol compounds were determined over the range of study and averaged greater than 90%. Storage stability results were acceptable after 28 days for all analytes except ethylene glycol, which was stable at ambient temperature for 14 days. Based on the results of this study, the new glycol method was published in the NIOSH Manual of Analytical Methods.

  7. Passenger rail vehicle safety assessment methodology. Volume II, Detailed analyses and simulation results.

    DOT National Transportation Integrated Search

    2000-04-01

    This report presents detailed analytic tools and results on dynamic response which are used to develop the safe dynamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical parameters and characteris...

  8. Advanced Crash Avoidance Technologies (ACAT) Program - Final Report of the Volvo-Ford-UMTRI Project: Safety Impact Methodology for Lane Departure Warning - Method Development and Estimation of Benefits

    DOT National Transportation Integrated Search

    2010-10-01

    The Volvo-Ford-UMTRI project: Safety Impact Methodology (SIM) for Lane Departure Warning is part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program. The project developed a basic analytical framework for e...

  9. SociAL Sensor Analytics: Measuring Phenomenology at Scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corley, Courtney D.; Dowling, Chase P.; Rose, Stuart J.

    The objective of this paper is to present a system for interrogating immense social media streams through analytical methodologies that characterize topics and events critical to tactical and strategic planning. First, we propose a conceptual framework for interpreting social media as a sensor network. Time-series models and topic clustering algorithms are used to implement this concept into a functioning analytical system. Next, we address two scientific challenges: 1) to understand, quantify, and baseline phenomenology of social media at scale, and 2) to develop analytical methodologies to detect and investigate events of interest. This paper then documents computational methods and reportsmore » experimental findings that address these challenges. Ultimately, the ability to process billions of social media posts per week over a period of years enables the identification of patterns and predictors of tactical and strategic concerns at an unprecedented rate through SociAL Sensor Analytics (SALSA).« less

  10. COMPARISON OF ANALYTICAL METHODS FOR THE MEASUREMENT OF NON-VIABLE BIOLOGICAL PM

    EPA Science Inventory

    The paper describes a preliminary research effort to develop a methodology for the measurement of non-viable biologically based particulate matter (PM), analyzing for mold, dust mite, and ragweed antigens and endotoxins. Using a comparison of analytical methods, the research obj...

  11. Cognitive-Developmental and Behavior-Analytic Theories: Evolving into Complementarity

    ERIC Educational Resources Information Center

    Overton, Willis F.; Ennis, Michelle D.

    2006-01-01

    Historically, cognitive-developmental and behavior-analytic approaches to the study of human behavior change and development have been presented as incompatible alternative theoretical and methodological perspectives. This presumed incompatibility has been understood as arising from divergent sets of metatheoretical assumptions that take the form…

  12. Analytical methodology for determination of helicopter IFR precision approach requirements. [pilot workload and acceptance level

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.

    1980-01-01

    A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.

  13. Selected analytical challenges in the determination of pharmaceuticals in drinking/marine waters and soil/sediment samples.

    PubMed

    Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr

    2016-03-20

    Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales

    PubMed Central

    Ayton, Gary S.; Voth, Gregory A.

    2009-01-01

    A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167

  15. Structural Sizing Methodology for the Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) System

    NASA Technical Reports Server (NTRS)

    Jones, Thomas C.; Dorsey, John T.; Doggett, William R.

    2015-01-01

    The Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) is a versatile long-reach robotic manipulator that is currently being tested at NASA Langley Research Center. TALISMAN is designed to be highly mass-efficient and multi-mission capable, with applications including asteroid retrieval and manipulation, in-space servicing, and astronaut and payload positioning. The manipulator uses a modular, periodic, tension-compression design that lends itself well to analytical modeling. Given the versatility of application for TALISMAN, a structural sizing methodology was developed that could rapidly assess mass and configuration sensitivities for any specified operating work space, applied loads and mission requirements. This methodology allows the systematic sizing of the key structural members of TALISMAN, which include the truss arm links, the spreaders and the tension elements. This paper summarizes the detailed analytical derivations and methodology that support the structural sizing approach and provides results from some recent TALISMAN designs developed for current and proposed mission architectures.

  16. Design, Implementation, and Operational Methodologies for Sub-arcsecond Attitude Determination, Control, and Stabilization of the Super-pressure Balloon-Borne Imaging Telescope (SuperBIT)

    NASA Astrophysics Data System (ADS)

    Javier Romualdez, Luis

    Scientific balloon-borne instrumentation offers an attractive, competitive, and effective alternative to space-borne missions when considering the overall scope, cost, and development timescale required to design and launch scientific instruments. In particular, the balloon-borne environment provides a near-space regime that is suitable for a number of modern astronomical and cosmological experiments, where the atmospheric interference suffered by ground-based instrumentation is negligible at stratospheric altitudes. This work is centered around the analytical strategies and implementation considerations for the attitude determination and control of SuperBIT, a scientific balloon-borne payload capable of meeting the strict sub-arcsecond pointing and image stability requirements demanded by modern cosmological experiments. Broadly speaking, the designed stability specifications of SuperBIT coupled with its observational efficiency, image quality, and accessibility rivals state-of-the-art astronomical observatories such as the Hubble Space Telescope. To this end, this work presents an end-to-end design methodology for precision pointing balloon-borne payloads such as SuperBIT within an analytical yet implementationally grounded context. Simulation models of SuperBIT are analytically derived to aid in pre-assembly trade-off and case studies that are pertinent to the dynamic balloon-borne environment. From these results, state estimation techniques and control methodologies are extensively developed, leveraging the analytical framework of simulation models and design studies. This pre-assembly design phase is physically validated during assembly, integration, and testing through implementation in real-time hardware and software, which bridges the gap between analytical results and practical application. SuperBIT attitude determination and control is demonstrated throughout two engineering test flights that verify pointing and image stability requirements in flight, where the post-flight results close the overall design loop by suggesting practical improvements to pre-design methodologies. Overall, the analytical and practical results presented in this work, though centered around the SuperBIT project, provide generically useful and implementationally viable methodologies for high precision balloon-borne instrumentation, all of which are validated, justified, and improved both theoretically and practically. As such, the continuing development of SuperBIT, built from the work presented in this thesis, strives to further the potential for scientific balloon-borne astronomy in the near future.

  17. Analysis of Environmental Contamination resulting from Catastrophic Incidents: Part two: Building Laboratory Capability by Selecting and Developing Analytical Methodologies

    EPA Science Inventory

    Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface resid...

  18. The Literature Review of Analytical Support to Defence Transformation: Lessons Learned from Turkish Air Force Transformation Activities

    DTIC Science & Technology

    2010-04-01

    available [11]. Additionally, Table-3 is a guide for DMAIC methodology including 29 different methods [12]. RTO-MP-SAS-081 6 - 4 NATO UNCLASSIFIED NATO...Table 3: DMAIC Methodology (5-Phase Methodology). Define Measure Analyze Improve Control Project Charter Prioritization Matrix 5 Whys Analysis...Methodology Scope [13] DMAIC PDCA Develop performance priorities This is a preliminary stage that precedes specific improvement projects, and the aim

  19. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  20. ANALYTICAL METHODOLOGY FOR THE DETERMINATION OF KEPONE (TRADEMARK) RESIDUES IN FISH, SHELLFISH, AND HI-VOL AIR FILTERS

    EPA Science Inventory

    The recent discovery of the pollution of the environment with Kepone has resulted in a tremendous interest in the development of residue methodology for the compound. Current multiresidue methods for the determination of the common organochlorinated pesticides do not yield good q...

  1. Hierarchical Analytical Approaches for Unraveling the Composition of Proprietary Mixtures

    EPA Pesticide Factsheets

    The composition of commercial mixtures including pesticide inert ingredients, aircraft deicers, and aqueous film-forming foam (AFFF) formulations, and by analogy, fracking fluids, are proprietary. Quantitative analytical methodologies can only be developed for mixture components once their identities are known. Because proprietary mixtures may contain volatile and non-volatile components, a hierarchy of analytical methods is often required for the full identification of all proprietary mixture components.

  2. Application of Characterization, Modeling, and Analytics Towards Understanding Process Structure Linkages in Metallic 3D Printing (Postprint)

    DTIC Science & Technology

    2017-08-01

    of metallic additive manufacturing processes and show that combining experimental data with modelling and advanced data processing and analytics...manufacturing processes and show that combining experimental data with modelling and advanced data processing and analytics methods will accelerate that...geometries, we develop a methodology that couples experimental data and modelling to convert the scan paths into spatially resolved local thermal histories

  3. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Learning Analytics in Higher Education Development: A Roadmap

    ERIC Educational Resources Information Center

    Adejo, Olugbenga; Connolly, Thomas

    2017-01-01

    The increase in education data and advance in technology are bringing about enhanced teaching and learning methodology. The emerging field of Learning Analytics (LA) continues to seek ways to improve the different methods of gathering, analysing, managing and presenting learners' data with the sole aim of using it to improve the student learning…

  5. Horizon Missions Methodology - Using new paradigms to overcome conceptual blocks to innovation

    NASA Technical Reports Server (NTRS)

    Anderson, John L.

    1993-01-01

    The Horizon Mission Methodology was developed to provide a systematic analytical approach for evaluating and identifying technological requirements for breakthrough technology options (BTOs) and for assessing their potential to provide revolutionary capabilities for advanced space missions. Here, attention is given to the further use of the methodology as a new tool for a broader range of studies dealing with technology innovation and new technology paradigms.

  6. An overview of key technology thrusts at Bell Helicopter Textron

    NASA Technical Reports Server (NTRS)

    Harse, James H.; Yen, Jing G.; Taylor, Rodney S.

    1988-01-01

    Insight is provided into several key technologies at Bell. Specific topics include the results of ongoing research and development in advanced rotors, methodology development, and new configurations. The discussion on advanced rotors highlight developments on the composite, bearingless rotor, including the development and testing of full scale flight hardware as well as some of the design support analyses and verification testing. The discussion on methodology development concentrates on analytical development in aeromechanics, including correlation studies and design application. New configurations, presents the results of some advanced configuration studies including hardware development.

  7. Durability predictions of adhesively bonded composite structures using accelerated characterization methods

    NASA Technical Reports Server (NTRS)

    Brinson, H. F.

    1985-01-01

    The utilization of adhesive bonding for composite structures is briefly assessed. The need for a method to determine damage initiation and propagation for such joints is outlined. Methods currently in use to analyze both adhesive joints and fiber reinforced plastics is mentioned and it is indicated that all methods require the input of the mechanical properties of the polymeric adhesive and composite matrix material. The mechanical properties of polymers are indicated to be viscoelastic and sensitive to environmental effects. A method to analytically characterize environmentally dependent linear and nonlinear viscoelastic properties is given. It is indicated that the methodology can be used to extrapolate short term data to long term design lifetimes. That is, the method can be used for long term durability predictions. Experimental results for near adhesive resins, polymers used as composite matrices and unidirectional composite laminates is given. The data is fitted well with the analytical durability methodology. Finally, suggestions are outlined for the development of an analytical methodology for the durability predictions of adhesively bonded composite structures.

  8. Designing Evaluations. 2012 Revision. Applied Research and Methods. GAO-12-208G

    ERIC Educational Resources Information Center

    US Government Accountability Office, 2012

    2012-01-01

    GAO assists congressional decision makers in their deliberations by furnishing them with analytical information on issues and options. Many diverse methodologies are needed to develop sound and timely answers to the questions the Congress asks. To provide GAO evaluators with basic information about the more commonly used methodologies, GAO's…

  9. Coprecipitation-assisted coacervative extraction coupled to high-performance liquid chromatography: An approach for determining organophosphorus pesticides in water samples.

    PubMed

    Mammana, Sabrina B; Berton, Paula; Camargo, Alejandra B; Lascalea, Gustavo E; Altamirano, Jorgelina C

    2017-05-01

    An analytical methodology based on coprecipitation-assisted coacervative extraction coupled to HPLC-UV was developed for determination of five organophosphorus pesticides (OPPs), including fenitrothion, guthion, parathion, methidathion, and chlorpyrifos, in water samples. It involves a green technique leading to an efficient and simple analytical methodology suitable for high-throughput analysis. Relevant physicochemical variables were studied and optimized on the analytical response of each OPP. Under optimized conditions, the resulting methodology was as follows: an aliquot of 9 mL of water sample was placed into a centrifuge tube and 0.5 mL sodium citrate 0.1 M, pH 4; 0.08 mL Al 2 (SO 4 ) 3 0.1 M; and 0.7 mL SDS 0.1 M were added and homogenized. After centrifugation the supernatant was discarded. A 700 μL aliquot of the coacervate-rich phase obtained was dissolved with 300 μL of methanol and 20 μL of the resulting solution was analyzed by HPLC-UV. The resulting LODs ranged within 0.7-2.5 ng/mL and the achieved RSD and recovery values were <8% (n = 3) and >81%, respectively. The proposed analytical methodology was successfully applied for the analysis of five OPPs in water samples for human consumption of different locations of Mendoza. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    ERIC Educational Resources Information Center

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  11. Understanding Fluorescence Measurements through a Guided-Inquiry and Discovery Experiment in Advanced Analytical Laboratory

    ERIC Educational Resources Information Center

    Wilczek-Vera, Grazyna; Salin, Eric Dunbar

    2011-01-01

    An experiment on fluorescence spectroscopy suitable for an advanced analytical laboratory is presented. Its conceptual development used a combination of the expository and discovery styles. The "learn-as-you-go" and direct "hands-on" methodology applied ensures an active role for a student in the process of visualization and discovery of concepts.…

  12. THE DETERMINATION OF NON-PESTICIDAL AND PESTICIDAL ORGANOTIN COMPOUNDS IN WATER BY GAS CHROMATOGRAPHY WITH [PULSED] FLAME PHOTOMETRIC DETECTION (GS/PFPD): THE EFFECTS OF "MASS" DISCRIMINATION

    EPA Science Inventory

    Capillary gas chromatography with GC/PFPD was used in the development of analytical methodology for determining both non-pesticidal and pesticidal organotin compounds in drinking water and other aqueous matrices. The method involves aqueous ethylation of organotin analytes with ...

  13. A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses

    PubMed Central

    Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert

    2011-01-01

    Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325

  14. A Big Data Analytics Methodology Program in the Health Sector

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  15. Research and development activities in unified control-structure modeling and design

    NASA Technical Reports Server (NTRS)

    Nayak, A. P.

    1985-01-01

    Results of work to develop a unified control/structures modeling and design capability for large space structures modeling are presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. Parallel research done by other researchers is reviewed. The development of a methodology for global design optimization is recommended as a long-term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization.

  16. Using the Photovoice Methodology to Increase Engagement and Sharpen Students' Analytical Skills Regarding Cultures, Lifestyles, and Markets Internationally

    ERIC Educational Resources Information Center

    Kelly, Kathleen; Lee, Seung Hwan; Bowen Ray, Heather; Kandaurova, Maria

    2018-01-01

    Barriers to cross-cultural instruction challenge even experienced educators and their students. To increase cross-cultural competence and bridge learning gaps, professors in two countries adapted the Photovoice methodology to develop shared visual vocabularies with students and unearth hidden assumptions. Results from an anonymous evaluation…

  17. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Reduction of adverse aerodynamic effects of large trucks, Volume I. Technical report

    DOT National Transportation Integrated Search

    1978-09-01

    The overall objective of this study has been to develop methods of minimizing three aerodynamic-related phenomena: truck-induced aerodynamic disturbances, splash, and spray. An analytical methodology has been developed and used to characterize aerody...

  19. Evaluation of capillary zone electrophoresis for the quality control of complex biologic samples: Application to snake venoms.

    PubMed

    Kpaibe, André P S; Ben-Ameur, Randa; Coussot, Gaëlle; Ladner, Yoann; Montels, Jérôme; Ake, Michèle; Perrin, Catherine

    2017-08-01

    Snake venoms constitute a very promising resource for the development of new medicines. They are mainly composed of very complex peptide and protein mixtures, which composition may vary significantly from batch to batch. This latter consideration is a challenge for routine quality control (QC) in the pharmaceutical industry. In this paper, we report the use of capillary zone electrophoresis for the development of an analytical fingerprint methodology to assess the quality of snake venoms. The analytical fingerprint concept is being widely used for the QC of herbal drugs but rarely for venoms QC so far. CZE was chosen for its intrinsic efficiency in the separation of protein and peptide mixtures. The analytical fingerprint methodology was first developed and evaluated for a particular snake venom, Lachesis muta. Optimal analysis conditions required the use of PDADMAC capillary coating to avoid protein and peptide adsorption. Same analytical conditions were then applied to other snake venom species. Different electrophoretic profiles were obtained for each venom. Excellent repeatability and intermediate precision was observed for each batch. Analysis of different batches of the same species revealed inherent qualitative and quantitative composition variations of the venoms between individuals. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Big Data Analytics Methodology in the Financial Industry

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  1. Prashant Sharan | NREL

    Science.gov Websites

    Engineering Prashant.Sharan@nrel.gov | 303-275-3067 Prashant Sharan joined the Thermal Systems Group at NREL ), and solar thermal system. Prashant developed analytical methodologies for optimal integration of

  2. Disturbance characteristics of half-selected cells in a cross-point resistive switching memory array

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Li, Haitong; Chen, Hong-Yu; Chen, Bing; Liu, Rui; Huang, Peng; Zhang, Feifei; Jiang, Zizhen; Ye, Hongfei; Gao, Bin; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng; Wong, H.-S. Philip; Yu, Shimeng

    2016-05-01

    Disturbance characteristics of cross-point resistive random access memory (RRAM) arrays are comprehensively studied in this paper. An analytical model is developed to quantify the number of pulses (#Pulse) the cell can bear before disturbance occurs under various sub-switching voltage stresses based on physical understanding. An evaluation methodology is proposed to assess the disturb behavior of half-selected (HS) cells in cross-point RRAM arrays by combining the analytical model and SPICE simulation. The characteristics of cross-point RRAM arrays such as energy consumption, reliable operating cycles and total error bits are evaluated by the methodology. A possible solution to mitigate disturbance is proposed.

  3. Clinical Chemistry Laboratory Automation in the 21st Century - Amat Victoria curam (Victory loves careful preparation)

    PubMed Central

    Armbruster, David A; Overcash, David R; Reyes, Jaime

    2014-01-01

    The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760

  4. Clinical Chemistry Laboratory Automation in the 21st Century - Amat Victoria curam (Victory loves careful preparation).

    PubMed

    Armbruster, David A; Overcash, David R; Reyes, Jaime

    2014-08-01

    The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.

  5. Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches

    NASA Technical Reports Server (NTRS)

    Farassat, Fereidoun; Casper, Jay H.

    2006-01-01

    In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.

  6. Probabilistic assessment methodology for continuous-type petroleum accumulations

    USGS Publications Warehouse

    Crovelli, R.A.

    2003-01-01

    The analytic resource assessment method, called ACCESS (Analytic Cell-based Continuous Energy Spreadsheet System), was developed to calculate estimates of petroleum resources for the geologic assessment model, called FORSPAN, in continuous-type petroleum accumulations. The ACCESS method is based upon mathematical equations derived from probability theory in the form of a computer spreadsheet system. ?? 2003 Elsevier B.V. All rights reserved.

  7. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. POLLUTION PREVENTION AND ENHANCEMENT OF BIODEGRADABILITY VIA ISOMER ELIMINATION IN CONSUMER PRODUCTS

    EPA Science Inventory

    The purpose of this project is to develop novel methodologies for the analysis and detection of chiral environmental contaminants. Conventional analytical techniques do not discriminate between enantiomers. By using newly developed enantioselective methods, the environmental pers...

  9. Decision analysis to complete diagnostic research by closing the gap between test characteristics and cost-effectiveness.

    PubMed

    Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik

    2009-12-01

    The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).

  10. Electrochemical concentration measurements for multianalyte mixtures in simulated electrorefiner salt

    NASA Astrophysics Data System (ADS)

    Rappleye, Devin Spencer

    The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).

  11. Journal Benchmarking for Strategic Publication Management and for Improving Journal Positioning in the World Ranking Systems

    ERIC Educational Resources Information Center

    Moskovkin, Vladimir M.; Bocharova, Emilia A.; Balashova, Oksana V.

    2014-01-01

    Purpose: The purpose of this paper is to introduce and develop the methodology of journal benchmarking. Design/Methodology/ Approach: The journal benchmarking method is understood to be an analytic procedure of continuous monitoring and comparing of the advance of specific journal(s) against that of competing journals in the same subject area,…

  12. The role of chromatographic and chiroptical spectroscopic techniques and methodologies in support of drug discovery for atropisomeric drug inhibitors of Bruton's tyrosine kinase.

    PubMed

    Dai, Jun; Wang, Chunlei; Traeger, Sarah C; Discenza, Lorell; Obermeier, Mary T; Tymiak, Adrienne A; Zhang, Yingru

    2017-03-03

    Atropisomers are stereoisomers resulting from hindered bond rotation. From synthesis of pure atropisomers, characterization of their interconversion thermodynamics to investigation of biological stereoselectivity, the evaluation of drug candidates subject to atropisomerism creates special challenges and can be complicated in both early drug discovery and later drug development. In this paper, we demonstrate an array of analytical techniques and systematic approaches to study the atropisomerism of drug molecules to meet these challenges. Using a case study of Bruton's tyrosine kinase (BTK) inhibitor drug candidates at Bristol-Myers Squibb, we present the analytical strategies and methodologies used during drug discovery including the detection of atropisomers, the determination of their relative composition, the identification of relative chirality, the isolation of individual atropisomers, the evaluation of interconversion kinetics, and the characterization of chiral stability in the solid state and in solution. In vivo and in vitro stereo-stability and stereo-selectivity were investigated as well as the pharmacological significance of any changes in atropisomer ratios. Techniques applied in these studies include analytical and preparative enantioselective supercritical fluid chromatography (SFC), enantioselective high performance liquid chromatography (HPLC), circular dichroism (CD), and mass spectrometry (MS). Our experience illustrates how atropisomerism can be a very complicated issue in drug discovery and why a thorough understanding of this phenomenon is necessary to provide guidance for pharmaceutical development. Analytical techniques and methodologies facilitate key decisions during the discovery of atropisomeric drug candidates by characterizing time-dependent physicochemical properties that can have significant biological implications and relevance to pharmaceutical development plans. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Industrial Demand Module - NEMS Documentation

    EIA Publications

    2014-01-01

    Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Industrial Demand Module. The report catalogues and describes model assumptions, computational methodology, parameter estimation techniques, and model source code.

  14. 2017 Workplace and Gender Relations Survey of Reserve Component Members: Statistical Methodology Report

    DTIC Science & Technology

    2018-04-30

    2017 Workplace and Gender Relations Survey of Reserve Component Members Statistical Methodology Report Additional copies of this report...Survey of Reserve Component Members Statistical Methodology Report Office of People Analytics (OPA) 4800 Mark Center Drive, Suite...RESERVE COMPONENT MEMBERS STATISTICAL METHODOLOGY REPORT Introduction The Office of People Analytics’ Center for Health and Resilience (OPA[H&R

  15. Advancements in nano-enabled therapeutics for neuroHIV management.

    PubMed

    Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan

    This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.

  16. Green approach using monolithic column for simultaneous determination of coformulated drugs.

    PubMed

    Yehia, Ali M; Mohamed, Heba M

    2016-06-01

    Green chemistry and sustainability is now entirely encompassed across the majority of pharmaceutical companies and research labs. Researchers' attention is careworn toward implementing the green analytical chemistry principles for more eco-friendly analytical methodologies. Solvents play a dominant role in determining the greenness of the analytical procedure. Using safer solvents, the greenness profile of the methodology could be increased remarkably. In this context, a green chromatographic method has been developed and validated for the simultaneous determination of phenylephrine, paracetamol, and guaifenesin in their ternary pharmaceutical mixture. The chromatographic separation was carried out using monolithic column and green solvents as mobile phase. The use of monolithic column allows efficient separation protocols at higher flow rates, which results in short time of analysis. Two-factor three-level experimental design was used to optimize the chromatographic conditions. The greenness profile of the proposed methodology was assessed using eco-scale as a green metrics and was found to be an excellent green method with regard to the usage and production of hazardous chemicals and solvents, energy consumption, and amount of produced waste. The proposed method improved the environmental impact without compromising the analytical performance criteria and could be used as a safer alternate for the routine analysis of the studied drugs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Engine environmental effects on composite behavior

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Smith, G. T.

    1980-01-01

    A series of programs were conducted to investigate and develop the application of composite materials to turbojet engines. A significant part of that effort was directed to establishing the impact resistance and defect growth chracteristics of composite materials over the wide range of environmental conditions found in commercial turbojet engine operations. Both analytical and empirical efforts were involved. The experimental programs and the analytical methodology development as well as an evaluation program for the use of composite materials as fan exit guide vanes are summarized.

  18. Macroeconomic Activity Module - NEMS Documentation

    EIA Publications

    2016-01-01

    Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Macroeconomic Activity Module (MAM) used to develop the Annual Energy Outlook for 2016 (AEO2016). The report catalogues and describes the module assumptions, computations, methodology, parameter estimation techniques, and mainframe source code

  19. Modeling energy/economy interactions for conservation and renewable energy-policy analysis

    NASA Astrophysics Data System (ADS)

    Groncki, P. J.

    Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.

  20. Synthesis of qualitative linguistic research--a pilot review integrating and generalizing findings on doctor-patient interaction.

    PubMed

    Nowak, Peter

    2011-03-01

    There is a broad range qualitative linguistic research (sequential analysis) on doctor-patient interaction that had only a marginal impact on clinical research and practice. At least in parts this is due to the lack of qualitative research synthesis in the field. Available research summaries are not systematic in their methodology. This paper proposes a synthesis methodology for qualitative, sequential analytic research on doctor-patient interaction. The presented methodology is not new but specifies standard methodology of qualitative research synthesis for sequential analytic research. This pilot review synthesizes twelve studies on German-speaking doctor-patient interactions, identifies 45 verbal actions of doctors and structures them in a systematics of eight interaction components. Three interaction components ("Listening", "Asking for information", and "Giving information") seem to be central and cover two thirds of the identified action types. This pilot review demonstrates that sequential analytic research can be synthesized in a consistent and meaningful way, thus providing a more comprehensive and unbiased integration of research. Future synthesis of qualitative research in the area of health communication research is very much needed. Qualitative research synthesis can support the development of quantitative research and of educational materials in medical training and patient training. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  1. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  2. Fuzzy Current-Mode Control and Stability Analysis

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    2000-01-01

    In this paper a current-mode control (CMC) methodology is developed for a buck converter by using a fuzzy logic controller. Conventional CMC methodologies are based on lead-lag compensation with voltage and inductor current feedback. In this paper the converter lead-lag compensation will be substituted with a fuzzy controller. A small-signal model of the fuzzy controller will also be developed in order to examine the stability properties of this buck converter control system. The paper develops an analytical approach, introducing fuzzy control into the area of CMC.

  3. Higher Education for Sustainable Development in Japan: Policy and Progress

    ERIC Educational Resources Information Center

    Nomura, Ko; Abe, Osamu

    2010-01-01

    Purpose: The purpose of this paper is to review key developments and the role of governmental support in the field of education for sustainable development (ESD) in higher education in Japan. Design/methodology/approach: This is an analytical review paper on policy and practice, using an evaluative perspective to consider developments, challenges…

  4. Absorption into fluorescence. A method to sense biologically relevant gas molecules

    NASA Astrophysics Data System (ADS)

    Strianese, Maria; Varriale, Antonio; Staiano, Maria; Pellecchia, Claudio; D'Auria, Sabato

    2011-01-01

    In this work we present an innovative optical sensing methodology based on the use of biomolecules as molecular gating nano-systems. Here, as an example, we report on the detection ofanalytes related to climate change. In particular, we focused our attention on the detection ofnitric oxide (NO) and oxygen (O2). Our methodology builds on the possibility of modulating the excitation intensity of a fluorescent probe used as a transducer and a sensor molecule whose absorption is strongly affected by the binding of an analyte of interest used as a filter. The two simple conditions that have to be fulfilled for the method to work are: (a) the absorption spectrum of the sensor placed inside the cuvette, and acting as the recognition element for the analyte of interest, should strongly change upon the binding of the analyte and (b) the fluorescence dye transducer should exhibit an excitation band which overlaps with one or more absorption bands of the sensor. The absorption band of the sensor affected by the binding of the specific analyte should overlap with the excitation band of the transducer. The high sensitivity of fluorescence detection combined with the use of proteins as highly selective sensors makes this method a powerful basis for the development of a new generation of analytical assays. Proof-of-principle results showing that cytochrome c peroxidase (CcP) for NO detection and myoglobin (Mb) for O2 detection can be successfully used by exploiting our new methodology are reported. The proposed technology can be easily expanded to the determination of different target analytes.

  5. Methodology, status and plans for development and assessment of Cathare code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bestion, D.; Barre, F.; Faydide, B.

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests ormore » integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.« less

  6. Fleet management performance monitoring.

    DOT National Transportation Integrated Search

    2013-05-01

    The principle goal of this project was to enhance and expand the analytical modeling methodology previously developed as part of the Fleet Management Criteria: Disposal Points and Utilization Rates project completed in 2010. The enhanced and ex...

  7. Validation of urban freeway models.

    DOT National Transportation Integrated Search

    2015-01-01

    This report describes the methodology, data, conclusions, and enhanced models regarding the validation of two sets of models developed in the Strategic Highway Research Program 2 (SHRP 2) Reliability Project L03, Analytical Procedures for Determining...

  8. International Natural Gas Model 2011, Model Documentation Report

    EIA Publications

    2013-01-01

    This report documents the objectives, analytical approach and development of the International Natural Gas Model (INGM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  9. Determination of tocopherols and sitosterols in seeds and nuts by QuEChERS-liquid chromatography.

    PubMed

    Delgado-Zamarreño, M Milagros; Fernández-Prieto, Cristina; Bustamante-Rangel, Myriam; Pérez-Martín, Lara

    2016-02-01

    In the present work a simple, reliable and affordable sample treatment method for the simultaneous analysis of tocopherols and free phytosterols in nuts was developed. Analyte extraction was carried out using the QuEChERS methodology and analyte separation and detection were accomplished using HPLC-DAD. The use of this methodology for the extraction of natural occurring substances provides advantages such as speed, simplicity and ease of use. The parameters evaluated for the validation of the method developed included the linearity of the calibration plots, the detection and quantification limits, repeatability, reproducibility and recovery. The proposed method was successfully applied to the analysis of tocopherols and free phytosterols in samples of almonds, cashew nuts, hazelnuts, peanuts, tiger nuts, sun flower seeds and pistachios. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Assurance of Learning in the MIS Program

    ERIC Educational Resources Information Center

    Harper, Jeffrey S.; Harder, Joseph T.

    2009-01-01

    This article describes the development of a systematic and practical methodology for assessing program effectiveness and monitoring student development in undergraduate decision sciences programs. The model we present is based on a student's progression through learning stages associated with four key competencies: technical, analytical,…

  11. Commercial Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  12. Analytical Methodology for Predicting the Onset of Widespread Fatigue Damage in Fuselage Structure

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Newman, James C., Jr.; Piascik, Robert S.; Starnes, James H., Jr.

    1996-01-01

    NASA has developed a comprehensive analytical methodology for predicting the onset of widespread fatigue damage in fuselage structure. The determination of the number of flights and operational hours of aircraft service life that are related to the onset of widespread fatigue damage includes analyses for crack initiation, fatigue crack growth, and residual strength. Therefore, the computational capability required to predict analytically the onset of widespread fatigue damage must be able to represent a wide range of crack sizes from the material (microscale) level to the global structural-scale level. NASA studies indicate that the fatigue crack behavior in aircraft structure can be represented conveniently by the following three analysis scales: small three-dimensional cracks at the microscale level, through-the-thickness two-dimensional cracks at the local structural level, and long cracks at the global structural level. The computational requirements for each of these three analysis scales are described in this paper.

  13. Challenges and perspectives in quantitative NMR.

    PubMed

    Giraudeau, Patrick

    2017-01-01

    This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    PubMed

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  15. Educational Design as Conversation: A Conversation Analytical Perspective on Teacher Dialogue

    ERIC Educational Resources Information Center

    van Kruiningen, Jacqueline F.

    2013-01-01

    The aim of this methodological paper is to expound on and demonstrate the value of conversation-analytical research in the area of (informal) teacher learning. The author discusses some methodological issues in current research on interaction in teacher learning and holds a plea for conversation-analytical research on interactional processes in…

  16. INTEGRATING DATA ANALYTICS AND SIMULATION METHODS TO SUPPORT MANUFACTURING DECISION MAKING

    PubMed Central

    Kibira, Deogratias; Hatim, Qais; Kumara, Soundar; Shao, Guodong

    2017-01-01

    Modern manufacturing systems are installed with smart devices such as sensors that monitor system performance and collect data to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of streaming data collected is beyond simulation analysis alone. Simulation models are run with well-prepared data. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs; system optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. PMID:28690363

  17. Qualitative carbonyl profile in coffee beans through GDME-HPLC-DAD-MS/MS for coffee preliminary characterization.

    PubMed

    Cordeiro, Liliana; Valente, Inês M; Santos, João Rodrigo; Rodrigues, José A

    2018-05-01

    In this work, an analytical methodology for volatile carbonyl compounds characterization in green and roasted coffee beans was developed. The methodology relied on a recent and simple sample preparation technique, gas diffusion microextraction for extraction of the samples' volatiles, followed HPLC-DAD-MS/MS analysis. The experimental conditions in terms of extraction temperature and extraction time were studied. A profile for carbonyl compounds was obtained for both arabica and robusta coffee species (green and roasted samples). Twenty-seven carbonyl compounds were identified and further discussed, in light of reported literature, with different coffee characteristics: coffee ageing, organoleptic impact, presence of defective beans, authenticity, human's health implication, post-harvest coffee processing and roasting. The applied methodology showed to be a powerful analytical tool to be used for coffee characterization as it measures marker compounds of different coffee characteristics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  19. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  20. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  1. Social Impact Studies: An Expository Analysis

    ERIC Educational Resources Information Center

    Shields, Mark A.

    1975-01-01

    Analyzed are some selected studies on the social impact of resources development and construction projects including dams, highways, nuclear power plants and strip mines. The analytical and methodological problem of assessing differential impacts is stressed. (BT)

  2. AMD NOX REDUCTION IMPACTS

    EPA Science Inventory

    This is the first phase of a potentially multi-phase project aimed at identifying scientific methodologies that will lead to the development of innnovative analytical tools supporting the analysis of control strategy effectiveness, namely. accountabilty. Significant reductions i...

  3. Residential Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Model Documentation - Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code.

  4. Development of an analytical methodology for two-lane highway facility analysis.

    DOT National Transportation Integrated Search

    2012-11-01

    Florida is experiencing rapid growth and development. This applies not only to urban areas, but to rural areas as well. This growth is now resulting in congestion on facilities that previously did not have any. One area that is becoming a concern, pa...

  5. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    PubMed

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Analytical capillary isotachophoresis after 50 years of development: Recent progress 2014-2016.

    PubMed

    Malá, Zdena; Gebauer, Petr; Boček, Petr

    2017-01-01

    This review brings a survey of papers on analytical ITP published since 2014 until the first quarter of 2016. The 50th anniversary of ITP as a modern analytical method offers the opportunity to present a brief view on its beginnings and to discuss the present state of the art from the viewpoint of the history of its development. Reviewed papers from the field of theory and principles confirm the continuing importance of computer simulations in the discovery of new and unexpected phenomena. The strongly developing field of instrumentation and techniques shows novel channel methodologies including use of porous media and new on-chip assays, where ITP is often included in a preseparative or even preparative function. A number of new analytical applications are reported, with ITP appearing almost exclusively in combination with other principles and methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. An Analytic Framework to Support E.Learning Strategy Development

    ERIC Educational Resources Information Center

    Marshall, Stephen J.

    2012-01-01

    Purpose: The purpose of this paper is to discuss and demonstrate the relevance of a new conceptual framework for leading and managing the development of learning and teaching to e.learning strategy development. Design/methodology/approach: After reviewing and discussing the research literature on e.learning in higher education institutions from…

  8. STRengthening Analytical Thinking for Observational Studies: the STRATOS initiative

    PubMed Central

    Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James

    2014-01-01

    The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even ‘standard’ analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. PMID:25074480

  9. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less

  10. Analytical Chemistry Developmental Work Using a 243Am Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Khalil J.; Stanley, Floyd E.; Porterfield, Donivan R.

    2015-02-24

    This project seeks to reestablish our analytical capability to characterize Am bulk material and develop a reference material suitable to characterizing the purity and assay of 241Am oxide for industrial use. The tasks associated with this phase of the project included conducting initial separations experiments, developing thermal ionization mass spectrometry capability using the 243Am isotope as an isotope dilution spike , optimizing the spike for the determination of 241Pu- 241 Am radiochemistry, and, additionally, developing and testing a methodology which can detect trace to ultra- trace levels of Pu (both assay and isotopics) in bulk Am samples .

  11. The influence of capture-recapture methodology on the evolution of the North American Bird Banding Program

    USGS Publications Warehouse

    Tautin, J.; Lebreton, J.-D.; North, P.M.

    1993-01-01

    Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.

  12. Application of ion chromatography in pharmaceutical and drug analysis.

    PubMed

    Jenke, Dennis

    2011-08-01

    Ion chromatography (IC) has developed and matured into an important analytical methodology in a number of diverse applications and industries, including pharmaceuticals. This manuscript provides a review of IC applications for the determinations of active and inactive ingredients, excipients, degradation products, and impurities relevant to pharmaceutical analyses and thus serves as a resource for investigators looking for insights into the use of the IC methodology in this field of application.

  13. Analytical control test plan and microbiological methods for the water recovery test

    NASA Technical Reports Server (NTRS)

    Traweek, M. S. (Editor); Tatara, J. D. (Editor)

    1994-01-01

    Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.

  14. World Energy Projection System Plus Model Documentation: Coal Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Coal Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  15. World Energy Projection System Plus Model Documentation: Transportation Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) International Transportation model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  16. World Energy Projection System Plus Model Documentation: Residential Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Residential Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  17. World Energy Projection System Plus Model Documentation: Refinery Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Refinery Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  18. World Energy Projection System Plus Model Documentation: Main Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Main Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  19. Transportation Sector Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model.

  20. World Energy Projection System Plus Model Documentation: Electricity Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Electricity Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  1. Assessing Leadership Knowledge in a Principalship Preparation Programme

    ERIC Educational Resources Information Center

    Seong, David Ng Foo

    2013-01-01

    Purpose: The purpose of this paper is to assess leadership learning in a principalship development programme. Design/methodology/approach: This case study adopted Popper's three worlds as an analytical framework to assess leadership learning in a principalship development programme. The unit of assessment of learning is knowledge--more…

  2. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  3. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  4. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Integrated Response Time Evaluation Methodology for the Nuclear Safety Instrumentation System

    NASA Astrophysics Data System (ADS)

    Lee, Chang Jae; Yun, Jae Hee

    2017-06-01

    Safety analysis for a nuclear power plant establishes not only an analytical limit (AL) in terms of a measured or calculated variable but also an analytical response time (ART) required to complete protective action after the AL is reached. If the two constraints are met, the safety limit selected to maintain the integrity of physical barriers used for preventing uncontrolled radioactivity release will not be exceeded during anticipated operational occurrences and postulated accidents. Setpoint determination methodologies have actively been developed to ensure that the protective action is initiated before the process conditions reach the AL. However, regarding the ART for a nuclear safety instrumentation system, an integrated evaluation methodology considering the whole design process has not been systematically studied. In order to assure the safety of nuclear power plants, this paper proposes a systematic and integrated response time evaluation methodology that covers safety analyses, system designs, response time analyses, and response time tests. This methodology is applied to safety instrumentation systems for the advanced power reactor 1400 and the optimized power reactor 1000 nuclear power plants in South Korea. The quantitative evaluation results are provided herein. The evaluation results using the proposed methodology demonstrate that the nuclear safety instrumentation systems fully satisfy corresponding requirements of the ART.

  6. Study and characterization of a MEMS micromirror device

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2004-08-01

    In this paper, advances in our study and characterization of a MEMS micromirror device are presented. The micromirror device, of 510 mm characteristic length, operates in a dynamic mode with a maximum displacement on the order of 10 mm along its principal optical axis and oscillation frequencies of up to 1.3 kHz. Developments are carried on by analytical, computational, and experimental methods. Analytical and computational nonlinear geometrical models are developed in order to determine the optimal loading-displacement operational characteristics of the micromirror. Due to the operational mode of the micromirror, the experimental characterization of its loading-displacement transfer function requires utilization of advanced optical metrology methods. Optoelectronic holography (OEH) methodologies based on multiple wavelengths that we are developing to perform such characterization are described. It is shown that the analytical, computational, and experimental approach is effective in our developments.

  7. Individual behavioral phenotypes: an integrative meta-theoretical framework. Why "behavioral syndromes" are not analogs of "personality".

    PubMed

    Uher, Jana

    2011-09-01

    Animal researchers are increasingly interested in individual differences in behavior. Their interpretation as meaningful differences in behavioral strategies stable over time and across contexts, adaptive, heritable, and acted upon by natural selection has triggered new theoretical developments. However, the analytical approaches used to explore behavioral data still address population-level phenomena, and statistical methods suitable to analyze individual behavior are rarely applied. I discuss fundamental investigative principles and analytical approaches to explore whether, in what ways, and under which conditions individual behavioral differences are actually meaningful. I elaborate the meta-theoretical ideas underlying common theoretical concepts and integrate them into an overarching meta-theoretical and methodological framework. This unravels commonalities and differences, and shows that assumptions of analogy to concepts of human personality are not always warranted and that some theoretical developments may be based on methodological artifacts. Yet, my results also highlight possible directions for new theoretical developments in animal behavior research. Copyright © 2011 Wiley Periodicals, Inc.

  8. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  9. Analytical Utility of Campylobacter Methodologies

    USDA-ARS?s Scientific Manuscript database

    The National Advisory Committee on Microbiological Criteria for Foods (NACMCF, or the Committee) was asked to address the analytical utility of Campylobacter methodologies in preparation for an upcoming United States Food Safety and Inspection Service (FSIS) baseline study to enumerate Campylobacter...

  10. THE EVOLUTION OF ATOMIC SPECTROSCOPY IN MEASURING TOXIC CONTAMINANTS

    EPA Science Inventory

    Three decades of study of environmental conditions necessary for the protection of freshwater
    aquatic life have been limited by the development and application of analytical methodology utilizing atomic adsorption, atomic fluorescence, and atomic emission spectroscopy.
    The...

  11. World Energy Projection System Plus Model Documentation: Greenhouse Gases Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Greenhouse Gases Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  12. World Energy Projection System Plus Model Documentation: Natural Gas Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Natural Gas Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  13. World Energy Projection System Plus Model Documentation: District Heat Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) District Heat Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  14. World Energy Projection System Plus Model Documentation: Industrial Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Industrial Model (WIM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  15. Tiered analytics for purity assessment of macrocyclic peptides in drug discovery: Analytical consideration and method development.

    PubMed

    Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A

    2017-05-10

    Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.

  16. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies.

    PubMed

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-10-24

    In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. The Shock and Vibration Bulletin. Part 1. Welcome, Keynote Address, Invited Papers, Pyrotechnic Shock, and Shock Testing and Analysis

    DTIC Science & Technology

    1983-05-01

    DESIGN PROCEDURE M. S. IIAndal, University of Vermont, Burlington, VT Machinery Dynamics ANALYTICAL AND EXPERIMENTAL INVESTIGATION OF ROTATING BLADE... methodology to accurately predict rotor vibratory loads and has recently been initiated for detail design and bench test- coupled rotor/airframe vibrations... design methodology , a trating on the basic disciplines of aerodynamics and struc. coupled rotor/airframe vibration analysis has been developed. tural

  18. Features and characterization needs of rubber composite structures

    NASA Technical Reports Server (NTRS)

    Tabaddor, Farhad

    1989-01-01

    Some of the major unique features of rubber composite structures are outlined. The features covered are those related to the material properties, but the analytical features are also briefly discussed. It is essential to recognize these features at the planning stage of any long-range analytical, experimental, or application program. The development of a general and comprehensive program which fully accounts for all the important characteristics of tires, under all the relevant modes of operation, may present a prohibitively expensive and impractical task at the near future. There is therefore a need to develop application methodologies which can utilize the less general models, beyond their theoretical limitations and yet with reasonable reliability, by proper mix of analytical, experimental, and testing activities.

  19. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    PubMed

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  20. e-Research and Learning Theory: What Do Sequence and Process Mining Methods Contribute?

    ERIC Educational Resources Information Center

    Reimann, Peter; Markauskaite, Lina; Bannert, Maria

    2014-01-01

    This paper discusses the fundamental question of how data-intensive e-research methods could contribute to the development of learning theories. Using methodological developments in research on self-regulated learning as an example, it argues that current applications of data-driven analytical techniques, such as educational data mining and its…

  1. STRengthening analytical thinking for observational studies: the STRATOS initiative.

    PubMed

    Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James

    2014-12-30

    The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even 'standard' analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.

  2. Research and development activities in unified control-structure modeling and design

    NASA Technical Reports Server (NTRS)

    Nayak, A. P.

    1985-01-01

    Results of work sponsored by JPL and other organizations to develop a unified control/structures modeling and design capability for large space structures is presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. The development of a methodology for global design optimization is recommended as a long term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization. Recommendations are also presented for near term research activities at JPL. The key recommendation is to continue the development of integrated dynamic modeling/control design techniques, with special attention given to the development of structural models specially tailored to support design.

  3. Measurement-based reliability prediction methodology. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Linn, Linda Shen

    1991-01-01

    In the past, analytical and measurement based models were developed to characterize computer system behavior. An open issue is how these models can be used, if at all, for system design improvement. The issue is addressed here. A combined statistical/analytical approach to use measurements from one environment to model the system failure behavior in a new environment is proposed. A comparison of the predicted results with the actual data from the new environment shows a close correspondence.

  4. Multiresidue analytical method for pharmaceuticals and personal care products in sewage and sewage sludge by online direct immersion SPME on-fiber derivatization - GCMS.

    PubMed

    López-Serna, Rebeca; Marín-de-Jesús, David; Irusta-Mata, Rubén; García-Encina, Pedro Antonio; Lebrero, Raquel; Fdez-Polanco, María; Muñoz, Raúl

    2018-08-15

    The work here presented aimed at developing an analytical method for the simultaneous determination of 22 pharmaceuticals and personal care products, including 3 transformation products, in sewage and sludge. A meticulous method optimization, involving an experimental design, was carried out. The developed method was fully automated and consisted of the online extraction of 17 mL of water sample by Direct Immersion Solid Phase MicroExtraction followed by On-fiber Derivatization coupled to Gas Chromatography - Mass Spectrometry (DI-SPME - On-fiber Derivatization - GC - MS). This methodology was validated for 12 of the initial compounds as a reliable (relative recoveries above 90% for sewage and 70% for sludge; repeatability as %RSD below 10% in all cases), sensitive (LODs below 20 ng L -1 in sewage and 10 ng g -1 in sludge), versatile (sewage and sewage-sludge samples up to 15,000 ng L -1 and 900 ng g -1 , respectively) and green analytical alternative for many medium-tech routine laboratories around the world to keep up with both current and forecast environmental regulations requirements. The remaining 10 analytes initially considered showed insufficient suitability to be included in the final method. The methodology was successfully applied to real samples generated in a pilot scale sewage treatment reactor. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. A solid phase extraction-ion chromatography with conductivity detection procedure for determining cationic surfactants in surface water samples.

    PubMed

    Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek

    2013-11-15

    A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Putting the methodological brakes on claims to measure national happiness through Twitter: Methodological limitations in social media analytics.

    PubMed

    Jensen, Eric Allen

    2017-01-01

    With the rapid global proliferation of social media, there has been growing interest in using this existing source of easily accessible 'big data' to develop social science knowledge. However, amidst the big data gold rush, it is important that long-established principles of good social research are not ignored. This article critically evaluates Mitchell et al.'s (2013) study, 'The Geography of Happiness: Connecting Twitter Sentiment and Expression, Demographics, and Objective Characteristics of Place', demonstrating the importance of attending to key methodological issues associated with secondary data analysis.

  7. Manufacturing data analytics using a virtual factory representation.

    PubMed

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  8. Reliability analysis of composite structures

    NASA Technical Reports Server (NTRS)

    Kan, Han-Pin

    1992-01-01

    A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.

  9. Advances in analytical technologies for environmental protection and public safety.

    PubMed

    Sadik, O A; Wanekaya, A K; Andreescu, S

    2004-06-01

    Due to the increased threats of chemical and biological agents of injury by terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat chemical and biochemical toxins. In addition to the right mix of policies and training of medical personnel on how to recognize symptoms of biochemical warfare agents, the major success in combating terrorism still lies in the prevention, early detection and the efficient and timely response using reliable analytical technologies and powerful therapies for minimizing the effects in the event of an attack. The public and regulatory agencies expect reliable methodologies and devices for public security. Today's systems are too bulky or slow to meet the "detect-to-warn" needs for first responders such as soldiers and medical personnel. This paper presents the challenges in monitoring technologies for warfare agents and other toxins. It provides an overview of how advances in environmental analytical methodologies could be adapted to design reliable sensors for public safety and environmental surveillance. The paths to designing sensors that meet the needs of today's measurement challenges are analyzed using examples of novel sensors, autonomous cell-based toxicity monitoring, 'Lab-on-a-Chip' devices and conventional environmental analytical techniques. Finally, in order to ensure that the public and legal authorities are provided with quality data to make informed decisions, guidelines are provided for assessing data quality and quality assurance using the United States Environmental Protection Agency (US-EPA) methodologies.

  10. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    PubMed

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Towards Reflective Writing Analytics: Rationale, Methodology and Preliminary Results

    ERIC Educational Resources Information Center

    Shum, Simon Buckingham; Sándor, Ágnes; Goldsmith, Rosalie; Bass, Randall; McWilliams, Mindy

    2017-01-01

    When used effectively, reflective writing tasks can deepen learners' understanding of key concepts, help them critically appraise their developing professional identity, and build qualities for lifelong learning. As such, reflective writing is attracting substantial interest from universities concerned with experiential learning, reflective…

  12. Engineering data characterizing the fleet of U.S. railway rolling stock. Volume 2 : methodology and data

    DOT National Transportation Integrated Search

    1981-11-01

    This report contains engineering parameter descriptions of major and distinctive freight vehicle configurations covering approximately 96% of the U.S. freight vehicle fleet. This data has been developed primarily for use in analytical simulation mode...

  13. Engineering data characterizing the fleet of U.S. railway rolling stock. Volume II, Methodology and data.

    DOT National Transportation Integrated Search

    1980-04-01

    This report contains engineering parameter descriptions of major and distinctive freight vehicle configurations covering approximately 96% of the U.S. freight vehicle fleet. This data has been developed primarily for use in analytical simulation mode...

  14. The Influence of State Policies on Critical Infrastructure Resilience: An Approach for Analyzing Transportation and Capital Investment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wall, Thomas; Trail, Jessica; Gevondyan, Erna

    During times of crisis, communities and regions rely heavily on critical infrastructure systems to support their emergency management response and recovery activities. Therefore, the resilience of critical infrastructure systems to crises is a pivotal factor to a community’s overall resilience. Critical infrastructure resilience can be influenced by many factors, including State policies – which are not always uniform in their structure or application across the United States – were identified by the U.S. Department of Homeland Security as an area of particular interest with respect to their the influence on the resilience of critical infrastructure systems. This study focuses onmore » developing an analytical methodology to assess links between policy and resilience, and applies that methodology to critical infrastructure in the Transportation Systems Sector. Specifically, this study seeks to identify potentially influential linkages between State transportation capital funding policies and the resilience of bridges located on roadways that are under the management of public agencies. This study yielded notable methodological outcomes, including the general capability of the analytical methodology to yield – in the case of some States – significant results connecting State policies with critical infrastructure resilience, with the suggestion that further refinement of the methodology may be beneficial.« less

  15. Early warning systems for the management of chronic heart failure: a systematic literature review of cost-effectiveness models.

    PubMed

    Albuquerque De Almeida, Fernando; Al, Maiwenn; Koymans, Ron; Caliskan, Kadir; Kerstens, Ankie; Severens, Johan L

    2018-04-01

    Describing the general and methodological characteristics of decision-analytical models used in the economic evaluation of early warning systems for the management of chronic heart failure patients and performing a quality assessment of their methodological characteristics is expected to provide concise and useful insight to inform the future development of decision-analytical models in the field of heart failure management. Areas covered: The literature on decision-analytical models for the economic evaluation of early warning systems for the management of chronic heart failure patients was systematically reviewed. Nine electronic databases were searched through the combination of synonyms for heart failure and sensitive filters for cost-effectiveness and early warning systems. Expert commentary: The retrieved models show some variability with regards to their general study characteristics. Overall, they display satisfactory methodological quality, even though some points could be improved, namely on the consideration and discussion of any competing theories regarding model structure and disease progression, identification of key parameters and the use of expert opinion, and uncertainty analyses. A comprehensive definition of early warning systems and further research under this label should be pursued. To improve the transparency of economic evaluation publications, authors should make available detailed technical information regarding the published models.

  16. Methodologies for Optimum Capital Expenditure Decisions for New Medical Technology

    PubMed Central

    Landau, Thomas P.; Ledley, Robert S.

    1980-01-01

    This study deals with the development of a theory and an analytical model to support decisions regarding capital expenditures for complex new medical technology. Formal methodologies and quantitative techniques developed by applied mathematicians and management scientists can be used by health planners to develop cost-effective plans for the utilization of medical technology on a community or region-wide basis. In order to maximize the usefulness of the model, it was developed and tested against multiple technologies. The types of technologies studied include capital and labor-intensive technologies, technologies whose utilization rates vary with hospital occupancy rate, technologies whose use can be scheduled, and limited-use and large-use technologies.

  17. DEVELOPMENT OF DIAGNOSTIC ANALYTICAL AND MECHANICAL ABILITY TESTS THROUGH FACET DESIGN AND ANALYSIS.

    ERIC Educational Resources Information Center

    GUTTMAN, LOUIS,; SCHLESINGER, I.M.

    METHODOLOGY BASED ON FACET THEORY (MODIFIED SET THEORY) WAS USED IN TEST CONSTRUCTION AND ANALYSIS TO PROVIDE AN EFFICIENT TOOL OF EVALUATION FOR VOCATIONAL GUIDANCE AND VOCATIONAL SCHOOL USE. THE TYPE OF TEST DEVELOPMENT UNDERTAKEN WAS LIMITED TO THE USE OF NONVERBAL PICTORIAL ITEMS. ITEMS FOR TESTING ABILITY TO IDENTIFY ELEMENTS BELONGING TO AN…

  18. Checking Equity: Why Differential Item Functioning Analysis Should Be a Routine Part of Developing Conceptual Assessments

    ERIC Educational Resources Information Center

    Martinková, Patricia; Drabinová, Adéla; Liaw, Yuan-Ling; Sanders, Elizabeth A.; McFarland, Jenny L.; Price, Rebecca M.

    2017-01-01

    We provide a tutorial on differential item functioning (DIF) analysis, an analytic method useful for identifying potentially biased items in assessments. After explaining a number of methodological approaches, we test for gender bias in two scenarios that demonstrate why DIF analysis is crucial for developing assessments, particularly because…

  19. Model documentation: Electricity Market Module, Electricity Fuel Dispatch Submodule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report documents the objectives, analytical approach and development of the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  20. A micromechanics-based strength prediction methodology for notched metal matrix composites

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1992-01-01

    An analytical micromechanics based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and post fatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.

  1. A micromechanics-based strength prediction methodology for notched metal-matrix composites

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1993-01-01

    An analytical micromechanics-based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three-dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and postfatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics-based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.

  2. AN INTERDISCIPLINARY APPROACH TO VALUING WATER FROM BRUSH CONTROL

    EPA Science Inventory

    An analytical methodology utilizing models from three disciplines is developed to assess the viability of brush control for wate yield in the Frio River Basin, TX. Ecological, hydrologic, and economic models are used to portray changes in forage production and water supply result...

  3. METHODOLOGY TO EVALUATE THE POTENTIAL FOR GROUND WATER CONTAMINATION FROM GEOTHERMAL FLUID RELEASES

    EPA Science Inventory

    This report provides analytical methods and graphical techniques to predict potential ground water contamination from geothermal energy development. Overflows and leaks from ponds, pipe leaks, well blowouts, leaks from well casing, and migration from injection zones can be handle...

  4. Design Considerations of ISTAR Hydrocarbon Fueled Combustor Operating in Air Augmented Rocket, Ramjet and Scramjet Modes

    NASA Technical Reports Server (NTRS)

    Andreadis, Dean; Drake, Alan; Garrett, Joseph L.; Gettinger, Christopher D.; Hoxie, Stephen S.

    2003-01-01

    The development and ground test of a rocket-based combined cycle (RBCC) propulsion system is being conducted as part of the NASA Marshall Space Flight Center (MSFC) Integrated System Test of an Airbreathing Rocket (ISTAR) program. The eventual flight vehicle (X-43B) is designed to support an air-launched self-powered Mach 0.7 to 7.0 demonstration of an RBCC engine through all of its airbreathing propulsion modes - air augmented rocket (AAR), ramjet (RJ), and scramjet (SJ). Through the use of analytical tools, numerical simulations, and experimental tests the ISTAR program is developing and validating a hydrocarbon-fueled RBCC combustor design methodology. This methodology will then be used to design an integrated RBCC propulsion system that produces robust ignition and combustion stability characteristics while maximizing combustion efficiency and minimizing drag losses. First order analytical and numerical methods used to design hydrocarbon-fueled combustors are discussed with emphasis on the methods and determination of requirements necessary to establish engine operability and performance characteristics.

  5. Design Considerations of Istar Hydrocarbon Fueled Combustor Operating in Air Augmented Rocket, Ramjet and Scramjet Modes

    NASA Technical Reports Server (NTRS)

    Andreadis, Dean; Drake, Alan; Garrett, Joseph L.; Gettinger, Christopher D.; Hoxie, Stephen S.

    2002-01-01

    The development and ground test of a rocket-based combined cycle (RBCC) propulsion system is being conducted as part of the NASA Marshall Space Flight Center (MSFC) Integrated System Test of an Airbreathing Rocket (ISTAR) program. The eventual flight vehicle (X-43B) is designed to support an air-launched self-powered Mach 0.7 to 7.0 demonstration of an RBCC engine through all of its airbreathing propulsion modes - air augmented rocket (AAR), ramjet (RJ), and scramjet (SJ). Through the use of analytical tools, numerical simulations, and experimental tests the ISTAR program is developing and validating a hydrocarbon-fueled RBCC combustor design methodology. This methodology will then be used to design an integrated RBCC propulsion system thai: produces robust ignition and combustion stability characteristics while maximizing combustion efficiency and minimizing drag losses. First order analytical and numerical methods used to design hydrocarbon-fueled combustors are discussed with emphasis on the methods and determination of requirements necessary to establish engine operability and performance characteristics.

  6. Analytical simulation and PROFAT II: a new methodology and a computer automated tool for fault tree analysis in chemical process industries.

    PubMed

    Khan, F I; Abbasi, S A

    2000-07-10

    Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.

  7. Combining analytical hierarchy process and agglomerative hierarchical clustering in search of expert consensus in green corridors development management.

    PubMed

    Shapira, Aviad; Shoshany, Maxim; Nir-Goldenberg, Sigal

    2013-07-01

    Environmental management and planning are instrumental in resolving conflicts arising between societal needs for economic development on the one hand and for open green landscapes on the other hand. Allocating green corridors between fragmented core green areas may provide a partial solution to these conflicts. Decisions regarding green corridor development require the assessment of alternative allocations based on multiple criteria evaluations. Analytical Hierarchy Process provides a methodology for both a structured and consistent extraction of such evaluations and for the search for consensus among experts regarding weights assigned to the different criteria. Implementing this methodology using 15 Israeli experts-landscape architects, regional planners, and geographers-revealed inherent differences in expert opinions in this field beyond professional divisions. The use of Agglomerative Hierarchical Clustering allowed to identify clusters representing common decisions regarding criterion weights. Aggregating the evaluations of these clusters revealed an important dichotomy between a pragmatist approach that emphasizes the weight of statutory criteria and an ecological approach that emphasizes the role of the natural conditions in allocating green landscape corridors.

  8. Combining Analytical Hierarchy Process and Agglomerative Hierarchical Clustering in Search of Expert Consensus in Green Corridors Development Management

    NASA Astrophysics Data System (ADS)

    Shapira, Aviad; Shoshany, Maxim; Nir-Goldenberg, Sigal

    2013-07-01

    Environmental management and planning are instrumental in resolving conflicts arising between societal needs for economic development on the one hand and for open green landscapes on the other hand. Allocating green corridors between fragmented core green areas may provide a partial solution to these conflicts. Decisions regarding green corridor development require the assessment of alternative allocations based on multiple criteria evaluations. Analytical Hierarchy Process provides a methodology for both a structured and consistent extraction of such evaluations and for the search for consensus among experts regarding weights assigned to the different criteria. Implementing this methodology using 15 Israeli experts—landscape architects, regional planners, and geographers—revealed inherent differences in expert opinions in this field beyond professional divisions. The use of Agglomerative Hierarchical Clustering allowed to identify clusters representing common decisions regarding criterion weights. Aggregating the evaluations of these clusters revealed an important dichotomy between a pragmatist approach that emphasizes the weight of statutory criteria and an ecological approach that emphasizes the role of the natural conditions in allocating green landscape corridors.

  9. High Bandwidth Rotary Fast Tool Servos and a Hybrid Rotary/Linear Electromagnetic Actuator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montesanti, Richard Clement

    2005-09-01

    This thesis describes the development of two high bandwidth short-stroke rotary fast tool servos and the hybrid rotary/linear electromagnetic actuator developed for one of them. Design insights, trade-o® methodologies, and analytical tools are developed for precision mechanical systems, power and signal electronic systems, control systems, normal-stress electromagnetic actuators, and the dynamics of the combined systems.

  10. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  11. Fuzzy Linear Programming and its Application in Home Textile Firm

    NASA Astrophysics Data System (ADS)

    Vasant, P.; Ganesan, T.; Elamvazuthi, I.

    2011-06-01

    In this paper, new fuzzy linear programming (FLP) based methodology using a specific membership function, named as modified logistic membership function is proposed. The modified logistic membership function is first formulated and its flexibility in taking up vagueness in parameter is established by an analytical approach. The developed methodology of FLP has provided a confidence in applying to real life industrial production planning problem. This approach of solving industrial production planning problem can have feedback with the decision maker, the implementer and the analyst.

  12. Blade loss transient dynamics analysis, volume 2. Task 2: Theoretical and analytical development. Task 3: Experimental verification

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.

  13. Gas chromatography coupled to tunable pulsed glow discharge time-of-flight mass spectrometry for environmental analysis.

    PubMed

    Solà-Vázquez, Auristela; Lara-Gonzalo, Azucena; Costa-Fernández, José M; Pereiro, Rosario; Sanz-Medel, Alfredo

    2010-05-01

    A tuneable microsecond pulsed direct current glow discharge (GD)-time-of-flight mass spectrometer MS(TOF) developed in our laboratory was coupled to a gas chromatograph (GC) to obtain sequential collection of the mass spectra, at different temporal regimes occurring in the GD pulses, during elution of the analytes. The capabilities of this set-up were explored using a mixture of volatile organic compounds of environmental concern: BrClCH, Cl(3)CH, Cl(4)C, BrCl(2)CH, Br(2)ClCH, Br(3)CH. The experimental parameters of the GC-pulsed GD-MS(TOF) prototype were optimized in order to separate appropriately and analyze the six selected organic compounds, and two GC carrier gases, helium and nitrogen, were evaluated. Mass spectra for all analytes were obtained in the prepeak, plateau and afterpeak temporal regimes of the pulsed GD. Results showed that helium offered the best elemental sensitivity, while nitrogen provided higher signal intensities for fragments and molecular peaks. The analytical performance characteristics were also worked out for each analyte. Absolute detection limits obtained were in the order of ng. In a second step, headspace solid phase microextraction (HS SPME), as sample preparation and preconcentration technique, was evaluated for the quantification of the compounds under study, in order to achieve the required analytical sensitivity for trihalomethanes European Union (EU) environmental legislation. The analytical figures of merit obtained using the proposed methodology showed rather good detection limits (between 2 and 13 microg L(-1) depending on the analyte). In fact, the developed methodology met the EU legislation requirements (the maximum level permitted in tap water for the "total trihalomethanes" is set at 100 microg L(-1)). Real analysis of drinking water and river water were successfully carried out. To our knowledge this is the first application of GC-pulsed GD-MS(TOF) for the analysis of real samples. Its ability to provide elemental, fragments and molecular information of the organic compounds is demonstrated.

  14. Engineering and programming manual: Two-dimensional kinetic reference computer program (TDK)

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dang, L. D.; Coats, D. E.

    1985-01-01

    The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket thrust chamber performance prediction methodology. The development of a methodology that includes all aspects of rocket engine performance from analytical calculation to test measurements, that is physically accurate and consistent, and that serves as an industry and government reference is presented. Recent interest in rocket engines that operate at high expansion ratio, such as most Orbit Transfer Vehicle (OTV) engine designs, has required an extension of the analytical methods used by the TDK computer program. Thus, the version of TDK that is described in this manual is in many respects different from the 1973 version of the program. This new material reflects the new capabilities of the TDK computer program, the most important of which are described.

  15. Introductory Guide to the Statistics of Molecular Genetics

    ERIC Educational Resources Information Center

    Eley, Thalia C.; Rijsdijk, Fruhling

    2005-01-01

    Background: This introductory guide presents the main two analytical approaches used by molecular geneticists: linkage and association. Methods: Traditional linkage and association methods are described, along with more recent advances in methodologies such as those using a variance components approach. Results: New methods are being developed all…

  16. Analyzing Agricultural Technology Systems: A Research Report.

    ERIC Educational Resources Information Center

    Swanson, Burton E.

    The International Program for Agricultural Knowledge Systems (INTERPAKS) research team is developing a descriptive and analytic framework to examine and assess agricultural technology systems. The first part of the framework is an inductive methodology that organizes data collection and orders data for comparison between countries. It requires and…

  17. Characteristics of Effective Leadership Networks

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Azah, Vera Ndifor

    2016-01-01

    Purpose: The purpose of this paper is to inquire about the characteristics of effective school leadership networks and the contribution of such networks to the development of individual leaders' professional capacities. Design/methodology/approach: The study used path-analytic techniques with survey data provided by 450 school and district leaders…

  18. Focal Event, Contextualization, and Effective Communication in the Mathematics Classroom

    ERIC Educational Resources Information Center

    Nilsson, Per; Ryve, Andreas

    2010-01-01

    The aim of this article is to develop analytical tools for studying mathematical communication in collaborative activities. The theoretical construct of contextualization is elaborated methodologically in order to study diversity in individual thinking in relation to effective communication. The construct of contextualization highlights issues of…

  19. Crewmember Performance before, during, and after Spaceflight

    ERIC Educational Resources Information Center

    Kelly, Thomas H.; Hienz, Robert D.; Zarcone, Troy J.; Wurster, Richard M.; Brady, Joseph V.

    2005-01-01

    The development of technologies for monitoring the welfare of crewmembers is a critical requirement for extended spaceflight. Behavior analytic methodologies provide a framework for studying the performance of individuals and groups, and brief computerized tests have been used successfully to examine the impairing effects of sleep, drug, and…

  20. Defining and Measuring Engagement and Learning in Science: Conceptual, Theoretical, Methodological, and Analytical Issues

    ERIC Educational Resources Information Center

    Azevedo, Roger

    2015-01-01

    Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…

  1. Contamination in food from packaging material.

    PubMed

    Lau, O W; Wong, S K

    2000-06-16

    Packaging has become an indispensible element in the food manufacturing process, and different types of additives, such as antioxidants, stabilizers, lubricants, anti-static and anti-blocking agents, have also been developed to improve the performance of polymeric packaging materials. Recently the packaging has been found to represent a source of contamination itself through the migration of substances from the packaging into food. Various analytical methods have been developed to analyze the migrants in the foodstuff, and migration evaluation procedures based on theoretical prediction of migration from plastic food contact material were also introduced recently. In this paper, the regulatory control, analytical methodology, factors affecting the migration and migration evaluation are reviewed.

  2. Multidisciplinary design and analytic approaches to advance prospective research on the multilevel determinants of child health.

    PubMed

    Johnson, Sara B; Little, Todd D; Masyn, Katherine; Mehta, Paras D; Ghazarian, Sharon R

    2017-06-01

    Characterizing the determinants of child health and development over time, and identifying the mechanisms by which these determinants operate, is a research priority. The growth of precision medicine has increased awareness and refinement of conceptual frameworks, data management systems, and analytic methods for multilevel data. This article reviews key methodological challenges in cohort studies designed to investigate multilevel influences on child health and strategies to address them. We review and summarize methodological challenges that could undermine prospective studies of the multilevel determinants of child health and ways to address them, borrowing approaches from the social and behavioral sciences. Nested data, variation in intervals of data collection and assessment, missing data, construct measurement across development and reporters, and unobserved population heterogeneity pose challenges in prospective multilevel cohort studies with children. We discuss innovations in missing data, innovations in person-oriented analyses, and innovations in multilevel modeling to address these challenges. Study design and analytic approaches that facilitate the integration across multiple levels, and that account for changes in people and the multiple, dynamic, nested systems in which they participate over time, are crucial to fully realize the promise of precision medicine for children and adolescents. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Critical review of dog detection and the influences of physiology, training, and analytical methodologies.

    PubMed

    Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M

    2018-08-01

    Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Development of garlic bioactive compounds analytical methodology based on liquid phase microextraction using response surface design. Implications for dual analysis: Cooked and biological fluids samples.

    PubMed

    Ramirez, Daniela Andrea; Locatelli, Daniela Ana; Torres-Palazzolo, Carolina Andrea; Altamirano, Jorgelina Cecilia; Camargo, Alejandra Beatriz

    2017-01-15

    Organosulphur compounds (OSCs) present in garlic (Allium sativum L.) are responsible of several biological properties. Functional foods researches indicate the importance of quantifying these compounds in food matrices and biological fluids. For this purpose, this paper introduces a novel methodology based on dispersive liquid-liquid microextraction (DLLME) coupled to high performance liquid chromatography with ultraviolet detector (HPLC-UV) for the extraction and determination of organosulphur compounds in different matrices. The target analytes were allicin, (E)- and (Z)-ajoene, 2-vinyl-4H-1,2-dithiin (2-VD), diallyl sulphide (DAS) and diallyl disulphide (DADS). The microextraction technique was optimized using an experimental design, and the analytical performance was evaluated under optimum conditions. The desirability function presented an optimal value for 600μL of chloroform as extraction solvent using acetonitrile as dispersant. The method proved to be reliable, precise and accurate. It was successfully applied to determine OSCs in cooked garlic samples as well as blood plasma and digestive fluids. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    PubMed

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  6. The Identification of Factors Affecting the Development and Practice of School-Based Counseling in Different National Contexts: A Grounded Theory Study Using a Worldwide Sample of Descriptive Journal Articles and Book Chapters

    ERIC Educational Resources Information Center

    Martin, Ian; Lauterbach, Alexandra; Carey, John

    2015-01-01

    A grounded theory methodology was used to analyze articles and book chapters describing the development and practice of school-based counseling in 25 different countries in order to identify the factors that affect development and practice. An 11-factor analytic framework was developed. Factors include: Cultural Factors, National Needs, Larger…

  7. Decision making in prioritization of required operational capabilities

    NASA Astrophysics Data System (ADS)

    Andreeva, P.; Karev, M.; Kovacheva, Ts.

    2015-10-01

    The paper describes an expert heuristic approach to prioritization of required operational capabilities in the field of defense. Based on expert assessment and by application of the method of Analytical Hierarchical Process, a methodology for their prioritization has been developed. It has been applied to practical simulation decision making games.

  8. Developing a New Interdisciplinary Lab Course for Undergraduate and Graduate Students: Plant Cells and Proteins

    ERIC Educational Resources Information Center

    Jez, Joseph M.; Schachtman, Daniel P.; Berg, R. Howard; Taylor, Christopher G.; Chen, Sixue; Hicks, Leslie M.; Jaworski, Jan G.; Smith, Thomas J.; Nielsen, Erik; Pikaard, Craig S.

    2007-01-01

    Studies of protein function increasingly use multifaceted approaches that span disciplines including recombinant DNA technology, cell biology, and analytical biochemistry. These studies rely on sophisticated equipment and methodologies including confocal fluorescence microscopy, mass spectrometry, and X-ray crystallography that are beyond the…

  9. The Search for Meaning in Factor Analytically Derived Dimensions.

    ERIC Educational Resources Information Center

    Hagekull, Berit

    This paper discusses the development of instruments to measure individual differences in behavior during infancy. The Infant Temperament Questionnaire (ITQ), which was designed to measure the temperament dimensions identified by the New York Longitudinal Study (NYLS), constituted the methodological starting point in the search for a dimensional…

  10. GC-MS (GAS CHROMATOGRAPHIC-MASS SPECTROMETRIC) SUITABILITY TESTING OF RCRA APPENDIX VIII AND MICHIGAN LIST ANALYTES

    EPA Science Inventory

    As a first step in a hierarchical scheme to demonstrate the suitability of present U.S. Environmental Protection Agency (USEPA) analysis methods and/or develop new methodology, the gas chromatographic (GC) separation and mass spectrometric (MS) detection characteristics of 328 to...

  11. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  12. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  13. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. RE-EVALUATION OF APPLICABILITY OF AGENCY SAMPLE HOLDING TIMES

    EPA Science Inventory

    Holding times are the length of time a sample can be stored after collection and prior to analysis without significantly affecting the analytical results. Holding times vary with the analyte, sample matrix, and analytical methodology used to quantify the analytes concentration. ...

  15. Analysis and Purification of Bioactive Natural Products: The AnaPurNa Study

    PubMed Central

    2012-01-01

    Based on a meta-analysis of data mined from almost 2000 publications on bioactive natural products (NPs) from >80 000 pages of 13 different journals published in 1998–1999, 2004–2005, and 2009–2010, the aim of this systematic review is to provide both a survey of the status quo and a perspective for analytical methodology used for isolation and purity assessment of bioactive NPs. The study provides numerical measures of the common means of sourcing NPs, the chromatographic methodology employed for NP purification, and the role of spectroscopy and purity assessment in NP characterization. A link is proposed between the observed use of various analytical methodologies, the challenges posed by the complexity of metabolomes, and the inescapable residual complexity of purified NPs and their biological assessment. The data provide inspiration for the development of innovative methods for NP analysis as a means of advancing the role of naturally occurring compounds as a viable source of biologically active agents with relevance for human health and global benefit. PMID:22620854

  16. Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.

    1989-01-01

    The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.

  17. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  18. VAP/VAT: video analytics platform and test bed for testing and deploying video analytics

    NASA Astrophysics Data System (ADS)

    Gorodnichy, Dmitry O.; Dubrofsky, Elan

    2010-04-01

    Deploying Video Analytics in operational environments is extremely challenging. This paper presents a methodological approach developed by the Video Surveillance and Biometrics Section (VSB) of the Science and Engineering Directorate (S&E) of the Canada Border Services Agency (CBSA) to resolve these problems. A three-phase approach to enable VA deployment within an operational agency is presented and the Video Analytics Platform and Testbed (VAP/VAT) developed by the VSB section is introduced. In addition to allowing the integration of third party and in-house built VA codes into an existing video surveillance infrastructure, VAP/VAT also allows the agency to conduct an unbiased performance evaluation of the cameras and VA software available on the market. VAP/VAT consists of two components: EventCapture, which serves to Automatically detect a "Visual Event", and EventBrowser, which serves to Display & Peruse of "Visual Details" captured at the "Visual Event". To deal with Open architecture as well as with Closed architecture cameras, two video-feed capture mechanisms have been developed within the EventCapture component: IPCamCapture and ScreenCapture.

  19. Flight simulator fidelity assessment in a rotorcraft lateral translation maneuver

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Malsbury, T.; Atencio, A., Jr.

    1992-01-01

    A model-based methodology for assessing flight simulator fidelity in closed-loop fashion is exercised in analyzing a rotorcraft low-altitude maneuver for which flight test and simulation results were available. The addition of a handling qualities sensitivity function to a previously developed model-based assessment criteria allows an analytical comparison of both performance and handling qualities between simulation and flight test. Model predictions regarding the existence of simulator fidelity problems are corroborated by experiment. The modeling approach is used to assess analytically the effects of modifying simulator characteristics on simulator fidelity.

  20. Magnetic scavengers as carriers of analytes for flowing atmospheric pressure afterglow mass spectrometry (FAPA-MS).

    PubMed

    Cegłowski, Michał; Kurczewska, Joanna; Smoluch, Marek; Reszke, Edward; Silberring, Jerzy; Schroeder, Grzegorz

    2015-09-07

    In this paper, a procedure for the preconcentration and transport of mixtures of acids, bases, and drug components to a mass spectrometer using magnetic scavengers is presented. Flowing atmospheric pressure afterglow mass spectrometry (FAPA-MS) was used as an analytical method for identification of the compounds by thermal desorption from the scavengers. The proposed procedure is fast and cheap, and does not involve time-consuming purification steps. The developed methodology can be applied for trapping harmful substances in minute quantities, to transport them to specialized, remotely located laboratories.

  1. Wind-tunnel evaluation of NASA developed control laws for flutter suppression on a DC-10 derivative wing

    NASA Technical Reports Server (NTRS)

    Abel, I.; Newsom, J. R.

    1981-01-01

    Two flutter suppression control laws were synthesized, implemented, and tested on a low speed aeroelastic wing model of a DC-10 derivative. The methodology used to design the control laws is described. Both control laws demonstrated increases in flutter speed in excess of 25 percent above the passive wing flutter speed. The effect of variations in gain and phase on the closed loop performance was measured and compared with analytical predictions. The analytical results are in good agreement with experimental data.

  2. A methodology for the assessment of manned flight simulator fidelity

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; Malsbury, Terry N.

    1989-01-01

    A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.

  3. The Wide-Field Imaging Interferometry Testbed: Enabling Techniques for High Angular Resolution Astronomy

    NASA Technical Reports Server (NTRS)

    Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.; hide

    2007-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.

  4. The Concordance between EFL Learners' Linguistic Sequential Development and the Curricula of Formal and Informal Learning Settings: An Analytical Study

    ERIC Educational Resources Information Center

    Albaqshi, Jalal H.

    2016-01-01

    This research explores the sequence of content in ESP curricula to our learners' linguistic development and to authentic situations. This study has been conducted in Alahsa College of Technology, Saudi Arabia. Methodology used was an analysis of an ESP textbook in corpus-based approach and matching the units of the textbook to students' needs…

  5. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  6. Two-factor theory – at the intersection of health care management and patient satisfaction

    PubMed Central

    Bohm, Josef

    2012-01-01

    Using data obtained from the 2004 Joint Canadian/United States Survey of Health, an analytic model using principles derived from Herzberg’s motivational hygiene theory was developed for evaluating patient satisfaction with health care. The analysis sought to determine whether survey variables associated with consumer satisfaction act as Hertzberg factors and contribute to survey participants’ self-reported levels of health care satisfaction. To validate the technique, data from the survey were analyzed using logistic regression methods and then compared with results obtained from the two-factor model. The findings indicate a high degree of correlation between the two methods. The two-factor analytical methodology offers advantages due to its ability to identify whether a factor assumes a motivational or hygienic role and assesses the influence of a factor within select populations. Its ease of use makes this methodology well suited for assessment of multidimensional variables. PMID:23055755

  7. Two-factor theory - at the intersection of health care management and patient satisfaction.

    PubMed

    Bohm, Josef

    2012-01-01

    Using data obtained from the 2004 Joint Canadian/United States Survey of Health, an analytic model using principles derived from Herzberg's motivational hygiene theory was developed for evaluating patient satisfaction with health care. The analysis sought to determine whether survey variables associated with consumer satisfaction act as Hertzberg factors and contribute to survey participants' self-reported levels of health care satisfaction. To validate the technique, data from the survey were analyzed using logistic regression methods and then compared with results obtained from the two-factor model. The findings indicate a high degree of correlation between the two methods. The two-factor analytical methodology offers advantages due to its ability to identify whether a factor assumes a motivational or hygienic role and assesses the influence of a factor within select populations. Its ease of use makes this methodology well suited for assessment of multidimensional variables.

  8. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    PubMed

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  9. Application of Haddon's matrix in qualitative research methodology: an experience in burns epidemiology.

    PubMed

    Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza

    2012-01-01

    Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.

  10. Video analysis for insight and coding: Examples from tutorials in introductory physics

    NASA Astrophysics Data System (ADS)

    Scherr, Rachel E.

    2009-12-01

    The increasing ease of video recording offers new opportunities to create richly detailed records of classroom activities. These recordings, in turn, call for research methodologies that balance generalizability with interpretive validity. This paper shares methodology for two practices of video analysis: (1) gaining insight into specific brief classroom episodes and (2) developing and applying a systematic observational protocol for a relatively large corpus of video data. These two aspects of analytic practice are illustrated in the context of a particular research interest but are intended to serve as general suggestions.

  11. Introduction to SIMRAND: Simulation of research and development project

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1982-01-01

    SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.

  12. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  13. Determination of insoluble soap in agricultural soil and sewage sludge samples by liquid chromatography with ultraviolet detection.

    PubMed

    Cantarero, Samuel; Zafra-Gómez, Alberto; Ballesteros, Oscar; Navalón, Alberto; Vílchez, José L; Crovetto, Guillermo; Verge, Coral; de Ferrer, Juan A

    2010-11-01

    We have developed a new analytical procedure for determining insoluble Ca and Mg fatty acid salts (soaps) in agricultural soil and sewage sludge samples. The number of analytical methodologies that focus in the determination of insoluble soap salts in different environmental compartments is very limited. In this work, we propose a methodology that involves a sample clean-up step with petroleum ether to remove soluble salts and a conversion of Ca and Mg insoluble salts into soluble potassium salts using tripotassium ethylenediaminetetraacetate salt and potassium carbonate, followed by the extraction of analytes from the samples using microwave-assisted extraction with methanol. An improved esterification procedure using 2,4-dibromoacetophenone before the liquid chromatography with ultraviolet detection analysis also has been developed. The absence of matrix effect was demonstrated with two fatty acid Ca salts that are not commercial and are never detected in natural samples (C₁₃:₀ and C₁₇:₀). Therefore, it was possible to evaluate the matrix effect because both standards have similar environmental behavior (adsorption and precipitation) to commercial soaps (C₁₀:₀) to C₁₈:₀). We also studied the effect of the different variables on the clean-up, the conversion of Ca soap, and the extraction and derivatization procedures. The quantification limits found ranged from 0.4 to 0.8 mg/kg. The proposed method was satisfactorily applied for the development of a study on soap behavior in agricultural soil and sewage sludge samples. © 2010 SETAC.

  14. A methodology for the design and evaluation of user interfaces for interactive information systems. Ph.D. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Farooq, Mohammad U.

    1986-01-01

    The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.

  15. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  16. Multi-scaling allometric analysis for urban and regional development

    NASA Astrophysics Data System (ADS)

    Chen, Yanguang

    2017-01-01

    The concept of allometric growth is based on scaling relations, and it has been applied to urban and regional analysis for a long time. However, most allometric analyses were devoted to the single proportional relation between two elements of a geographical system. Few researches focus on the allometric scaling of multielements. In this paper, a process of multiscaling allometric analysis is developed for the studies on spatio-temporal evolution of complex systems. By means of linear algebra, general system theory, and by analogy with the analytical hierarchy process, the concepts of allometric growth can be integrated with the ideas from fractal dimension. Thus a new methodology of geo-spatial analysis and the related theoretical models emerge. Based on the least squares regression and matrix operations, a simple algorithm is proposed to solve the multiscaling allometric equation. Applying the analytical method of multielement allometry to Chinese cities and regions yields satisfying results. A conclusion is reached that the multiscaling allometric analysis can be employed to make a comprehensive evaluation for the relative levels of urban and regional development, and explain spatial heterogeneity. The notion of multiscaling allometry may enrich the current theory and methodology of spatial analyses of urban and regional evolution.

  17. Antimony in the environment as a global pollutant: a review on analytical methodologies for its determination in atmospheric aerosols.

    PubMed

    Smichowski, Patricia

    2008-03-15

    This review summarizes and discusses the research carried out on the determination of antimony and its predominant chemical species in atmospheric aerosols. Environmental matrices such as airborne particulate matter, fly ash and volcanic ash present a number of complex analytical challenges as very sensitive analytical techniques and highly selective separation methodologies for speciation studies. Given the diversity of instrumental approaches and methodologies employed for the determination of antimony and its species in environmental matrices, the objective of this review is to briefly discuss the most relevant findings reported in the last years for this remarkable element and to identify the future needs and trends. The survey includes 92 references and covers principally the literature published over the last decade.

  18. Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh

    2017-03-01

    Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.

  19. Keeping Connected: A Review of the Research Relationship

    ERIC Educational Resources Information Center

    Moss, Julianne; Hay, Trevor

    2014-01-01

    In this paper, some key findings of the Keeping Connected project are discussed in light of the methodological challenges of developing an analytical approach in a large-scale study, particularly in starting with open-ended, participant-selected, digital still visual images as part of 31 longitudinal case studies. The paper works to clarify the…

  20. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  1. Consumer Learning for University Students: A Case for a Curriculum

    ERIC Educational Resources Information Center

    Crafford, Sharon; Bitzer, Eli

    2009-01-01

    This article indicates how the application of a simplified version of the analytical abstraction method (AAM) was used in curriculum development for consumer learning at one higher education institution in South Africa. We used a case study design and qualitative research methodology to generate data through semi-structured interviews with eight…

  2. Who Owns Educational Theory? Big Data, Algorithms and the Expert Power of Education Data Science

    ERIC Educational Resources Information Center

    Williamson, Ben

    2017-01-01

    "Education data science" is an emerging methodological field which possesses the algorithm-driven technologies required to generate insights and knowledge from educational big data. This article consists of an analysis of the Lytics Lab, Stanford University's laboratory for research and development in learning analytics, and the Center…

  3. Tool to Prioritize Energy Efficiency Investments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farese, P.; Gelman, R.; Hendron, R.

    2012-08-01

    To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.

  4. Science, Social Work, and Intervention Research: The Case of "Critical Time Intervention"

    ERIC Educational Resources Information Center

    Jenson, Jeffrey M.

    2014-01-01

    Intervention research is an important, yet often neglected, focus of social work scholars and investigators. The purpose of this article is to review significant milestones and recent advances in intervention research. Methodological and analytical developments in intervention research are discussed in the context of science and social work.…

  5. A Methodology in the Teaching Process of Calculus and Its Motivation.

    ERIC Educational Resources Information Center

    Vasquez-Martinez, Claudio-Rafael

    The development of calculus and science by being permanent, didactic, demands on one part an analytical, deductive study and on another an application of methods, rhochrematics, resources, within calculus, which allows to dialectically conform knowledge in its different phases and to test the results. For the purpose of this study, the motivation…

  6. Educational Approaches to Entrepreneurship in Higher Education: A View from the Swedish Horizon

    ERIC Educational Resources Information Center

    Hoppe, Magnus; Westerberg, Mats; Leffler, Eva

    2017-01-01

    Purpose: The purpose of this paper is to present and develop models of educational approaches to entrepreneurship that can provide complementary analytical structures to better study, enact and reflect upon the role of entrepreneurship in higher education. Design/methodology/approach A general framework for entrepreneurship education is developed…

  7. The Pedagogical Foundations of Primary School Inspector Leonor Serrano (1914-1939)

    ERIC Educational Resources Information Center

    Ortells Roca, Miguel; Traver Martí, Juan

    2018-01-01

    This article aims to reconstruct the pedagogy of Leonor Serrano, a Spanish school inspector working and developing her theories between 1914 and 1939. We use an interrogative-analytical methodology based on content analysis of her texts to reconstruct her educational theory. The theoretical deductive elements are uncovered in the analysis of the…

  8. Integrating Developmental Theory and Methodology: Using Derivatives to Articulate Change Theories, Models, and Inferences

    ERIC Educational Resources Information Center

    Deboeck, Pascal R.; Nicholson, Jody; Kouros, Chrystyna; Little, Todd D.; Garber, Judy

    2015-01-01

    Matching theories about growth, development, and change to appropriate statistical models can present a challenge, which can result in misuse, misinterpretation, and underutilization of different analytical approaches. We discuss the use of "derivatives": the change of a construct with respect to the change in another construct.…

  9. ’Coxiella Burnetii’ Vaccine Development: Lipopolysaccharide Structural Analysis

    DTIC Science & Technology

    1991-02-20

    Analytical instrumentation and methodology is presented for the determination of endotoxin -related structures at much improved sensitivity and... ENDOTOXIN CHARACTERIZATION BY SFC .......................... 10 III. COXIELLA BURNETII LPS CHARACTERIZATION A. EXPERIMENTAL...period for the determination of endotoxin -related structures at much improved sensitivity and specificity. Reports, and their applications, are listed in

  10. "Great Classroom Teaching" and More: Awards for Outstanding Teaching Evaluated

    ERIC Educational Resources Information Center

    Jackson, Michael

    2006-01-01

    Purpose: In this paper teaching excellence awards are evaluated, with an eye to improving them. Design/methodology/approach: Literature is reviewed and an analytic framework developed in Canada is modified to apply to the University of Sydney's Vice Chancellor Outstanding Teaching Award. Data come from 60 respondents familiar with the Sydney award…

  11. Integration of fuzzy analytic hierarchy process and probabilistic dynamic programming in formulating an optimal fleet management model

    NASA Astrophysics Data System (ADS)

    Teoh, Lay Eng; Khoo, Hooi Ling

    2013-09-01

    This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.

  12. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Direct trace-elemental analysis of urine samples by laser ablation-inductively coupled plasma mass spectrometry after sample deposition on clinical filter papers.

    PubMed

    Aramendía, Maite; Rello, Luis; Vanhaecke, Frank; Resano, Martín

    2012-10-16

    Collection of biological fluids on clinical filter papers shows important advantages from a logistic point of view, although analysis of these specimens is far from straightforward. Concerning urine analysis, and particularly when direct trace elemental analysis by laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS) is aimed at, several problems arise, such as lack of sensitivity or different distribution of the analytes on the filter paper, rendering obtaining reliable quantitative results quite difficult. In this paper, a novel approach for urine collection is proposed, which circumvents many of these problems. This methodology consists on the use of precut filter paper discs where large amounts of sample can be retained upon a single deposition. This provides higher amounts of the target analytes and, thus, sufficient sensitivity, and allows addition of an adequate internal standard at the clinical lab prior to analysis, therefore making it suitable for a strategy based on unsupervised sample collection and ulterior analysis at referral centers. On the basis of this sampling methodology, an analytical method was developed for the direct determination of several elements in urine (Be, Bi, Cd, Co, Cu, Ni, Sb, Sn, Tl, Pb, and V) at the low μg L(-1) level by means of LA-ICPMS. The method developed provides good results in terms of accuracy and LODs (≤1 μg L(-1) for most of the analytes tested), with a precision in the range of 15%, fit-for-purpose for clinical control analysis.

  14. Novel analytical methods to assess the chemical and physical properties of liposomes.

    PubMed

    Kothalawala, Nuwan; Mudalige, Thilak K; Sisco, Patrick; Linder, Sean W

    2018-08-01

    Liposomes are used in commercial pharmaceutical formulations (PFs) and dietary supplements (DSs) as a carrier vehicle to protect the active ingredient from degradation and to increase the half-life of the injectable. Even as the commercialization of liposomal products has rapidly increased, characterization methodologies to evaluate physical and chemical properties of the liposomal products have not been well-established. Herein we develop rapid methodologies to evaluate chemical and selected physical properties of liposomal formulations. Chemical properties of liposomes are determined by their lipid composition. The lipid composition is evaluated by first screening of the lipids present in the sample using HPLC-ELSD followed by HPLC-MSMS analysis with high mass accuracy (<5 ppm), fragmentation pattern and lipid structure databases searching. Physical properties such as particle size and size distribution were investigated using Tunable Resistive Pulse Sensing (TRPS). The developed methods were used to analyze commercially available PFs and DSs. As results, PFs contain distinct number of lipids as indicated by the manufacture, but DSs were more complicated containing a large number of lipids belonging to different sub-classes. Commercially available liposomes have particles with wide size distribution based on size measurements performed by TRPS. The high mass accuracy as well as identification lipids using multiple fragment ions aided to accurately identify the lipids and differentiate them from other lipophilic molecules. The developed analytical methodologies were successfully adapted to measure the physiochemical properties of commercial liposomes. Copyright © 2018. Published by Elsevier B.V.

  15. Quantitation of DNA adducts by stable isotope dilution mass spectrometry

    PubMed Central

    Tretyakova, Natalia; Goggin, Melissa; Janis, Gregory

    2012-01-01

    Exposure to endogenous and exogenous chemicals can lead to the formation of structurally modified DNA bases (DNA adducts). If not repaired, these nucleobase lesions can cause polymerase errors during DNA replication, leading to heritable mutations potentially contributing to the development of cancer. Due to their critical role in cancer initiation, DNA adducts represent mechanism-based biomarkers of carcinogen exposure, and their quantitation is particularly useful for cancer risk assessment. DNA adducts are also valuable in mechanistic studies linking tumorigenic effects of environmental and industrial carcinogens to specific electrophilic species generated from their metabolism. While multiple experimental methodologies have been developed for DNA adduct analysis in biological samples – including immunoassay, HPLC, and 32P-postlabeling – isotope dilution high performance liquid chromatography-electrospray ionization-tandem mass spectrometry (HPLC-ESI-MS/MS) generally has superior selectivity, sensitivity, accuracy, and reproducibility. As typical DNA adducts concentrations in biological samples are between 0.01 – 10 adducts per 108 normal nucleotides, ultrasensitive HPLC-ESI-MS/MS methodologies are required for their analysis. Recent developments in analytical separations and biological mass spectrometry – especially nanoflow HPLC, nanospray ionization MS, chip-MS, and high resolution MS – have pushed the limits of analytical HPLC-ESI-MS/MS methodologies for DNA adducts, allowing researchers to accurately measure their concentrations in biological samples from patients treated with DNA alkylating drugs and in populations exposed to carcinogens from urban air, drinking water, cooked food, alcohol, and cigarette smoke. PMID:22827593

  16. Visual analytics as a translational cognitive science.

    PubMed

    Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard

    2011-07-01

    Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.

  17. HPLC method development for evolving applications in the pharmaceutical industry and nanoscale chemistry

    NASA Astrophysics Data System (ADS)

    Castiglione, Steven Louis

    As scientific research trends towards trace levels and smaller architectures, the analytical chemist is often faced with the challenge of quantitating said species in a variety of matricies. The challenge is heightened when the analytes prove to be potentially toxic or possess physical or chemical properties that make traditional analytical methods problematic. In such cases, the successful development of an acceptable quantitative method plays a critical role in the ability to further develop the species under study. This is particularly true for pharmaceutical impurities and nanoparticles (NP). The first portion of the research focuses on the development of a part-per-billion level HPLC method for a substituted phenazine-class pharmaceutical impurity. The development of this method was required due to the need for a rapid methodology to quantitatively determine levels of a potentially toxic phenazine moiety in order to ensure patient safety. As the synthetic pathway for the active ingredient was continuously refined to produce progressively lower amounts of the phenazine impurity, the approach for increasingly sensitive quantitative methods was required. The approaches evolved across four discrete methods, each employing a unique scheme for analyte detection. All developed methods were evaluated with regards to accuracy, precision and linear adherence as well as ancillary benefits and detriments -- e.g., one method in this evolution demonstrated the ability to resolve and detect other species from the phenazine class. The second portion of the research focuses on the development of an HPLC method for the quantitative determination of NP size distributions. The current methodology for the determination of NP sizes employs tunneling electron microscopy (TEM), which requires sample drying without particle size alteration and which, in many cases, may prove infeasible due to cost or availability. The feasibility of an HPLC method for NP size characterizations evolved across three methods, each employing a different approach for size resolution. These methods were evaluated primarily for sensitivity, which proved to be a substantial hurdle to further development, but does not appear to deter future research efforts.

  18. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  19. Evolution of a primary pulse in the granular dimers mounted on a linear elastic foundation: An analytical and numerical study.

    PubMed

    Ahsan, Zaid; Jayaprakash, K R

    2016-10-01

    In this exposition we consider the wave dynamics of a one-dimensional periodic granular dimer (diatomic) chain mounted on a damped and an undamped linear elastic foundation (otherwise called the on-site potential). It is very well known that periodic granular dimers support solitary wave propagation (similar to that in the homogeneous granular chains) for a specific discrete set of mass ratios. In this work we present the analytical investigation of the evolution of solitary waves and primary pulses in granular dimers when they are mounted on on-site potential with and without velocity proportional foundation damping. We invoke a methodology based on the multiple time-scale asymptotic analysis and partition the dynamics of the perturbed dimer chain into slow and fast components. The dynamics of the dimer chain in the limit of large mass mismatch (auxiliary chain) mounted on on-site potential and foundation damping is used as the basis for the analysis. A systematic analytical procedure is then developed for the slowly varying response of the beads and in estimating primary pulse amplitude evolution resulting in a nonlinear map relating the relative displacement amplitudes of two adjacent beads. The methodology is applicable for arbitrary mass ratios between the beads. We present several examples to demonstrate the efficacy of the proposed method. It is observed that the amplitude evolution predicted by the described methodology is in good agreement with the numerical simulation of the original system. This work forms a basis for further application of the considered methodology to weakly coupled granular dimers which finds practical relevance in designing shock mitigating granular layers.

  20. Phase-0/microdosing studies using PET, AMS, and LC-MS/MS: a range of study methodologies and conduct considerations. Accelerating development of novel pharmaceuticals through safe testing in humans - a practical guide.

    PubMed

    Burt, Tal; John, Christy S; Ruckle, Jon L; Vuong, Le T

    2017-05-01

    Phase-0 studies, including microdosing, also called Exploratory Investigational New Drug (eIND) or exploratory clinical trials, are a regulatory framework for first-in-human (FIH) trials. Common to these approaches is the use and implied safety of limited exposures to test articles. Use of sub-pharmacological doses in phase-0/microdose studies requires sensitive analytic tools such as accelerator mass spectrometer (AMS), Positron Emission Tomography (PET), and Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) to determine drug disposition. Areas covered: Here we present a practical guide to the range of methodologies, design options, and conduct strategies that can be used to increase the efficiency of drug development. We provide detailed examples of relevant developmental scenarios. Expert opinion: Validation studies over the past decade demonstrated the reliability of extrapolation of sub-pharmacological to therapeutic-level exposures in more than 80% of cases, an improvement over traditional allometric approaches. Applications of phase-0/microdosing approaches include study of pharmacokinetic and pharmacodynamic properties, target tissue localization, drug-drug interactions, effects in vulnerable populations (e.g. pediatric), and intra-target microdosing (ITM). Study design should take into account the advantages and disadvantages of each analytic tool. Utilization of combinations of these analytic techniques increases the versatility of study designs and the power of data obtained.

  1. Advances in Instrumental Analysis of Brominated Flame Retardants: Current Status and Future Perspectives

    PubMed Central

    2014-01-01

    This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482

  2. Development of a validated liquid chromatographic method for quantification of sorafenib tosylate in the presence of stress-induced degradation products and in biological matrix employing analytical quality by design approach.

    PubMed

    Sharma, Teenu; Khurana, Rajneet Kaur; Jain, Atul; Katare, O P; Singh, Bhupinder

    2018-05-01

    The current research work envisages an analytical quality by design-enabled development of a simple, rapid, sensitive, specific, robust and cost-effective stability-indicating reversed-phase high-performance liquid chromatographic method for determining stress-induced forced-degradation products of sorafenib tosylate (SFN). An Ishikawa fishbone diagram was constructed to embark upon analytical target profile and critical analytical attributes, i.e. peak area, theoretical plates, retention time and peak tailing. Factor screening using Taguchi orthogonal arrays and quality risk assessment studies carried out using failure mode effect analysis aided the selection of critical method parameters, i.e. mobile phase ratio and flow rate potentially affecting the chosen critical analytical attributes. Systematic optimization using response surface methodology of the chosen critical method parameters was carried out employing a two-factor-three-level-13-run, face-centered cubic design. A method operable design region was earmarked providing optimum method performance using numerical and graphical optimization. The optimum method employed a mobile phase composition consisting of acetonitrile and water (containing orthophosphoric acid, pH 4.1) at 65:35 v/v at a flow rate of 0.8 mL/min with UV detection at 265 nm using a C 18 column. Response surface methodology validation studies confirmed good efficiency and sensitivity of the developed method for analysis of SFN in mobile phase as well as in human plasma matrix. The forced degradation studies were conducted under different recommended stress conditions as per ICH Q1A (R2). Mass spectroscopy studies showed that SFN degrades in strongly acidic, alkaline and oxidative hydrolytic conditions at elevated temperature, while the drug was per se found to be photostable. Oxidative hydrolysis using 30% H 2 O 2 showed maximum degradation with products at retention times of 3.35, 3.65, 4.20 and 5.67 min. The absence of any significant change in the retention time of SFN and degradation products, formed under different stress conditions, ratified selectivity and specificity of the systematically developed method. Copyright © 2017 John Wiley & Sons, Ltd.

  3. A Methodology for Determining Statistical Performance Compliance for Airborne Doppler Radar with Forward-Looking Turbulence Detection Capability

    NASA Technical Reports Server (NTRS)

    Bowles, Roland L.; Buck, Bill K.

    2009-01-01

    The objective of the research developed and presented in this document was to statistically assess turbulence hazard detection performance employing airborne pulse Doppler radar systems. The FAA certification methodology for forward looking airborne turbulence radars will require estimating the probabilities of missed and false hazard indications under operational conditions. Analytical approaches must be used due to the near impossibility of obtaining sufficient statistics experimentally. This report describes an end-to-end analytical technique for estimating these probabilities for Enhanced Turbulence (E-Turb) Radar systems under noise-limited conditions, for a variety of aircraft types, as defined in FAA TSO-C134. This technique provides for one means, but not the only means, by which an applicant can demonstrate compliance to the FAA directed ATDS Working Group performance requirements. Turbulence hazard algorithms were developed that derived predictive estimates of aircraft hazards from basic radar observables. These algorithms were designed to prevent false turbulence indications while accurately predicting areas of elevated turbulence risks to aircraft, passengers, and crew; and were successfully flight tested on a NASA B757-200 and a Delta Air Lines B737-800. Application of this defined methodology for calculating the probability of missed and false hazard indications taking into account the effect of the various algorithms used, is demonstrated for representative transport aircraft and radar performance characteristics.

  4. RLV Turbine Performance Optimization

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.

    2001-01-01

    A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.

  5. An analytical procedure to assist decision-making in a government research organization

    Treesearch

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  6. Analytical and simulator study of advanced transport

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Rickard, W. W.

    1982-01-01

    An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.

  7. Quantification of endocrine disruptors and pesticides in water by gas chromatography-tandem mass spectrometry. Method validation using weighted linear regression schemes.

    PubMed

    Mansilha, C; Melo, A; Rebelo, H; Ferreira, I M P L V O; Pinho, O; Domingues, V; Pinho, C; Gameiro, P

    2010-10-22

    A multi-residue methodology based on a solid phase extraction followed by gas chromatography-tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC-MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Application of person-centered analytic methodology in longitudinal research: exemplars from the Women's Health Initiative Clinical Trial data.

    PubMed

    Zaslavsky, Oleg; Cochrane, Barbara B; Herting, Jerald R; Thompson, Hilaire J; Woods, Nancy F; Lacroix, Andrea

    2014-02-01

    Despite the variety of available analytic methods, longitudinal research in nursing has been dominated by use of a variable-centered analytic approach. The purpose of this article is to present the utility of person-centered methodology using a large cohort of American women 65 and older enrolled in the Women's Health Initiative Clinical Trial (N = 19,891). Four distinct trajectories of energy/fatigue scores were identified. Levels of fatigue were closely linked to age, socio-demographic factors, comorbidities, health behaviors, and poor sleep quality. These findings were consistent regardless of the methodological framework. Finally, we demonstrated that energy/fatigue levels predicted future hospitalization in non-disabled elderly. Person-centered methods provide unique opportunities to explore and statistically model the effects of longitudinal heterogeneity within a population. © 2013 Wiley Periodicals, Inc.

  9. Optimal design of piezoelectric transformers: a rational approach based on an analytical model and a deterministic global optimization.

    PubMed

    Pigache, Francois; Messine, Frédéric; Nogarede, Bertrand

    2007-07-01

    This paper deals with a deterministic and rational way to design piezoelectric transformers in radial mode. The proposed approach is based on the study of the inverse problem of design and on its reformulation as a mixed constrained global optimization problem. The methodology relies on the association of the analytical models for describing the corresponding optimization problem and on an exact global optimization software, named IBBA and developed by the second author to solve it. Numerical experiments are presented and compared in order to validate the proposed approach.

  10. Assessment of pesticide contamination in soil samples from an intensive horticulture area, using ultrasonic extraction and gas chromatography-mass spectrometry.

    PubMed

    Gonçalves, C; Alpendurada, M F

    2005-03-15

    In order to reduce the amount of sample to be collected and the time consumed in the analytical process, a broad range of analytes should be preferably considered in the same analytical procedure. A suitable methodology for pesticide residue analysis in soil samples was developed based on ultrasonic extraction (USE) and gas chromatography-mass spectrometry (GC-MS). For this study, different classes of pesticides were selected, both recent and old persistent molecules: parent compounds and degradation products, namely organochlorine, organophosphorous and pyrethroid insecticides, triazine and acetanilide herbicides and other miscellaneous pesticides. Pesticide residues could be detected in the low- to sub-ppb range (0.05-7.0mugkg(-1)) with good precision (7.5-20.5%, average 13.7% R.S.D.) and extraction efficiency (69-118%, average 88%) for the great majority of analytes. This methodology has been applied in a monitoring program of soil samples from an intensive horticulture area in Póvoa de Varzim, North of Portugal. The pesticides detected in four sampling programs (2001/2002) were the following: lindane, dieldrin, endosulfan, endosulfan sulfate, 4,4'-DDE, 4,4'-DDD, atrazine, desethylatrazine, alachlor, dimethoate, chlorpyrifos, pendimethalin, procymidone and chlorfenvinphos. Pesticide contamination was investigated at three depths and in different soil and crop types to assess the influence of soil characteristics and trends over time.

  11. Use of Cdse/ZnS quantum dots for sensitive detection and quantification of paraquat in water samples.

    PubMed

    Durán, Gema M; Contento, Ana M; Ríos, Ángel

    2013-11-01

    Based on the highly sensitive fluorescence change of water-soluble CdSe/ZnS core-shell quantum dots (QD) by paraquat herbicide, a simple, rapid and reproducible methodology was developed to selectively determine paraquat (PQ) in water samples. The methodology enabled the use of simple pretreatment procedure based on the simple water solubilization of CdSe/ZnS QDs with hydrophilic heterobifunctional thiol ligands, such as 3-mercaptopropionic acid (3-MPA), using microwave irradiation. The resulting water-soluble QDs exhibit a strong fluorescence emission at 596 nm with a high and reproducible photostability. The proposed analytical method thus satisfies the need for a simple, sensible and rapid methodology to determine residues of paraquat in water samples, as required by the increasingly strict regulations for health protection introduced in recent years. The sensitivity of the method, expressed as detection limits, was as low as 3.0 ng L(-1). The lineal range was between 10-5×10(3) ng L(-1). RSD values in the range of 71-102% were obtained. The analytical applicability of proposed method was demonstrated by analyzing water samples from different procedence. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Laboratory Training Manual on the Use of Nuclear Techniques in Pesticide Research. Technical Reports Series No. 225.

    ERIC Educational Resources Information Center

    International Atomic Energy Agency, Vienna (Austria).

    Radiolabelled pesticides are used: in studies involving improved formulations of pesticides, to assist in developing standard residue analytical methodology, and in obtaining metabolism data to support registration of pesticides. This manual is designed to give the scientist involved in pesticide research the basic terms and principles for…

  13. A Task-Based Needs Analysis for Australian Aboriginal Students: Going beyond the Target Situation to Address Cultural Issues

    ERIC Educational Resources Information Center

    Oliver, Rhonda; Grote, Ellen; Rochecouste, Judith; Exell, Michael

    2013-01-01

    While needs analyses underpin the design of second language analytic syllabi, the methodologies undertaken are rarely examined. This paper explores the value of multiple data sources and collection methods for developing a needs analysis model to enable vocational education and training teachers to address the needs of Australian Aboriginal…

  14. Developing a University Contribution to Teacher Education: Creating an Analytical Space for Learning Narratives

    ERIC Educational Resources Information Center

    Hanley, Chris; Brown, Tony

    2017-01-01

    What might a distinct university contribution to teacher education look like? This paper tracks a group of prospective teachers making the transition from undergraduate to teacher on a one-year school-based postgraduate course. The study employs a practitioner research methodological framework where teacher learning is understood as a process of…

  15. A Methodology in the Teaching Process of the Derivative and Its Motivation.

    ERIC Educational Resources Information Center

    Vasquez-Martinez, Claudio-Rafael

    The development of the derivative because of being part of calculus in permanent dialectic, demands on one part an analytical, deductive study and on another an application of rochrematic methods, sources of resources, within calculus of derivative which allows to dialectically confront knowledge in its different phases and to test the results.…

  16. Online Learning Era: Exploring the Most Decisive Determinants of MOOCs in Taiwanese Higher Education

    ERIC Educational Resources Information Center

    Hsieh, Ming-Yuan

    2016-01-01

    Because the development of Taiwanese Massive Open Online Course (MOOCs) websites is at this moment full of vitality, this research employs a series of analytical cross-measurements of Quality Function Deployment method of House of Quality (QFD-HOQ) model and Multiple Criteria Decision Making (MCDM) methodology to cross-evaluate the weighted…

  17. Optimization of the computational load of a hypercube supercomputer onboard a mobile robot.

    PubMed

    Barhen, J; Toomarian, N; Protopopescu, V

    1987-12-01

    A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of singleneuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as prec xdence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.

  18. Optimization of the computational load of a hypercube supercomputer onboard a mobile robot

    NASA Technical Reports Server (NTRS)

    Barhen, Jacob; Toomarian, N.; Protopopescu, V.

    1987-01-01

    A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of single-neuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as precedence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.

  19. Benchmark Calibration Tests Completed for Stirling Convertor Heater Head Life Assessment

    NASA Technical Reports Server (NTRS)

    Krause, David L.; Halford, Gary R.; Bowman, Randy R.

    2005-01-01

    A major phase of benchmark testing has been completed at the NASA Glenn Research Center (http://www.nasa.gov/glenn/), where a critical component of the Stirling Radioisotope Generator (SRG) is undergoing extensive experimentation to aid the development of an analytical life-prediction methodology. Two special-purpose test rigs subjected SRG heater-head pressure-vessel test articles to accelerated creep conditions, using the standard design temperatures to stay within the wall material s operating creep-response regime, but increasing wall stresses up to 7 times over the design point. This resulted in well-controlled "ballooning" of the heater-head hot end. The test plan was developed to provide critical input to analytical parameters in a reasonable period of time.

  20. Spectral radiation analyses of the GOES solar illuminated hexagonal cell scan mirror back

    NASA Technical Reports Server (NTRS)

    Fantano, Louis G.

    1993-01-01

    A ray tracing analytical tool has been developed for the simulation of spectral radiation exchange in complex systems. Algorithms are used to account for heat source spectral energy, surface directional radiation properties, and surface spectral absorptivity properties. This tool has been used to calculate the effective solar absorptivity of the geostationary operational environmental satellites (GOES) scan mirror in the calibration position. The development and design of Sounder and Imager instruments on board GOES is reviewed and the problem of calculating the effective solar absorptivity associated with the GOES hexagonal cell configuration is presented. The analytical methodology based on the Monte Carlo ray tracing technique is described and results are presented and verified by experimental measurements for selected solar incidence angles.

  1. Development of a methodology for strategic environmental assessment: application to the assessment of golf course installation policy in Taiwan.

    PubMed

    Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min

    2009-01-01

    Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.

  2. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  3. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  4. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  5. Analytical Chemistry Division annual progress report for period ending November 30, 1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, W.S.

    1978-03-01

    Activities for the year are summarized in sections on analytical methodology, mass and mass emission spectrometry, analytical services, bio-organic analysis, nuclear and radiochemical analysis, and quality assurance and safety. Presentations of research results in publications and reports are tabulated. (JRD)

  6. Non-traditional applications of laser desorption/ionization mass spectrometry

    NASA Astrophysics Data System (ADS)

    McAlpin, Casey R.

    Seven studies were carried out using laser desorption/ionization mass spectrometry (LDI MS) to develop enhanced methodologies for a variety of analyte systems by investigating analyte chemistries, ionization processes, and elimination of spectral interferences. Applications of LDI and matrix assisted laser/desorption/ionization (MALDI) have been previously limited by poorly understood ionization phenomena, and spectral interferences from matrices. Matrix assisted laser desorption ionization MS is well suited to the analysis of proteins. However, the proteins associated with bacteriophages often form complexes which are too massive for detection with a standard MALDI mass spectrometer. As such, methodologies for pretreatment of these samples are discussed in detail in the first chapter. Pretreatment of bacteriophage samples with reducing agents disrupted disulfide linkages and allowed enhanced detection of bacteriophage proteins. The second chapter focuses on the use of MALDI MS for lipid compounds whose molecular mass is significantly less than the proteins for which MALDI is most often applied. The use of MALDI MS for lipid analysis presented unique challenges such as matrix interference and differential ionization efficiencies. It was observed that optimization of the matrix system, and addition of cationization reagents mitigated these challenges and resulted in an enhanced methodology for MALDI MS of lipids. One of the challenges commonly encountered in efforts to expand MALDI MS applications is as previously mentioned interferences introduced by organic matrix molecules. The third chapter focuses on the development of a novel inorganic matrix replacement system called metal oxide laser ionization mass spectrometry (MOLI MS). In contrast to other matrix replacements, considerable effort was devoted to elucidating the ionization mechanism. It was shown that chemisorption of analytes to the metal oxide surface produced acidic adsorbed species which then protonated free analyte molecules. Expanded applications of MOLI MS were developed following description of the ionization mechanism. A series of experiments were carried out involving treatment of metal oxide surfaces with reagent molecules to expand MOLI MS and develop enhanced MOLI MS methodologies. It was found that treatment of the metal oxide surface with a small molecule to act as a proton source expanded MOLI MS to analytes which did not form acidic adsorbed species. Proton-source pretreated MOLI MS was then used for the analysis of oils obtained from the fast, anoxic pyrolysis of biomass (py-oil). These samples are complex and produce MOLI mass spectra with many peaks. In this experiment, methods of data reduction including Kendrick mass defects and nominal mass z*-scores, which are commonly used for the study of petroleum fractions, were used to interpret these spectra and identify the major constituencies of py-oils. Through data reduction and collision induced dissociation (CID), homologous series of compounds were rapidly identified. The final chapter involves using metal oxides to catalytically cleave the ester linkage on lipids containing fatty acids in addition to ionization. The cleavage process results in the production of spectra similar to those observed with saponification/methylation. Fatty acid profiles were generated for a variety of micro-organisms to differentiate between bacterial species. (Abstract shortened by UMI.)

  7. 76 FR 55804 - Dicamba; Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-09

    ... Considerations A. Analytical Enforcement Methodology Adequate enforcement methodologies, Methods I and II--gas chromatography with electron capture detection (GC/ECD), are available to enforce the tolerance expression. The...

  8. FASP, an analytic resource appraisal program for petroleum play analysis

    USGS Publications Warehouse

    Crovelli, R.A.; Balay, R.H.

    1986-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented in a FORTRAN program termed FASP. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An established geologic model considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The program FASP produces resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and many laws of expectation and variance. ?? 1986.

  9. Reducing Conservatism of Analytic Transient Response Bounds via Shaping Filters

    NASA Technical Reports Server (NTRS)

    Kwan, Aiyueh; Bedrossian, Nazareth; Jan, Jiann-Woei; Grigoriadis, Karolos; Hua, Tuyen (Technical Monitor)

    1999-01-01

    Recent results show that the peak transient response of a linear system to bounded energy inputs can be computed using the energy-to-peak gain of the system. However, analytically computed peak response bound can be conservative for a class of class bounded energy signals, specifically pulse trains generated from jet firings encountered in space vehicles. In this paper, shaping filters are proposed as a Methodology to reduce the conservatism of peak response analytic bounds. This Methodology was applied to a realistic Space Station assembly operation subject to jet firings. The results indicate that shaping filters indeed reduce the predicted peak response bounds.

  10. Building analytic capacity, facilitating partnerships, and promoting data use in state health agencies: a distance-based workforce development initiative applied to maternal and child health epidemiology.

    PubMed

    Rankin, Kristin M; Kroelinger, Charlan D; Rosenberg, Deborah; Barfield, Wanda D

    2012-12-01

    The purpose of this article is to summarize the methodology, partnerships, and products developed as a result of a distance-based workforce development initiative to improve analytic capacity among maternal and child health (MCH) epidemiologists in state health agencies. This effort was initiated by the Centers for Disease Control's MCH Epidemiology Program and faculty at the University of Illinois at Chicago to encourage and support the use of surveillance data by MCH epidemiologists and program staff in state agencies. Beginning in 2005, distance-based training in advanced analytic skills was provided to MCH epidemiologists. To support participants, this model of workforce development included: lectures about the practical application of innovative epidemiologic methods, development of multidisciplinary teams within and across agencies, and systematic, tailored technical assistance The goal of this initiative evolved to emphasize the direct application of advanced methods to the development of state data products using complex sample surveys, resulting in the articles published in this supplement to MCHJ. Innovative methods were applied by participating MCH epidemiologists, including regional analyses across geographies and datasets, multilevel analyses of state policies, and new indicator development. Support was provided for developing cross-state and regional partnerships and for developing and publishing the results of analytic projects. This collaboration was successful in building analytic capacity, facilitating partnerships and promoting surveillance data use to address state MCH priorities, and may have broader application beyond MCH epidemiology. In an era of decreasing resources, such partnership efforts between state and federal agencies and academia are essential for promoting effective data use.

  11. A methodology to select a wire insulation for use in habitable spacecraft.

    PubMed

    Paulos, T; Apostolakis, G

    1998-08-01

    This paper investigates electrical overheating events aboard a habitable spacecraft. The wire insulation involved in these failures plays a major role in the entire event scenario from threat development to detection and damage assessment. Ideally, if models of wire overheating events in microgravity existed, the various wire insulations under consideration could be quantitatively compared. However, these models do not exist. In this paper, a methodology is developed that can be used to select a wire insulation that is best suited for use in a habitable spacecraft. The results of this study show that, based upon the Analytic Hierarchy Process and simplifying assumptions, the criteria selected, and data used in the analysis, Tefzel is better than Teflon for use in a habitable spacecraft.

  12. Towards Context-Aware and User-Centered Analysis in Assistive Environments: A Methodology and a Software Tool.

    PubMed

    Fontecha, Jesús; Hervás, Ramón; Mondéjar, Tania; González, Iván; Bravo, José

    2015-10-01

    One of the main challenges on Ambient Assisted Living (AAL) is to reach an appropriate acceptance level of the assistive systems, as well as to analyze and monitor end user tasks in a feasible and efficient way. The development and evaluation of AAL solutions based on user-centered perspective help to achive these goals. In this work, we have designed a methodology to integrate and develop analytics user-centered tools into assistive systems. An analysis software tool gathers information of end users from adapted psychological questionnaires and naturalistic observation of their own context. The aim is to enable an in-deep analysis focused on improving the life quality of elderly people and their caregivers.

  13. MASS SPECTROMETRY FOR RISK MANAGEMENT OF DRINKING WATER TREATMENT; II. DISINFECTION BY-PRODUCTS: HALOACETIC ACIDS

    EPA Science Inventory

    Risk management of drinking water relies on quality analytical data. Analytical methodology can often be adapted from environmental monitoring sources. However, risk management sometimes presents special analytical challenges because data may be needed from a source for which n...

  14. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  15. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  16. Improvements in the analytical methodology for the residue determination of the herbicide glyphosate in soils by liquid chromatography coupled to mass spectrometry.

    PubMed

    Botero-Coy, A M; Ibáñez, M; Sancho, J V; Hernández, F

    2013-05-31

    The determination of glyphosate (GLY) in soils is of great interest due to the widespread use of this herbicide and the need of assessing its impact on the soil/water environment. However, its residue determination is very problematic especially in soils with high organic matter content, where strong interferences are normally observed, and because of the particular physico-chemical characteristics of this polar/ionic herbicide. In the present work, we have improved previous LC-MS/MS analytical methodology reported for GLY and its main metabolite AMPA in order to be applied to "difficult" soils, like those commonly found in South-America, where this herbicide is extensively used in large areas devoted to soya or maize, among other crops. The method is based on derivatization with FMOC followed by LC-MS/MS analysis, using triple quadrupole. After extraction with potassium hydroxide, a combination of extract dilution, adjustment to appropriate pH, and solid phase extraction (SPE) clean-up was applied to minimize the strong interferences observed. Despite the clean-up performed, the use of isotope labelled glyphosate as internal standard (ILIS) was necessary for the correction of matrix effects and to compensate for any error occurring during sample processing. The analytical methodology was satisfactorily validated in four soils from Colombia and Argentina fortified at 0.5 and 5mg/kg. In contrast to most LC-MS/MS methods, where the acquisition of two transitions is recommended, monitoring all available transitions was required for confirmation of positive samples, as some of them were interfered by unknown soil components. This was observed not only for GLY and AMPA but also for the ILIS. Analysis by QTOF MS was useful to confirm the presence of interferent compounds that shared the same nominal mass of analytes as well as some of their main product ions. Therefore, the selection of specific transitions was crucial to avoid interferences. The methodology developed was applied to the analysis of 26 soils from different areas of Colombia and Argentina, and the method robustness was demonstrated by analysis of quality control samples along 4 months. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.

    PubMed

    Levitt, Heidi M

    2018-05-01

    Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.

  18. Application of analytic hierarchy process in a waste treatment technology assessment in Mexico.

    PubMed

    Taboada-González, Paul; Aguilar-Virgen, Quetzalli; Ojeda-Benítez, Sara; Cruz-Sotelo, Samantha

    2014-09-01

    The high per capita generation of solid waste and the environmental problems in major rural communities of Ensenada, Baja California, have prompted authorities to seek alternatives for waste treatment. In the absence of a selection methodology, three technologies of waste treatment with energy recovery (an anaerobic digester, a downdraft gasifier, and a plasma gasifier) were evaluated, taking the broader social, political, economic, and environmental issues into considerations. Using the scientific literature as a baseline, interviews with experts, decision makers and the community, and waste stream studies were used to construct a hierarchy that was evaluated by the analytic hierarchy process. In terms of the criteria, judgments, and assumptions made in the model, the anaerobic digester was found to have the highest rating and should consequently be selected as the waste treatment technology for this area. The study results showed low sensitivity, so alternative scenarios were not considered. The methodology developed in this study may be useful for other governments who wish to assess technologies to select waste treatment.

  19. On the pursuit of a nuclear development capability: The case of the Cuban nuclear program

    NASA Astrophysics Data System (ADS)

    Benjamin-Alvarado, Jonathan Calvert

    1998-09-01

    While there have been many excellent descriptive accounts of modernization schemes in developing states, energy development studies based on prevalent modernization theory have been rare. Moreover, heretofore there have been very few analyses of efforts to develop a nuclear energy capability by developing states. Rarely have these analyses employed social science research methodologies. The purpose of this study was to develop a general analytical framework, based on such a methodology to analyze nuclear energy development and to utilize this framework for the study of the specific case of Cuba's decision to develop nuclear energy. The analytical framework developed focuses on a qualitative tracing of the process of Cuban policy objectives and implementation to develop a nuclear energy capability, and analyzes the policy in response to three models of modernization offered to explain the trajectory of policy development. These different approaches are the politically motivated modernization model, the economic and technological modernization model and the economic and energy security model. Each model provides distinct and functionally differentiated expectations for the path of development toward this objective. Each model provides expected behaviors to external stimuli that would result in specific policy responses. In the study, Cuba's nuclear policy responses to stimuli from domestic constraints and intensities, institutional development, and external influences are analyzed. The analysis revealed that in pursuing the nuclear energy capability, Cuba primarily responded by filtering most of the stimuli through the twin objectives of economic rationality and technological advancement. Based upon the Cuban policy responses to the domestic and international stimuli, the study concluded that the economic and technological modernization model of nuclear energy development offered a more complete explanation of the trajectory of policy development than either the politically-motivated or economic and energy security models. The findings of this case pose some interesting questions for the general study of energy programs in developing states. By applying the analytical framework employed in this study to a number of other cases, perhaps the understanding of energy development schemes may be expanded through future research.

  20. Using soft systems methodology to develop a simulation of out-patient services.

    PubMed

    Lehaney, B; Paul, R J

    1994-10-01

    Discrete event simulation is an approach to modelling a system in the form of a set of mathematical equations and logical relationships, usually used for complex problems, which are difficult to address by using analytical or numerical methods. Managing out-patient services is such a problem. However, simulation is not in itself a systemic approach, in that it provides no methodology by which system boundaries and system activities may be identified. The investigation considers the use of soft systems methodology as an aid to drawing system boundaries and identifying system activities, for the purpose of simulating the outpatients' department at a local hospital. The long term aims are to examine the effects that the participative nature of soft systems methodology has on the acceptability of the simulation model, and to provide analysts and managers with a process that may assist in planning strategies for health care.

  1. Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.

    PubMed

    Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María

    2017-01-01

    This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.

  2. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    NASA Astrophysics Data System (ADS)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  3. 2016 Workplace and Gender Relations Survey of Active Duty Members: Statistical Methodology Report

    DTIC Science & Technology

    2017-03-01

    2016 Workplace and Gender Relations Survey of Active Duty Members Statistical Methodology Report Additional copies of this report may be...MEMBERS: STATISTICAL METHODOLOGY REPORT Office of People Analytics (OPA) Defense Research, Surveys, and Statistics Center 4800 Mark Center Drive...20 1 2016 WORKPLACE AND GENDER RELATIONS SURVEY OF ACTIVE DUTY MEMBERS: STATISTICAL METHODOLOGY REPORT

  4. Modeling and control of flexible space platforms with articulated payloads

    NASA Technical Reports Server (NTRS)

    Graves, Philip C.; Joshi, Suresh M.

    1989-01-01

    The first steps in developing a methodology for spacecraft control-structure interaction (CSI) optimization are identification and classification of anticipated missions, and the development of tractable mathematical models in each mission class. A mathematical model of a generic large flexible space platform (LFSP) with multiple independently pointed rigid payloads is considered. The objective is not to develop a general purpose numerical simulation, but rather to develop an analytically tractable mathematical model of such composite systems. The equations of motion for a single payload case are derived, and are linearized about zero steady-state. The resulting model is then extended to include multiple rigid payloads, yielding the desired analytical form. The mathematical models developed clearly show the internal inertial/elastic couplings, and are therefore suitable for analytical and numerical studies. A simple decentralized control law is proposed for fine pointing the payloads and LFSP attitude control, and simulation results are presented for an example problem. The decentralized controller is shown to be adequate for the example problem chosen, but does not, in general, guarantee stability. A centralized dissipative controller is then proposed, requiring a symmetric form of the composite system equations. Such a controller guarantees robust closed loop stability despite unmodeled elastic dynamics and parameter uncertainties.

  5. Development and application of stir bar sorptive extraction with polyurethane foams for the determination of testosterone and methenolone in urine matrices.

    PubMed

    Sequeiros, R C P; Neng, N R; Portugal, F C M; Pinto, M L; Pires, J; Nogueira, J M F

    2011-04-01

    This work describes the development, validation, and application of a novel methodology for the determination of testosterone and methenolone in urine matrices by stir bar sorptive extraction using polyurethane foams [SBSE(PU)] followed by liquid desorption and high-performance liquid chromatography with diode array detection. The methodology was optimized in terms of extraction time, agitation speed, pH, ionic strength and organic modifier, as well as back-extraction solvent and desorption time. Under optimized experimental conditions, convenient accuracy were achieved with average recoveries of 49.7 8.6% for testosterone and 54.2 ± 4.7% for methenolone. Additionally, the methodology showed good precision (<9%), excellent linear dynamic ranges (>0.9963) and convenient detection limits (0.2-0.3 μg/L). When comparing the efficiency obtained by SBSE(PU) and with the conventional polydimethylsiloxane phase [SBSE(PDMS)], yields up to four-fold higher are attained for the former, under the same experimental conditions. The application of the proposed methodology for the analysis of testosterone and methenolone in urine matrices showed negligible matrix effects and good analytical performance.

  6. "Let's Set Up Some Subgoals": Understanding Human-Pedagogical Agent Collaborations and Their Implications for Learning and Prompt and Feedback Compliance

    ERIC Educational Resources Information Center

    Harley, Jason M.; Taub, Michelle; Azevedo, Roger; Bouchet, Francois

    2018-01-01

    Research on collaborative learning between humans and virtual pedagogical agents represents a necessary extension to recent research on the conceptual, theoretical, methodological, analytical, and educational issues behind co- and socially-shared regulated learning between humans. This study presents a novel coding framework that was developed and…

  7. The Recovery Care and Treatment Center: A Database Design and Development Case

    ERIC Educational Resources Information Center

    Harris, Ranida B.; Vaught, Kara L.

    2008-01-01

    The advantages of active learning methodologies have been suggested and empirically shown by a number of IS educators. Case studies are one such teaching technique that offers students the ability to think analytically, apply material learned, and solve a real-world problem. This paper presents a case study designed to be used in a database design…

  8. Headspace-Solid-Phase Microextraction-Gas Chromatography as Analytical Methodology for the Determination of Volatiles in Wild Mushrooms and Evaluation of Modifications Occurring during Storage

    PubMed Central

    Costa, Rosaria; De Grazia, Selenia; Grasso, Elisa; Trozzi, Alessandra

    2015-01-01

    Mushrooms are sources of food, medicines, and agricultural means. Not much is reported in the literature about wild species of the Mediterranean flora, although many of them are traditionally collected for human consumption. The knowledge of their chemical constituents could represent a valid tool for both taxonomic and physiological characterizations. In this work, a headspace-solid-phase microextraction (HS-SPME) method coupled with GC-MS and GC-FID was developed to evaluate the volatile profiles of ten wild mushroom species collected in South Italy. In addition, in order to evaluate the potential of this analytical methodology for true quantitation of volatiles, samples of the cultivated species Agaricus bisporus were analyzed. The choice of this mushroom was dictated by its ease of availability in the food market, due to the consistent amounts required for SPME method development. For calibration of the main volatile compounds, the standard addition method was chosen. Finally, the assessed volatile composition of A. bisporus was monitored in order to evaluate compositional changes occurring during storage, which represents a relevant issue for such a wide consumption edible product. PMID:25945282

  9. Contribution of economic evaluation to decision making in early phases of product development: a methodological and empirical review.

    PubMed

    Hartz, Susanne; John, Jürgen

    2008-01-01

    Economic evaluation as an integral part of health technology assessment is today mostly applied to established technologies. Evaluating healthcare innovations in their early states of development has recently attracted attention. Although it offers several benefits, it also holds methodological challenges. The aim of our study was to investigate the possible contributions of economic evaluation to industry's decision making early in product development and to confront the results with the actual use of early data in economic assessments. We conducted a literature research to detect methodological contributions as well as economic evaluations that used data from early phases of product development. Economic analysis can be beneficially used in early phases of product development for various purposes including early market assessment, R&D portfolio management, and first estimations of pricing and reimbursement scenarios. Analytical tools available for these purposes have been identified. Numerous empirical works were detected, but most do not disclose any concrete decision context and could not be directly matched with the suggested applications. Industry can benefit from starting economic evaluation early in product development in several ways. Empirical evidence suggests that there is still potential left unused.

  10. Determination of residual acetone and acetone related impurities in drug product intermediates prepared as Spray Dried Dispersions (SDD) using gas chromatography with headspace autosampling (GCHS).

    PubMed

    Quirk, Emma; Doggett, Adrian; Bretnall, Alison

    2014-08-05

    Spray Dried Dispersions (SDD) are uniform mixtures of a specific ratio of amorphous active pharmaceutical ingredient (API) and polymer prepared via a spray drying process. Volatile solvents are employed during spray drying to facilitate the formation of the SDD material. Following manufacture, analytical methodology is required to determine residual levels of the spray drying solvent and its associated impurities. Due to the high level of polymer in the SDD samples, direct liquid injection with Gas Chromatography (GC) is not a viable option for analysis. This work describes the development and validation of an analytical approach to determine residual levels of acetone and acetone related impurities, mesityl oxide (MO) and diacetone alcohol (DAA), in drug product intermediates prepared as SDDs using GC with headspace (HS) autosampling. The method development for these analytes presented a number of analytical challenges which had to be overcome before the levels of the volatiles of interest could be accurately quantified. GCHS could be used after two critical factors were implemented; (1) calculation and application of conversion factors to 'correct' for the reactions occurring between acetone, MO and DAA during generation of the headspace volume for analysis, and the addition of an equivalent amount of polymer into all reference solutions used for quantitation to ensure comparability between the headspace volumes generated for both samples and external standards. This work describes the method development and optimisation of the standard preparation, the headspace autosampler operating parameters and the chromatographic conditions, together with a summary of the validation of the methodology. The approach has been demonstrated to be robust and suitable to accurately determine levels of acetone, MO and DAA in SDD materials over the linear concentration range 0.008-0.4μL/mL, with minimum quantitation limits of 20ppm for acetone and MO, and 80ppm for DAA. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. The need for a usable assessment tool to analyse the efficacy of emergency care systems in developing countries: proposal to use the TEWS methodology.

    PubMed

    Sun, Jared H; Twomey, Michele; Tran, Jeffrey; Wallis, Lee A

    2012-11-01

    Ninety percent of emergency incidents occur in developing countries, and this is only expected to get worse as these nations develop. As a result, governments in developing countries are establishing emergency care systems. However, there is currently no widely-usable, objective method to monitor or research the rapid growth of emergency care in the developing world. Analysis of current quantitative methods to assess emergency care in developing countries, and the proposal of a more appropriate method. Currently accepted methods to quantitatively assess the efficacy of emergency care systems cannot be performed in most developing countries due to weak record-keeping infrastructure and the inappropriateness of applying Western derived coefficients to developing country conditions. As a result, although emergency care in the developing world is rapidly growing, researchers and clinicians are unable to objectively measure its progress or determine which policies work best in their respective countries. We propose the TEWS methodology, a simple analytical tool that can be handled by low-resource, developing countries. By relying on the most basic universal parameters, simplest calculations and straightforward protocol, the TEWS methodology allows for widespread analysis of emergency care in the developing world. This could become essential in the establishment and growth of new emergency care systems worldwide.

  12. High-throughput fabrication and screening improves gold nanoparticle chemiresistor sensor performance.

    PubMed

    Hubble, Lee J; Cooper, James S; Sosa-Pintos, Andrea; Kiiveri, Harri; Chow, Edith; Webster, Melissa S; Wieczorek, Lech; Raguse, Burkhard

    2015-02-09

    Chemiresistor sensor arrays are a promising technology to replace current laboratory-based analysis instrumentation, with the advantage of facile integration into portable, low-cost devices for in-field use. To increase the performance of chemiresistor sensor arrays a high-throughput fabrication and screening methodology was developed to assess different organothiol-functionalized gold nanoparticle chemiresistors. This high-throughput fabrication and testing methodology was implemented to screen a library consisting of 132 different organothiol compounds as capping agents for functionalized gold nanoparticle chemiresistor sensors. The methodology utilized an automated liquid handling workstation for the in situ functionalization of gold nanoparticle films and subsequent automated analyte testing of sensor arrays using a flow-injection analysis system. To test the methodology we focused on the discrimination and quantitation of benzene, toluene, ethylbenzene, p-xylene, and naphthalene (BTEXN) mixtures in water at low microgram per liter concentration levels. The high-throughput methodology identified a sensor array configuration consisting of a subset of organothiol-functionalized chemiresistors which in combination with random forests analysis was able to predict individual analyte concentrations with overall root-mean-square errors ranging between 8-17 μg/L for mixtures of BTEXN in water at the 100 μg/L concentration. The ability to use a simple sensor array system to quantitate BTEXN mixtures in water at the low μg/L concentration range has direct and significant implications to future environmental monitoring and reporting strategies. In addition, these results demonstrate the advantages of high-throughput screening to improve the performance of gold nanoparticle based chemiresistors for both new and existing applications.

  13. Evaluation of performance of three different hybrid mesoporous solids based on silica for preconcentration purposes in analytical chemistry: From the study of sorption features to the determination of elements of group IB.

    PubMed

    Kim, Manuela Leticia; Tudino, Mabel Beatríz

    2010-08-15

    Several studies involving the physicochemical interaction of three silica based hybrid mesoporous materials with metal ions of the group IB have been performed in order to employ them for preconcentration purposes in the determination of traces of Cu(II), Ag(I) and Au(III). The three solids were obtained from mesoporous silica functionalized with 3-aminopropyl (APS), 3-mercaptopropyl (MPS) and N-[2-aminoethyl]-3-aminopropyl (NN) groups, respectively. Adsorption capacities for Au, Cu and Ag were calculated using Langmuir's isotherm model and then, the optimal values for the retention of each element onto each one of the solids were found. Physicochemical data obtained under thermodynamic equilibrium and under kinetic conditions - imposed by flow through experiments - allowed the design of simple analytical methodologies where the solids were employed as fillings of microcolumns held in continuous systems coupled on-line to an atomic absorption spectrometry. In order to control the interaction between the filling and the analyte at short times (flow through conditions) and thus, its effect on the analytical signal and the presence of interferences, the initial adsorption velocities were calculated using the pseudo second order model. All these experiments allowed the comparison of the solids in terms of their analytical behaviour at the moment of facing the determination of the three elements. Under optimized conditions mainly given by the features of the filling, the analytical methodologies developed in this work showed excellent performances with limits of detection of 0.14, 0.02 and 0.025 microg L(-1) and RSD % values of 3.4, 2.7 and 3.1 for Au, Cu and Ag, respectively. A full discussion of the main findings on the interaction metal ions/fillings will be provided. The analytical results for the determination of the three metals will be also presented. Copyright 2010 Elsevier B.V. All rights reserved.

  14. Development of an improved method of consolidating fatigue life data

    NASA Technical Reports Server (NTRS)

    Leis, B. N.; Sampath, S. G.

    1978-01-01

    A fatigue data consolidation model that incorporates recent advances in life prediction methodology was developed. A combined analytic and experimental study of fatigue of notched 2024-T3 aluminum alloy under constant amplitude loading was carried out. Because few systematic and complete data sets for 2024-T3 were available in the program generated data for fatigue crack initiation and separation failure for both zero and nonzero mean stresses. Consolidations of these data are presented.

  15. Transient and steady state viscoelastic rolling contact

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Paramadilok, O.

    1985-01-01

    Based on moving total Lagrangian coordinates, a so-called traveling Hughes type contact strategy is developed. Employing the modified contact scheme in conjunction with a traveling finite element strategy, an overall solution methodology is developed to handle transient and steady viscoelastic rolling contact. To verify the scheme, the results of both experimental and analytical benchmarking is presented. The experimental benchmarking includes the handling of rolling tires up to their upper bound behavior, namely the standing wave response.

  16. Development of computer-based analytical tool for assessing physical protection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less

  17. Prioritization of engineering support requests and advanced technology projects using decision support and industrial engineering models

    NASA Technical Reports Server (NTRS)

    Tavana, Madjid

    1995-01-01

    The evaluation and prioritization of Engineering Support Requests (ESR's) is a particularly difficult task at the Kennedy Space Center (KSC) -- Shuttle Project Engineering Office. This difficulty is due to the complexities inherent in the evaluation process and the lack of structured information. The evaluation process must consider a multitude of relevant pieces of information concerning Safety, Supportability, O&M Cost Savings, Process Enhancement, Reliability, and Implementation. Various analytical and normative models developed over the past have helped decision makers at KSC utilize large volumes of information in the evaluation of ESR's. The purpose of this project is to build on the existing methodologies and develop a multiple criteria decision support system that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. The model utilizes the Analytic Hierarchy Process (AHP), subjective probabilities, the entropy concept, and Maximize Agreement Heuristic (MAH) to enhance the decision maker's intuition in evaluating a set of ESR's.

  18. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  19. Fluorescence Spectroscopy for the Monitoring of Food Processes.

    PubMed

    Ahmad, Muhammad Haseeb; Sahar, Amna; Hitzmann, Bernd

    Different analytical techniques have been used to examine the complexity of food samples. Among them, fluorescence spectroscopy cannot be ignored in developing rapid and non-invasive analytical methodologies. It is one of the most sensitive spectroscopic approaches employed in identification, classification, authentication, quantification, and optimization of different parameters during food handling, processing, and storage and uses different chemometric tools. Chemometrics helps to retrieve useful information from spectral data utilized in the characterization of food samples. This contribution discusses in detail the potential of fluorescence spectroscopy of different foods, such as dairy, meat, fish, eggs, edible oil, cereals, fruit, vegetables, etc., for qualitative and quantitative analysis with different chemometric approaches.

  20. Implementation of structural response sensitivity calculations in a large-scale finite-element analysis system

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Rogers, J. L., Jr.

    1982-01-01

    The methodology used to implement structural sensitivity calculations into a major, general-purpose finite-element analysis system (SPAR) is described. This implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calculating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of SPAR are also discussed.

  1. An analytic model for footprint dispersions and its application to mission design

    NASA Technical Reports Server (NTRS)

    Rao, J. R. Jagannatha; Chen, Yi-Chao

    1992-01-01

    This is the final report on our recent research activities that are complementary to those conducted by our colleagues, Professor Farrokh Mistree and students, in the context of the Taguchi method. We have studied the mathematical model that forms the basis of the Simulation and Optimization of Rocket Trajectories (SORT) program and developed an analytic method for determining mission reliability with a reduced number of flight simulations. This method can be incorporated in a design algorithm to mathematically optimize different performance measures of a mission, thus leading to a robust and easy-to-use methodology for mission planning and design.

  2. Functionalized xenon as a biosensor

    PubMed Central

    Spence, Megan M.; Rubin, Seth M.; Dimitrov, Ivan E.; Ruiz, E. Janette; Wemmer, David E.; Pines, Alexander; Yao, Shao Qin; Tian, Feng; Schultz, Peter G.

    2001-01-01

    The detection of biological molecules and their interactions is a significant component of modern biomedical research. In current biosensor technologies, simultaneous detection is limited to a small number of analytes by the spectral overlap of their signals. We have developed an NMR-based xenon biosensor that capitalizes on the enhanced signal-to-noise, spectral simplicity, and chemical-shift sensitivity of laser-polarized xenon to detect specific biomolecules at the level of tens of nanomoles. We present results using xenon “functionalized” by a biotin-modified supramolecular cage to detect biotin–avidin binding. This biosensor methodology can be extended to a multiplexing assay for multiple analytes. PMID:11535830

  3. (U) Analytic First and Second Derivatives of the Uncollided Leakage for a Homogeneous Sphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Favorite, Jeffrey A.

    2017-04-26

    The second-order adjoint sensitivity analysis methodology (2nd-ASAM), developed by Cacuci, has been applied by Cacuci to derive second derivatives of a response with respect to input parameters for uncollided particles in an inhomogeneous transport problem. In this memo, we present an analytic benchmark for verifying the derivatives of the 2nd-ASAM. The problem is a homogeneous sphere, and the response is the uncollided total leakage. This memo does not repeat the formulas given in Ref. 2. We are preparing a journal article that will include the derivation of Ref. 2 and the benchmark of this memo.

  4. Design and development of molecularly imprinted polymers for the selective extraction of deltamethrin in olive oil: An integrated computational-assisted approach.

    PubMed

    Martins, Nuno; Carreiro, Elisabete P; Locati, Abel; Ramalho, João P Prates; Cabrita, Maria João; Burke, Anthony J; Garcia, Raquel

    2015-08-28

    This work firstly addresses the design and development of molecularly imprinted systems selective for deltamethrin aiming to provide a suitable sorbent for solid phase (SPE) extraction that will be further used for the implementation of an analytical methodology for the trace analysis of the target pesticide in spiked olive oil samples. To achieve this goal, a preliminary evaluation of the molecular recognition and selectivity of the molecularly imprinted polymers has been performed. In order to investigate the complexity of the mechanistic basis for template selective recognition in these polymeric matrices, the use of a quantum chemical approach has been attempted providing new insights about the mechanisms underlying template recognition, and in particular the crucial role of the crosslinker agent and the solvent used. Thus, DFT calculations corroborate the results obtained by experimental molecular recognition assays enabling one to select the most suitable imprinting system for MISPE extraction technique which encompasses acrylamide as functional monomer and ethylene glycol dimethacrylate as crosslinker. Furthermore, an analytical methodology comprising a sample preparation step based on solid phase extraction has been implemented using this "tailor made" imprinting system as sorbent, for the selective isolation/pre-concentration of deltamethrin from olive oil samples. Molecularly imprinted solid phase extraction (MISPE) methodology was successfully applied for the clean-up of spiked olive oil samples, with recovery rates up to 94%. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Reactor safeguards system assessment and design. Volume I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varnado, G.B.; Ericson, D.M. Jr.; Daniel, S.L.

    1978-06-01

    This report describes the development and application of a methodology for evaluating the effectiveness of nuclear power reactor safeguards systems. Analytic techniques are used to identify the sabotage acts which could lead to release of radioactive material from a nuclear power plant, to determine the areas of a plant which must be protected to assure that significant release does not occur, to model the physical plant layout, and to evaluate the effectiveness of various safeguards systems. The methodology was used to identify those aspects of reactor safeguards systems which have the greatest effect on overall system performance and which, therefore,more » should be emphasized in the licensing process. With further refinements, the methodology can be used by the licensing reviewer to aid in assessing proposed or existing safeguards systems.« less

  6. Evaluation of Visual Analytics Environments: The Road to the Visual Analytics Science and Technology Challenge Evaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Plaisant, Catherine; Whiting, Mark A.

    The evaluation of visual analytics environments was a topic in Illuminating the Path [Thomas 2005] as a critical aspect of moving research into practice. For a thorough understanding of the utility of the systems available, evaluation not only involves assessing the visualizations, interactions or data processing algorithms themselves, but also the complex processes that a tool is meant to support (such as exploratory data analysis and reasoning, communication through visualization, or collaborative data analysis [Lam 2012; Carpendale 2007]). Researchers and practitioners in the field have long identified many of the challenges faced when planning, conducting, and executing an evaluation ofmore » a visualization tool or system [Plaisant 2004]. Evaluation is needed to verify that algorithms and software systems work correctly and that they represent improvements over the current infrastructure. Additionally to effectively transfer new software into a working environment, it is necessary to ensure that the software has utility for the end-users and that the software can be incorporated into the end-user’s infrastructure and work practices. Evaluation test beds require datasets, tasks, metrics and evaluation methodologies. As noted in [Thomas 2005] it is difficult and expensive for any one researcher to setup an evaluation test bed so in many cases evaluation is setup for communities of researchers or for various research projects or programs. Examples of successful community evaluations can be found [Chinchor 1993; Voorhees 2007; FRGC 2012]. As visual analytics environments are intended to facilitate the work of human analysts, one aspect of evaluation needs to focus on the utility of the software to the end-user. This requires representative users, representative tasks, and metrics that measure the utility to the end-user. This is even more difficult as now one aspect of the test methodology is access to representative end-users to participate in the evaluation. In many cases the sensitive nature of data and tasks and difficult access to busy analysts puts even more of a burden on researchers to complete this type of evaluation. User-centered design goes beyond evaluation and starts with the user [Beyer 1997, Shneiderman 2009]. Having some knowledge of the type of data, tasks, and work practices helps researchers and developers know the correct paths to pursue in their work. When access to the end-users is problematic at best and impossible at worst, user-centered design becomes difficult. Researchers are unlikely to go to work on the type of problems faced by inaccessible users. Commercial vendors have difficulties evaluating and improving their products when they cannot observe real users working with their products. In well-established fields such as web site design or office software design, user-interface guidelines have been developed based on the results of empirical studies or the experience of experts. Guidelines can speed up the design process and replace some of the need for observation of actual users [heuristics review references]. In 2006 when the visual analytics community was initially getting organized, no such guidelines existed. Therefore, we were faced with the problem of developing an evaluation framework for the field of visual analytics that would provide representative situations and datasets, representative tasks and utility metrics, and finally a test methodology which would include a surrogate for representative users, increase interest in conducting research in the field, and provide sufficient feedback to the researchers so that they could improve their systems.« less

  7. Analytical Methods of Decoupling the Automotive Engine Torque Roll Axis

    NASA Astrophysics Data System (ADS)

    JEONG, TAESEOK; SINGH, RAJENDRA

    2000-06-01

    This paper analytically examines the multi-dimensional mounting schemes of an automotive engine-gearbox system when excited by oscillating torques. In particular, the issue of torque roll axis decoupling is analyzed in significant detail since it is poorly understood. New dynamic decoupling axioms are presented an d compared with the conventional elastic axis mounting and focalization methods. A linear time-invariant system assumption is made in addition to a proportionally damped system. Only rigid-body modes of the powertrain are considered and the chassis elements are assumed to be rigid. Several simplified physical systems are considered and new closed-form solutions for symmetric and asymmetric engine-mounting systems are developed. These clearly explain the design concepts for the 4-point mounting scheme. Our analytical solutions match with the existing design formulations that are only applicable to symmetric geometries. Spectra for all six rigid-body motions are predicted using the alternate decoupling methods and the closed-form solutions are verified. Also, our method is validated by comparing modal solutions with prior experimental and analytical studies. Parametric design studies are carried out to illustrate the methodology. Chief contributions of this research include the development of new or refined analytical models and closed-form solutions along with improved design strategies for the torque roll axis decoupling.

  8. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.

  9. Advances in spatial epidemiology and geographic information systems.

    PubMed

    Kirby, Russell S; Delmelle, Eric; Eberth, Jan M

    2017-01-01

    The field of spatial epidemiology has evolved rapidly in the past 2 decades. This study serves as a brief introduction to spatial epidemiology and the use of geographic information systems in applied research in epidemiology. We highlight technical developments and highlight opportunities to apply spatial analytic methods in epidemiologic research, focusing on methodologies involving geocoding, distance estimation, residential mobility, record linkage and data integration, spatial and spatio-temporal clustering, small area estimation, and Bayesian applications to disease mapping. The articles included in this issue incorporate many of these methods into their study designs and analytical frameworks. It is our hope that these studies will spur further development and utilization of spatial analysis and geographic information systems in epidemiologic research. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Basic emotion processing and the adolescent brain: Task demands, analytic approaches, and trajectories of changes.

    PubMed

    Del Piero, Larissa B; Saxbe, Darby E; Margolin, Gayla

    2016-06-01

    Early neuroimaging studies suggested that adolescents show initial development in brain regions linked with emotional reactivity, but slower development in brain structures linked with emotion regulation. However, the increased sophistication of adolescent brain research has made this picture more complex. This review examines functional neuroimaging studies that test for differences in basic emotion processing (reactivity and regulation) between adolescents and either children or adults. We delineated different emotional processing demands across the experimental paradigms in the reviewed studies to synthesize the diverse results. The methods for assessing change (i.e., analytical approach) and cohort characteristics (e.g., age range) were also explored as potential factors influencing study results. Few unifying dimensions were found to successfully distill the results of the reviewed studies. However, this review highlights the potential impact of subtle methodological and analytic differences between studies, need for standardized and theory-driven experimental paradigms, and necessity of analytic approaches that are can adequately test the trajectories of developmental change that have recently been proposed. Recommendations for future research highlight connectivity analyses and non-linear developmental trajectories, which appear to be promising approaches for measuring change across adolescence. Recommendations are made for evaluating gender and biological markers of development beyond chronological age. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Basic emotion processing and the adolescent brain: Task demands, analytic approaches, and trajectories of changes

    PubMed Central

    Del Piero, Larissa B.; Saxbe, Darby E.; Margolin, Gayla

    2016-01-01

    Early neuroimaging studies suggested that adolescents show initial development in brain regions linked with emotional reactivity, but slower development in brain structures linked with emotion regulation. However, the increased sophistication of adolescent brain research has made this picture more complex. This review examines functional neuroimaging studies that test for differences in basic emotion processing (reactivity and regulation) between adolescents and either children or adults. We delineated different emotional processing demands across the experimental paradigms in the reviewed studies to synthesize the diverse results. The methods for assessing change (i.e., analytical approach) and cohort characteristics (e.g., age range) were also explored as potential factors influencing study results. Few unifying dimensions were found to successfully distill the results of the reviewed studies. However, this review highlights the potential impact of subtle methodological and analytic differences between studies, need for standardized and theory-driven experimental paradigms, and necessity of analytic approaches that are can adequately test the trajectories of developmental change that have recently been proposed. Recommendations for future research highlight connectivity analyses and nonlinear developmental trajectories, which appear to be promising approaches for measuring change across adolescence. Recommendations are made for evaluating gender and biological markers of development beyond chronological age. PMID:27038840

  12. Coupled thermal, electrical, and fluid flow analyses of AMTEC converters, with illustrative application to OSC`s cell design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schock, A.; Noravian, H.; Or, C.

    1997-12-31

    This paper presents the background and introduction to the OSC AMTEC (Alkali Metal Thermal-to-Electrical Conversion) studies, which were conducted for the Department of energy (DOE) and NASA`s jet Propulsion Laboratory (JPL). After describing the basic principle of AMTEC, the paper describes and explains the operation of multi-tube vapor/vapor cells, which have been under development by AMPS (Advance Modular Power Systems, Inc.) for the Air Force Phillips Laboratory (AFPL) and JPL for possible application to the Europa Orbiter, Pluto Express, and other space missions. It then describes a novel OSC-generated methodology for analyzing the performance of such cells. This methodology consistsmore » of an iterative procedure for the coupled solution of the interdependent thermal, electrical, and fluid flow differential and integral equations governing the performance of AMTEC cells and generators, taking proper account of the non-linear axial variations of temperature, pressure, open-circuit voltage, inter-electrode voltages, current density, axial current, sodium mass flow rate, and power density. The paper illustrates that analytical procedure by applying it to OSC`s latest cell design and by presenting detailed analytical results for that design. The OSC-developed analytic methodology constitutes a unique and powerful tool for accurate parametric analyses and design optimizations of the multi-tube AMTEC cells and of radioisotope power systems. This is illustrated in two companion papers in these proceedings. The first of those papers applies the OSC-derived program to determine the effect of various design parameters on the performance of single AMTEC cells with adiabatic side walls, culminating in an OSC-recommended revised cell design. And the second describes a number of OSC-generated AMTEC generator designs consisting of 2 and 3 GPHS heat source modules, 16 multi-tube converter cells, and a hybrid insulation design, and presents the results of applying the above analysis program to determine the applicability of those generators to possible future missions under consideration by NASA.« less

  13. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    ERIC Educational Resources Information Center

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  14. Analytical Methodology Used To Assess/Refine Observatory Thermal Vacuum Test Conditions For the Landsat 8 Data Continuity Mission

    NASA Technical Reports Server (NTRS)

    Fantano, Louis

    2015-01-01

    Thermal and Fluids Analysis Workshop Silver Spring, MD NCTS 21070-15 The Landsat 8 Data Continuity Mission, which is part of the United States Geologic Survey (USGS), launched February 11, 2013. A Landsat environmental test requirement mandated that test conditions bound worst-case flight thermal environments. This paper describes a rigorous analytical methodology applied to assess refine proposed thermal vacuum test conditions and the issues encountered attempting to satisfy this requirement.

  15. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  16. Tennessee long-range transportation plan : project evaluation system

    DOT National Transportation Integrated Search

    2005-12-01

    The Project Evaluation System (PES) Report is an analytical methodology to aid programming efforts and prioritize multimodal investments. The methodology consists of both quantitative and qualitative evaluation criteria built upon the Guiding Princip...

  17. Determination of trace levels of parabens in real matrices by bar adsorptive microextraction using selective sorbent phases.

    PubMed

    Almeida, C; Nogueira, J M F

    2014-06-27

    In the present work, the development of an analytical methodology which combines bar adsorptive microextraction with microliquid desorption followed by high performance liquid chromatography-diode array detection (BAμE-μLD/HPLC-DAD) is proposed for the determination of trace levels of four parabens (methyl, ethyl, propyl and buthyl paraben) in real matrices. By comparing six polymer (P1, P2, P3, P4, P5 and P6) and five activated carbon (AC1, AC2, AC3, AC4 and AC5) coatings through BAμE, AC2 exhibited much higher selectivity and efficiency from all the sorbent phases tested, even when compared with the commercial stir bar sorptive extraction with polydimethylsiloxane. Assays performed through BAμE(AC2, 1.7mg) on 25mL of ultrapure water samples spiked at the 8.0μg/L level, yielded recoveries ranging from 85.6±6.3% to 100.6±11.8%, under optimized experimental conditions. The analytical performance showed also convenient limits of detection (0.1μg/L) and quantification (0.3μg/L), as well as good linear dynamic ranges (0.5-28.0μg/L) with remarkable determination coefficients (r(2)>0.9982). Excellent repeatability was also achieved through intraday (RSD<10.2%) and interday (RSD<10.0%) assays. By downsizing the analytical device to half-length (BAμE(AC2, 0.9mg)), similar analytical data was also achieved for the four parabens, under optimized experimental conditions, showing that this analytical technology can be design to operate with lower volumes of sample and desorption solvent, thus increasing the sensitivity and effectiveness. The application of the proposed analytical approach using the standard addition methodology on tap, underground, estuarine, swimming pool and waste water samples, as well as on commercial cosmetic products and urine samples, revealed good sensitivity, absence of matrix effects and the occurrence of levels of some parabens. Moreover, the present methodology is easy to implement, reliable, sensitive, requiring low sample and minimized desorption solvent volume, having the possibility to tune the most selective sorbent coating, according to the target compounds involved. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Towards automated human gait disease classification using phase space representation of intrinsic mode functions

    NASA Astrophysics Data System (ADS)

    Pratiher, Sawon; Patra, Sayantani; Pratiher, Souvik

    2017-06-01

    A novel analytical methodology for segregating healthy and neurological disorders from gait patterns is proposed by employing a set of oscillating components called intrinsic mode functions (IMF's). These IMF's are generated by the Empirical Mode Decomposition of the gait time series and the Hilbert transformed analytic signal representation forms the complex plane trace of the elliptical shaped analytic IMFs. The area measure and the relative change in the centroid position of the polygon formed by the Convex Hull of these analytic IMF's are taken as the discriminative features. Classification accuracy of 79.31% with Ensemble learning based Adaboost classifier validates the adequacy of the proposed methodology for a computer aided diagnostic (CAD) system for gait pattern identification. Also, the efficacy of several potential biomarkers like Bandwidth of Amplitude Modulation and Frequency Modulation IMF's and it's Mean Frequency from the Fourier-Bessel expansion from each of these analytic IMF's has been discussed for its potency in diagnosis of gait pattern identification and classification.

  19. Measuring solids concentration in stormwater runoff: comparison of analytical methods.

    PubMed

    Clark, Shirley E; Siu, Christina Y S

    2008-01-15

    Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.

  20. What values in design? The challenge of incorporating moral values into design.

    PubMed

    Manders-Huits, Noëmi

    2011-06-01

    Recently, there is increased attention to the integration of moral values into the conception, design, and development of emerging IT. The most reviewed approach for this purpose in ethics and technology so far is Value-Sensitive Design (VSD). This article considers VSD as the prime candidate for implementing normative considerations into design. Its methodology is considered from a conceptual, analytical, normative perspective. The focus here is on the suitability of VSD for integrating moral values into the design of technologies in a way that joins in with an analytical perspective on ethics of technology. Despite its promising character, it turns out that VSD falls short in several respects: (1) VSD does not have a clear methodology for identifying stakeholders, (2) the integration of empirical methods with conceptual research within the methodology of VSD is obscure, (3) VSD runs the risk of committing the naturalistic fallacy when using empirical knowledge for implementing values in design, (4) the concept of values, as well as their realization, is left undetermined and (5) VSD lacks a complimentary or explicit ethical theory for dealing with value trade-offs. For the normative evaluation of a technology, I claim that an explicit and justified ethical starting point or principle is required. Moreover, explicit attention should be given to the value aims and assumptions of a particular design. The criteria of adequacy for such an approach or methodology follow from the evaluation of VSD as the prime candidate for implementing moral values in design.

  1. Validation of a sampling plan to generate food composition data.

    PubMed

    Sammán, N C; Gimenez, M A; Bassett, N; Lobo, M O; Marcoleri, M E

    2016-02-15

    A methodology to develop systematic plans for food sampling was proposed. Long life whole and skimmed milk, and sunflower oil were selected to validate the methodology in Argentina. Fatty acid profile in all foods, proximal composition, and calcium's content in milk were determined with AOAC methods. The number of samples (n) was calculated applying Cochran's formula with variation coefficients ⩽12% and an estimate error (r) maximum permissible ⩽5% for calcium content in milks and unsaturated fatty acids in oil. n were 9, 11 and 21 for long life whole and skimmed milk, and sunflower oil respectively. Sample units were randomly collected from production sites and sent to labs. Calculated r with experimental data was ⩽10%, indicating high accuracy in the determination of analyte content of greater variability and reliability of the proposed sampling plan. The methodology is an adequate and useful tool to develop sampling plans for food composition analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Recent advances in CE-MS coupling: Instrumentation, methodology, and applications.

    PubMed

    Týčová, Anna; Ledvina, Vojtěch; Klepárník, Karel

    2017-01-01

    This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices coupled with MS for detection and identification of important analytes. It is a continuation of the review article on the same topic by Kleparnik (Electrophoresis 2015, 36, 159-178). A wide selection of 161 relevant articles covers the literature published from June 2014 till May 2016. New improvements in the instrumentation and methodology of MS interfaced with capillary or microfluidic versions of zone electrophoresis, isotachophoresis, and isoelectric focusing are described in detail. The most frequently implemented MS ionization methods include electrospray ionization, matrix-assisted desorption/ionization and inductively coupled plasma ionization. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography, and micellar electrokinetic chromatography are not included. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Rapid monitoring of glycerol in fermentation growth media: Facilitating crude glycerol bioprocess development.

    PubMed

    Abad, Sergi; Pérez, Xavier; Planas, Antoni; Turon, Xavier

    2014-04-01

    Recently, the need for crude glycerol valorisation from the biodiesel industry has generated many studies for practical and economic applications. Amongst them, fermentations based on glycerol media for the production of high value metabolites are prominent applications. This has generated a need to develop analytical techniques which allow fast and simple glycerol monitoring during fermentation. The methodology should be fast and inexpensive to be adopted in research, as well as in industrial applications. In this study three different methods were analysed and compared: two common methodologies based on liquid chromatography and enzymatic kits, and the new method based on a DotBlot assay coupled with image analysis. The new methodology is faster and cheaper than the other conventional methods, with comparable performance. Good linearity, precision and accuracy were achieved in the lower range (10 or 15 g/L to depletion), the most common range of glycerol concentrations to monitor fermentations in terms of growth kinetics. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Experimental and analytical investigation of inertial propulsion mechanisms and motion simulation of rigid multi-body mechanical systems

    NASA Astrophysics Data System (ADS)

    Almesallmy, Mohammed

    Methodologies are developed for dynamic analysis of mechanical systems with emphasis on inertial propulsion systems. This work adopted the Lagrangian methodology. Lagrangian methodology is the most efficient classical computational technique, which we call Equations of Motion Code (EOMC). The EOMC is applied to several simple dynamic mechanical systems for easier understanding of the method and to aid other investigators in developing equations of motion of any dynamic system. In addition, it is applied to a rigid multibody system, such as Thomson IPS [Thomson 1986]. Furthermore, a simple symbolic algorithm is developed using Maple software, which can be used to convert any nonlinear n-order ordinary differential equation (ODE) systems into 1st-order ODE system in ready format to be used in Matlab software. A side issue, but equally important, we have started corresponding with the U.S. Patent office to persuade them that patent applications, claiming gross linear motion based on inertial propulsion systems should be automatically rejected. The precedent is rejection of patent applications involving perpetual motion machines.

  5. Quantitative SIMS Imaging of Agar-Based Microbial Communities.

    PubMed

    Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V

    2018-05-01

    After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.

  6. Determination of trace level genotoxic impurities in small molecule drug substances using conventional headspace gas chromatography with contemporary ionic liquid diluents and electron capture detection.

    PubMed

    Ho, Tien D; Yehl, Peter M; Chetwyn, Nik P; Wang, Jin; Anderson, Jared L; Zhong, Qiqing

    2014-09-26

    Ionic liquids (ILs) were used as a new class of diluents for the analysis of two classes of genotoxic impurities (GTIs), namely, alkyl/aryl halides and nitro-aromatics, in small molecule drug substances by headspace gas chromatography (HS-GC) coupled with electron capture detection (ECD). This novel approach using ILs as contemporary diluents greatly broadens the applicability of HS-GC for the determination of high boiling (≥ 130°C) analytes including GTIs with limits of detection (LOD) ranging from 5 to 500 parts-per-billion (ppb) of analytes in a drug substance. This represents up to tens of thousands-fold improvement compared to traditional HS-GC diluents such as dimethyl sulfoxide (DMSO) and dimethylacetamide (DMAC). Various ILs were screened to determine their suitability as diluents for the HS-GC/ECD analysis. Increasing the HS oven temperatures resulted in varying responses for alkyl/aryl halides and a significant increase in response for all nitroaromatic GTIs. Linear ranges of up to five orders of magnitude were found for a number of analytes. The technique was validated on two active pharmaceutical ingredients with excellent recovery. This simple and robust methodology offers a key advantage in the ease of method transfer from development laboratories to quality control environments since conventional validated chromatographic data systems and GC instruments can be used. For many analytes, it is a cost effective alternative to more complex trace analytical methodologies like LC/MS and GC/MS, and significantly reduces the training needed for operation. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. The Impact of Big Data on Chronic Disease Management.

    PubMed

    Bhardwaj, Niharika; Wodajo, Bezawit; Spano, Anthony; Neal, Symaron; Coustasse, Alberto

    Population health management and specifically chronic disease management depend on the ability of providers to prevent development of high-cost and high-risk conditions such as diabetes, heart failure, and chronic respiratory diseases and to control them. The advent of big data analytics has potential to empower health care providers to make timely and truly evidence-based informed decisions to provide more effective and personalized treatment while reducing the costs of this care to patients. The goal of this study was to identify real-world health care applications of big data analytics to determine its effectiveness in both patient outcomes and the relief of financial burdens. The methodology for this study was a literature review utilizing 49 articles. Evidence of big data analytics being largely beneficial in the areas of risk prediction, diagnostic accuracy and patient outcome improvement, hospital readmission reduction, treatment guidance, and cost reduction was noted. Initial applications of big data analytics have proved useful in various phases of chronic disease management and could help reduce the chronic disease burden.

  8. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.

    PubMed

    Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  9. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology

    PubMed Central

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well. PMID:28116217

  10. Combining numerical simulations with time-domain random walk for pathogen risk assessment in groundwater

    NASA Astrophysics Data System (ADS)

    Cvetkovic, V.; Molin, S.

    2012-02-01

    We present a methodology that combines numerical simulations of groundwater flow and advective transport in heterogeneous porous media with analytical retention models for computing the infection risk probability from pathogens in aquifers. The methodology is based on the analytical results presented in [1,2] for utilising the colloid filtration theory in a time-domain random walk framework. It is shown that in uniform flow, the results from the numerical simulations of advection yield comparable results as the analytical TDRW model for generating advection segments. It is shown that spatial variability of the attachment rate may be significant, however, it appears to affect risk in a different manner depending on if the flow is uniform or radially converging. In spite of the fact that numerous issues remain open regarding pathogen transport in aquifers on the field scale, the methodology presented here may be useful for screening purposes, and may also serve as a basis for future studies that would include greater complexity.

  11. Integrated corridor management analysis, modeling and simulation (AMS) methodology.

    DOT National Transportation Integrated Search

    2008-03-01

    This AMS Methodologies Document provides a discussion of potential ICM analytical approaches for the assessment of generic corridor operations. The AMS framework described in this report identifies strategies and procedures for tailoring AMS general ...

  12. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  13. Violent Video Game Effects on Aggression, Empathy, and Prosocial Behavior in Eastern and Western Countries: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Anderson, Craig A.; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L.; Bushman, Brad J.; Sakamoto, Akira; Rothstein, Hannah R.; Saleem, Muniba

    2010-01-01

    Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past…

  14. Quantitative Profiling of Endogenous Fat-Soluble Vitamins and Carotenoids in Human Plasma Using an Improved UHPSFC-ESI-MS Interface.

    PubMed

    Petruzziello, Filomena; Grand-Guillaume Perrenoud, Alexandre; Thorimbert, Anita; Fogwill, Michael; Rezzi, Serge

    2017-07-18

    Analytical solutions enabling the quantification of circulating levels of liposoluble micronutrients such as vitamins and carotenoids are currently limited to either single or a reduced panel of analytes. The requirement to use multiple approaches hampers the investigation of the biological variability on a large number of samples in a time and cost efficient manner. With the goal to develop high-throughput and robust quantitative methods for the profiling of micronutrients in human plasma, we introduce a novel, validated workflow for the determination of 14 fat-soluble vitamins and carotenoids in a single run. Automated supported liquid extraction was optimized and implemented to simultaneously parallelize 48 samples in 1 h, and the analytes were measured using ultrahigh-performance supercritical fluid chromatography coupled to tandem mass spectrometry in less than 8 min. An improved mass spectrometry interface hardware was built up to minimize the post-decompression volume and to allow better control of the chromatographic effluent density on its route toward and into the ion source. In addition, a specific make-up solvent condition was developed to ensure both analytes and matrix constituents solubility after mobile phase decompression. The optimized interface resulted in improved spray plume stability and conserved matrix compounds solubility leading to enhanced hyphenation robustness while ensuring both suitable analytical repeatability and improved the detection sensitivity. The overall developed methodology gives recoveries within 85-115%, as well as within and between-day coefficient of variation of 2 and 14%, respectively.

  15. Gain weighted eigenspace assignment

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Andrisani, Dominick, II

    1994-01-01

    This report presents the development of the gain weighted eigenspace assignment methodology. This provides a designer with a systematic methodology for trading off eigenvector placement versus gain magnitudes, while still maintaining desired closed-loop eigenvalue locations. This is accomplished by forming a cost function composed of a scalar measure of error between desired and achievable eigenvectors and a scalar measure of gain magnitude, determining analytical expressions for the gradients, and solving for the optimal solution by numerical iteration. For this development the scalar measure of gain magnitude is chosen to be a weighted sum of the squares of all the individual elements of the feedback gain matrix. An example is presented to demonstrate the method. In this example, solutions yielding achievable eigenvectors close to the desired eigenvectors are obtained with significant reductions in gain magnitude compared to a solution obtained using a previously developed eigenspace (eigenstructure) assignment method.

  16. In-line monitoring of the coffee roasting process with near infrared spectroscopy: Measurement of sucrose and colour.

    PubMed

    Santos, João Rodrigo; Viegas, Olga; Páscoa, Ricardo N M J; Ferreira, Isabel M P L V O; Rangel, António O S S; Lopes, João Almeida

    2016-10-01

    In this work, a real-time and in-situ analytical tool based on near infrared spectroscopy is proposed to predict two of the most relevant coffee parameters during the roasting process, sucrose and colour. The methodology was developed taking in consideration different coffee varieties (Arabica and Robusta), coffee origins (Brazil, East-Timor, India and Uganda) and roasting process procedures (slow and fast). All near infrared spectroscopy-based calibrations were developed resorting to partial least squares regression. The results proved the suitability of this methodology as demonstrated by range-error-ratio and coefficient of determination higher than 10 and 0.85 respectively, for all modelled parameters. The relationship between sucrose and colour development during the roasting process is further discussed, in light of designing in real-time coffee products with similar visual appearance and distinct organoleptic profile. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Water level management of lakes connected to regulated rivers: An integrated modeling and analytical methodology

    NASA Astrophysics Data System (ADS)

    Hu, Tengfei; Mao, Jingqiao; Pan, Shunqi; Dai, Lingquan; Zhang, Peipei; Xu, Diandian; Dai, Huichao

    2018-07-01

    Reservoir operations significantly alter the hydrological regime of the downstream river and river-connected lake, which has far-reaching impacts on the lake ecosystem. To facilitate the management of lakes connected to regulated rivers, the following information must be provided: (1) the response of lake water levels to reservoir operation schedules in the near future and (2) the importance of different rivers in terms of affecting the water levels in different lake regions of interest. We develop an integrated modeling and analytical methodology for the water level management of such lakes. The data-driven method is used to model the lake level as it has the potential of producing quick and accurate predictions. A new genetic algorithm-based synchronized search is proposed to optimize input variable time lags and data-driven model parameters simultaneously. The methodology also involves the orthogonal design and range analysis for extracting the influence of an individual river from that of all the rivers. The integrated methodology is applied to the second largest freshwater lake in China, the Dongting Lake. The results show that: (1) the antecedent lake levels are of crucial importance for the current lake level prediction; (2) the selected river discharge time lags reflect the spatial heterogeneity of the rivers' impacts on lake level changes; (3) the predicted lake levels are in very good agreement with the observed data (RMSE ≤ 0.091 m; R2 ≥ 0.9986). This study demonstrates the practical potential of the integrated methodology, which can provide both the lake level responses to future dam releases and the relative contributions of different rivers to lake level changes.

  18. A Six Sigma Trial For Reduction of Error Rates in Pathology Laboratory.

    PubMed

    Tosuner, Zeynep; Gücin, Zühal; Kiran, Tuğçe; Büyükpinarbaşili, Nur; Turna, Seval; Taşkiran, Olcay; Arici, Dilek Sema

    2016-01-01

    A major target of quality assurance is the minimization of error rates in order to enhance patient safety. Six Sigma is a method targeting zero error (3.4 errors per million events) used in industry. The five main principles of Six Sigma are defining, measuring, analysis, improvement and control. Using this methodology, the causes of errors can be examined and process improvement strategies can be identified. The aim of our study was to evaluate the utility of Six Sigma methodology in error reduction in our pathology laboratory. The errors encountered between April 2014 and April 2015 were recorded by the pathology personnel. Error follow-up forms were examined by the quality control supervisor, administrative supervisor and the head of the department. Using Six Sigma methodology, the rate of errors was measured monthly and the distribution of errors at the preanalytic, analytic and postanalytical phases was analysed. Improvement strategies were reclaimed in the monthly intradepartmental meetings and the control of the units with high error rates was provided. Fifty-six (52.4%) of 107 recorded errors in total were at the pre-analytic phase. Forty-five errors (42%) were recorded as analytical and 6 errors (5.6%) as post-analytical. Two of the 45 errors were major irrevocable errors. The error rate was 6.8 per million in the first half of the year and 1.3 per million in the second half, decreasing by 79.77%. The Six Sigma trial in our pathology laboratory provided the reduction of the error rates mainly in the pre-analytic and analytic phases.

  19. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  20. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    PubMed

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  1. Determination of As, Se, and Hg in fuel samples by in-chamber chemical vapor generation ICP OES using a Flow Blurring® multinebulizer.

    PubMed

    García, Miriam; Aguirre, Miguel Ángel; Canals, Antonio

    2017-09-01

    In this work, a new and simple analytical methodology based on in-chamber chemical vapor generation has been developed for the spectrochemical analysis of commercial fuel samples. A multiple nebulizer with three nebulization units has been employed for this purpose: One unit was used for sample introduction, while the other two were used for the necessary reagent introduction. In this way, the aerosols were mixed inside the spray chamber. Through this method, analyte transport and, therefore, sensitivity are improved in inductively coupled plasma-optical emission spectrometry. The factors (i.e., variables), influencing chemical vapor generation, have been optimized using a multivariate approach. Under optimum chemical vapor generation conditions ([NaBH 4 ] = 1.39%, [HCl] = 2.97 M, total liquid flow = 936 μL min -1 ), the proposed sample introduction system allowed the determination of arsenic, selenium, and mercury up to 5 μg g -1 with a limit of detection of 25, 140, and 13 μg kg -1 , respectively. Analyzing spiked commercial fuel samples, recovery values obtained were between 96 and 113%, and expanded uncertainty values ranged from 4 to 16%. The most striking practical conclusion of this investigation is that no carbon deposit appears on the plasma torch after extended periods of working. Graphical abstract A new and simple analytical methodology based on in-chamber chemical vapor generation has been developed for the spectrochemical analysis of commercial fuel samples in ICP OES.

  2. HPLC-PFD determination of priority pollutant PAHs in water, sediment, and semipermeable membrane devices

    USGS Publications Warehouse

    Williamson, K.S.; Petty, J.D.; Huckins, J.N.; Lebo, J.A.; Kaiser, E.M.

    2002-01-01

    High performance liquid chromatography coupled with programmable fluorescence detection was employed for the determination of 15 priority pollutant polycyclic aromatic hydrocarbons (PPPAHs) in water, sediment, and semipermeable membrane devices (SPMDs). Chromatographic separation using this analytical method facilitates selectivity, sensitivity (ppt levels), and can serve as a non-destructive technique for subsequent analysis by other chromatographic and spectroscopic techniques. Extraction and sample cleanup procedures were also developed for water, sediment, and SPMDs using various chromatographic and wet chemical methods. The focus of this publication is to examine the enrichment techniques and the analytical methodologies used in the isolation, characterization, and quantitation of 15 PPPAHs in different sample matrices.

  3. Multi-way chemometric methodologies and applications: a central summary of our research work.

    PubMed

    Wu, Hai-Long; Nie, Jin-Fang; Yu, Yong-Jie; Yu, Ru-Qin

    2009-09-14

    Multi-way data analysis and tensorial calibration are gaining widespread acceptance with the rapid development of modern analytical instruments. In recent years, our group working in State Key Laboratory of Chemo/Biosensing and Chemometrics in Hunan University has carried out exhaustive scientific research work in this area, such as building more canonical symbol systems, seeking the inner mathematical cyclic symmetry property for trilinear or multilinear decomposition, suggesting a series of multi-way calibration algorithms, exploring the rank estimation of three-way trilinear data array and analyzing different application systems. In this present paper, an overview from second-order data to third-order data covering about theories and applications in analytical chemistry has been presented.

  4. Analytical methodologies based on LC-MS/MS for monitoring selected emerging compounds in liquid and solid phases of the sewage sludge.

    PubMed

    Boix, C; Ibáñez, M; Fabregat-Safont, D; Morales, E; Pastor, L; Sancho, J V; Sánchez-Ramírez, J E; Hernández, F

    2016-01-01

    In this work, two analytical methodologies based on liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) were developed for quantification of emerging pollutants identified in sewage sludge after a previous wide-scope screening. The target list included 13 emerging contaminants (EC): thiabendazole, acesulfame, fenofibric acid, valsartan, irbesartan, salicylic acid, diclofenac, carbamazepine, 4-aminoantipyrine (4-AA), 4-acetyl aminoantipyrine (4-AAA), 4-formyl aminoantipyrine (4-FAA), venlafaxine and benzoylecgonine. The aqueous and solid phases of the sewage sludge were analyzed making use of Solid-Phase Extraction (SPE) and UltraSonic Extraction (USE) for sample treatment, respectively. The methods were validated at three concentration levels: 0.2, 2 and 20 μg L(-1) for the aqueous phase, and 50, 500 and 2000 μg kg(-1) for the solid phase of the sludge. In general, the method was satisfactorily validated, showing good recoveries (70-120%) and precision (RSD < 20%). Regarding the limit of quantification (LOQ), it was below 0.1 μg L(-1) in the aqueous phase and below 50 μg kg(-1) in the solid phase for the majority of the analytes. The method applicability was tested by analysis of samples from a wider study on degradation of emerging pollutants in sewage sludge under anaerobic digestion. The key benefits of these methodologies are: • SPE and USE are appropriate sample procedures to extract selected emerging contaminants from the aqueous phase of the sewage sludge and the solid residue. • LC-MS/MS is highly suitable for determining emerging contaminants in both sludge phases. • Up to our knowledge, the main metabolites of dipyrone had not been studied before in sewage sludge.

  5. New methodology for capillary electrophoresis with ESI-MS detection: Electrophoretic focusing on inverse electromigration dispersion gradient. High-sensitivity analysis of sulfonamides in waters.

    PubMed

    Malá, Zdena; Gebauer, Petr; Boček, Petr

    2016-09-07

    This article describes for the first time the combination of electrophoretic focusing on inverse electromigration dispersion (EMD) gradient, a new separation principle described in 2010, with electrospray-ionization (ESI) mass spectrometric detection. The separation of analytes along the electromigrating EMD profile proceeds so that each analyte is focused and concentrated within the profile at a particular position given by its pKa and ionic mobility. The proposed methodology combines this principle with the transport of the focused zones to the capillary end by superimposed electromigration, electroosmotic flow and ESI suction, and their detection by the MS detector. The designed electrolyte system based on maleic acid and 2,6-lutidine is suitable to create an inverse EMD gradient of required properties and its components are volatile enough to be compatible with the ESI interface. The characteristic properties of the proposed electrolyte system and of the formed inverse gradient are discussed in detail using calculated diagrams and computer simulations. It is shown that the system is surprisingly robust and allows sensitive analyses of trace amounts of weak acids in the pKa range between approx. 6 and 9. As a first practical application of electrophoretic focusing on inverse EMD gradient, the analysis of several sulfonamides in waters is reported. It demonstrates the potential of the developed methodology for fast and high-sensitivity analyses of ionic trace analytes, with reached LODs around 3 × 10(-9) M (0.8 ng mL(-1)) of sulfonamides in spiked drinking water without any sample pretreatment. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Analyzing data from open enrollment groups: current considerations and future directions.

    PubMed

    Morgan-Lopez, Antonio A; Fals-Stewart, William

    2008-07-01

    Difficulties in modeling turnover in treatment-group membership have been cited as one of the major impediments to ecological validity of substance abuse and alcoholism treatment research. In this review, our primary foci are on (a) the discussion of approaches that draw on state-of-the-science analytic methods for modeling open-enrollment group data and (b) highlighting emerging issues that are critical to this relatively new area of methodological research (e.g., quantifying membership change, modeling "holiday" effects, and modeling membership change among group members and leaders). Continuing refinement of new modeling tools to address these analytic complexities may ultimately lead to the development of more federally funded open-enrollment trials. These developments may also facilitate the building of a "community-friendly" treatment research portfolio for funding agencies that support substance abuse and alcoholism treatment research.

  7. Cost-Effectiveness of HBV and HCV Screening Strategies – A Systematic Review of Existing Modelling Techniques

    PubMed Central

    Geue, Claudia; Wu, Olivia; Xin, Yiqiao; Heggie, Robert; Hutchinson, Sharon; Martin, Natasha K.; Fenwick, Elisabeth; Goldberg, David

    2015-01-01

    Introduction Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers. PMID:26689908

  8. Analytical aspects of plant metabolite profiling platforms: current standings and future aims.

    PubMed

    Seger, Christoph; Sturm, Sonja

    2007-02-01

    Over the past years, metabolic profiling has been established as a comprehensive systems biology tool. Mass spectrometry or NMR spectroscopy-based technology platforms combined with unsupervised or supervised multivariate statistical methodologies allow a deep insight into the complex metabolite patterns of plant-derived samples. Within this review, we provide a thorough introduction to the analytical hard- and software requirements of metabolic profiling platforms. Methodological limitations are addressed, and the metabolic profiling workflow is exemplified by summarizing recent applications ranging from model systems to more applied topics.

  9. Cytochemical studies of planetary microorganisms - Explorations in exobiology

    NASA Technical Reports Server (NTRS)

    Lederberg, J.

    1972-01-01

    Analytical methodology using gas chromatography and mass spectrography for improved physiological monitoring of astronauts is developed. Reported research covers the following topics: (1) Chlorination of DNA bases; (2) mass fragmentography; (3) mass spectrometry; (4) urine analysis for metabolic constituents; (5) analysis of natural products by mass spectrometry; (6) computer identification of unknown molecular compounds; (7) fluorescent sorter for cell separation; (8) Mariner Mars 1971 orbital photography; and (9) Viking Lander imagery.

  10. Thermal and Chemical Characterization of Composite Materials. MSFC Center Director's Discretionary Fund Final Report, Project No. ED36-18

    NASA Technical Reports Server (NTRS)

    Stanley, D. C.; Huff, T. L.

    2003-01-01

    The purpose of this research effort was to: (1) provide a concise and well-defined property profile of current and developing composite materials using thermal and chemical characterization techniques and (2) optimize analytical testing requirements of materials. This effort applied a diverse array of methodologies to ascertain composite material properties. Often, a single method of technique will provide useful, but nonetheless incomplete, information on material composition and/or behavior. To more completely understand and predict material properties, a broad-based analytical approach is required. By developing a database of information comprised of both thermal and chemical properties, material behavior under varying conditions may be better understood. THis is even more important in the aerospace community, where new composite materials and those in the development stage have little reference data. For example, Fourier transform infrared (FTIR) spectroscopy spectral databases available for identification of vapor phase spectra, such as those generated during experiments, generally refer to well-defined chemical compounds. Because this method renders a unique thermal decomposition spectral pattern, even larger, more diverse databases, such as those found in solid and liquid phase FTIR spectroscopy libraries, cannot be used. By combining this and other available methodologies, a database specifically for new materials and materials being developed at Marshall Space Flight Center can be generated . In addition, characterizing materials using this approach will be extremely useful in the verification of materials and identification of anomalies in NASA-wide investigations.

  11. Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis

    PubMed Central

    Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.

    2011-01-01

    Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184

  12. Supercritical fluid extraction and ultra performance liquid chromatography of respiratory quinones for microbial community analysis in environmental and biological samples.

    PubMed

    Hanif, Muhammad; Atsuta, Yoichi; Fujie, Koichi; Daimon, Hiroyuki

    2012-03-05

    Microbial community structure plays a significant role in environmental assessment and animal health management. The development of a superior analytical strategy for the characterization of microbial community structure is an ongoing challenge. In this study, we developed an effective supercritical fluid extraction (SFE) and ultra performance liquid chromatography (UPLC) method for the analysis of bacterial respiratory quinones (RQ) in environmental and biological samples. RQ profile analysis is one of the most widely used culture-independent tools for characterizing microbial community structure. A UPLC equipped with a photo diode array (PDA) detector was successfully applied to the simultaneous determination of ubiquinones (UQ) and menaquinones (MK) without tedious pretreatment. Supercritical carbon dioxide (scCO(2)) extraction with the solid-phase cartridge trap proved to be a more effective and rapid method for extracting respiratory quinones, compared to a conventional organic solvent extraction method. This methodology leads to a successful analytical procedure that involves a significant reduction in the complexity and sample preparation time. Application of the optimized methodology to characterize microbial communities based on the RQ profile was demonstrated for a variety of environmental samples (activated sludge, digested sludge, and compost) and biological samples (swine and Japanese quail feces).

  13. Enantiomeric separation of non-protein amino acids by electrokinetic chromatography.

    PubMed

    Pérez-Míguez, Raquel; Marina, María Luisa; Castro-Puyana, María

    2016-10-07

    New analytical methodologies enabling the enantiomeric separation of a group of non-protein amino acids of interest in the pharmaceutical and food analysis fields were developed in this work using Electrokinetic Chromatography. The use of FMOC as derivatization reagent and the subsequent separation using acidic conditions (formate buffer at pH 2.0) and anionic cyclodextrins as chiral selectors allowed the chiral separation of eight from the ten non-protein amino acids studied. Pyroglutamic acid, norvaline, norleucine, 3,4-dihydroxyphenilalanine, 2-aminoadipic acid, and selenomethionine were enantiomericaly separated using sulfated-α-CD while sulfated-γ-CD enabled the enantiomeric separation of norvaline, 3,4-dihydroxyphenilalanine, 2-aminoadipic acid, selenomethionie, citrulline, and pipecolic acid. Moreover, the potential of the developed methodologies was demonstrated in the analysis of citrulline and its enantiomeric impurity in food supplements. For that purpose, experimental and instrumental variables were optimized and the analytical characteristics of the proposed method were evaluated. LODs of 2.1×10 -7 and 1.8×10 -7 M for d- and l-citrulline, respectively, were obtained. d-Cit was not detectable in any of the six food supplement samples analyzed showing that the effect of storage time on the racemization of citrulline was negligible. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Quantitative mass spectrometry of unconventional human biological matrices

    NASA Astrophysics Data System (ADS)

    Dutkiewicz, Ewelina P.; Urban, Pawel L.

    2016-10-01

    The development of sensitive and versatile mass spectrometric methodology has fuelled interest in the analysis of metabolites and drugs in unconventional biological specimens. Here, we discuss the analysis of eight human matrices-hair, nail, breath, saliva, tears, meibum, nasal mucus and skin excretions (including sweat)-by mass spectrometry (MS). The use of such specimens brings a number of advantages, the most important being non-invasive sampling, the limited risk of adulteration and the ability to obtain information that complements blood and urine tests. The most often studied matrices are hair, breath and saliva. This review primarily focuses on endogenous (e.g. potential biomarkers, hormones) and exogenous (e.g. drugs, environmental contaminants) small molecules. The majority of analytical methods used chromatographic separation prior to MS; however, such a hyphenated methodology greatly limits analytical throughput. On the other hand, the mass spectrometric methods that exclude chromatographic separation are fast but suffer from matrix interferences. To enable development of quantitative assays for unconventional matrices, it is desirable to standardize the protocols for the analysis of each specimen and create appropriate certified reference materials. Overcoming these challenges will make analysis of unconventional human biological matrices more common in a clinical setting. This article is part of the themed issue 'Quantitative mass spectrometry'.

  15. Graphical Descriptives: A Way to Improve Data Transparency and Methodological Rigor in Psychology.

    PubMed

    Tay, Louis; Parrigon, Scott; Huang, Qiming; LeBreton, James M

    2016-09-01

    Several calls have recently been issued to the social sciences for enhanced transparency of research processes and enhanced rigor in the methodological treatment of data and data analytics. We propose the use of graphical descriptives (GDs) as one mechanism for responding to both of these calls. GDs provide a way to visually examine data. They serve as quick and efficient tools for checking data distributions, variable relations, and the potential appropriateness of different statistical analyses (e.g., do data meet the minimum assumptions for a particular analytic method). Consequently, we believe that GDs can promote increased transparency in the journal review process, encourage best practices for data analysis, and promote a more inductive approach to understanding psychological data. We illustrate the value of potentially including GDs as a step in the peer-review process and provide a user-friendly online resource (www.graphicaldescriptives.org) for researchers interested in including data visualizations in their research. We conclude with suggestions on how GDs can be expanded and developed to enhance transparency. © The Author(s) 2016.

  16. Effect of modulation of the particle size distributions in the direct solid analysis by total-reflection X-ray fluorescence

    NASA Astrophysics Data System (ADS)

    Fernández-Ruiz, Ramón; Friedrich K., E. Josue; Redrejo, M. J.

    2018-02-01

    The main goal of this work was to investigate, in a systematic way, the influence of the controlled modulation of the particle size distribution of a representative solid sample with respect to the more relevant analytical parameters of the Direct Solid Analysis (DSA) by Total-reflection X-Ray Fluorescence (TXRF) quantitative method. In particular, accuracy, uncertainty, linearity and detection limits were correlated with the main parameters of their size distributions for the following elements; Al, Si, P, S, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, As, Se, Rb, Sr, Ba and Pb. In all cases strong correlations were finded. The main conclusion of this work can be resumed as follows; the modulation of particles shape to lower average sizes next to a minimization of the width of particle size distributions, produce a strong increment of accuracy, minimization of uncertainties and limit of detections for DSA-TXRF methodology. These achievements allow the future use of the DSA-TXRF analytical methodology for development of ISO norms and standardized protocols for the direct analysis of solids by mean of TXRF.

  17. Calculation of catalyst crust thickness from full elemental laser-induced breakdown spectroscopy images

    NASA Astrophysics Data System (ADS)

    Sorbier, L.; Trichard, F.; Moncayo, S.; Lienemann, C. P.; Motto-Ros, V.

    2018-01-01

    We propose a methodology to compute the crust thickness of an element in an egg-shell catalyst from a two-dimensional elemental map. The methodology handles two important catalyst shapes: infinite extrudates of arbitrary section and spheres. The methodology is validated with synthetic analytical profiles on simple shapes (cylinder and sphere). Its relative accuracy is shown close to few percent with a decrease inversely proportional to the square root of the number of sampled pixels. The crust thickness obtained by this method from quantitative Pd maps acquired by laser-induced breakdown spectroscopy are comparable with values obtained from electron-probe microanalysis profiles. Some discrepancies are found and are explained by the heterogeneity of the crust thickness within a grain. As a full map is more representative than a single profile, fast mapping and the methodology exposed in this paper are expected to become valuable tools for the development of new generations of egg-shell deposited catalysts.

  18. Making sense of grounded theory in medical education.

    PubMed

    Kennedy, Tara J T; Lingard, Lorelei A

    2006-02-01

    Grounded theory is a research methodology designed to develop, through collection and analysis of data that is primarily (but not exclusively) qualitative, a well-integrated set of concepts that provide a theoretical explanation of a social phenomenon. This paper aims to provide an introduction to key features of grounded theory methodology within the context of medical education research. In this paper we include a discussion of the origins of grounded theory, a description of key methodological processes, a comment on pitfalls encountered commonly in the application of grounded theory research, and a summary of the strengths of grounded theory methodology with illustrations from the medical education domain. The significant strengths of grounded theory that have resulted in its enduring prominence in qualitative research include its clearly articulated analytical process and its emphasis on the generation of pragmatic theory that is grounded in the data of experience. When applied properly and thoughtfully, grounded theory can address research questions of significant relevance to the domain of medical education.

  19. Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.

    1997-01-01

    An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.

  20. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.

  1. Propellant injection systems and processes

    NASA Technical Reports Server (NTRS)

    Ito, Jackson I.

    1995-01-01

    The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.

  2. Incorporating Information Literacy Skills into Analytical Chemistry: An Evolutionary Step

    ERIC Educational Resources Information Center

    Walczak, Mary M.; Jackson, Paul T.

    2007-01-01

    The American Chemical Society (ACS) has recently decided to incorporate various information literacy skills for teaching analytical chemistry to the students. The methodology has been found to be extremely effective, as it provides better understanding to the students.

  3. A Modern Approach to College Analytical Chemistry.

    ERIC Educational Resources Information Center

    Neman, R. L.

    1983-01-01

    Describes a course which emphasizes all facets of analytical chemistry, including sampling, preparation, interference removal, selection of methodology, measurement of a property, and calculation/interpretation of results. Includes special course features (such as cooperative agreement with an environmental protection center) and course…

  4. Validation of a SysML based design for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed

    2017-07-01

    When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.

  5. Aeroservoelastic wind-tunnel investigations using the Active Flexible Wing Model: Status and recent accomplishments

    NASA Technical Reports Server (NTRS)

    Noll, Thomas E.; Perry, Boyd, III; Tiffany, Sherwood H.; Cole, Stanley R.; Buttrill, Carey S.; Adams, William M., Jr.; Houck, Jacob A.; Srinathkumar, S.; Mukhopadhyay, Vivek; Pototzky, Anthony S.

    1989-01-01

    The status of the joint NASA/Rockwell Active Flexible Wing Wind-Tunnel Test Program is described. The objectives are to develop and validate the analysis, design, and test methodologies required to apply multifunction active control technology for improving aircraft performance and stability. Major tasks include designing digital multi-input/multi-output flutter-suppression and rolling-maneuver-load alleviation concepts for a flexible full-span wind-tunnel model, obtaining an experimental data base for the basic model and each control concept and providing comparisons between experimental and analytical results to validate the methodologies. The opportunity is provided to improve real-time simulation techniques and to gain practical experience with digital control law implementation procedures.

  6. Trends in analytical methodologies for the determination of alkylphenols and bisphenol A in water samples.

    PubMed

    Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D

    2017-04-15

    In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  8. Proteomics: from hypothesis to quantitative assay on a single platform. Guidelines for developing MRM assays using ion trap mass spectrometers.

    PubMed

    Han, Bomie; Higgs, Richard E

    2008-09-01

    High-throughput HPLC-mass spectrometry (HPLC-MS) is routinely used to profile biological samples for potential protein markers of disease, drug efficacy and toxicity. The discovery technology has advanced to the point where translating hypotheses from proteomic profiling studies into clinical use is the bottleneck to realizing the full potential of these approaches. The first step in this translation is the development and analytical validation of a higher throughput assay with improved sensitivity and selectivity relative to typical profiling assays. Multiple reaction monitoring (MRM) assays are an attractive approach for this stage of biomarker development given their improved sensitivity and specificity, the speed at which the assays can be developed and the quantitative nature of the assay. While the profiling assays are performed with ion trap mass spectrometers, MRM assays are traditionally developed in quadrupole-based mass spectrometers. Development of MRM assays from the same instrument used in the profiling analysis enables a seamless and rapid transition from hypothesis generation to validation. This report provides guidelines for rapidly developing an MRM assay using the same mass spectrometry platform used for profiling experiments (typically ion traps) and reviews methodological and analytical validation considerations. The analytical validation guidelines presented are drawn from existing practices on immunological assays and are applicable to any mass spectrometry platform technology.

  9. Assessment of analytical quality in Nordic clinical chemistry laboratories using data from contemporary national programs.

    PubMed

    Aronsson, T; Bjørnstad, P; Leskinen, E; Uldall, A; de Verdier, C H

    1984-01-01

    The aim of this investigation was primarily to assess analytical quality expressed as between-laboratory, within-laboratory, and total imprecision, not in order to detect laboratories with poor performance, but in the positive sense to provide data for improving critical steps in analytical methodology. The aim was also to establish the present state of the art in comparison with earlier investigations to see if improvement in analytical quality could be observed.

  10. Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.

    PubMed

    Grdinić, Vladimir; Vuković, Jadranka

    2004-05-28

    A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, W.S.

    The Analytical Chemistry Dvision of Oak Ridge National laboratory (ORNL) serves a multitude of functions for a clientele that exists both in and outside ORNL. These functions fall into the following general categories: (1) analytical research, development, and implementation; (2) programmatic research, development, and utilization; and (3) technical support. The Division is organized into five major sections, each of which may carry out any type of work falling in the three categories mentioned above. Chapters 1 through 5 of this report highlight progress within the five sections (analytical methodology, mass and emission spectrometry, radioactive materials, bio/organic analysis, and general andmore » environmental analysis) during the period January 1, 1982 to December 31, 1982. A short summary introduces each chapter to indicate work scope. Information about quality assurance and safety programs is presented in Chapter 6, along with a tabulation of analyses rendered. Publications, oral presentations, professional activities, educational programs, and seminars are cited in Chapters 7 and 8. Approximately 61 articles, 32 proceedings publications and 37 reports have been published, and 107 oral presentations were given during this reporting period.« less

  12. Generalized Subset Designs in Analytical Chemistry.

    PubMed

    Surowiec, Izabella; Vikström, Ludvig; Hector, Gustaf; Johansson, Erik; Vikström, Conny; Trygg, Johan

    2017-06-20

    Design of experiments (DOE) is an established methodology in research, development, manufacturing, and production for screening, optimization, and robustness testing. Two-level fractional factorial designs remain the preferred approach due to high information content while keeping the number of experiments low. These types of designs, however, have never been extended to a generalized multilevel reduced design type that would be capable to include both qualitative and quantitative factors. In this Article we describe a novel generalized fractional factorial design. In addition, it also provides complementary and balanced subdesigns analogous to a fold-over in two-level reduced factorial designs. We demonstrate how this design type can be applied with good results in three different applications in analytical chemistry including (a) multivariate calibration using microwave resonance spectroscopy for the determination of water in tablets, (b) stability study in drug product development, and (c) representative sample selection in clinical studies. This demonstrates the potential of generalized fractional factorial designs to be applied in many other areas of analytical chemistry where representative, balanced, and complementary subsets are required, especially when a combination of quantitative and qualitative factors at multiple levels exists.

  13. TDS exposure project: application of the analytic hierarchy process for the prioritization of substances to be analyzed in a total diet study.

    PubMed

    Papadopoulos, A; Sioen, I; Cubadda, F; Ozer, H; Basegmez, H I Oktay; Turrini, A; Lopez Esteban, M T; Fernandez San Juan, P M; Sokolić-Mihalak, D; Jurkovic, M; De Henauw, S; Aureli, F; Vin, K; Sirot, V

    2015-02-01

    The objective of this article is to develop a general method based on the analytic hierarchy process (AHP) methodology to rank the substances to be studied in a Total Diet Studies (TDS). This method was tested for different substances and groups of substances (N = 113), for which the TDS approach has been considered relevant. This work was performed by a group of 7 experts from different European countries representing their institutes, which are involved in the TDS EXPOSURE project. The AHP methodology is based on a score system taking into account experts' judgments quantified assigning comparative scores to the different identified issues. Hence, the 10 substances of highest interest in the framework of a TDS are trace elements (methylmercury, cadmium, inorganic arsenic, lead, aluminum, inorganic mercury), dioxins, furans and polychlorinated biphenyls (PCBs), and some additives (sulfites and nitrites). The priority list depends on both the national situation (geographical variations, consumer concern, etc.) and the availability of data. Thus, the list depends on the objectives of the TDS and on reachable analytical performances. Moreover, such a list is highly variable with time and new data (e.g. social context, vulnerable population groups, emerging substances, new toxicological data or health-based guidance values). Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Pressurized hot water extraction followed by miniaturized membrane assisted solvent extraction for the green analysis of alkylphenols in sediments.

    PubMed

    Salgueiro-González, N; Turnes-Carou, I; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D

    2015-02-27

    A novel and Green analytical methodology for the determination of alkylphenols (4-tert-octylphenol, 4-n-octylphenol, 4-n-nonylphenol, nonylphenol) in sediments was developed and validated. The method was based on pressurized hot water extraction (PHWE) followed by miniaturized membrane assisted solvent extraction (MASE) and liquid chromatography-electrospray ionization tandem mass spectrometry detection (LC-ESI-MS/MS). The extraction conditions were optimized by a Plackett-Burman design in order to minimize the number of assays according to Green principles. Matrix effect was studied and compensated using deuterated labeled standards as surrogate standards for the quantitation of the target compounds. The analytical features of the method were satisfactory: relative recoveries varied between 92 and 103% and repeatability and intermediate precision were <9% for all compounds. Quantitation limits of the method (MQL) ranged from 0.061 (4-n-nonylphenol) to 1.7ngg(-1) dry weight (nonylphenol). Sensitivity, selectivity, automaticity and fastness are the main advantages of the exposed methodology. Reagent consumption, analysis time and waste generation were minimized. The "greenness" of the proposed method was evaluated using an analytical Eco-Scale approach and satisfactory results were obtained. The applicability of the proposed method was demonstrated analysing sediment samples of Galicia coast (NW of Spain) and the ubiquity of alkylphenols in the environment was demonstrated. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Report on an Assessment of the Application of EPP Results from the Strain Limit Evaluation Procedure to the Prediction of Cyclic Life Based on the SMT Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jetter, R. I.; Messner, M. C.; Sham, T. -L.

    The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate an SMT data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. This methodology should minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, analytical studies and evaluation of thermomechanical test results continuedmore » in FY17. This report presents the results of those studies. An EPP strain limits methodology assessment was based on recent two-bar thermal ratcheting test results on 316H stainless steel in the temperature range of 405 to 7050C. Strain range predictions from the EPP evaluation of the two-bar tests were also evaluated and compared with the experimental results. The role of sustained primary loading on cyclic life was assessed using the results of pressurized SMT data from tests on Alloy 617 at 9500C. A viscoplastic material model was used in an analytic simulation of two-bar tests to compare with EPP strain limits assessments using isochronous stress strain curves that are consistent with the viscoplastic material model. A finite element model of a prior 304H stainless steel Oak Ridge National Laboratory (ORNL) nozzle-to-sphere test was developed and used for an EPP strain limits and creep-fatigue code case damage evaluations. A theoretical treatment of a recurring issue with convergence criteria for plastic shakedown illustrated the role of computer machine precision in EPP calculations.« less

  16. Combined Numerical/Analytical Perturbation Solutions of the Navier-Stokes Equations for Aerodynamic Ejector/Mixer Nozzle Flows

    NASA Technical Reports Server (NTRS)

    DeChant, Lawrence Justin

    1998-01-01

    In spite of rapid advances in both scalar and parallel computational tools, the large number of variables involved in both design and inverse problems make the use of sophisticated fluid flow models impractical, With this restriction, it is concluded that an important family of methods for mathematical/computational development are reduced or approximate fluid flow models. In this study a combined perturbation/numerical modeling methodology is developed which provides a rigorously derived family of solutions. The mathematical model is computationally more efficient than classical boundary layer but provides important two-dimensional information not available using quasi-1-d approaches. An additional strength of the current methodology is its ability to locally predict static pressure fields in a manner analogous to more sophisticated parabolized Navier Stokes (PNS) formulations. To resolve singular behavior, the model utilizes classical analytical solution techniques. Hence, analytical methods have been combined with efficient numerical methods to yield an efficient hybrid fluid flow model. In particular, the main objective of this research has been to develop a system of analytical and numerical ejector/mixer nozzle models, which require minimal empirical input. A computer code, DREA Differential Reduced Ejector/mixer Analysis has been developed with the ability to run sufficiently fast so that it may be used either as a subroutine or called by an design optimization routine. Models are of direct use to the High Speed Civil Transport Program (a joint government/industry project seeking to develop an economically.viable U.S. commercial supersonic transport vehicle) and are currently being adopted by both NASA and industry. Experimental validation of these models is provided by comparison to results obtained from open literature and Limited Exclusive Right Distribution (LERD) sources, as well as dedicated experiments performed at Texas A&M. These experiments have been performed using a hydraulic/gas flow analog. Results of comparisons of DREA computations with experimental data, which include entrainment, thrust, and local profile information, are overall good. Computational time studies indicate that DREA provides considerably more information at a lower computational cost than contemporary ejector nozzle design models. Finally. physical limitations of the method, deviations from experimental data, potential improvements and alternative formulations are described. This report represents closure to the NASA Graduate Researchers Program. Versions of the DREA code and a user's guide may be obtained from the NASA Lewis Research Center.

  17. Xpey' Relational Environments: an analytic framework for conceptualizing Indigenous health equity.

    PubMed

    Kent, Alexandra; Loppie, Charlotte; Carriere, Jeannine; MacDonald, Marjorie; Pauly, Bernie

    2017-12-01

    Both health equity research and Indigenous health research are driven by the goal of promoting equitable health outcomes among marginalized and underserved populations. However, the two fields often operate independently, without collaboration. As a result, Indigenous populations are underrepresented in health equity research relative to the disproportionate burden of health inequities they experience. In this methodological article, we present Xpey' Relational Environments, an analytic framework that maps some of the barriers and facilitators to health equity for Indigenous peoples. Health equity research needs to include a focus on Indigenous populations and Indigenized methodologies, a shift that could fill gaps in knowledge with the potential to contribute to 'closing the gap' in Indigenous health. With this in mind, the Equity Lens in Public Health (ELPH) research program adopted the Xpey' Relational Environments framework to add a focus on Indigenous populations to our research on the prioritization and implementation of health equity. The analytic framework introduced an Indigenized health equity lens to our methodology, which facilitated the identification of social, structural and systemic determinants of Indigenous health. To test the framework, we conducted a pilot case study of one of British Columbia's regional health authorities, which included a review of core policies and plans as well as interviews and focus groups with frontline staff, managers and senior executives. ELPH's application of Xpey' Relational Environments serves as an example of the analytic framework's utility for exploring and conceptualizing Indigenous health equity in BC's public health system. Future applications of the framework should be embedded in Indigenous research methodologies.

  18. Analytical technique characterizes all trace contaminants in water

    NASA Technical Reports Server (NTRS)

    Foster, J. N.; Lysyj, I.; Nelson, K. H.

    1967-01-01

    Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.

  19. Study on bending behaviour of nickel–titanium rotary endodontic instruments by analytical and numerical analyses

    PubMed Central

    Tsao, C C; Liou, J U; Wen, P H; Peng, C C; Liu, T S

    2013-01-01

    Aim To develop analytical models and analyse the stress distribution and flexibility of nickel–titanium (NiTi) instruments subject to bending forces. Methodology The analytical method was used to analyse the behaviours of NiTi instruments under bending forces. Two NiTi instruments (RaCe and Mani NRT) with different cross-sections and geometries were considered. Analytical results were derived using Euler–Bernoulli nonlinear differential equations that took into account the screw pitch variation of these NiTi instruments. In addition, the nonlinear deformation analysis based on the analytical model and the finite element nonlinear analysis was carried out. Numerical results are obtained by carrying out a finite element method. Results According to analytical results, the maximum curvature of the instrument occurs near the instrument tip. Results of the finite element analysis revealed that the position of maximum von Mises stress was near the instrument tip. Therefore, the proposed analytical model can be used to predict the position of maximum curvature in the instrument where fracture may occur. Finally, results of analytical and numerical models were compatible. Conclusion The proposed analytical model was validated by numerical results in analysing bending deformation of NiTi instruments. The analytical model is useful in the design and analysis of instruments. The proposed theoretical model is effective in studying the flexibility of NiTi instruments. Compared with the finite element method, the analytical model can deal conveniently and effectively with the subject of bending behaviour of rotary NiTi endodontic instruments. PMID:23173762

  20. Quantifying construction and demolition waste: An analytical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less

  1. Treatment effects model for assessing disease management: measuring outcomes and strengthening program management.

    PubMed

    Wendel, Jeanne; Dumitras, Diana

    2005-06-01

    This paper describes an analytical methodology for obtaining statistically unbiased outcomes estimates for programs in which participation decisions may be correlated with variables that impact outcomes. This methodology is particularly useful for intraorganizational program evaluations conducted for business purposes. In this situation, data is likely to be available for a population of managed care members who are eligible to participate in a disease management (DM) program, with some electing to participate while others eschew the opportunity. The most pragmatic analytical strategy for in-house evaluation of such programs is likely to be the pre-intervention/post-intervention design in which the control group consists of people who were invited to participate in the DM program, but declined the invitation. Regression estimates of program impacts may be statistically biased if factors that impact participation decisions are correlated with outcomes measures. This paper describes an econometric procedure, the Treatment Effects model, developed to produce statistically unbiased estimates of program impacts in this type of situation. Two equations are estimated to (a) estimate the impacts of patient characteristics on decisions to participate in the program, and then (b) use this information to produce a statistically unbiased estimate of the impact of program participation on outcomes. This methodology is well-established in economics and econometrics, but has not been widely applied in the DM outcomes measurement literature; hence, this paper focuses on one illustrative application.

  2. Fast methodology for the reliable determination of nonylphenol in water samples by minimal labeling isotope dilution mass spectrometry.

    PubMed

    Fabregat-Cabello, Neus; Castillo, Ángel; Sancho, Juan V; González, Florenci V; Roig-Navarro, Antoni Francesc

    2013-08-02

    In this work we have developed and validated an accurate and fast methodology for the determination of 4-nonylphenol (technical mixture) in complex matrix water samples by UHPLC-ESI-MS/MS. The procedure is based on isotope dilution mass spectrometry (IDMS) in combination with isotope pattern deconvolution (IPD), which provides the concentration of the analyte directly from the spiked sample without requiring any methodological calibration graph. To avoid any possible isotopic effect during the analytical procedure the in-house synthesized (13)C1-4-(3,6-dimethyl-3-heptyl)phenol was used as labeled compound. This proposed surrogate was able to compensate the matrix effect even from wastewater samples. A SPE pre-concentration step together with exhaustive efforts to avoid contamination were included to reach the signal-to-noise ratio necessary to detect the endogenous concentrations present in environmental samples. Calculations were performed acquiring only three transitions, achieving limits of detection lower than 100ng/g for all water matrix assayed. Recoveries within 83-108% and coefficients of variation ranging from 1.5% to 9% were obtained. On the contrary a considerable overestimation was obtained with the most usual classical calibration procedure using 4-n-nonylphenol as internal standard, demonstrating the suitability of the minimal labeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. The harmful chemistry behind "krokodil": Street-like synthesis and product analysis.

    PubMed

    Alves, Emanuele Amorim; Soares, José Xavier; Afonso, Carlos Manuel; Grund, Jean-Paul C; Agonia, Ana Sofia; Cravo, Sara Manuela; Netto, Annibal Duarte Pereira; Carvalho, Félix; Dinis-Oliveira, Ricardo Jorge

    2015-12-01

    "Krokodil" is the street name for a drug, which has been attracting media and researchers attention due to its increasing spread and extreme toxicity. "Krokodil" is a homemade injectable mixture being used as a cheap substitute for heroin. Its use begun in Russia and Ukraine, but it is being spread throughout other countries. The starting materials for "krokodil" synthesis are tablets containing codeine, caustic soda, gasoline, hydrochloric acid, iodine from disinfectants and red phosphorus from matchboxes, all of which are easily available in a retail market or drugstores. The resulting product is a light brown liquid that is injected without previous purification. Herein, we aimed to understand the chemistry behind "krokodil" synthesis by mimicking the steps followed by people who use this drug. The successful synthesis was assessed by the presence of desomorphine and other two morphinans. An analytical gas chromatography-electron impact/mass spectrometry (GC-EI/MS) methodology for quantification of desomorphine and codeine was also developed and validated. The methodologies presented herein provide a representative synthesis of "krokodil" street samples and the application of an effective analytical methodology for desomorphine quantification, which was the major morphinan found. Further studies are required in order to find other hypothetical by-products in "krokodil" since these may help to explain signs and symptoms presented by abusers. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. [Theoretical and methodological uses of research in Social and Human Sciences in Health].

    PubMed

    Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein

    2012-12-01

    The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.

  5. LC-MS based analysis of endogenous steroid hormones in human hair.

    PubMed

    Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias

    2016-09-01

    The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Development of balanced key performance indicators for emergency departments strategic dashboards following analytic hierarchical process.

    PubMed

    Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh

    2014-01-01

    Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.

  7. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  8. An immersed boundary method for modeling a dirty geometry data

    NASA Astrophysics Data System (ADS)

    Onishi, Keiji; Tsubokura, Makoto

    2017-11-01

    We present a robust, fast, and low preparation cost immersed boundary method (IBM) for simulating an incompressible high Re flow around highly complex geometries. The method is achieved by the dispersion of the momentum by the axial linear projection and the approximate domain assumption satisfying the mass conservation around the wall including cells. This methodology has been verified against an analytical theory and wind tunnel experiment data. Next, we simulate the problem of flow around a rotating object and demonstrate the ability of this methodology to the moving geometry problem. This methodology provides the possibility as a method for obtaining a quick solution at a next large scale supercomputer. This research was supported by MEXT as ``Priority Issue on Post-K computer'' (Development of innovative design and production processes) and used computational resources of the K computer provided by the RIKEN Advanced Institute for Computational Science.

  9. Benchmark Tests for Stirling Convertor Heater Head Life Assessment Conducted

    NASA Technical Reports Server (NTRS)

    Krause, David L.; Halford, Gary R.; Bowman, Randy R.

    2004-01-01

    A new in-house test capability has been developed at the NASA Glenn Research Center, where a critical component of the Stirling Radioisotope Generator (SRG) is undergoing extensive testing to aid the development of analytical life prediction methodology and to experimentally aid in verification of the flight-design component's life. The new facility includes two test rigs that are performing creep testing of the SRG heater head pressure vessel test articles at design temperature and with wall stresses ranging from operating level to seven times that (see the following photograph).

  10. Selecting Health Care Improvement Projects: A Methodology Integrating Cause-and-Effect Diagram and Analytical Hierarchy Process.

    PubMed

    Testik, Özlem Müge; Shaygan, Amir; Dasdemir, Erdi; Soydan, Guray

    It is often vital to identify, prioritize, and select quality improvement projects in a hospital. Yet, a methodology, which utilizes experts' opinions with different points of view, is needed for better decision making. The proposed methodology utilizes the cause-and-effect diagram to identify improvement projects and construct a project hierarchy for a problem. The right improvement projects are then prioritized and selected using a weighting scheme of analytical hierarchy process by aggregating experts' opinions. An approach for collecting data from experts and a graphical display for summarizing the obtained information are also provided. The methodology is implemented for improving a hospital appointment system. The top-ranked 2 major project categories for improvements were identified to be system- and accessibility-related causes (45%) and capacity-related causes (28%), respectively. For each of the major project category, subprojects were then ranked for selecting the improvement needs. The methodology is useful in cases where an aggregate decision based on experts' opinions is expected. Some suggestions for practical implementations are provided.

  11. Reagentless, Structure-Switching, Electrochemical Aptamer-Based Sensors

    NASA Astrophysics Data System (ADS)

    Schoukroun-Barnes, Lauren R.; Macazo, Florika C.; Gutierrez, Brenda; Lottermoser, Justine; Liu, Juan; White, Ryan J.

    2016-06-01

    The development of structure-switching, electrochemical, aptamer-based sensors over the past ˜10 years has led to a variety of reagentless sensors capable of analytical detection in a range of sample matrices. The crux of this methodology is the coupling of target-induced conformation changes of a redox-labeled aptamer with electrochemical detection of the resulting altered charge transfer rate between the redox molecule and electrode surface. Using aptamer recognition expands the highly sensitive detection ability of electrochemistry to a range of previously inaccessible analytes. In this review, we focus on the methods of sensor fabrication and how sensor signaling is affected by fabrication parameters. We then discuss recent studies addressing the fundamentals of sensor signaling as well as quantitative characterization of the analytical performance of electrochemical aptamer-based sensors. Although the limits of detection of reported electrochemical aptamer-based sensors do not often reach that of gold-standard methods such as enzyme-linked immunosorbent assays, the operational convenience of the sensor platform enables exciting analytical applications that we address. Using illustrative examples, we highlight recent advances in the field that impact important areas of analytical chemistry. Finally, we discuss the challenges and prospects for this class of sensors.

  12. The importance of quality control in validating concentrations ...

    EPA Pesticide Factsheets

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer

  13. Combining qualitative and quantitative research within mixed method research designs: a methodological review.

    PubMed

    Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh

    2011-03-01

    It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. A simple and highly selective molecular imprinting polymer-based methodology for propylparaben monitoring in personal care products and industrial waste waters.

    PubMed

    Vicario, Ana; Aragón, Leslie; Wang, Chien C; Bertolino, Franco; Gomez, María R

    2018-02-05

    In this work, a novel molecularly imprinted polymer (MIP) proposed as solid phase extraction sorbent was developed for the determination of propylparaben (PP) in diverse cosmetic samples. The use of parabens (PAs) is authorized by regulatory agencies as microbiological preservative; however, recently several studies claim that large-scale use of these preservatives can be a potential health risk and harmful to the environment. Diverse factors that influence on polymer synthesis were studied, including template, functional monomer, porogen and crosslinker used. Morphological characterization of the MIP was performed using SEM and BET analysis. Parameters affecting the molecularly imprinted solid phase extraction (MISPE) and elution efficiency of PP were evaluated. After sample clean-up, the analyte was analyzed by high performance liquid chromatography (HPLC). The whole procedure was validated, showing satisfactory analytical parameters. After applying the MISPE methodology, the extraction recoveries were always better than 86.15%; the obtained precision expressed as RSD% was always lower than 2.19 for the corrected peak areas. Good linear relationship was obtained within the range 8-500ngmL -1 of PP, r 2 =0.99985. Lower limits of detection and quantification after MISPE procedure of 2.4 and 8ngmL -1 , respectively were reached, in comparison with previously reported methodologies. The development of MISPE-HPLC methodology provided a simple an economic way for accomplishing a clean-up/preconcentration step and the subsequent determination of PP in a complex matrix. The performance of the proposed method was compared against C-18 and silica solid phase extraction (SPE) cartridges. The recovery factors obtained after applying extraction methods were 96.6, 64.8 and 0.79 for MISPE, C18-SPE and silica-SPE procedures, respectively. The proposed methodology improves the retention capability of SPE material plus robustness and possibility of reutilization, enabling it to be used for PP routine monitoring in diverse personal care products (PCP) and environmental samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. The Global War on Terrorism: Analytical Support, Tools and Metrics of Assessment. MORS Workshop

    DTIC Science & Technology

    2005-08-11

    is the matter of intelligence, as COL(P) Keller pointed out, we need to spend less time in the intelligence cycle on managing information and...models, decision aids: "named things " * Methodologies: potentially useful things "* Resources: databases, people, books? * Meta-data on tools * Develop a...experience. Only one member (Mr. Garry Greco) had served on the Joint Intelligence Task Force for Counter Terrorism. Although Gary heavily participated

  16. Tank waste remediation system baseline tank waste inventory estimates for fiscal year 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shelton, L.W., Westinghouse Hanford

    1996-12-06

    A set of tank-by-tank waste inventories is derived from historical waste models, flowsheet records, and analytical data to support the Tank Waste Remediation System flowsheet and retrieval sequence studies. Enabling assumptions and methodologies used to develop the inventories are discussed. These provisional inventories conform to previously established baseline inventories and are meant to serve as an interim basis until standardized inventory estimates are made available.

  17. Modeling of a ring rosen-type piezoelectric transformer by Hamilton's principle.

    PubMed

    Nadal, Clément; Pigache, Francois; Erhart, Jiří

    2015-04-01

    This paper deals with the analytical modeling of a ring Rosen-type piezoelectric transformer. The developed model is based on a Hamiltonian approach, enabling to obtain main parameters and performance evaluation for the first radial vibratory modes. Methodology is detailed, and final results, both the input admittance and the electric potential distribution on the surface of the secondary part, are compared with numerical and experimental ones for discussion and validation.

  18. Bioinspired Methodology for Artificial Olfaction

    PubMed Central

    Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve

    2008-01-01

    Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409

  19. A General Methodology for the Translation of Behavioral Terms into Vernacular Languages.

    PubMed

    Virues-Ortega, Javier; Martin, Neil; Schnerch, Gabriel; García, Jesús Ángel Miguel; Mellichamp, Fae

    2015-05-01

    As the field of behavior analysis expands internationally, the need for comprehensive and systematic glossaries of behavioral terms in the vernacular languages of professionals and clients becomes crucial. We created a Spanish-language glossary of behavior-analytic terms by developing and employing a systematic set of decision-making rules for the inclusion of terms. We then submitted the preliminary translation to a multi-national advisory committee to evaluate the transnational acceptability of the glossary. This method led to a translated corpus of over 1200 behavioral terms. The end products of this work included the following: (a) a Spanish-language glossary of behavior analytic terms that are publicly available over the Internet through the Behavior Analyst Certification Board and (b) a set of translation guidelines summarized here that may be useful for the development of glossaries of behavioral terms into other vernacular languages.

  20. Development and optimization of an energy-regenerative suspension system under stochastic road excitation

    NASA Astrophysics Data System (ADS)

    Huang, Bo; Hsieh, Chen-Yu; Golnaraghi, Farid; Moallem, Mehrdad

    2015-11-01

    In this paper a vehicle suspension system with energy harvesting capability is developed, and an analytical methodology for the optimal design of the system is proposed. The optimization technique provides design guidelines for determining the stiffness and damping coefficients aimed at the optimal performance in terms of ride comfort and energy regeneration. The corresponding performance metrics are selected as root-mean-square (RMS) of sprung mass acceleration and expectation of generated power. The actual road roughness is considered as the stochastic excitation defined by ISO 8608:1995 standard road profiles and used in deriving the optimization method. An electronic circuit is proposed to provide variable damping in the real-time based on the optimization rule. A test-bed is utilized and the experiments under different driving conditions are conducted to verify the effectiveness of the proposed method. The test results suggest that the analytical approach is credible in determining the optimality of system performance.

  1. Closed-loop, pilot/vehicle analysis of the approach and landing task

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.; Anderson, M. R.

    1985-01-01

    Optimal-control-theoretic modeling and frequency-domain analysis is the methodology proposed to evaluate analytically the handling qualities of higher-order manually controlled dynamic systems. Fundamental to the methodology is evaluating the interplay between pilot workload and closed-loop pilot/vehicle performance and stability robustness. The model-based metric for pilot workload is the required pilot phase compensation. Pilot/vehicle performance and loop stability is then evaluated using frequency-domain techniques. When these techniques were applied to the flight-test data for thirty-two highly-augmented fighter configurations, strong correlation was obtained between the analytical and experimental results.

  2. Hybrid experimental/analytical models of structural dynamics - Creation and use for predictions

    NASA Technical Reports Server (NTRS)

    Balmes, Etienne

    1993-01-01

    An original complete methodology for the construction of predictive models of damped structural vibrations is introduced. A consistent definition of normal and complex modes is given which leads to an original method to accurately identify non-proportionally damped normal mode models. A new method to create predictive hybrid experimental/analytical models of damped structures is introduced, and the ability of hybrid models to predict the response to system configuration changes is discussed. Finally a critical review of the overall methodology is made by application to the case of the MIT/SERC interferometer testbed.

  3. Effects of space environment on composites: An analytical study of critical experimental parameters

    NASA Technical Reports Server (NTRS)

    Gupta, A.; Carroll, W. F.; Moacanin, J.

    1979-01-01

    A generalized methodology currently employed at JPL, was used to develop an analytical model for effects of high-energy electrons and interactions between electron and ultraviolet effects. Chemical kinetic concepts were applied in defining quantifiable parameters; the need for determining short-lived transient species and their concentration was demonstrated. The results demonstrates a systematic and cost-effective means of addressing the issues and show qualitative and quantitative, applicable relationships between space radiation and simulation parameters. An equally important result is identification of critical initial experiments necessary to further clarify the relationships. Topics discussed include facility and test design; rastered vs. diffuse continuous e-beam; valid acceleration level; simultaneous vs. sequential exposure to different types of radiation; and interruption of test continuity.

  4. Auditing of chromatographic data.

    PubMed

    Mabie, J T

    1998-01-01

    During a data audit, it is important to ensure that there is clear documentation and an audit trail. The Quality Assurance Unit should review all areas, including the laboratory, during the conduct of the sample analyses. The analytical methodology that is developed should be documented prior to sample analyses. This is an important document for the auditor, as it is the instrumental piece used by the laboratory personnel to maintain integrity throughout the process. It is expected that this document will give insight into the sample analysis, run controls, run sequencing, instrument parameters, and acceptance criteria for the samples. The sample analysis and all supporting documentation should be audited in conjunction with this written analytical method and any supporting Standard Operating Procedures to ensure the quality and integrity of the data.

  5. [Composition of chicken and quail eggs].

    PubMed

    Closa, S J; Marchesich, C; Cabrera, M; Morales, J C

    1999-06-01

    Qualified food composition data on lipids composition are needed to evaluate intakes as a risk factor in the development of heart disease. Proximal composition, cholesterol and fatty acid content of chicken and quail eggs, usually consumed or traded, were analysed. Proximal composition were determined using AOAC (1984) specific techniques; lipids were extracted by a Folch's modified technique and cholesterol and fatty acids were determined by gas chromatography. Results corroborate the stability of eggs composition. Cholesterol content of quail eggs is similar to chicken eggs, but it is almost the half content of data registered in Handbook 8. Differences may be attributed to the analytical methodology used to obtain them. This study provides data obtained with up-date analytical techniques and accessory information useful for food composition tables.

  6. Trade-Off Analysis Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    NASAs Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASAs four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. CNS previously developed a report which applied the methodology, to three space Internet-based communications scenarios for future missions. CNS conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. GRC selected for further analysis the scenario that involved unicast communications between a Low-Earth-Orbit (LEO) International Space Station (ISS) and a ground terminal Internet node via a Tracking and Data Relay Satellite (TDRS) transfer. This report contains a tradeoff analysis on the selected scenario. The analysis examines the performance characteristics of the various protocols and architectures. The tradeoff analysis incorporates the results of a CNS developed analytical model that examined performance parameters.

  7. Costs of Addressing Heroin Addiction in Malaysia and 32 Comparable Countries Worldwide

    PubMed Central

    Ruger, Jennifer Prah; Chawarski, Marek; Mazlan, Mahmud; Luekens, Craig; Ng, Nora; Schottenfeld, Richard

    2012-01-01

    Objective Develop and apply new costing methodologies to estimate costs of opioid dependence treatment in countries worldwide. Data Sources/Study Setting Micro-costing methodology developed and data collected during randomized controlled trial (RCT) involving 126 patients (July 2003–May 2005) in Malaysia. Gross-costing methodology developed to estimate costs of treatment replication in 32 countries with data collected from publicly available sources. Study Design Fixed, variable, and societal cost components of Malaysian RCT micro-costed and analytical framework created and employed for gross-costing in 32 countries selected by three criteria relative to Malaysia: major heroin problem, geographic proximity, and comparable gross domestic product (GDP) per capita. Principal Findings Medication, and urine and blood testing accounted for the greatest percentage of total costs for both naltrexone (29–53 percent) and buprenorphine (33–72 percent) interventions. In 13 countries, buprenorphine treatment could be provided for under $2,000 per patient. For all countries except United Kingdom and Singapore, incremental costs per person were below $1,000 when comparing buprenorphine to naltrexone. An estimated 100 percent of opiate users in Cambodia and Lao People's Democratic Republic could be treated for $8 and $30 million, respectively. Conclusions Buprenorphine treatment can be provided at low cost in countries across the world. This study's new costing methodologies provide tools for health systems worldwide to determine the feasibility and cost of similar interventions. PMID:22091732

  8. Costs of addressing heroin addiction in Malaysia and 32 comparable countries worldwide.

    PubMed

    Ruger, Jennifer Prah; Chawarski, Marek; Mazlan, Mahmud; Luekens, Craig; Ng, Nora; Schottenfeld, Richard

    2012-04-01

    Develop and apply new costing methodologies to estimate costs of opioid dependence treatment in countries worldwide. Micro-costing methodology developed and data collected during randomized controlled trial (RCT) involving 126 patients (July 2003-May 2005) in Malaysia. Gross-costing methodology developed to estimate costs of treatment replication in 32 countries with data collected from publicly available sources. Fixed, variable, and societal cost components of Malaysian RCT micro-costed and analytical framework created and employed for gross-costing in 32 countries selected by three criteria relative to Malaysia: major heroin problem, geographic proximity, and comparable gross domestic product (GDP) per capita. Medication, and urine and blood testing accounted for the greatest percentage of total costs for both naltrexone (29-53 percent) and buprenorphine (33-72 percent) interventions. In 13 countries, buprenorphine treatment could be provided for under $2,000 per patient. For all countries except United Kingdom and Singapore, incremental costs per person were below $1,000 when comparing buprenorphine to naltrexone. An estimated 100 percent of opiate users in Cambodia and Lao People's Democratic Republic could be treated for $8 and $30 million, respectively. Buprenorphine treatment can be provided at low cost in countries across the world. This study's new costing methodologies provide tools for health systems worldwide to determine the feasibility and cost of similar interventions. © Health Research and Educational Trust.

  9. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  10. Big data analytics in healthcare: promise and potential.

    PubMed

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  11. New analytical methodology for analysing S(IV) species at low pH solutions by one stage titration method (bichromatometry) with a clear colour change. Could potentially replace the state-of-art-method iodometry at low pH analysis due higher accuracy

    PubMed Central

    Galfi, Istvan; Virtanen, Jorma; Gasik, Michael M.

    2017-01-01

    A new, faster and more reliable analytical methodology for S(IV) species analysis at low pH solutions by bichromatometry is proposed. For decades the state of the art methodology has been iodometry that is still well justified method for neutral solutions, thus at low pH media possess various side reactions increasing inaccuracy. In contrast, the new methodology has no side reactions at low pH media, requires only one titration step and provides a clear color change if S(IV) species are present in the solution. The method is validated using model solutions with known concentrations and applied to analyses of gaseous SO2 from purged solution in low pH media samples. The results indicate that bichromatometry can accurately analyze SO2 from liquid samples having pH even below 0 relevant to metallurgical industrial processes. PMID:29145479

  12. Background for Joint Systems Aspects of AIR 6000

    DTIC Science & Technology

    2000-04-01

    Checkland’s Soft Systems Methodology [7, 8,9]. The analytical techniques that are proposed for joint systems work are based on calculating probability...Supporting Global Interests 21 DSTO-CR-0155 SLMP Structural Life Management Plan SOW Stand-Off Weapon SSM Soft Systems Methodology UAV Uninhabited Aerial... Systems Methodology in Action, John Wiley & Sons, Chichester, 1990. [101 Pearl, Judea, Probabilistic Reasoning in Intelligent Systems: Networks of Plausible

  13. Assessment of Environmental Enteropathy in the MAL-ED Cohort Study: Theoretical and Analytic Framework

    PubMed Central

    Kosek, Margaret; Guerrant, Richard L.; Kang, Gagandeep; Bhutta, Zulfiqar; Yori, Pablo Peñataro; Gratz, Jean; Gottlieb, Michael; Lang, Dennis; Lee, Gwenyth; Haque, Rashidul; Mason, Carl J.; Ahmed, Tahmeed; Lima, Aldo; Petri, William A.; Houpt, Eric; Olortegui, Maribel Paredes; Seidman, Jessica C.; Mduma, Estomih; Samie, Amidou; Babji, Sudhir

    2014-01-01

    Individuals in the developing world live in conditions of intense exposure to enteric pathogens due to suboptimal water and sanitation. These environmental conditions lead to alterations in intestinal structure, function, and local and systemic immune activation that are collectively referred to as environmental enteropathy (EE). This condition, although poorly defined, is likely to be exacerbated by undernutrition as well as being responsible for permanent growth deficits acquired in early childhood, vaccine failure, and loss of human potential. This article addresses the underlying theoretical and analytical frameworks informing the methodology proposed by the Etiology, Risk Factors and Interactions of Enteric Infections and Malnutrition and the Consequences for Child Health and Development (MAL-ED) cohort study to define and quantify the burden of disease caused by EE within a multisite cohort. Additionally, we will discuss efforts to improve, standardize, and harmonize laboratory practices within the MAL-ED Network. These efforts will address current limitations in the understanding of EE and its burden on children in the developing world. PMID:25305293

  14. Progress toward the development of an implantable sensor for glucose.

    PubMed

    Wilson, G S; Zhang, Y; Reach, G; Moatti-Sirat, D; Poitout, V; Thévenot, D R; Lemonnier, F; Klein, J C

    1992-09-01

    The development of an electrochemically based implantable sensor for glucose is described. The sensor is needle-shaped, about the size of a 28-gauge needle. It is flexible and must be implanted subcutaneously by using a 21-gauge catheter, which is then removed. When combined with a monitoring unit, this device, based on the glucose oxidase-catalyzed oxidation of glucose, reliably monitors glucose concentrations for as long as 10 days in rats. Various design considerations, including the decision to monitor the hydrogen peroxide produced in the enzymatic reaction, are discussed. Glucose constitutes the most important future target analyte for continuous monitoring, but the basic methodology developed for glucose could be applied to several other analytes such as lactate or ascorbate. The success in implementation of such a device depends on a reaction of the tissue surrounding the implant so as not to interfere with the proper functioning of the sensor. Histochemical evidence indicates that the tissue response leads to enhanced sensor performance.

  15. Development of an analytical scheme for simazine and 2,4-D in soil and water runoff from ornamental plant nursery plots.

    PubMed

    Sutherland, Devon J; Stearman, G Kim; Wells, Martha J M

    2003-01-01

    The transport and fate of pesticides applied to ornamental plant nursery crops are not well documented. Methodology for analysis of soil and water runoff samples concomitantly containing the herbicides simazine (1-chloro-4,6-bis(ethylamino)-s-triazine) and 2,4-D ((2,4-dichlorophenoxy)acetic acid) was developed in this research to investigate the potential for runoff and leaching from ornamental nursery plots. Solid-phase extraction was used prior to analysis by gas chromatography and liquid chromatography. Chromatographic results were compared with determination by enzyme-linked immunoassay analysis. The significant analytical contributions of this research include (1) the development of a scheme using chromatographic mode sequencing for the fractionation of simazine and 2,4-D, (2) optimization of the homogeneous derivatization of 2,4-D using the methylating agent boron trifluoride in methanol as an alternative to in situ generation of diazomethane, and (3) the practical application of these techniques to field samples.

  16. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed

    Lexchin, J; Holbrook, A

    1994-07-01

    To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). Analytic study. All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion.

  17. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed Central

    Lexchin, J; Holbrook, A

    1994-01-01

    OBJECTIVE: To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). DESIGN: Analytic study. DATA SOURCE: All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. MAIN OUTCOME MEASURES: Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. MAIN RESULTS: Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. CONCLUSIONS: Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion. PMID:8004560

  18. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    PubMed Central

    Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-01-01

    Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928

  19. Multi-center evaluation of analytical performance of the Beckman Coulter AU5822 chemistry analyzer.

    PubMed

    Zimmerman, M K; Friesen, L R; Nice, A; Vollmer, P A; Dockery, E A; Rankin, J D; Zmuda, K; Wong, S H

    2015-09-01

    Our three academic institutions, Indiana University, Northwestern Memorial Hospital, and Wake Forest, were among the first in the United States to implement the Beckman Coulter AU5822 series chemistry analyzers. We undertook this post-hoc multi-center study by merging our data to determine performance characteristics and the impact of methodology changes on analyte measurement. We independently completed performance validation studies including precision, linearity/analytical measurement range, method comparison, and reference range verification. Complete data sets were available from at least one institution for 66 analytes with the following groups: 51 from all three institutions, and 15 from 1 or 2 institutions for a total sample size of 12,064. Precision was similar among institutions. Coefficients of variation (CV) were <10% for 97%. Analytes with CVs >10% included direct bilirubin and digoxin. All analytes exhibited linearity over the analytical measurement range. Method comparison data showed slopes between 0.900-1.100 for 87.9% of the analytes. Slopes for amylase, tobramycin and urine amylase were <0.8; the slope for lipase was >1.5, due to known methodology or standardization differences. Consequently, reference ranges of amylase, urine amylase and lipase required only minor or no modification. The four AU5822 analyzers independently evaluated at three sites showed consistent precision, linearity, and correlation results. Since installations, the test results had been well received by clinicians from all three institutions. Copyright © 2015. Published by Elsevier Inc.

  20. Dental and dental hygiene students' diagnostic accuracy in oral radiology: effect of diagnostic strategy and instructional method.

    PubMed

    Baghdady, Mariam T; Carnahan, Heather; Lam, Ernest W N; Woods, Nicole N

    2014-09-01

    There has been much debate surrounding diagnostic strategies and the most appropriate training models for novices in oral radiology. It has been argued that an analytic approach, using a step-by-step analysis of the radiographic features of an abnormality, is ideal. Alternative research suggests that novices can successfully employ non-analytic reasoning. Many of these studies do not take instructional methodology into account. This study evaluated the effectiveness of non-analytic and analytic strategies in radiographic interpretation and explored the relationship between instructional methodology and diagnostic strategy. Second-year dental and dental hygiene students were taught four radiographic abnormalities using basic science instructions or a step-by-step algorithm. The students were tested on diagnostic accuracy and memory immediately after learning and one week later. A total of seventy-three students completed both immediate and delayed sessions and were included in the analysis. Students were randomly divided into two instructional conditions: one group provided a diagnostic hypothesis for the image and then identified specific features to support it, while the other group first identified features and then provided a diagnosis. Participants in the diagnosis-first condition (non-analytic reasoning) had higher diagnostic accuracy then those in the features-first condition (analytic reasoning), regardless of their learning condition. No main effect of learning condition or interaction with diagnostic strategy was observed. Educators should be mindful of the potential influence of analytic and non-analytic approaches on the effectiveness of the instructional method.

  1. Synthetic training sets for the development of discriminant functions for the detection of volatile organic compounds from passive infrared remote sensing data.

    PubMed

    Wan, Boyong; Small, Gary W

    2011-01-21

    A novel synthetic data generation methodology is described for use in the development of pattern recognition classifiers that are employed for the automated detection of volatile organic compounds (VOCs) during infrared remote sensing measurements. The approach used is passive Fourier transform infrared spectrometry implemented in a downward-looking mode on an aircraft platform. A key issue in developing this methodology in practice is the need for example data that can be used to train the classifiers. To replace the time-consuming and costly collection of training data in the field, this work implements a strategy for taking laboratory analyte spectra and superimposing them on background spectra collected from the air. The resulting synthetic spectra can be used to train the classifiers. This methodology is tested by developing classifiers for ethanol and methanol, two prevalent VOCs in wide industrial use. The classifiers are successfully tested with data collected from the aircraft during controlled releases of ethanol and during a methanol release from an industrial facility. For both ethanol and methanol, missed detections in the aircraft data are in the range of 4 to 5%, with false positive detections ranging from 0.1 to 0.3%.

  2. Training the next generation analyst using red cell analytics

    NASA Astrophysics Data System (ADS)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  3. Online Communication Settings and the Qualitative Research Process: Acclimating Students and Novice Researchers.

    PubMed

    Gregory, Katherine

    2018-06-01

    In the last 20 years, qualitative research scholars have begun to interrogate methodological and analytic issues concerning online research settings as both data sources and instruments for digital methods. This article examines the adaptation of parts of a qualitative research curriculum for understanding online communication settings. I propose methodological best practices for researchers and educators that I developed while teaching research methods to undergraduate and graduate students across disciplinary departments and discuss obstacles faced during my own research while gathering data from online sources. This article confronts issues concerning the disembodied aspects of applying what in practice should be rooted in a humanistic inquiry. Furthermore, as some approaches to online qualitative research as a digital method grow increasingly problematic with the development of new data mining technologies, I will also briefly touch upon borderline ethical practices involving data-scraping-based qualitative research.

  4. A novel method for the determination of chemical purity and assay of menaquinone-7. Comparison with the methods from the official USP monograph.

    PubMed

    Jedynak, Łukasz; Jedynak, Maria; Kossykowska, Magdalena; Zagrodzka, Joanna

    2017-02-20

    An HPLC method with UV detection and separation with the use of a C30 reversed phase analytical column for the determination of chemical purity and assay of menaquinone-7 (MK7) in one chromatographic run was developed. The method is superior to the methods published in the USP Monograph in terms of selectivity, sensitivity and accuracy, as well as time, solvent and sample consumption. The developed methodology was applied to MK7 samples of active pharmaceutical ingredient (API) purity, MK7 samples of lower quality and crude MK7 samples before purification. The comparison of the results revealed that the use of USP methodology could lead to serious overestimation (up to a few percent) of both purity and MK7 assay in menaquinone-7 samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. A comprehensive plan for helicopter drag reduction

    NASA Technical Reports Server (NTRS)

    Williams, R. M.; Montana, P. S.

    1975-01-01

    Current helicopters have parasite drag levels 6 to 10 times as great as fixed wing aircraft. The commensurate poor cruise efficiency results in a substantial degradation of potential mission capability. The paper traces the origins of helicopter drag and shows that the problem (primarily due to bluff body flow separation) can be solved by the adoption of a comprehensive research and development plan. This plan, known as the Fuselage Design Methodology, comprises both nonaerodynamic and aerodynamic aspects. The aerodynamics are discussed in detail and experimental and analytical programs are described which will lead to a solution of the bluff body problem. Some recent results of work conducted at the Naval Ship Research and Development Center (NSRDC) are presented to illustrate these programs. It is concluded that a 75-per cent reduction of helicopter drag is possible by the full implementation of the Fuselage Design Methodology.

  6. Analytical methodology for sampling and analysing eight siloxanes and trimethylsilanol in biogas from different wastewater treatment plants in Europe.

    PubMed

    Raich-Montiu, J; Ribas-Font, C; de Arespacochaga, N; Roig-Torres, E; Broto-Puig, F; Crest, M; Bouchy, L; Cortina, J L

    2014-02-17

    Siloxanes and trimethylsilanol belong to a family of organic silicone compounds that are currently used extensively in industry. Those that are prone to volatilisation become minor compounds in biogas adversely affecting energetic applications. However, non-standard analytical methodologies are available to analyse biogas-based gaseous matrixes. To this end, different sampling techniques (adsorbent tubes, impingers and tedlar bags) were compared using two different configurations: sampling directly from the biogas source or from a 200 L tedlar bag filled with biogas and homogenised. No significant differences were apparent between the two sampling configurations. The adsorbent tubes performed better than the tedlar bags and impingers, particularly for quantifying low concentrations. A method for the speciation of silicon compounds in biogas was developed using gas chromatography coupled with mass spectrometry working in dual scan/single ion monitoring mode. The optimised conditions could separate and quantify eight siloxane compounds (L2, L3, L4, L5, D3, D4, D5 and D6) and trimethylsilanol within fourteen minutes. Biogas from five waste water treatment plants located in Spain, France and England was sampled and analysed using the developed methodology. The siloxane concentrations in the biogas samples were influenced by the anaerobic digestion temperature, as well as the nature and composition of the sewage inlet. Siloxanes D4 and D5 were the most abundant, ranging in concentration from 1.5 to 10.1 and 10.8 to 124.0 mg Nm(-3), respectively, and exceeding the tolerance limit of most energy conversion systems. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Design Evolution and Methodology for Pumpkin Super-Pressure Balloons

    NASA Astrophysics Data System (ADS)

    Farley, Rodger

    The NASA Ultra Long Duration Balloon (ULDB) program has had many technical development issues discovered and solved along its road to success as a new vehicle. It has the promise of being a sub-satellite, a means to launch up to 2700 kg to 33.5 km altitude for 100 days from a comfortable mid-latitude launch point. Current high-lift long duration ballooning is accomplished out of Antarctica with zero-pressure balloons, which cannot cope with the rigors of diurnal cycles. The ULDB design is still evolving, the product of intense analytical effort, scaled testing, improved manufacturing, and engineering intuition. The past technical problems, in particular the s-cleft deformation, their solutions, future challenges, and the methodology of pumpkin balloon design will generally be described.

  8. Nuclear Forensics. Chapter 18

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayer, Klaus; Glaser, Alexander

    Whenever nuclear material is found out of regulatory control, questions on the origin of the material, on its intended use, and on hazards associated with the material need to be answered. Here, analytical and interpretational methodologies have been developed in order to exploit measurable material properties for gaining information on the history of the nuclear material. This area of research is referred to as nuclear forensic science or, in short, nuclear forensics.This chapter reviews the origins, types, and state-of-the-art of nuclear forensics; discusses the potential roles of nuclear forensics in supporting nuclear security; and examines what nuclear forensics can realisticallymore » achieve. Lastly, it also charts a path forward, pointing at potential applications of nuclear forensic methodologies in other areas.« less

  9. Nuclear Forensics

    DOE PAGES

    Glaser, Alexander; Mayer, Klaus

    2016-06-01

    Whenever nuclear material is found out of regulatory control, questions on the origin of the material, on its intended use, and on hazards associated with the material need to be answered. Analytical and interpretational methodologies have been developed in order to exploit measurable material properties for gaining information on the history of the nuclear material. This area of research is referred to as nuclear forensic science or, in short, nuclear forensics.This chapter reviews the origins, types, and state-of-the-art of nuclear forensics; discusses the potential roles of nuclear forensics in supporting nuclear security; and examines what nuclear forensics can realistically achieve.more » It also charts a path forward, pointing at potential applications of nuclear forensic methodologies in other areas.« less

  10. Technical Evaluation Report of the Aerospace Medical Panel Working Group WG-08 on Evaluation of Methods to Assess Workload.

    DTIC Science & Technology

    1980-11-01

    Occlusion 3.1 Single Measures 3. Primary Task 3.2 Multiple Measures 3.3 Math Modeling 4.1.1 PFF 4.1.2 CSR 4.1.3 M,0 4.1.4 MW 4.1.5 UG3 4.1.6 ZCP 4.1 Single... modeling methodology; and (4) validation of the analytic/predictive methodology In a system design, development, and test effort." Chapter 9: "A central...2.3 Occlusion P S P S S P -P 3.1 Single Measure-Primary S S S S S S S 3.2 Multiple Measure-Primary S S IS S S S S K 3.3 Math Modeling ~ 4.1.7 Eye and

  11. Three essays on energy and environmental economics: Empirical, applied, and theoretical

    NASA Astrophysics Data System (ADS)

    Karney, Daniel Houghton

    Energy and environmental economics are closely related fields as nearly all forms of energy production generate pollution and thus nearly all forms of environmental policy affect energy production and consumption. The three essays in this dissertation are related by their common themes of energy and environmental economics, but they differ in their methodologies. The first chapter is an empirical exercise that looks that the relationship between electricity price deregulation and maintenance outages at nuclear power plants. The second chapter is an applied theory paper that investigates environmental regulation in a multiple pollutants setting. The third chapter develops a new methodology regarding the construction of analytical general equilibrium models that can be used to study topics in energy and environmental economics.

  12. Review of calcium methodologies.

    PubMed

    Zak, B; Epstein, E; Baginski, E S

    1975-01-01

    A review of calcium methodologies for serum has been described. The analytical systems developed over the past century have been classified as to type beginning with gravimetry and extending to isotope dilution-mass spectrometry by covering all of the commonly used technics which have evolved during that period. Screening and referee procedures are discussed along with comparative sensitivities encountered between atomic absorption spectrophotometry and molecular absorption spectrophotometry. A procedure involving a simple direct reaction for serum calcium using cresolphthalein complexone is recommended in which high blanks are minimized by repressing the ionization of the color reagent on lowering the dielectric constant characteristics of the mixture with dimethylsulfoxide. Reaction characteristics, errors which can be encountered, normal ranges and an interpretative resume are included in its discussion.

  13. HiMAT structural development design methodology. [aeroelastic tailoring of the canard and wing box and distributed load tests

    NASA Technical Reports Server (NTRS)

    Price, M. A.

    1979-01-01

    In order to improve aerodynamic performance, a twist criterion was used to design the canard and wing lifting surfaces of two graphite-epoxy research aircraft. To meet that twist criterion, the lifting surfaces were tailored using graphite-epoxy tape. The outer surface of the aircraft is essentially constructed of 95 percent graphite epoxy materials. The analytical tools and methodology used to design those lifting surfaces are described. One aircraft was subjected to an 8g ground test in order to verify structural integrity and to determine how well the desired twist was achieved. Test results are presented and the reductions of both flight and ground strain test gages and their associated stresses are discussed.

  14. Advances in bioanalytical techniques to measure steroid hormones in serum.

    PubMed

    French, Deborah

    2016-06-01

    Steroid hormones are measured clinically to determine if a patient has a pathological process occurring in the adrenal gland, or other hormone responsive organs. They are very similar in structure making them analytically challenging to measure. Additionally, these hormones have vast concentration differences in human serum adding to the measurement complexity. GC-MS was the gold standard methodology used to measure steroid hormones clinically, followed by radioimmunoassay, but that was replaced by immunoassay due to ease of use. LC-MS/MS has now become a popular alternative owing to simplified sample preparation than for GC-MS and increased specificity and sensitivity over immunoassay. This review will discuss these methodologies and some new developments that could simplify and improve steroid hormone analysis in serum.

  15. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    ERIC Educational Resources Information Center

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  16. Quality by Design in the development of hydrophilic interaction liquid chromatography method with gradient elution for the analysis of olanzapine.

    PubMed

    Tumpa, Anja; Stajić, Ana; Jančić-Stojanović, Biljana; Medenica, Mirjana

    2017-02-05

    This paper deals with the development of hydrophilic interaction liquid chromatography (HILIC) method with gradient elution, in accordance with Analytical Quality by Design (AQbD) methodology, for the first time. The method is developed for olanzapine and its seven related substances. Following step by step AQbD methodology, firstly as critical process parameters (CPPs) temperature, starting content of aqueous phase and duration of linear gradient are recognized, and as critical quality attributes (CQAs) separation criterion S of critical pairs of substances are investigated. Rechtschaffen design is used for the creation of models that describe the dependence between CPPs and CQAs. The design space that is obtained at the end is used for choosing the optimal conditions (set point). The method is fully validated at the end to verify the adequacy of the chosen optimal conditions and applied to real samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Examination of the impact of animal and dairy science journals based on traditional and newly developed bibliometric indices.

    PubMed

    Malesios, C; Abas, Z

    2012-12-01

    Using traditional bibliometric indices such as the well-known journal impact factor (IFAC), as well as other more recently developed measures like the (journal) h-index and modifications, we assessed the impact of most prolific scientific journals in the field of animal and dairy science. To achieve this end, we performed a detailed investigation on the evaluation of journals quality, using a total of 50 journals selected from the category of "Agriculture, Dairy & Animal Science" included in the Thomson Reuters' (formerly Institute of Scientific Information, ISI) Web of Science. Our analysis showed that among the top journals in the field are the Journal of Dairy Research, the Journal of Dairy Science, and the Journal of Animal Science. In particular, the Journal of Animal Science, the most productive and frequently cited journal, has shown rapid development, especially in recent years. The majority of the top-tier, highly cited articles are those associated with the description of statistical methodology and the standard chemical analytical methodologies.

  18. ON-SITE SOLID PHRASE EXTRACTION AND LABORATORY ...

    EPA Pesticide Factsheets

    Fragrance materials, such as synthetic musks in aqueous samples, are normally analyzed by GC/MS in the selected ion monitoring (SIM) mode to provide maximum sensitivity after liquid-liquid extraction of I -L samples. A I -L sample, however, usually provides too little analyte for full-scan data acquisition. An on-site extraction method for extracting synthetic musks from 60 L of wastewater effluent has been developed. Such a large sample volume permits high-quality, full-scan mass spectra to be obtained for various synthetic musk compounds. Quantification of these compounds was conveniently achieved from the full-scan data directly, without preparing SIM descriptors for each compound to acquire SIM data. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-sol

  19. IN SITU SOLID-PHASE EXTRACTION AND ANALYSIS OF ...

    EPA Pesticide Factsheets

    Fragrance materials, such as synthetic musks in aqueous samples, are normally analyzed by GC/MS in the selected ion monitoring (SIM) mode to provide maximum sensitivity after liquid-liquid extraction of 1-L samples. A 1-L sample, however, usually provides too little analyte for full-scan data acquisition.We have developed an on-site extraction method for extracting synthetic musks from 60 L of wastewater effluent. Such a large sample volume permits high-quality, full-scan mass spectra to be obtained for various synthetic musk compounds. Quantification of these compounds was conveniently achieved from the full-scan data directly, without preparing SIM descriptors for each compound to acquire SIM data. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-s

  20. Modern Instrumental Methods in Forensic Toxicology*

    PubMed Central

    Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.

    2009-01-01

    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

  1. Helicase-dependent isothermal amplification: a novel tool in the development of molecular-based analytical systems for rapid pathogen detection.

    PubMed

    Barreda-García, Susana; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Lobo-Castañón, María Jesús

    2018-01-01

    Highly sensitive testing of nucleic acids is essential to improve the detection of pathogens, which pose a major threat for public health worldwide. Currently available molecular assays, mainly based on PCR, have a limited utility in point-of-need control or resource-limited settings. Consequently, there is a strong interest in developing cost-effective, robust, and portable platforms for early detection of these harmful microorganisms. Since its description in 2004, isothermal helicase-dependent amplification (HDA) has been successfully applied in the development of novel molecular-based technologies for rapid, sensitive, and selective detection of viruses and bacteria. In this review, we highlight relevant analytical systems using this simple nucleic acid amplification methodology that takes place at a constant temperature and that is readily compatible with microfluidic technologies. Different strategies for monitoring HDA amplification products are described. In addition, we present technological advances for integrating sample preparation, HDA amplification, and detection. Future perspectives and challenges toward point-of-need use not only for clinical diagnosis but also in food safety testing and environmental monitoring are also discussed. Graphical Abstract Expanding the analytical toolbox for the detection of DNA sequences specific of pathogens with isothermal helicase dependent amplification (HDA).

  2. Determination of Carbonyl Compounds in Cork Agglomerates by GDME-HPLC-UV: Identification of the Extracted Compounds by HPLC-MS/MS.

    PubMed

    Brandão, Pedro Francisco; Ramos, Rui Miguel; Almeida, Paulo Joaquim; Rodrigues, José António

    2017-02-08

    A new approach is proposed for the extraction and determination of carbonyl compounds in solid samples, such as wood or cork materials. Cork products are used as building materials due to their singular characteristics; however, little is known about its aldehyde emission potential and content. Sample preparation was done by using a gas-diffusion microextraction (GDME) device for the direct extraction of volatile aldehydes and derivatization with 2,4-dinitrophenylhydrazine. Analytical determination of the extracts was done by HPLC-UV, with detection at 360 nm. The developed methodology proved to be a reliable tool for aldehyde determination in cork agglomerate samples with suitable method features. Mass spectrometry studies were performed for each sample, which enabled the identification, in the extracts, of the derivatization products of a total of 13 aldehydes (formaldehyde, acetaldehyde, furfural, propanal, 5-methylfurfural, butanal, benzaldehyde, pentanal, hexanal, trans-2-heptenal, heptanal, octanal, and trans-2-nonenal) and 4 ketones (3-hydroxy-2-butanone, acetone, cyclohexanone, and acetophenone). This new analytical methodology simultaneously proved to be consistent for the identification and determination of aldehydes in cork agglomerates and a very simple and straightforward procedure.

  3. Simultaneous stable carbon isotopic analysis of wine glycerol and ethanol by liquid chromatography coupled to isotope ratio mass spectrometry.

    PubMed

    Cabañero, Ana I; Recio, Jose L; Rupérez, Mercedes

    2010-01-27

    A novel procedure was established for the simultaneous characterization of wine glycerol and ethanol (13)C/(12)C isotope ratio, using liquid chromatography/isotope ratio mass spectrometry (LC-IRMS). Several parameters influencing separation of glycerol and ethanol from wine matrix were optimized. Results obtained for 35 Spanish samples exposed no significant differences and very strong correlations (r = 0.99) between the glycerol (13)C/(12)C ratios obtained by an alternative method (gas chromatography/isotope ratio mass spectrometry) and the proposed new methodology, and between the ethanol (13)C/(12)C ratios obtained by the official method (elemental analyzer/isotope ratio mass spectrometry) and the proposed new methodology. The accuracy of the proposed method varied from 0.01 to 0.19 per thousand, and the analytical precision was better than 0.25 per thousand. The new developed LC-IRMS method it is the first isotopic method that allows (13)C/(12)C determination of both analytes in the same run directly from a liquid sample with no previous glycerol or ethanol isolation, overcoming technical difficulties associated with complex sample treatment and improving in terms of simplicity and speed.

  4. HS-SPME determination of volatile carbonyl and carboxylic compounds in different matrices.

    PubMed

    Stashenko, Elena E; Mora, Amanda L; Cervantes, Martha E; Martínez, Jairo R

    2006-07-01

    Specific chromatographic methodologies are developed for the analysis of carboxylic acids (C(2)-C(6), benzoic) and aldehydes (C(2)-C(10)) of low molecular weight in diverse matrices, such as air, automotive exhaust gases, human breath, and aqueous matrices. For carboxylic acids, the method is based on their reaction with pentafluorobenzyl bromide in aqueous solution, followed by the separation and identification of the resultant pentafluorobenzyl esters by means of headspace (HS)-solid-phase microextraction (SPME) combined with gas chromatography (GC) and electron capture detection (ECD). Detection limits in the microg/m(3) range are reached, with relative standard deviation (RSD) less than 10% and linear response (R(2) > 0.99) over two orders of magnitude. The analytical methodology for aldehydes is based on SPME with simultaneous derivatization of the analytes on the fiber, by reaction with pentafluorophenylhydrazine. The derivatization reagent is previously deposited on the SPME fiber, which is then exposed to the gaseous matrix or the HS of the sample solution. The pentafluorophenyl hydrazones formed on the fiber are analyzed selectively by means of GC-ECD, with detection limits in the ng/m(3) range, RSD less than 10%, and linear response (R(2) > 0.99) over two orders of magnitude.

  5. Mycotoxin Analysis of Human Urine by LC-MS/MS: A Comparative Extraction Study

    PubMed Central

    Escrivá, Laura; Font, Guillermina

    2017-01-01

    The lower mycotoxin levels detected in urine make the development of sensitive and accurate analytical methods essential. Three extraction methods, namely salting-out liquid–liquid extraction (SALLE), miniQuEChERS (quick, easy, cheap, effective, rugged, and safe), and dispersive liquid–liquid microextraction (DLLME), were evaluated and compared based on analytical parameters for the quantitative LC-MS/MS measurement of 11 mycotoxins (AFB1, AFB2, AFG1, AFG2, OTA, ZEA, BEA, EN A, EN B, EN A1 and EN B1) in human urine. DLLME was selected as the most appropriate methodology, as it produced better validation results for recovery (79–113%), reproducibility (RSDs < 12%), and repeatability (RSDs < 15%) than miniQuEChERS (71–109%, RSDs <14% and <24%, respectively) and SALLE (70–108%, RSDs < 14% and < 24%, respectively). Moreover, the lowest detection (LODS) and quantitation limits (LOQS) were achieved with DLLME (LODs: 0.005–2 μg L−1, LOQs: 0.1–4 μg L−1). DLLME methodology was used for the analysis of 10 real urine samples from healthy volunteers showing the presence of ENs B, B1 and A1 at low concentrations. PMID:29048356

  6. Ultra-high-performance liquid chromatography-Time-of-flight high resolution mass spectrometry to quantify acidic drugs in wastewater.

    PubMed

    Becerra-Herrera, Mercedes; Honda, Luis; Richter, Pablo

    2015-12-04

    A novel analytical approach involving an improved rotating-disk sorptive extraction (RDSE) procedure and ultra-high-performance liquid chromatography (UHPLC) coupled to an ultraspray electrospray ionization source (UESI) and time-of-flight mass spectrometry (TOF/MS), in trap mode, was developed to identify and quantify four non-steroidal anti-inflammatory drugs (NSAIDs) (naproxen, ibuprofen, ketoprofen and diclofenac) and two anti-cholesterol drugs (ACDs) (clofibric acid and gemfibrozil) that are widely used and typically found in water samples. The method reduced the amount of both sample and reagents used and also the time required for the whole analysis, resulting in a reliable and green analytical strategy. The analytical eco-scale was calculated, showing that this methodology is an excellent green analysis, increasing its ecological worth. The detection limits (LOD) and precision (%RSD) were lower than 90ng/L and 10%, respectively. Matrix effects and recoveries were studied using samples from the influent of a wastewater treatment plant (WWTP). All the compounds exhibited suppression of their signals due to matrix effects, and the recoveries were approximately 100%. The applicability and reliability of this methodology were confirmed through the analysis of influent and effluent samples from a WWTP in Santiago, Chile, obtaining concentrations ranging from 1.1 to 20.5μg/L and from 0.5 to 8.6μg/L, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Against Simplicity, against Ethics: Analytics of Disruption as Quasi-Methodology

    ERIC Educational Resources Information Center

    Childers, Sara M.

    2012-01-01

    Simplified understandings of qualitative inquiry as mere method overlook the complexity and nuance of qualitative practice. As is the call of this special issue, the author intervenes in the simplification of qualitative inquiry through a discussion of methodology to illustrate how theory and inquiry are inextricably linked and ethically…

  8. The Integration of Project-Based Methodology into Teaching in Machine Translation

    ERIC Educational Resources Information Center

    Madkour, Magda

    2016-01-01

    This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…

  9. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Treesearch

    Toddi A. Steelman; Branda Nowell; Deena Bayoumi; Sarah McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  10. Centroid and Theoretical Rotation: Justification for Their Use in Q Methodology Research

    ERIC Educational Resources Information Center

    Ramlo, Sue

    2016-01-01

    This manuscript's purpose is to introduce Q as a methodology before providing clarification about the preferred factor analytical choices of centroid and theoretical (hand) rotation. Stephenson, the creator of Q, designated that only these choices allowed for scientific exploration of subjectivity while not violating assumptions associated with…

  11. The Nature of Educational Research

    ERIC Educational Resources Information Center

    Gillett, Simon G.

    2011-01-01

    The paper is in two parts. The first part of the paper is a critique of current methodology in educational research: scientific, critical and interpretive. The ontological and epistemological assumptions of those methodologies are described from the standpoint of John Searle's analytic philosophy. In the second part two research papers with…

  12. Analytical display design for flight tasks conducted under instrument meteorological conditions. [human factors engineering of pilot performance for display device design in instrument landing systems

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1976-01-01

    Paramount to proper utilization of electronic displays is a method for determining pilot-centered display requirements. Display design should be viewed fundamentally as a guidance and control problem which has interactions with the designer's knowledge of human psychomotor activity. From this standpoint, reliable analytical models of human pilots as information processors and controllers can provide valuable insight into the display design process. A relatively straightforward, nearly algorithmic procedure for deriving model-based, pilot-centered display requirements was developed and is presented. The optimal or control theoretic pilot model serves as the backbone of the design methodology, which is specifically directed toward the synthesis of head-down, electronic, cockpit display formats. Some novel applications of the optimal pilot model are discussed. An analytical design example is offered which defines a format for the electronic display to be used in a UH-1H helicopter in a landing approach task involving longitudinal and lateral degrees of freedom.

  13. Strategy to improve the quantitative LC-MS analysis of molecular ions resistant to gas-phase collision induced dissociation: application to disulfide-rich cyclic peptides.

    PubMed

    Ciccimaro, Eugene; Ranasinghe, Asoka; D'Arienzo, Celia; Xu, Carrie; Onorato, Joelle; Drexler, Dieter M; Josephs, Jonathan L; Poss, Michael; Olah, Timothy

    2014-12-02

    Due to observed collision induced dissociation (CID) fragmentation inefficiency, developing sensitive liquid chromatography tandem mass spectrometry (LC-MS/MS) assays for CID resistant compounds is especially challenging. As an alternative to traditional LC-MS/MS, we present here a methodology that preserves the intact analyte ion for quantification by selectively filtering ions while reducing chemical noise. Utilizing a quadrupole-Orbitrap MS, the target ion is selectively isolated while interfering matrix components undergo MS/MS fragmentation by CID, allowing noise-free detection of the analyte's surviving molecular ion. In this manner, CID affords additional selectivity during high resolution accurate mass analysis by elimination of isobaric interferences, a fundamentally different concept than the traditional approach of monitoring a target analyte's unique fragment following CID. This survivor-selected ion monitoring (survivor-SIM) approach has allowed sensitive and specific detection of disulfide-rich cyclic peptides extracted from plasma.

  14. Predicting and explaining inflammation in Crohn's disease patients using predictive analytics methods and electronic medical record data.

    PubMed

    Reddy, Bhargava K; Delen, Dursun; Agrawal, Rupesh K

    2018-01-01

    Crohn's disease is among the chronic inflammatory bowel diseases that impact the gastrointestinal tract. Understanding and predicting the severity of inflammation in real-time settings is critical to disease management. Extant literature has primarily focused on studies that are conducted in clinical trial settings to investigate the impact of a drug treatment on the remission status of the disease. This research proposes an analytics methodology where three different types of prediction models are developed to predict and to explain the severity of inflammation in patients diagnosed with Crohn's disease. The results show that machine-learning-based analytic methods such as gradient boosting machines can predict the inflammation severity with a very high accuracy (area under the curve = 92.82%), followed by regularized regression and logistic regression. According to the findings, a combination of baseline laboratory parameters, patient demographic characteristics, and disease location are among the strongest predictors of inflammation severity in Crohn's disease patients.

  15. Analytical robustness of quantitative NIR chemical imaging for Islamic paper characterization

    NASA Astrophysics Data System (ADS)

    Mahgoub, Hend; Gilchrist, John R.; Fearn, Thomas; Strlič, Matija

    2017-07-01

    Recently, spectral imaging techniques such as Multispectral (MSI) and Hyperspectral Imaging (HSI) have gained importance in the field of heritage conservation. This paper explores the analytical robustness of quantitative chemical imaging for Islamic paper characterization by focusing on the effect of different measurement and processing parameters, i.e. acquisition conditions and calibration on the accuracy of the collected spectral data. This will provide a better understanding of the technique that can provide a measure of change in collections through imaging. For the quantitative model, special calibration target was devised using 105 samples from a well-characterized reference Islamic paper collection. Two material properties were of interest: starch sizing and cellulose degree of polymerization (DP). Multivariate data analysis methods were used to develop discrimination and regression models which were used as an evaluation methodology for the metrology of quantitative NIR chemical imaging. Spectral data were collected using a pushbroom HSI scanner (Gilden Photonics Ltd) in the 1000-2500 nm range with a spectral resolution of 6.3 nm using a mirror scanning setup and halogen illumination. Data were acquired at different measurement conditions and acquisition parameters. Preliminary results showed the potential of the evaluation methodology to show that measurement parameters such as the use of different lenses and different scanning backgrounds may not have a great influence on the quantitative results. Moreover, the evaluation methodology allowed for the selection of the best pre-treatment method to be applied to the data.

  16. Analytical Quality by Design in pharmaceutical quality assurance: Development of a capillary electrophoresis method for the analysis of zolmitriptan and its impurities.

    PubMed

    Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra

    2015-11-01

    A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Performance enhancement of Pt/TiO2/Si UV-photodetector by optimizing light trapping capability and interdigitated electrodes geometry

    NASA Astrophysics Data System (ADS)

    Bencherif, H.; Djeffal, F.; Ferhati, H.

    2016-09-01

    This paper presents a hybrid approach based on an analytical and metaheuristic investigation to study the impact of the interdigitated electrodes engineering on both speed and optical performance of an Interdigitated Metal-Semiconductor-Metal Ultraviolet Photodetector (IMSM-UV-PD). In this context, analytical models regarding the speed and optical performance have been developed and validated by experimental results, where a good agreement has been recorded. Moreover, the developed analytical models have been used as objective functions to determine the optimized design parameters, including the interdigit configuration effect, via a Multi-Objective Genetic Algorithm (MOGA). The ultimate goal of the proposed hybrid approach is to identify the optimal design parameters associated with the maximum of electrical and optical device performance. The optimized IMSM-PD not only reveals superior performance in terms of photocurrent and response time, but also illustrates higher optical reliability against the optical losses due to the active area shadowing effects. The advantages offered by the proposed design methodology suggest the possibility to overcome the most challenging problem with the communication speed and power requirements of the UV optical interconnect: high derived current and commutation speed in the UV receiver.

  18. Model Analytical Development for Physical, Chemical, and Biological Characterization of Momordica charantia Vegetable Drug

    PubMed Central

    Guimarães, Geovani Pereira; Santos, Ravely Lucena; Júnior, Fernando José de Lima Ramos; da Silva, Karla Monik Alves; de Souza, Fabio Santos

    2016-01-01

    Momordica charantia is a species cultivated throughout the world and widely used in folk medicine, and its medicinal benefits are well documented, especially its pharmacological properties, including antimicrobial activities. Analytical methods have been used to aid in the characterization of compounds derived from plant drug extracts and their products. This paper developed a methodological model to evaluate the integrity of the vegetable drug M. charantia in different particle sizes, using different analytical methods. M. charantia was collected in the semiarid region of Paraíba, Brazil. The herbal medicine raw material derived from the leaves and fruits in different particle sizes was analyzed using thermoanalytical techniques as thermogravimetry (TG) and differential thermal analysis (DTA), pyrolysis coupled to gas chromatography/mass spectrometry (PYR-GC/MS), and nuclear magnetic resonance (1H NMR), in addition to the determination of antimicrobial activity. The different particle surface area among the samples was differentiated by the techniques. DTA and TG were used for assessing thermal and kinetic parameters and PYR-GC/MS was used for degradation products chromatographic identification through the pyrograms. The infusions obtained from the fruit and leaves of Momordica charantia presented antimicrobial activity. PMID:27579215

  19. Development of Sample Handling and Analytical Expertise For the Stardust Comet Sample Return

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, J; Bajt, S; Brennan, S

    NASA's Stardust mission returned to Earth in January 2006 with ''fresh'' cometary particles from a young Jupiter family comet. The cometary particles were sampled during the spacecraft flyby of comet 81P/Wild-2 in January 2004, when they impacted low-density silica aerogel tiles and aluminum foils on the sample tray assembly at approximately 6.1 km/s. This LDRD project has developed extraction and sample recovery methodologies to maximize the scientific information that can be obtained from the analysis of natural and man-made nano-materials of relevance to the LLNL programs.

  20. Implications of direct protective factors for public health research and prevention strategies to reduce youth violence.

    PubMed

    Hall, Jeffrey E; Simon, Thomas R; Lee, Rosalyn D; Mercy, James A

    2012-08-01

    The development of work on direct protective factors for youth violence has been delayed by conceptual and methodologic problems that have constrained the design, execution, and interpretation of prevention research. These problems are described in detail and actively addressed in review and analytic papers developed by the CDC's Expert Panel on Protective Factors for youth violence. The present paper synthesizes findings from these papers, specifies their implications for public health research and prevention strategies to reduce youth violence, and suggests directions for future research. Published by Elsevier Inc.

  1. Analysis of potential trade-offs in regulation of disinfection by-products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cromwell, J.E.; Zhang, X.; Regli, S.

    1992-11-01

    Executive Order 12291 requires the preparation of a Regulatory Impact Analysis (RIA) on all new major federal regulations. The goal of an RIA is to develop and organize information on benefits, costs, and economic impacts so as to clarify trade-offs among alternative regulatory options. This paper outlines explicit methodology for assessing the technical potential for risk-risk tradeoffs. The strategies used to cope with complexities and uncertainties in developing the Disinfection By-Products Regulatory Analysis Model are explained. Results are presented and discussed in light of uncertainties, and in light of the analytical requirements for regulatory impact analysis.

  2. Mixture modeling methods for the assessment of normal and abnormal personality, part II: longitudinal models.

    PubMed

    Wright, Aidan G C; Hallquist, Michael N

    2014-01-01

    Studying personality and its pathology as it changes, develops, or remains stable over time offers exciting insight into the nature of individual differences. Researchers interested in examining personal characteristics over time have a number of time-honored analytic approaches at their disposal. In recent years there have also been considerable advances in person-oriented analytic approaches, particularly longitudinal mixture models. In this methodological primer we focus on mixture modeling approaches to the study of normative and individual change in the form of growth mixture models and ipsative change in the form of latent transition analysis. We describe the conceptual underpinnings of each of these models, outline approaches for their implementation, and provide accessible examples for researchers studying personality and its assessment.

  3. Graph Analytics for Signature Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Emilie A.; Johnson, John R.; Halappanavar, Mahantesh

    2013-06-01

    Within large amounts of seemingly unstructured data it can be diffcult to find signatures of events. In our work we transform unstructured data into a graph representation. By doing this we expose underlying structure in the data and can take advantage of existing graph analytics capabilities, as well as develop new capabilities. Currently we focus on applications in cybersecurity and communication domains. Within cybersecurity we aim to find signatures for perpetrators using the pass-the-hash attack, and in communications we look for emails or phone calls going up or down a chain of command. In both of these areas, and inmore » many others, the signature we look for is a path with certain temporal properties. In this paper we discuss our methodology for finding these temporal paths within large graphs.« less

  4. Making Mass Spectrometry See the Light: The Promises and Challenges of Cryogenic Infrared Ion Spectroscopy as a Bioanalytical Technique

    PubMed Central

    Cismesia, Adam P.; Bailey, Laura S.; Bell, Matthew R.; Tesler, Larry F.; Polfer, Nicolas C.

    2016-01-01

    The detailed chemical information contained in the vibrational spectrum of a cryogenically cooled analyte would, in principle, make infrared (IR) ion spectroscopy a gold standard technique for molecular identification in mass spectrometry. Despite this immense potential, there are considerable challenges in both instrumentation and methodology to overcome before the technique is analytically useful. Here, we discuss the promise of IR ion spectroscopy for small molecule analysis in the context of metabolite identification. Experimental strategies to address sensitivity constraints, poor overall duty cycle, and speed of the experiment are intimately tied to the development of a mass-selective cryogenic trap. Therefore, the most likely avenues for success, in the authors? opinion, are presented here, alongside alternative approaches and some thoughts on data interpretation. PMID:26975370

  5. Social Cognitive Predictors of College Students' Academic Performance and Persistence: A Meta-Analytic Path Analysis

    ERIC Educational Resources Information Center

    Brown, Steven D.; Tramayne, Selena; Hoxha, Denada; Telander, Kyle; Fan, Xiaoyan; Lent, Robert W.

    2008-01-01

    This study tested Social Cognitive Career Theory's (SCCT) academic performance model using a two-stage approach that combined meta-analytic and structural equation modeling methodologies. Unbiased correlations obtained from a previously published meta-analysis [Robbins, S. B., Lauver, K., Le, H., Davis, D., & Langley, R. (2004). Do psychosocial…

  6. Operational Environmental Assessment

    DTIC Science & Technology

    1988-09-01

    Chemistry Branch - Physical Chemistry Branch " Analytical Research Division - Analytical Systems Branch - Methodology Research Branch - Spectroscopy Branch...electromagnetic frequency spec- trum and includes radio frequencies, infrared , visible light, ultraviolet, X-rays and gamma rays (in ascending order of...Verruculogen Aflatrem Picrotoxin Ciguatoxin Mycotoxins Simple Tr ichothecenes T-2 Toxin T-2 Tetraol Neosolaniol * Nivalenol Deoxynivalenol Verrucarol B-3 B lank

  7. Analytical Chemistry Division. Annual progress report for period ending December 31, 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, W.S.

    1981-05-01

    This report is divided into: analytical methodology; mass and emission spectrometry; technical support; bio/organic analysis; nuclear and radiochemical analysis; quality assurance, safety, and tabulation of analyses; supplementary activities; and presentation of research results. Separate abstracts were prepared for the technical support, bio/organic analysis, and nuclear and radiochemical analysis. (DLC)

  8. A Multidimensional Reappraisal of Language in Autism: Insights from a Discourse Analytic Study

    ERIC Educational Resources Information Center

    Sterponi, Laura; de Kirby, Kenton

    2016-01-01

    In this article, we leverage theoretical insights and methodological guidelines of discourse analytic scholarship to re-examine language phenomena typically associated with autism. Through empirical analysis of the verbal behavior of three children with autism, we engage the question of how prototypical features of autistic language--notably…

  9. An introduction to joint research by the USEPA and USGS on contaminants of emerging concern in source and treated drinking waters of the United States

    EPA Science Inventory

    Improvements in analytical methodology have allowed low-level detection of an ever increasing number of pharmaceuticals, personal care products, hormones, pathogens and other contaminants of emerging concern (CECs). The use of these improved analytical tools has allowed researche...

  10. Equity Analytics: A Methodological Approach for Quantifying Participation Patterns in Mathematics Classroom Discourse

    ERIC Educational Resources Information Center

    Reinholz, Daniel L.; Shah, Niral

    2018-01-01

    Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…

  11. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  12. Analysis of thin-walled cylindrical composite shell structures subject to axial and bending loads: Concept development, analytical modeling and experimental verification

    NASA Astrophysics Data System (ADS)

    Mahadev, Sthanu

    Continued research and development efforts devoted in recent years have generated novel avenues towards the advancement of efficient and effective, slender laminated fiber-reinforced composite members. Numerous studies have focused on the modeling and response characterization of composite structures with particular relevance to thin-walled cylindrical composite shells. This class of shell configurations is being actively explored to fully determine their mechanical efficacy as primary aerospace structural members. The proposed research is targeted towards formulating a composite shell theory based prognosis methodology that entails an elaborate analysis and investigation of thin-walled cylindrical shell type laminated composite configurations that are highly desirable in increasing number of mechanical and aerospace applications. The prime motivation to adopt this theory arises from its superior ability to generate simple yet viable closed-form analytical solution procedure to numerous geometrically intense, inherent curvature possessing composite structures. This analytical evaluative routine offers to acquire a first-hand insight on the primary mechanical characteristics that essentially govern the behavior of slender composite shells under typical static loading conditions. Current work exposes the robustness of this mathematical framework via demonstrating its potential towards the prediction of structural properties such as axial stiffness and bending stiffness respectively. Longitudinal ply-stress computations are investigated upon deriving the global stiffness matrix model for composite cylindrical tubes with circular cross-sections. Additionally, this work employs a finite element based numerical technique to substantiate the analytical results reported for cylindrically shaped circular composite tubes. Furthermore, this concept development is extended to the study of thin-walled, open cross-sectioned, curved laminated shells that are geometrically distinguished with respect to the circumferential arc angle, thickness-to-mean radius ratio and total laminate thickness. The potential of this methodology is challenged to analytically determine the location of the centroid. This precise location dictates the decoupling of extension-bending type deformational response in tension loaded composite structures. Upon the cross-validation of the centroidal point through the implementation of an ANSYS based finite element routine, influence of centroid is analytically examined under the application of a concentrated longitudinal tension and bending type loadings on a series of cylindrical shells characterized by three different symmetric-balanced stacking sequences. In-plane ply-stresses are computed and analyzed across the circumferential contour. An experimental investigation has been incorporated via designing an ad-hoc apparatus and test-up that accommodates the quantification of in-plane strains, computation of ply-stresses and addresses the physical characteristics for a set of auto-clave fabricated cylindrical shell articles. Consequently, this work is shown to essentially capture the mechanical aspects of cylindrical shells, thus facilitating structural engineers to design and manufacture viable structures.

  13. Recent advancements in nanoelectrodes and nanopipettes used in combined scanning electrochemical microscopy techniques.

    PubMed

    Kranz, Christine

    2014-01-21

    In recent years, major developments in scanning electrochemical microscopy (SECM) have significantly broadened the application range of this electroanalytical technique from high-resolution electrochemical imaging via nanoscale probes to large scale mapping using arrays of microelectrodes. A major driving force in advancing the SECM methodology is based on developing more sophisticated probes beyond conventional micro-disc electrodes usually based on noble metals or carbon microwires. This critical review focuses on the design and development of advanced electrochemical probes particularly enabling combinations of SECM with other analytical measurement techniques to provide information beyond exclusively measuring electrochemical sample properties. Consequently, this critical review will focus on recent progress and new developments towards multifunctional imaging.

  14. Anisotropic Multishell Analytical Modeling of an Intervertebral Disk Subjected to Axial Compression.

    PubMed

    Demers, Sébastien; Nadeau, Sylvie; Bouzid, Abdel-Hakim

    2016-04-01

    Studies on intervertebral disk (IVD) response to various loads and postures are essential to understand disk's mechanical functions and to suggest preventive and corrective actions in the workplace. The experimental and finite-element (FE) approaches are well-suited for these studies, but validating their findings is difficult, partly due to the lack of alternative methods. Analytical modeling could allow methodological triangulation and help validation of FE models. This paper presents an analytical method based on thin-shell, beam-on-elastic-foundation and composite materials theories to evaluate the stresses in the anulus fibrosus (AF) of an axisymmetric disk composed of multiple thin lamellae. Large deformations of the soft tissues are accounted for using an iterative method and the anisotropic material properties are derived from a published biaxial experiment. The results are compared to those obtained by FE modeling. The results demonstrate the capability of the analytical model to evaluate the stresses at any location of the simplified AF. It also demonstrates that anisotropy reduces stresses in the lamellae. This novel model is a preliminary step in developing valuable analytical models of IVDs, and represents a distinctive groundwork that is able to sustain future refinements. This paper suggests important features that may be included to improve model realism.

  15. Green aspects, developments and perspectives of liquid phase microextraction techniques.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2014-02-01

    Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.

  16. Optimal design of experiments applied to headspace solid phase microextraction for the quantification of vicinal diketones in beer through gas chromatography-mass spectrometric detection.

    PubMed

    Leça, João M; Pereira, Ana C; Vieira, Ana C; Reis, Marco S; Marques, José C

    2015-08-05

    Vicinal diketones, namely diacetyl (DC) and pentanedione (PN), are compounds naturally found in beer that play a key role in the definition of its aroma. In lager beer, they are responsible for off-flavors (buttery flavor) and therefore their presence and quantification is of paramount importance to beer producers. Aiming at developing an accurate quantitative monitoring scheme to follow these off-flavor compounds during beer production and in the final product, the head space solid-phase microextraction (HS-SPME) analytical procedure was tuned through experiments planned in an optimal way and the final settings were fully validated. Optimal design of experiments (O-DOE) is a computational, statistically-oriented approach for designing experiences that are most informative according to a well-defined criterion. This methodology was applied for HS-SPME optimization, leading to the following optimal extraction conditions for the quantification of VDK: use a CAR/PDMS fiber, 5 ml of samples in 20 ml vial, 5 min of pre-incubation time followed by 25 min of extraction at 30 °C, with agitation. The validation of the final analytical methodology was performed using a matrix-matched calibration, in order to minimize matrix effects. The following key features were obtained: linearity (R(2) > 0.999, both for diacetyl and 2,3-pentanedione), high sensitivity (LOD of 0.92 μg L(-1) and 2.80 μg L(-1), and LOQ of 3.30 μg L(-1) and 10.01 μg L(-1), for diacetyl and 2,3-pentanedione, respectively), recoveries of approximately 100% and suitable precision (repeatability and reproducibility lower than 3% and 7.5%, respectively). The applicability of the methodology was fully confirmed through an independent analysis of several beer samples, with analyte concentrations ranging from 4 to 200 g L(-1). Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Conceptual and Analytical Considerations toward the Use of Patient-Reported Outcomes in Personalized Medicine.

    PubMed

    Alemayehu, Demissie; Cappelleri, Joseph C

    2012-07-01

    Patient-reported outcomes (PROs) can play an important role in personalized medicine. PROs can be viewed as an important fundamental tool to measure the extent of disease and the effect of treatment at the individual level, because they reflect the self-reported health state of the patient directly. However, their effective integration in personalized medicine requires addressing certain conceptual and methodological challenges, including instrument development and analytical issues. To evaluate methodological issues, such as multiple comparisons, missing data, and modeling approaches, associated with the analysis of data related to PRO and personalized medicine to further our understanding on the role of PRO data in personalized medicine. There is a growing recognition of the role of PROs in medical research, but their potential use in customizing healthcare is not widely appreciated. Emerging insights into the genetic basis of PROs could potentially lead to new pathways that may improve patient care. Knowledge of the biologic pathways through which the various genetic predispositions propel people toward negative or away from positive health experiences may ultimately transform healthcare. Understanding and addressing the conceptual and methodological issues in PROs and personalized medicine are expected to enhance the emerging area of personalized medicine and to improve patient care. This article addresses relevant concerns that need to be considered for effective integration of PROs in personalized medicine, with particular reference to conceptual and analytical issues that routinely arise with personalized medicine and PRO data. Some of these issues, including multiplicity problems, handling of missing values-and modeling approaches, are common to both areas. It is hoped that this article will help to stimulate further research to advance our understanding of the role of PRO data in personalized medicine. A robust conceptual framework to incorporate PROs into personalized medicine can provide fertile opportunity to bring these two areas even closer and to enhance the way a specific treatment is attuned and delivered to address patient care and patient needs.

  18. Enabling Data-Driven Methodologies Across the Data Lifecycle and Ecosystem

    NASA Astrophysics Data System (ADS)

    Doyle, R. J.; Crichton, D.

    2017-12-01

    NASA has unlocked unprecedented scientific knowledge through exploration of the Earth, our solar system, and the larger universe. NASA is generating enormous amounts of data that are challenging traditional approaches to capturing, managing, analyzing and ultimately gaining scientific understanding from science data. New architectures, capabilities and methodologies are needed to span the entire observing system, from spacecraft to archive, while integrating data-driven discovery and analytic capabilities. NASA data have a definable lifecycle, from remote collection point to validated accessibility in multiple archives. Data challenges must be addressed across this lifecycle, to capture opportunities and avoid decisions that may limit or compromise what is achievable once data arrives at the archive. Data triage may be necessary when the collection capacity of the sensor or instrument overwhelms data transport or storage capacity. By migrating computational and analytic capability to the point of data collection, informed decisions can be made about which data to keep; in some cases, to close observational decision loops onboard, to enable attending to unexpected or transient phenomena. Along a different dimension than the data lifecycle, scientists and other end-users must work across an increasingly complex data ecosystem, where the range of relevant data is rarely owned by a single institution. To operate effectively, scalable data architectures and community-owned information models become essential. NASA's Planetary Data System is having success with this approach. Finally, there is the difficult challenge of reproducibility and trust. While data provenance techniques will be part of the solution, future interactive analytics environments must support an ability to provide a basis for a result: relevant data source and algorithms, uncertainty tracking, etc., to assure scientific integrity and to enable confident decision making. Advances in data science offer opportunities to gain new insights from space missions and their vast data collections. We are working to innovate new architectures, exploit emerging technologies, develop new data-driven methodologies, and transfer them across disciplines, while working across the dual dimensions of the data lifecycle and the data ecosystem.

  19. Millimeter wave satellite concepts, volume 1

    NASA Technical Reports Server (NTRS)

    Hilsen, N. B.; Holland, L. D.; Thomas, R. E.; Wallace, R. W.; Gallagher, J. G.

    1977-01-01

    The identification of technologies necessary for development of millimeter spectrum communication satellites was examined from a system point of view. Development of methodology based on the technical requirements of potential services that might be assigned to millimeter wave bands for identifying the viable and appropriate technologies for future NASA millimeter research and development programs, and testing of this methodology with selected user applications and services were the goals of the program. The entire communications network, both ground and space subsystems was studied. Cost, weight, and performance models for the subsystems, conceptual design for point-to-point and broadcast communications satellites, and analytic relationships between subsystem parameters and an overall link performance are discussed along with baseline conceptual systems, sensitivity studies, model adjustment analyses, identification of critical technologies and their risks, and brief research and development program scenarios for the technologies judged to be moderate or extensive risks. Identification of technologies for millimeter satellite communication systems, and assessment of the relative risks of these technologies, was accomplished through subsystem modeling and link optimization for both point-to-point and broadcast applications.

  20. Econometric model for age- and population-dependent radiation exposures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandquist, G.M.; Slaughter, D.M.; Rogers, V.C.

    1991-01-01

    The economic impact associated with ionizing radiation exposures in a given human population depends on numerous factors including the individual's mean economic status as a function age, the age distribution of the population, the future life expectancy at each age, and the latency period for the occurrence of radiation-induced health effects. A simple mathematical model has been developed that provides an analytical methodology for estimating the societal econometrics associated with radiation effects are to be assessed and compared for economic evaluation.

  1. Development Approaches Coupled with Verification and Validation Methodologies for Agent-Based Mission-Level Analytical Combat Simulations

    DTIC Science & Technology

    2004-03-01

    When applying experience to new situations, the process is very similar. Faced with a new situation, a human generally looks for ways in which...find the best course of action, the human would compare current goals to those it faced in the previous experiences and choose the path that...154. Saperstein, Alvin (1995) “War and Chaos”. American Scientist, vol. 84. November-December 1995. pp. 548-557. 155. Sargent, Robert G . (1991

  2. A Review of Meta-Analyses in Education: Methodological Strengths and Weaknesses

    ERIC Educational Resources Information Center

    Ahn, Soyeon; Ames, Allison J.; Myers, Nicholas D.

    2012-01-01

    The current review addresses the validity of published meta-analyses in education that determines the credibility and generalizability of study findings using a total of 56 meta-analyses published in education in the 2000s. Our objectives were to evaluate the current meta-analytic practices in education, identify methodological strengths and…

  3. Advanced Analytical Methodologies Based on Raman Spectroscopy to Detect Prebiotic and Biotic Molecules: Applicability in the Study of the Martian Nakhlite NWA 6148 Meteorite

    NASA Astrophysics Data System (ADS)

    Madariaga, J. M.; Torre-Fdez, I.; Ruiz-Galende, P.; Aramendia, J.; Gomez-Nubla, L.; Fdez-Ortiz de Vallejuelo, S.; Maguregui, M.; Castro, K.; Arana, G.

    2018-04-01

    Advanced methodologies based on Raman spectroscopy are proposed to detect prebiotic and biotic molecules in returned samples from Mars: (a) optical microscopy with confocal micro-Raman, (b) the SCA instrument, (c) Raman Imaging. Examples for NWA 6148.

  4. Technological Leverage in Higher Education: An Evolving Pedagogy

    ERIC Educational Resources Information Center

    Pillai, K. Rajasekharan; Prakash, Ashish Viswanath

    2017-01-01

    Purpose: The purpose of the study is to analyse the perception of students toward a computer-based exam on a custom-made digital device and their willingness to adopt the same for high-stake summative assessment. Design/methodology/approach: This study followed an analytical methodology using survey design. A modified version of students'…

  5. Inlet design for high-speed propfans

    NASA Technical Reports Server (NTRS)

    Little, B. H., Jr.; Hinson, B. L.

    1982-01-01

    A two-part study was performed to design inlets for high-speed propfan installation. The first part was a parametric study to select promising inlet concepts. A wide range of inlet geometries was examined and evaluated - primarily on the basis of cruise thrust and fuel burn performance. Two inlet concepts were than chosen for more detailed design studies - one apropriate to offset engine/gearbox arrangements and the other to in-line arrangements. In the second part of this study, inlet design points were chosen to optimize the net installed thrust, and detailed design of the two inlet configurations was performed. An analytical methodology was developed to account for propfan slipstream effects, transonic flow efects, and three-dimensional geometry effects. Using this methodology, low drag cowls were designed for the two inlets.

  6. Application of low-cost methodologies for mobile phone app development.

    PubMed

    Zhang, Melvyn; Cheow, Enquan; Ho, Cyrus Sh; Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-12-09

    The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users' self-rated perception of the apps. In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the "Mastering Psychiatry" app for undergraduates and "Déjà vu" app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. This is one of the few studies that have demonstrated the low-cost methodologies of app development; as well as student and trainee receptivity toward self-created Web-based mobile phone apps. The results obtained have demonstrated that these Web-based low-cost apps are applicable in the real life, and suggest that the methodologies shared in this research paper might be of benefit for other specialities and disciplines.

  7. Application of Low-Cost Methodologies for Mobile Phone App Development

    PubMed Central

    Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-01-01

    Background The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. Objective The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users’ self-rated perception of the apps. Methods In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the “Mastering Psychiatry” app for undergraduates and “Déjà vu” app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. Results For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. Conclusions This is one of the few studies that have demonstrated the low-cost methodologies of app development; as well as student and trainee receptivity toward self-created Web-based mobile phone apps. The results obtained have demonstrated that these Web-based low-cost apps are applicable in the real life, and suggest that the methodologies shared in this research paper might be of benefit for other specialities and disciplines. PMID:25491323

  8. The temporal structure of pollution levels in developed cities.

    PubMed

    Barrigón Morillas, Juan Miguel; Ortiz-Caraballo, Carmen; Prieto Gajardo, Carlos

    2015-06-01

    Currently, the need for mobility can cause significant pollution levels in cities, with important effects on health and quality of life. Any approach to the study of urban pollution and its effects requires an analysis of spatial distribution and temporal variability. It is a crucial dilemma to obtain proven methodologies that allow an increase in the quality of the prediction and the saving of resources in the spatial and temporal sampling. This work proposes a new analytical methodology in the study of temporal structure. As a result, a model for estimating annual levels of urban traffic noise was proposed. The average errors are less than one decibel in all acoustics indicators. A new working methodology of urban noise has begun. Additionally, a general application can be found for the study of the impacts of pollution associated with traffic, with implications for urban design and possibly in economic and sociological aspects. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Reverse Engineering Validation using a Benchmark Synthetic Gene Circuit in Human Cells

    PubMed Central

    Kang, Taek; White, Jacob T.; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas

    2013-01-01

    Multi-component biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network. PMID:23654266

  10. Reverse engineering validation using a benchmark synthetic gene circuit in human cells.

    PubMed

    Kang, Taek; White, Jacob T; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas

    2013-05-17

    Multicomponent biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network.

  11. Electro-thermal vaporization direct analysis in real time-mass spectrometry for water contaminant analysis during space missions.

    PubMed

    Dwivedi, Prabha; Gazda, Daniel B; Keelor, Joel D; Limero, Thomas F; Wallace, William T; Macatangay, Ariel V; Fernández, Facundo M

    2013-10-15

    The development of a direct analysis in real time-mass spectrometry (DART-MS) method and first prototype vaporizer for the detection of low molecular weight (∼30-100 Da) contaminants representative of those detected in water samples from the International Space Station is reported. A temperature-programmable, electro-thermal vaporizer (ETV) was designed, constructed, and evaluated as a sampling interface for DART-MS. The ETV facilitates analysis of water samples with minimum user intervention while maximizing analytical sensitivity and sample throughput. The integrated DART-ETV-MS methodology was evaluated in both positive and negative ion modes to (1) determine experimental conditions suitable for coupling DART with ETV as a sample inlet and ionization platform for time-of-flight MS, (2) to identify analyte response ions, (3) to determine the detection limit and dynamic range for target analyte measurement, and (4) to determine the reproducibility of measurements made with the method when using manual sample introduction into the vaporizer. Nitrogen was used as the DART working gas, and the target analytes chosen for the study were ethyl acetate, acetone, acetaldehyde, ethanol, ethylene glycol, dimethylsilanediol, formaldehyde, isopropanol, methanol, methylethyl ketone, methylsulfone, propylene glycol, and trimethylsilanol.

  12. Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level

    PubMed Central

    Savalei, Victoria; Rhemtulla, Mijke

    2017-01-01

    In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data—that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study. PMID:29276371

  13. Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level.

    PubMed

    Savalei, Victoria; Rhemtulla, Mijke

    2017-08-01

    In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data-that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study.

  14. Methodology for assessing the safety of Hydrogen Systems: HyRAM 1.1 technical reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina; Hecht, Ethan; Reynolds, John Thomas

    The HyRAM software toolkit provides a basis for conducting quantitative risk assessment and consequence modeling for hydrogen infrastructure and transportation systems. HyRAM is designed to facilitate the use of state-of-the-art science and engineering models to conduct robust, repeatable assessments of hydrogen safety, hazards, and risk. HyRAM is envisioned as a unifying platform combining validated, analytical models of hydrogen behavior, a stan- dardized, transparent QRA approach, and engineering models and generic data for hydrogen installations. HyRAM is being developed at Sandia National Laboratories for the U. S. De- partment of Energy to increase access to technical data about hydrogen safety andmore » to enable the use of that data to support development and revision of national and international codes and standards. This document provides a description of the methodology and models contained in the HyRAM version 1.1. HyRAM 1.1 includes generic probabilities for hydrogen equipment fail- ures, probabilistic models for the impact of heat flux on humans and structures, and computa- tionally and experimentally validated analytical and first order models of hydrogen release and flame physics. HyRAM 1.1 integrates deterministic and probabilistic models for quantifying accident scenarios, predicting physical effects, and characterizing hydrogen hazards (thermal effects from jet fires, overpressure effects from deflagrations), and assessing impact on people and structures. HyRAM is a prototype software in active development and thus the models and data may change. This report will be updated at appropriate developmental intervals.« less

  15. Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach.

    PubMed

    Aromataris, Edoardo; Fernandez, Ritin; Godfrey, Christina M; Holly, Cheryl; Khalil, Hanan; Tungpunkom, Patraporn

    2015-09-01

    With the increase in the number of systematic reviews available, a logical next step to provide decision makers in healthcare with the evidence they require has been the conduct of reviews of existing systematic reviews. Syntheses of existing systematic reviews are referred to by many different names, one of which is an umbrella review. An umbrella review allows the findings of reviews relevant to a review question to be compared and contrasted. An umbrella review's most characteristic feature is that this type of evidence synthesis only considers for inclusion the highest level of evidence, namely other systematic reviews and meta-analyses. A methodology working group was formed by the Joanna Briggs Institute to develop methodological guidance for the conduct of an umbrella review, including diverse types of evidence, both quantitative and qualitative. The aim of this study is to describe the development and guidance for the conduct of an umbrella review. Discussion and testing of the elements of methods for the conduct of an umbrella review were held over a 6-month period by members of a methodology working group. The working group comprised six participants who corresponded via teleconference, e-mail and face-to-face meeting during this development period. In October 2013, the methodology was presented in a workshop at the Joanna Briggs Institute Convention. Workshop participants, review authors and methodologists provided further testing, critique and feedback on the proposed methodology. This study describes the methodology and methods developed for the conduct of an umbrella review that includes published systematic reviews and meta-analyses as the analytical unit of the review. Details are provided regarding the essential elements of an umbrella review, including presentation of the review question in a Population, Intervention, Comparator, Outcome format, nuances of the inclusion criteria and search strategy. A critical appraisal tool with 10 questions to help assess risk of bias in systematic reviews and meta-analyses was also developed and tested. Relevant details to extract from included reviews and how to best present the findings of both quantitative and qualitative systematic reviews in a reader friendly format are provided. Umbrella reviews provide a ready means for decision makers in healthcare to gain a clear understanding of a broad topic area. The umbrella review methodology described here is the first to consider reviews that report other than quantitative evidence derived from randomized controlled trials. The methodology includes an easy to use and informative summary of evidence table to readily provide decision makers with the available, highest level of evidence relevant to the question posed.

  16. Methodological quality of meta-analyses on treatments for chronic obstructive pulmonary disease: a cross-sectional study using the AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool.

    PubMed

    Ho, Robin S T; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel Y S; Chung, Vincent C H

    2015-01-08

    Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. To assess the methodological quality of MAs on COPD treatments. A cross-sectional study on MAs of COPD trials. MAs published during 2000-2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods.

  17. Methodological quality of meta-analyses on treatments for chronic obstructive pulmonary disease: a cross-sectional study using the AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool

    PubMed Central

    Ho, Robin ST; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel YS; Chung, Vincent CH

    2015-01-01

    Background: Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. Aims: To assess the methodological quality of MAs on COPD treatments. Methods: A cross-sectional study on MAs of COPD trials. MAs published during 2000–2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Results: Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. Conclusions: The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods. PMID:25569783

  18. Answer first: Applying the heuristic-analytic theory of reasoning to examine student intuitive thinking in the context of physics

    NASA Astrophysics Data System (ADS)

    Kryjevskaia, Mila; Stetzer, MacKenzie R.; Grosz, Nathaniel

    2014-12-01

    We have applied the heuristic-analytic theory of reasoning to interpret inconsistencies in student reasoning approaches to physics problems. This study was motivated by an emerging body of evidence that suggests that student conceptual and reasoning competence demonstrated on one task often fails to be exhibited on another. Indeed, even after instruction specifically designed to address student conceptual and reasoning difficulties identified by rigorous research, many undergraduate physics students fail to build reasoning chains from fundamental principles even though they possess the required knowledge and skills to do so. Instead, they often rely on a variety of intuitive reasoning strategies. In this study, we developed and employed a methodology that allowed for the disentanglement of student conceptual understanding and reasoning approaches through the use of sequences of related questions. We have shown that the heuristic-analytic theory of reasoning can be used to account for, in a mechanistic fashion, the observed inconsistencies in student responses. In particular, we found that students tended to apply their correct ideas in a selective manner that supported a specific and likely anticipated conclusion while neglecting to employ the same ideas to refute an erroneous intuitive conclusion. The observed reasoning patterns were consistent with the heuristic-analytic theory, according to which reasoners develop a "first-impression" mental model and then construct an argument in support of the answer suggested by this model. We discuss implications for instruction and argue that efforts to improve student metacognition, which serves to regulate the interaction between intuitive and analytical reasoning, is likely to lead to improved student reasoning.

  19. A Two-Stage Approach to Synthesizing Covariance Matrices in Meta-Analytic Structural Equation Modeling

    ERIC Educational Resources Information Center

    Cheung, Mike W. L.; Chan, Wai

    2009-01-01

    Structural equation modeling (SEM) is widely used as a statistical framework to test complex models in behavioral and social sciences. When the number of publications increases, there is a need to systematically synthesize them. Methodology of synthesizing findings in the context of SEM is known as meta-analytic SEM (MASEM). Although correlation…

  20. Contextual and Analytic Qualities of Research Methods Exemplified in Research on Teaching

    ERIC Educational Resources Information Center

    Svensson, Lennart; Doumas, Kyriaki

    2013-01-01

    The aim of the present article is to discuss contextual and analytic qualities of research methods. The arguments are specified in relation to research on teaching. A specific investigation is used as an example to illustrate the general methodological approach. It is argued that research methods should be carefully grounded in an understanding of…

Top