Recent Methodology in Ginseng Analysis
Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill
2012-01-01
As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean
A new field of research, visual analytics, has recently been introduced. This has been defined as “the science of analytical reasoning facilitated by visual interfaces." Visual analytic environments, therefore, support analytical reasoning using visual representations and interactions, with data representations and transformation capabilities, to support production, presentation and dissemination. As researchers begin to develop visual analytic environments, it will be advantageous to develop metrics and methodologies to help researchers measure the progress of their work and understand the impact their work will have on the users who will work in such environments. This paper presents five areas or aspects ofmore » visual analytic environments that should be considered as metrics and methodologies for evaluation are developed. Evaluation aspects need to include usability, but it is necessary to go beyond basic usability. The areas of situation awareness, collaboration, interaction, creativity, and utility are proposed as areas for initial consideration. The steps that need to be undertaken to develop systematic evaluation methodologies and metrics for visual analytic environments are outlined.« less
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
A Modern Approach to College Analytical Chemistry.
ERIC Educational Resources Information Center
Neman, R. L.
1983-01-01
Describes a course which emphasizes all facets of analytical chemistry, including sampling, preparation, interference removal, selection of methodology, measurement of a property, and calculation/interpretation of results. Includes special course features (such as cooperative agreement with an environmental protection center) and course…
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface resid...
A methodology for the assessment of manned flight simulator fidelity
NASA Technical Reports Server (NTRS)
Hess, Ronald A.; Malsbury, Terry N.
1989-01-01
A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.
Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.
ERIC Educational Resources Information Center
Heineman, William R.; Kissinger, Peter T.
1980-01-01
Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)
Analytical and simulator study of advanced transport
NASA Technical Reports Server (NTRS)
Levison, W. H.; Rickard, W. W.
1982-01-01
An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.
Smichowski, Patricia
2008-03-15
This review summarizes and discusses the research carried out on the determination of antimony and its predominant chemical species in atmospheric aerosols. Environmental matrices such as airborne particulate matter, fly ash and volcanic ash present a number of complex analytical challenges as very sensitive analytical techniques and highly selective separation methodologies for speciation studies. Given the diversity of instrumental approaches and methodologies employed for the determination of antimony and its species in environmental matrices, the objective of this review is to briefly discuss the most relevant findings reported in the last years for this remarkable element and to identify the future needs and trends. The survey includes 92 references and covers principally the literature published over the last decade.
Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen
2016-04-07
Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Anderson, Craig A.; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L.; Bushman, Brad J.; Sakamoto, Akira; Rothstein, Hannah R.; Saleem, Muniba
2010-01-01
Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past…
Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M
2018-08-01
Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Jones, Thomas C.; Dorsey, John T.; Doggett, William R.
2015-01-01
The Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) is a versatile long-reach robotic manipulator that is currently being tested at NASA Langley Research Center. TALISMAN is designed to be highly mass-efficient and multi-mission capable, with applications including asteroid retrieval and manipulation, in-space servicing, and astronaut and payload positioning. The manipulator uses a modular, periodic, tension-compression design that lends itself well to analytical modeling. Given the versatility of application for TALISMAN, a structural sizing methodology was developed that could rapidly assess mass and configuration sensitivities for any specified operating work space, applied loads and mission requirements. This methodology allows the systematic sizing of the key structural members of TALISMAN, which include the truss arm links, the spreaders and the tension elements. This paper summarizes the detailed analytical derivations and methodology that support the structural sizing approach and provides results from some recent TALISMAN designs developed for current and proposed mission architectures.
2010-04-01
available [11]. Additionally, Table-3 is a guide for DMAIC methodology including 29 different methods [12]. RTO-MP-SAS-081 6 - 4 NATO UNCLASSIFIED NATO...Table 3: DMAIC Methodology (5-Phase Methodology). Define Measure Analyze Improve Control Project Charter Prioritization Matrix 5 Whys Analysis...Methodology Scope [13] DMAIC PDCA Develop performance priorities This is a preliminary stage that precedes specific improvement projects, and the aim
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
Hierarchical Analytical Approaches for Unraveling the Composition of Proprietary Mixtures
The composition of commercial mixtures including pesticide inert ingredients, aircraft deicers, and aqueous film-forming foam (AFFF) formulations, and by analogy, fracking fluids, are proprietary. Quantitative analytical methodologies can only be developed for mixture components once their identities are known. Because proprietary mixtures may contain volatile and non-volatile components, a hierarchy of analytical methods is often required for the full identification of all proprietary mixture components.
Xpey' Relational Environments: an analytic framework for conceptualizing Indigenous health equity.
Kent, Alexandra; Loppie, Charlotte; Carriere, Jeannine; MacDonald, Marjorie; Pauly, Bernie
2017-12-01
Both health equity research and Indigenous health research are driven by the goal of promoting equitable health outcomes among marginalized and underserved populations. However, the two fields often operate independently, without collaboration. As a result, Indigenous populations are underrepresented in health equity research relative to the disproportionate burden of health inequities they experience. In this methodological article, we present Xpey' Relational Environments, an analytic framework that maps some of the barriers and facilitators to health equity for Indigenous peoples. Health equity research needs to include a focus on Indigenous populations and Indigenized methodologies, a shift that could fill gaps in knowledge with the potential to contribute to 'closing the gap' in Indigenous health. With this in mind, the Equity Lens in Public Health (ELPH) research program adopted the Xpey' Relational Environments framework to add a focus on Indigenous populations to our research on the prioritization and implementation of health equity. The analytic framework introduced an Indigenized health equity lens to our methodology, which facilitated the identification of social, structural and systemic determinants of Indigenous health. To test the framework, we conducted a pilot case study of one of British Columbia's regional health authorities, which included a review of core policies and plans as well as interviews and focus groups with frontline staff, managers and senior executives. ELPH's application of Xpey' Relational Environments serves as an example of the analytic framework's utility for exploring and conceptualizing Indigenous health equity in BC's public health system. Future applications of the framework should be embedded in Indigenous research methodologies.
A micromechanics-based strength prediction methodology for notched metal matrix composites
NASA Technical Reports Server (NTRS)
Bigelow, C. A.
1992-01-01
An analytical micromechanics based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and post fatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.
A micromechanics-based strength prediction methodology for notched metal-matrix composites
NASA Technical Reports Server (NTRS)
Bigelow, C. A.
1993-01-01
An analytical micromechanics-based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three-dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and postfatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics-based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.
Operational Environmental Assessment
1988-09-01
Chemistry Branch - Physical Chemistry Branch " Analytical Research Division - Analytical Systems Branch - Methodology Research Branch - Spectroscopy Branch...electromagnetic frequency spec- trum and includes radio frequencies, infrared , visible light, ultraviolet, X-rays and gamma rays (in ascending order of...Verruculogen Aflatrem Picrotoxin Ciguatoxin Mycotoxins Simple Tr ichothecenes T-2 Toxin T-2 Tetraol Neosolaniol * Nivalenol Deoxynivalenol Verrucarol B-3 B lank
Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.
Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María
2017-01-01
This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.
Sosa-Ferrera, Zoraida; Mahugo-Santana, Cristina; Santana-Rodríguez, José Juan
2013-01-01
Endocrine-disruptor compounds (EDCs) can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented. PMID:23738329
Multi-center evaluation of analytical performance of the Beckman Coulter AU5822 chemistry analyzer.
Zimmerman, M K; Friesen, L R; Nice, A; Vollmer, P A; Dockery, E A; Rankin, J D; Zmuda, K; Wong, S H
2015-09-01
Our three academic institutions, Indiana University, Northwestern Memorial Hospital, and Wake Forest, were among the first in the United States to implement the Beckman Coulter AU5822 series chemistry analyzers. We undertook this post-hoc multi-center study by merging our data to determine performance characteristics and the impact of methodology changes on analyte measurement. We independently completed performance validation studies including precision, linearity/analytical measurement range, method comparison, and reference range verification. Complete data sets were available from at least one institution for 66 analytes with the following groups: 51 from all three institutions, and 15 from 1 or 2 institutions for a total sample size of 12,064. Precision was similar among institutions. Coefficients of variation (CV) were <10% for 97%. Analytes with CVs >10% included direct bilirubin and digoxin. All analytes exhibited linearity over the analytical measurement range. Method comparison data showed slopes between 0.900-1.100 for 87.9% of the analytes. Slopes for amylase, tobramycin and urine amylase were <0.8; the slope for lipase was >1.5, due to known methodology or standardization differences. Consequently, reference ranges of amylase, urine amylase and lipase required only minor or no modification. The four AU5822 analyzers independently evaluated at three sites showed consistent precision, linearity, and correlation results. Since installations, the test results had been well received by clinicians from all three institutions. Copyright © 2015. Published by Elsevier Inc.
Modern Instrumental Methods in Forensic Toxicology*
Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.
2009-01-01
This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968
NASA Astrophysics Data System (ADS)
Cvetkovic, V.; Molin, S.
2012-02-01
We present a methodology that combines numerical simulations of groundwater flow and advective transport in heterogeneous porous media with analytical retention models for computing the infection risk probability from pathogens in aquifers. The methodology is based on the analytical results presented in [1,2] for utilising the colloid filtration theory in a time-domain random walk framework. It is shown that in uniform flow, the results from the numerical simulations of advection yield comparable results as the analytical TDRW model for generating advection segments. It is shown that spatial variability of the attachment rate may be significant, however, it appears to affect risk in a different manner depending on if the flow is uniform or radially converging. In spite of the fact that numerous issues remain open regarding pathogen transport in aquifers on the field scale, the methodology presented here may be useful for screening purposes, and may also serve as a basis for future studies that would include greater complexity.
Measuring solids concentration in stormwater runoff: comparison of analytical methods.
Clark, Shirley E; Siu, Christina Y S
2008-01-15
Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.
Mammana, Sabrina B; Berton, Paula; Camargo, Alejandra B; Lascalea, Gustavo E; Altamirano, Jorgelina C
2017-05-01
An analytical methodology based on coprecipitation-assisted coacervative extraction coupled to HPLC-UV was developed for determination of five organophosphorus pesticides (OPPs), including fenitrothion, guthion, parathion, methidathion, and chlorpyrifos, in water samples. It involves a green technique leading to an efficient and simple analytical methodology suitable for high-throughput analysis. Relevant physicochemical variables were studied and optimized on the analytical response of each OPP. Under optimized conditions, the resulting methodology was as follows: an aliquot of 9 mL of water sample was placed into a centrifuge tube and 0.5 mL sodium citrate 0.1 M, pH 4; 0.08 mL Al 2 (SO 4 ) 3 0.1 M; and 0.7 mL SDS 0.1 M were added and homogenized. After centrifugation the supernatant was discarded. A 700 μL aliquot of the coacervate-rich phase obtained was dissolved with 300 μL of methanol and 20 μL of the resulting solution was analyzed by HPLC-UV. The resulting LODs ranged within 0.7-2.5 ng/mL and the achieved RSD and recovery values were <8% (n = 3) and >81%, respectively. The proposed analytical methodology was successfully applied for the analysis of five OPPs in water samples for human consumption of different locations of Mendoza. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An overview of key technology thrusts at Bell Helicopter Textron
NASA Technical Reports Server (NTRS)
Harse, James H.; Yen, Jing G.; Taylor, Rodney S.
1988-01-01
Insight is provided into several key technologies at Bell. Specific topics include the results of ongoing research and development in advanced rotors, methodology development, and new configurations. The discussion on advanced rotors highlight developments on the composite, bearingless rotor, including the development and testing of full scale flight hardware as well as some of the design support analyses and verification testing. The discussion on methodology development concentrates on analytical development in aeromechanics, including correlation studies and design application. New configurations, presents the results of some advanced configuration studies including hardware development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Thomas; Trail, Jessica; Gevondyan, Erna
During times of crisis, communities and regions rely heavily on critical infrastructure systems to support their emergency management response and recovery activities. Therefore, the resilience of critical infrastructure systems to crises is a pivotal factor to a community’s overall resilience. Critical infrastructure resilience can be influenced by many factors, including State policies – which are not always uniform in their structure or application across the United States – were identified by the U.S. Department of Homeland Security as an area of particular interest with respect to their the influence on the resilience of critical infrastructure systems. This study focuses onmore » developing an analytical methodology to assess links between policy and resilience, and applies that methodology to critical infrastructure in the Transportation Systems Sector. Specifically, this study seeks to identify potentially influential linkages between State transportation capital funding policies and the resilience of bridges located on roadways that are under the management of public agencies. This study yielded notable methodological outcomes, including the general capability of the analytical methodology to yield – in the case of some States – significant results connecting State policies with critical infrastructure resilience, with the suggestion that further refinement of the methodology may be beneficial.« less
MS-based analytical methodologies to characterize genetically modified crops.
García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro
2011-01-01
The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.
A Big Data Analytics Methodology Program in the Health Sector
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony; Howell-Barber, H.
2016-01-01
The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…
NASA Technical Reports Server (NTRS)
Phatak, A. V.
1980-01-01
A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.
Shuttle payload bay dynamic environments: Summary and conclusion report for STS flights 1-5 and 9
NASA Technical Reports Server (NTRS)
Oconnell, M.; Garba, J.; Kern, D.
1984-01-01
The vibration, acoustic and low frequency loads data from the first 5 shuttle flights are presented. The engineering analysis of that data is also presented. Vibroacoustic data from STS-9 are also presented because they represent the only data taken on a large payload. Payload dynamic environment predictions developed by the participation of various NASA and industrial centers are presented along with a comparison of analytical loads methodology predictions with flight data, including a brief description of the methodologies employed in developing those predictions for payloads. The review of prediction methodologies illustrates how different centers have approached the problems of developing shuttle dynamic environmental predictions and criteria. Ongoing research activities related to the shuttle dynamic environments are also described. Analytical software recently developed for the prediction of payload acoustic and vibration environments are also described.
LC-MS based analysis of endogenous steroid hormones in human hair.
Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias
2016-09-01
The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Big Data Analytics Methodology in the Financial Industry
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony
2017-01-01
Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…
Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D
2017-04-15
In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.
Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.
Grdinić, Vladimir; Vuković, Jadranka
2004-05-28
A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.
Analytical Methodology for Predicting the Onset of Widespread Fatigue Damage in Fuselage Structure
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Newman, James C., Jr.; Piascik, Robert S.; Starnes, James H., Jr.
1996-01-01
NASA has developed a comprehensive analytical methodology for predicting the onset of widespread fatigue damage in fuselage structure. The determination of the number of flights and operational hours of aircraft service life that are related to the onset of widespread fatigue damage includes analyses for crack initiation, fatigue crack growth, and residual strength. Therefore, the computational capability required to predict analytically the onset of widespread fatigue damage must be able to represent a wide range of crack sizes from the material (microscale) level to the global structural-scale level. NASA studies indicate that the fatigue crack behavior in aircraft structure can be represented conveniently by the following three analysis scales: small three-dimensional cracks at the microscale level, through-the-thickness two-dimensional cracks at the local structural level, and long cracks at the global structural level. The computational requirements for each of these three analysis scales are described in this paper.
Challenges and perspectives in quantitative NMR.
Giraudeau, Patrick
2017-01-01
This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-18
... during conference calls and via email discussions. Member duties include prioritizing topics, designing... their expertise in methodological issues such as meta-analysis, analytic modeling or clinical...
Social Impact Studies: An Expository Analysis
ERIC Educational Resources Information Center
Shields, Mark A.
1975-01-01
Analyzed are some selected studies on the social impact of resources development and construction projects including dams, highways, nuclear power plants and strip mines. The analytical and methodological problem of assessing differential impacts is stressed. (BT)
2018-04-30
2017 Workplace and Gender Relations Survey of Reserve Component Members Statistical Methodology Report Additional copies of this report...Survey of Reserve Component Members Statistical Methodology Report Office of People Analytics (OPA) 4800 Mark Center Drive, Suite...RESERVE COMPONENT MEMBERS STATISTICAL METHODOLOGY REPORT Introduction The Office of People Analytics’ Center for Health and Resilience (OPA[H&R
Mihura, Joni L; Meyer, Gregory J; Dumitrascu, Nicolae; Bombel, George
2016-01-01
We respond to Tibon Czopp and Zeligman's (2016) critique of our systematic reviews and meta-analyses of 65 Rorschach Comprehensive System (CS) variables published in Psychological Bulletin (2013). The authors endorsed our supportive findings but critiqued the same methodology when used for the 13 unsupported variables. Unfortunately, their commentary was based on significant misunderstandings of our meta-analytic method and results, such as thinking we used introspectively assessed criteria in classifying levels of support and reporting only a subset of our externally assessed criteria. We systematically address their arguments that our construct label and criterion variable choices were inaccurate and, therefore, meta-analytic validity for these 13 CS variables was artificially low. For example, the authors created new construct labels for these variables that they called "the customary CS interpretation," but did not describe their methodology nor provide evidence that their labels would result in better validity than ours. They cite studies they believe we should have included; we explain how these studies did not fit our inclusion criteria and that including them would have actually reduced the relevant CS variables' meta-analytic validity. Ultimately, criticisms alone cannot change meta-analytic support from negative to positive; Tibon Czopp and Zeligman would need to conduct their own construct validity meta-analyses.
ERIC Educational Resources Information Center
Jez, Joseph M.; Schachtman, Daniel P.; Berg, R. Howard; Taylor, Christopher G.; Chen, Sixue; Hicks, Leslie M.; Jaworski, Jan G.; Smith, Thomas J.; Nielsen, Erik; Pikaard, Craig S.
2007-01-01
Studies of protein function increasingly use multifaceted approaches that span disciplines including recombinant DNA technology, cell biology, and analytical biochemistry. These studies rely on sophisticated equipment and methodologies including confocal fluorescence microscopy, mass spectrometry, and X-ray crystallography that are beyond the…
Quantifying construction and demolition waste: An analytical review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin
2014-09-15
Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention
Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-01-01
Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928
Application of ion chromatography in pharmaceutical and drug analysis.
Jenke, Dennis
2011-08-01
Ion chromatography (IC) has developed and matured into an important analytical methodology in a number of diverse applications and industries, including pharmaceuticals. This manuscript provides a review of IC applications for the determinations of active and inactive ingredients, excipients, degradation products, and impurities relevant to pharmaceutical analyses and thus serves as a resource for investigators looking for insights into the use of the IC methodology in this field of application.
Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza
2012-01-01
Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-01-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-08-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2013-12-20
Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.
Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B
2018-05-30
Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.
Educational Design as Conversation: A Conversation Analytical Perspective on Teacher Dialogue
ERIC Educational Resources Information Center
van Kruiningen, Jacqueline F.
2013-01-01
The aim of this methodological paper is to expound on and demonstrate the value of conversation-analytical research in the area of (informal) teacher learning. The author discusses some methodological issues in current research on interaction in teacher learning and holds a plea for conversation-analytical research on interactional processes in…
Dai, Jun; Wang, Chunlei; Traeger, Sarah C; Discenza, Lorell; Obermeier, Mary T; Tymiak, Adrienne A; Zhang, Yingru
2017-03-03
Atropisomers are stereoisomers resulting from hindered bond rotation. From synthesis of pure atropisomers, characterization of their interconversion thermodynamics to investigation of biological stereoselectivity, the evaluation of drug candidates subject to atropisomerism creates special challenges and can be complicated in both early drug discovery and later drug development. In this paper, we demonstrate an array of analytical techniques and systematic approaches to study the atropisomerism of drug molecules to meet these challenges. Using a case study of Bruton's tyrosine kinase (BTK) inhibitor drug candidates at Bristol-Myers Squibb, we present the analytical strategies and methodologies used during drug discovery including the detection of atropisomers, the determination of their relative composition, the identification of relative chirality, the isolation of individual atropisomers, the evaluation of interconversion kinetics, and the characterization of chiral stability in the solid state and in solution. In vivo and in vitro stereo-stability and stereo-selectivity were investigated as well as the pharmacological significance of any changes in atropisomer ratios. Techniques applied in these studies include analytical and preparative enantioselective supercritical fluid chromatography (SFC), enantioselective high performance liquid chromatography (HPLC), circular dichroism (CD), and mass spectrometry (MS). Our experience illustrates how atropisomerism can be a very complicated issue in drug discovery and why a thorough understanding of this phenomenon is necessary to provide guidance for pharmaceutical development. Analytical techniques and methodologies facilitate key decisions during the discovery of atropisomeric drug candidates by characterizing time-dependent physicochemical properties that can have significant biological implications and relevance to pharmaceutical development plans. Copyright © 2017 Elsevier B.V. All rights reserved.
Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A
2017-05-10
Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.
Baghdady, Mariam T; Carnahan, Heather; Lam, Ernest W N; Woods, Nicole N
2014-09-01
There has been much debate surrounding diagnostic strategies and the most appropriate training models for novices in oral radiology. It has been argued that an analytic approach, using a step-by-step analysis of the radiographic features of an abnormality, is ideal. Alternative research suggests that novices can successfully employ non-analytic reasoning. Many of these studies do not take instructional methodology into account. This study evaluated the effectiveness of non-analytic and analytic strategies in radiographic interpretation and explored the relationship between instructional methodology and diagnostic strategy. Second-year dental and dental hygiene students were taught four radiographic abnormalities using basic science instructions or a step-by-step algorithm. The students were tested on diagnostic accuracy and memory immediately after learning and one week later. A total of seventy-three students completed both immediate and delayed sessions and were included in the analysis. Students were randomly divided into two instructional conditions: one group provided a diagnostic hypothesis for the image and then identified specific features to support it, while the other group first identified features and then provided a diagnosis. Participants in the diagnosis-first condition (non-analytic reasoning) had higher diagnostic accuracy then those in the features-first condition (analytic reasoning), regardless of their learning condition. No main effect of learning condition or interaction with diagnostic strategy was observed. Educators should be mindful of the potential influence of analytic and non-analytic approaches on the effectiveness of the instructional method.
Multimodal system planning technique : an analytical approach to peak period operation
DOT National Transportation Integrated Search
1995-11-01
The multimodal system planning technique described in this report is an improvement of the methodology used in the Dallas System Planning Study. The technique includes a spreadsheet-based process to identify the costs of congestion, construction, and...
Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.
Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U
2015-05-01
The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Model and Analytic Processes for Export License Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.
2011-09-29
This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less
López-Serna, Rebeca; Marín-de-Jesús, David; Irusta-Mata, Rubén; García-Encina, Pedro Antonio; Lebrero, Raquel; Fdez-Polanco, María; Muñoz, Raúl
2018-08-15
The work here presented aimed at developing an analytical method for the simultaneous determination of 22 pharmaceuticals and personal care products, including 3 transformation products, in sewage and sludge. A meticulous method optimization, involving an experimental design, was carried out. The developed method was fully automated and consisted of the online extraction of 17 mL of water sample by Direct Immersion Solid Phase MicroExtraction followed by On-fiber Derivatization coupled to Gas Chromatography - Mass Spectrometry (DI-SPME - On-fiber Derivatization - GC - MS). This methodology was validated for 12 of the initial compounds as a reliable (relative recoveries above 90% for sewage and 70% for sludge; repeatability as %RSD below 10% in all cases), sensitive (LODs below 20 ng L -1 in sewage and 10 ng g -1 in sludge), versatile (sewage and sewage-sludge samples up to 15,000 ng L -1 and 900 ng g -1 , respectively) and green analytical alternative for many medium-tech routine laboratories around the world to keep up with both current and forecast environmental regulations requirements. The remaining 10 analytes initially considered showed insufficient suitability to be included in the final method. The methodology was successfully applied to real samples generated in a pilot scale sewage treatment reactor. Copyright © 2018 Elsevier B.V. All rights reserved.
Evaluation of analytical performance based on partial order methodology.
Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin
2015-01-01
Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.
Analytical capillary isotachophoresis after 50 years of development: Recent progress 2014-2016.
Malá, Zdena; Gebauer, Petr; Boček, Petr
2017-01-01
This review brings a survey of papers on analytical ITP published since 2014 until the first quarter of 2016. The 50th anniversary of ITP as a modern analytical method offers the opportunity to present a brief view on its beginnings and to discuss the present state of the art from the viewpoint of the history of its development. Reviewed papers from the field of theory and principles confirm the continuing importance of computer simulations in the discovery of new and unexpected phenomena. The strongly developing field of instrumentation and techniques shows novel channel methodologies including use of porous media and new on-chip assays, where ITP is often included in a preseparative or even preparative function. A number of new analytical applications are reported, with ITP appearing almost exclusively in combination with other principles and methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The role of analytical chemistry in Niger Delta petroleum exploration: a review.
Akinlua, Akinsehinwa
2012-06-12
Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.
Analytical Utility of Campylobacter Methodologies
USDA-ARS?s Scientific Manuscript database
The National Advisory Committee on Microbiological Criteria for Foods (NACMCF, or the Committee) was asked to address the analytical utility of Campylobacter methodologies in preparation for an upcoming United States Food Safety and Inspection Service (FSIS) baseline study to enumerate Campylobacter...
PDTRT special section: Methodological issues in personality disorder research.
Widiger, Thomas A
2017-10-01
Personality Disorders: Theory, Research, and Treatment includes a rolling, ongoing Special Section concerned with methodological issues in personality disorder research. This third edition of this series includes two articles. The first is by Brian Hicks, Angus Clark, and Emily Durbin: "Person-Centered Approaches in the Study of Personality Disorders." The second article is by Steve Balsis: "Item Response Theory Applications in Personality Disorder Research." Both articles should be excellent resources for future research and certainly manuscripts submitted to this journal that use these analytic tools. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Research Applications of Magnetic Resonance Spectroscopy (MRS) to Investigate Psychiatric Disorders
Dager, SR; Oskin, NM; Richards, TL; Posse, S
2009-01-01
Advances in magnetic resonance spectroscopy (MRS) methodology and related analytic strategies allow sophisticated testing of neurobiological models of disease pathology in psychiatric disorders. An overview of principles underlying MRS, methodological considerations and investigative approaches is presented. A review of recent research is presented that highlights innovative approaches applying MRS, in particular 1H MRS, to systematically investigate specific psychiatric disorders, including autism spectrum disorders, schizophrenia, panic disorder, major depression and bipolar disorder. PMID:19363431
Quality assurance for health and environmental chemistry: 1990
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gautier, M.A.; Gladney, E.S.; Koski, N.L.
1991-10-01
This report documents the continuing quality assurance efforts of the Health and Environmental Chemistry Group (HSE-9) at the Los Alamos National Laboratory. The philosophy, methodology, computing resources, and laboratory information management system used by the quality assurance program to encompass the diversity of analytical chemistry practiced in the group are described. Included in the report are all quality assurance reference materials used, along with their certified or consensus concentrations, and all analytical chemistry quality assurance measurements made by HSE-9 during 1990.
4-Nonylphenol (NP) in food-contact materials: analytical methodology and occurrence.
Fernandes, A R; Rose, M; Charlton, C
2008-03-01
Nonylphenol is a recognized environmental contaminant, but it is unclear whether its occurrence in food arises only through environmental pathways or also during the processing or packaging of food, as there are reports that indicate that materials in contact with food such as rubber products and polyvinylchloride wraps can contain nonylphenol. A review of the literature has highlighted the scarcity of robust analytical methodology or data on the occurrence of nonylphenol in packaging materials. This paper describes a methodology for the determination of nonylphenol in a variety of packaging materials, which includes plastics, paper and rubber. The method uses either Soxhlet extraction or dissolution followed by solvent extraction (depending on the material type), followed by purification using adsorption chromatography. Procedures were internally standardized using 13C-labelled nonylphenol and the analytes were measured by gas chromatography-mass spectrometry. The method is validated and data relating to quality parameters such as limits of detection, recovery, precision and linearity of measurement are provided. Analysis of a range of 25 food-contact materials found nonylphenol at concentrations of 64-287 microg g(-1) in some polystyrene and polyvinylchloride samples. Far lower concentrations (<0.03-1.4 microg g(-1)) were detected in the other materials. It is possible that occurrence at the higher levels has the potential for migration to food.
Ho, Robin S T; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel Y S; Chung, Vincent C H
2015-01-08
Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. To assess the methodological quality of MAs on COPD treatments. A cross-sectional study on MAs of COPD trials. MAs published during 2000-2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods.
Ho, Robin ST; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel YS; Chung, Vincent CH
2015-01-01
Background: Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. Aims: To assess the methodological quality of MAs on COPD treatments. Methods: A cross-sectional study on MAs of COPD trials. MAs published during 2000–2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Results: Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. Conclusions: The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods. PMID:25569783
NASA Technical Reports Server (NTRS)
Starnes, James H., Jr.; Newman, James C., Jr.; Harris, Charles E.; Piascik, Robert S.; Young, Richard D.; Rose, Cheryl A.
2003-01-01
Analysis methodologies for predicting fatigue-crack growth from rivet holes in panels subjected to cyclic loads and for predicting the residual strength of aluminum fuselage structures with cracks and subjected to combined internal pressure and mechanical loads are described. The fatigue-crack growth analysis methodology is based on small-crack theory and a plasticity induced crack-closure model, and the effect of a corrosive environment on crack-growth rate is included. The residual strength analysis methodology is based on the critical crack-tip-opening-angle fracture criterion that characterizes the fracture behavior of a material of interest, and a geometric and material nonlinear finite element shell analysis code that performs the structural analysis of the fuselage structure of interest. The methodologies have been verified experimentally for structures ranging from laboratory coupons to full-scale structural components. Analytical and experimental results based on these methodologies are described and compared for laboratory coupons and flat panels, small-scale pressurized shells, and full-scale curved stiffened panels. The residual strength analysis methodology is sufficiently general to include the effects of multiple-site damage on structural behavior.
The importance of quality control in validating concentrations ...
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer
Determination of tocopherols and sitosterols in seeds and nuts by QuEChERS-liquid chromatography.
Delgado-Zamarreño, M Milagros; Fernández-Prieto, Cristina; Bustamante-Rangel, Myriam; Pérez-Martín, Lara
2016-02-01
In the present work a simple, reliable and affordable sample treatment method for the simultaneous analysis of tocopherols and free phytosterols in nuts was developed. Analyte extraction was carried out using the QuEChERS methodology and analyte separation and detection were accomplished using HPLC-DAD. The use of this methodology for the extraction of natural occurring substances provides advantages such as speed, simplicity and ease of use. The parameters evaluated for the validation of the method developed included the linearity of the calibration plots, the detection and quantification limits, repeatability, reproducibility and recovery. The proposed method was successfully applied to the analysis of tocopherols and free phytosterols in samples of almonds, cashew nuts, hazelnuts, peanuts, tiger nuts, sun flower seeds and pistachios. Copyright © 2015 Elsevier Ltd. All rights reserved.
Engineering and programming manual: Two-dimensional kinetic reference computer program (TDK)
NASA Technical Reports Server (NTRS)
Nickerson, G. R.; Dang, L. D.; Coats, D. E.
1985-01-01
The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket thrust chamber performance prediction methodology. The development of a methodology that includes all aspects of rocket engine performance from analytical calculation to test measurements, that is physically accurate and consistent, and that serves as an industry and government reference is presented. Recent interest in rocket engines that operate at high expansion ratio, such as most Orbit Transfer Vehicle (OTV) engine designs, has required an extension of the analytical methods used by the TDK computer program. Thus, the version of TDK that is described in this manual is in many respects different from the 1973 version of the program. This new material reflects the new capabilities of the TDK computer program, the most important of which are described.
Analytical and Numerical Results for an Adhesively Bonded Joint Subjected to Pure Bending
NASA Technical Reports Server (NTRS)
Smeltzer, Stanley S., III; Lundgren, Eric
2006-01-01
A one-dimensional, semi-analytical methodology that was previously developed for evaluating adhesively bonded joints composed of anisotropic adherends and adhesives that exhibit inelastic material behavior is further verified in the present paper. A summary of the first-order differential equations and applied joint loading used to determine the adhesive response from the methodology are also presented. The method was previously verified against a variety of single-lap joint configurations from the literature that subjected the joints to cases of axial tension and pure bending. Using the same joint configuration and applied bending load presented in a study by Yang, the finite element analysis software ABAQUS was used to further verify the semi-analytical method. Linear static ABAQUS results are presented for two models, one with a coarse and one with a fine element meshing, that were used to verify convergence of the finite element analyses. Close agreement between the finite element results and the semi-analytical methodology were determined for both the shear and normal stress responses of the adhesive bondline. Thus, the semi-analytical methodology was successfully verified using the ABAQUS finite element software and a single-lap joint configuration subjected to pure bending.
Watts, R R; Langone, J J; Knight, G J; Lewtas, J
1990-01-01
A two-day technical workshop was convened November 10-11, 1986, to discuss analytical approaches for determining trace amounts of cotinine in human body fluids resulting from passive exposure to environmental tobacco smoke (ETS). The workshop, jointly sponsored by the U.S. Environmental Protection Agency and Centers for Disease Control, was attended by scientists with expertise in cotinine analytical methodology and/or conduct of human monitoring studies related to ETS. The workshop format included technical presentations, separate panel discussions on chromatography and immunoassay analytical approaches, and group discussions related to the quality assurance/quality control aspects of future monitoring programs. This report presents a consensus of opinion on general issues before the workshop panel participants and also a detailed comparison of several analytical approaches being used by the various represented laboratories. The salient features of the chromatography and immunoassay analytical methods are discussed separately. PMID:2190812
Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D
2015-09-01
Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
Force 2025 and Beyond Strategic Force Design Analytic Model
2017-01-12
depiction of the core ideas of our force design model. Figure 1: Description of Force Design Model Figure 2 shows an overview of our methodology ...the F2025B Force Design Analytic Model research conducted by TRAC- MTRY and the Naval Postgraduate School. Our research develops a methodology for...designs. We describe a data development methodology that characterizes the data required to construct a force design model using our approach. We
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
Big data analytics to improve cardiovascular care: promise and challenges.
Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M
2016-06-01
The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.
Cabrera-Barona, Pablo; Ghorbanzadeh, Omid
2018-01-16
Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas.
Cabrera-Barona, Pablo
2018-01-01
Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas. PMID:29337915
Big–deep–smart data in imaging for guiding materials design
Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.
2015-09-23
Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.
Big-deep-smart data in imaging for guiding materials design.
Kalinin, Sergei V; Sumpter, Bobby G; Archibald, Richard K
2015-10-01
Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.
Big-deep-smart data in imaging for guiding materials design
NASA Astrophysics Data System (ADS)
Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.
2015-10-01
Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.
Big–deep–smart data in imaging for guiding materials design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.
Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.
Crovelli, R.A.
1988-01-01
The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.
ERIC Educational Resources Information Center
Azevedo, Roger
2015-01-01
Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…
NASA Astrophysics Data System (ADS)
Penkov, V. B.; Levina, L. V.; Novikova, O. S.; Shulmin, A. S.
2018-03-01
Herein we propose a methodology for structuring a full parametric analytical solution to problems featuring elastostatic media based on state-of-the-art computing facilities that support computerized algebra. The methodology includes: direct and reverse application of P-Theorem; methods of accounting for physical properties of media; accounting for variable geometrical parameters of bodies, parameters of boundary states, independent parameters of volume forces, and remote stress factors. An efficient tool to address the task is the sustainable method of boundary states originally designed for the purposes of computerized algebra and based on the isomorphism of Hilbertian spaces of internal states and boundary states of bodies. We performed full parametric solutions of basic problems featuring a ball with a nonconcentric spherical cavity, a ball with a near-surface flaw, and an unlimited medium with two spherical cavities.
Eco-analytical Methodology in Environmental Problems Monitoring
NASA Astrophysics Data System (ADS)
Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.
2017-01-01
Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.
A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.
Płotka-Wasylka, J
2018-05-01
A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.
A methodology to enhance electromagnetic compatibility in joint military operations
NASA Astrophysics Data System (ADS)
Buckellew, William R.
The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.
Analytical group decision making in natural resources: methodology and application
Daniel L. Schmoldt; David L. Peterson
2000-01-01
Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups...
Three Interaction Patterns on Asynchronous Online Discussion Behaviours: A Methodological Comparison
ERIC Educational Resources Information Center
Jo, I.; Park, Y.; Lee, H.
2017-01-01
An asynchronous online discussion (AOD) is one format of instructional methods that facilitate student-centered learning. In the wealth of AOD research, this study evaluated how students' behavior on AOD influences their academic outcomes. This case study compared the differential analytic methods including web log mining, social network analysis…
Making sense of grounded theory in medical education.
Kennedy, Tara J T; Lingard, Lorelei A
2006-02-01
Grounded theory is a research methodology designed to develop, through collection and analysis of data that is primarily (but not exclusively) qualitative, a well-integrated set of concepts that provide a theoretical explanation of a social phenomenon. This paper aims to provide an introduction to key features of grounded theory methodology within the context of medical education research. In this paper we include a discussion of the origins of grounded theory, a description of key methodological processes, a comment on pitfalls encountered commonly in the application of grounded theory research, and a summary of the strengths of grounded theory methodology with illustrations from the medical education domain. The significant strengths of grounded theory that have resulted in its enduring prominence in qualitative research include its clearly articulated analytical process and its emphasis on the generation of pragmatic theory that is grounded in the data of experience. When applied properly and thoughtfully, grounded theory can address research questions of significant relevance to the domain of medical education.
Graphical Descriptives: A Way to Improve Data Transparency and Methodological Rigor in Psychology.
Tay, Louis; Parrigon, Scott; Huang, Qiming; LeBreton, James M
2016-09-01
Several calls have recently been issued to the social sciences for enhanced transparency of research processes and enhanced rigor in the methodological treatment of data and data analytics. We propose the use of graphical descriptives (GDs) as one mechanism for responding to both of these calls. GDs provide a way to visually examine data. They serve as quick and efficient tools for checking data distributions, variable relations, and the potential appropriateness of different statistical analyses (e.g., do data meet the minimum assumptions for a particular analytic method). Consequently, we believe that GDs can promote increased transparency in the journal review process, encourage best practices for data analysis, and promote a more inductive approach to understanding psychological data. We illustrate the value of potentially including GDs as a step in the peer-review process and provide a user-friendly online resource (www.graphicaldescriptives.org) for researchers interested in including data visualizations in their research. We conclude with suggestions on how GDs can be expanded and developed to enhance transparency. © The Author(s) 2016.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This report provides a detailed summary of the activities carried out to sample groundwater at Waste Area Grouping (WAG) 6. The analytical results for samples collected during Phase 1, Activity 2 of the WAG 6 Resource Conservation and Recovery Act Facility Investigation (RFI) are also presented. In addition, analytical results for Phase 1, activity sampling events for which data were not previously reported are included in this TM. A summary of the groundwater sampling activities of WAG 6, to date, are given in the Introduction. The Methodology section describes the sampling procedures and analytical parameters. Six attachments are included. Attachmentsmore » 1 and 2 provide analytical results for selected RFI groundwater samples and ORNL sampling event. Attachment 3 provides a summary of the contaminants detected in each well sampled for all sampling events conducted at WAG 6. Bechtel National Inc. (BNI)/IT Corporation Contract Laboratory (IT) RFI analytical methods and detection limits are given in Attachment 4. Attachment 5 provides the Oak Ridge National Laboratory (ORNL)/Analytical Chemistry Division (ACD) analytical methods and detection limits and Resource Conservation and Recovery Act (RCRA) quarterly compliance monitoring (1988--1989). Attachment 6 provides ORNL/ACD groundwater analytical methods and detection limits (for the 1990 RCRA semi-annual compliance monitoring).« less
RE-EVALUATION OF APPLICABILITY OF AGENCY SAMPLE HOLDING TIMES
Holding times are the length of time a sample can be stored after collection and prior to analysis without significantly affecting the analytical results. Holding times vary with the analyte, sample matrix, and analytical methodology used to quantify the analytes concentration. ...
Burt, Tal; John, Christy S; Ruckle, Jon L; Vuong, Le T
2017-05-01
Phase-0 studies, including microdosing, also called Exploratory Investigational New Drug (eIND) or exploratory clinical trials, are a regulatory framework for first-in-human (FIH) trials. Common to these approaches is the use and implied safety of limited exposures to test articles. Use of sub-pharmacological doses in phase-0/microdose studies requires sensitive analytic tools such as accelerator mass spectrometer (AMS), Positron Emission Tomography (PET), and Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) to determine drug disposition. Areas covered: Here we present a practical guide to the range of methodologies, design options, and conduct strategies that can be used to increase the efficiency of drug development. We provide detailed examples of relevant developmental scenarios. Expert opinion: Validation studies over the past decade demonstrated the reliability of extrapolation of sub-pharmacological to therapeutic-level exposures in more than 80% of cases, an improvement over traditional allometric approaches. Applications of phase-0/microdosing approaches include study of pharmacokinetic and pharmacodynamic properties, target tissue localization, drug-drug interactions, effects in vulnerable populations (e.g. pediatric), and intra-target microdosing (ITM). Study design should take into account the advantages and disadvantages of each analytic tool. Utilization of combinations of these analytic techniques increases the versatility of study designs and the power of data obtained.
Morales, Daniel R.; Pacurariu, Alexandra; Kurz, Xavier
2017-01-01
Aims Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. Methods We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non‐European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. Results From 1246 screened articles, 229 were eligible for full‐text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill‐over effects were rarely evaluated. Two‐thirds used before–after time series and 15.7% before–after cross‐sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Conclusion Whilst impact evaluation of pharmacovigilance and product‐specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. PMID:29105853
The added value of thorough economic evaluation of telemedicine networks.
Le Goff-Pronost, Myriam; Sicotte, Claude
2010-02-01
This paper proposes a thorough framework for the economic evaluation of telemedicine networks. A standard cost analysis methodology was used as the initial base, similar to the evaluation method currently being applied to telemedicine, and to which we suggest adding subsequent stages that enhance the scope and sophistication of the analytical methodology. We completed the methodology with a longitudinal and stakeholder analysis, followed by the calculation of a break-even threshold, a calculation of the economic outcome based on net present value (NPV), an estimate of the social gain through external effects, and an assessment of the probability of social benefits. In order to illustrate the advantages, constraints and limitations of the proposed framework, we tested it in a paediatric cardiology tele-expertise network. The results demonstrate that the project threshold was not reached after the 4 years of the study. Also, the calculation of the project's NPV remained negative. However, the additional analytical steps of the proposed framework allowed us to highlight alternatives that can make this service economically viable. These included: use over an extended period of time, extending the network to other telemedicine specialties, or including it in the services offered by other community hospitals. In sum, the results presented here demonstrate the usefulness of an economic evaluation framework as a way of offering decision makers the tools they need to make comprehensive evaluations of telemedicine networks.
System engineering toolbox for design-oriented engineers
NASA Technical Reports Server (NTRS)
Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.
1994-01-01
This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.
Johnson, Blair T; Low, Robert E; MacDonald, Hayley V
2015-01-01
Systematic reviews now routinely assess methodological quality to gauge the validity of the included studies and of the synthesis as a whole. Although trends from higher quality studies should be clearer, it is uncertain how often meta-analyses incorporate methodological quality in models of study results either as predictors, or, more interestingly, in interactions with theoretical moderators. We survey 200 meta-analyses in three health promotion domains to examine when and how meta-analyses incorporate methodological quality. Although methodological quality assessments commonly appear in contemporary meta-analyses (usually as scales), they are rarely incorporated in analyses, and still more rarely analysed in interaction with theoretical determinants of the success of health promotions. The few meta-analyses (2.5%) that did include such an interaction analysis showed that moderator results remained significant in higher quality studies or were present only among higher quality studies. We describe how to model quality interactively with theoretically derived moderators and discuss strengths and weaknesses of this approach and in relation to current meta-analytic practice. In large literatures exhibiting heterogeneous effects, meta-analyses can incorporate methodological quality and generate conclusions that enable greater confidence not only about the substantive phenomenon but also about the role that methodological quality itself plays.
Weuve, Jennifer; Proust-Lima, Cécile; Power, Melinda C.; Gross, Alden L.; Hofer, Scott M.; Thiébaut, Rodolphe; Chêne, Geneviève; Glymour, M. Maria; Dufouil, Carole
2015-01-01
Clinical and population research on dementia and related neurologic conditions, including Alzheimer’s disease, faces several unique methodological challenges. Progress to identify preventive and therapeutic strategies rests on valid and rigorous analytic approaches, but the research literature reflects little consensus on “best practices.” We present findings from a large scientific working group on research methods for clinical and population studies of dementia, which identified five categories of methodological challenges as follows: (1) attrition/sample selection, including selective survival; (2) measurement, including uncertainty in diagnostic criteria, measurement error in neuropsychological assessments, and practice or retest effects; (3) specification of longitudinal models when participants are followed for months, years, or even decades; (4) time-varying measurements; and (5) high-dimensional data. We explain why each challenge is important in dementia research and how it could compromise the translation of research findings into effective prevention or care strategies. We advance a checklist of potential sources of bias that should be routinely addressed when reporting dementia research. PMID:26397878
DOE Office of Scientific and Technical Information (OSTI.GOV)
Senesac, Larry R; Datskos, Panos G; Sepaniak, Michael J
2006-01-01
In the present work, we have performed analyte species and concentration identification using an array of ten differentially functionalized microcantilevers coupled with a back-propagation artificial neural network pattern recognition algorithm. The array consists of ten nanostructured silicon microcantilevers functionalized by polymeric and gas chromatography phases and macrocyclic receptors as spatially dense, differentially responding sensing layers for identification and quantitation of individual analyte(s) and their binary mixtures. The array response (i.e. cantilever bending) to analyte vapor was measured by an optical readout scheme and the responses were recorded for a selection of individual analytes as well as several binary mixtures. Anmore » artificial neural network (ANN) was designed and trained to recognize not only the individual analytes and binary mixtures, but also to determine the concentration of individual components in a mixture. To the best of our knowledge, ANNs have not been applied to microcantilever array responses previously to determine concentrations of individual analytes. The trained ANN correctly identified the eleven test analyte(s) as individual components, most with probabilities greater than 97%, whereas it did not misidentify an unknown (untrained) analyte. Demonstrated unique aspects of this work include an ability to measure binary mixtures and provide both qualitative (identification) and quantitative (concentration) information with array-ANN-based sensor methodologies.« less
Analytical redundancy and the design of robust failure detection systems
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The Failure Detection and Identification (FDI) process is viewed as consisting of two stages: residual generation and decision making. It is argued that a robust FDI system can be achieved by designing a robust residual generation process. Analytical redundancy, the basis for residual generation, is characterized in terms of a parity space. Using the concept of parity relations, residuals can be generated in a number of ways and the design of a robust residual generation process can be formulated as a minimax optimization problem. An example is included to illustrate this design methodology. Previously announcedd in STAR as N83-20653
Comparative study of solar optics for paraboloidal concentrators
NASA Technical Reports Server (NTRS)
Wen, L.; Poon, P.; Carley, W.; Huang, L.
1979-01-01
Different analytical methods for computing the flux distribution on the focal plane of a paraboloidal solar concentrator are reviewed. An analytical solution in algebraic form is also derived for an idealized model. The effects resulting from using different assumptions in the definition of optical parameters used in these methodologies are compared and discussed in detail. These parameters include solar irradiance distribution (limb darkening and circumsolar), reflector surface specular spreading, surface slope error, and concentrator pointing inaccuracy. The type of computational method selected for use depends on the maturity of the design and the data available at the time the analysis is made.
Analytical Chemistry Developmental Work Using a 243Am Solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Khalil J.; Stanley, Floyd E.; Porterfield, Donivan R.
2015-02-24
This project seeks to reestablish our analytical capability to characterize Am bulk material and develop a reference material suitable to characterizing the purity and assay of 241Am oxide for industrial use. The tasks associated with this phase of the project included conducting initial separations experiments, developing thermal ionization mass spectrometry capability using the 243Am isotope as an isotope dilution spike , optimizing the spike for the determination of 241Pu- 241 Am radiochemistry, and, additionally, developing and testing a methodology which can detect trace to ultra- trace levels of Pu (both assay and isotopics) in bulk Am samples .
USDA-ARS GRACEnet Project Protocols, Chapter 3. Chamber-based trace gas flux measurements4
USDA-ARS?s Scientific Manuscript database
This protocol addresses N2O, CO2 and CH4 flux measurement by soil chamber methodology. The reactivities of other gasses of interest such as NOx O3, CO, and NH3 will require different chambers and associated instrumentation. Carbon dioxide is included as an analyte with this protocol; however, when p...
ERIC Educational Resources Information Center
Rossholt, Nina
2009-01-01
This article discusses theoretical, methodological and analytical strategies for researching the material subject. The discussion relates to discursive practices in a preschool setting with children of one and two years of age, where the material subject includes both bodily and discursive practices. Using critical ethnography research, the author…
Quantifying construction and demolition waste: an analytical review.
Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen
2014-09-01
Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.
Mansilha, C; Melo, A; Rebelo, H; Ferreira, I M P L V O; Pinho, O; Domingues, V; Pinho, C; Gameiro, P
2010-10-22
A multi-residue methodology based on a solid phase extraction followed by gas chromatography-tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC-MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness. Copyright © 2010 Elsevier B.V. All rights reserved.
Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards
Smith, Justin D.
2013-01-01
This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874
Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Chang Jae; Han, Seung; Yun, Jae Hee
2015-07-01
Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less
Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica
2016-06-01
To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In the first laboratory evaluation of these biobeds endosulfan was bioconverted up to 87% and chlorpyrifos more than 79% after 27 days. Copyright © 2016 Elsevier B.V. All rights reserved.
How effects on health equity are assessed in systematic reviews of interventions.
Welch, Vivian; Tugwell, Peter; Petticrew, Mark; de Montigny, Joanne; Ueffing, Erin; Kristjansson, Betsy; McGowan, Jessie; Benkhalti Jandu, Maria; Wells, George A; Brand, Kevin; Smylie, Janet
2010-12-08
Enhancing health equity has now achieved international political importance with endorsement from the World Health Assembly in 2009. The failure of systematic reviews to consider effects on health equity is cited by decision-makers as a limitation to their ability to inform policy and program decisions. To systematically review methods to assess effects on health equity in systematic reviews of effectiveness. We searched the following databases up to July 2 2010: MEDLINE, PsychINFO, the Cochrane Methodology Register, CINAHL, Education Resources Information Center, Education Abstracts, Criminal Justice Abstracts, Index to Legal Periodicals, PAIS International, Social Services Abstracts, Sociological Abstracts, Digital Dissertations and the Health Technology Assessment Database. We searched SCOPUS to identify articles that cited any of the included studies on October 7 2010. We included empirical studies of cohorts of systematic reviews that assessed methods for measuring effects on health inequalities. Data were extracted using a pre-tested form by two independent reviewers. Risk of bias was appraised for included studies according to the potential for bias in selection and detection of systematic reviews. Thirty-four methodological studies were included. The methods used by these included studies were: 1) Targeted approaches (n=22); 2) gap approaches (n=12) and gradient approach (n=1). Gender or sex was assessed in eight out of 34 studies, socioeconomic status in ten studies, race/ethnicity in seven studies, age in seven studies, low and middle income countries in 14 studies, and two studies assessed multiple factors across health inequity may exist.Only three studies provided a definition of health equity. Four methodological approaches to assessing effects on health equity were identified: 1) descriptive assessment of reporting and analysis in systematic reviews (all 34 studies used a type of descriptive method); 2) descriptive assessment of reporting and analysis in original trials (12/34 studies); 3) analytic approaches (10/34 studies); and 4) applicability assessment (11/34 studies). Both analytic and applicability approaches were not reported transparently nor in sufficient detail to judge their credibility. There is a need for improvement in conceptual clarity about the definition of health equity, describing sufficient detail about analytic approaches (including subgroup analyses) and transparent reporting of judgments required for applicability assessments in order to assess and report effects on health equity in systematic reviews.
Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W
2017-02-01
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.
An analytical procedure to assist decision-making in a government research organization
H. Dean Claxton; Giuseppe Rensi
1972-01-01
An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...
Methodological quality of behavioural weight loss studies: a systematic review
Lemon, S. C.; Wang, M. L.; Haughton, C. F.; Estabrook, D. P.; Frisard, C. F.; Pagoto, S. L.
2018-01-01
Summary This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate >75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775
NASA Technical Reports Server (NTRS)
Pieper, Jerry L.; Walker, Richard E.
1993-01-01
During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.
Zaslavsky, Oleg; Cochrane, Barbara B; Herting, Jerald R; Thompson, Hilaire J; Woods, Nancy F; Lacroix, Andrea
2014-02-01
Despite the variety of available analytic methods, longitudinal research in nursing has been dominated by use of a variable-centered analytic approach. The purpose of this article is to present the utility of person-centered methodology using a large cohort of American women 65 and older enrolled in the Women's Health Initiative Clinical Trial (N = 19,891). Four distinct trajectories of energy/fatigue scores were identified. Levels of fatigue were closely linked to age, socio-demographic factors, comorbidities, health behaviors, and poor sleep quality. These findings were consistent regardless of the methodological framework. Finally, we demonstrated that energy/fatigue levels predicted future hospitalization in non-disabled elderly. Person-centered methods provide unique opportunities to explore and statistically model the effects of longitudinal heterogeneity within a population. © 2013 Wiley Periodicals, Inc.
Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)
NASA Technical Reports Server (NTRS)
1983-01-01
The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.
Ruiz-Cordell, Karyn D; Joubin, Kathy; Haimowitz, Steven
2016-01-01
The goal of this study was to add a predictive modeling approach to the meta-analysis of continuing medical education curricula to determine whether this technique can be used to better understand clinical decision making. Using the education of rheumatologists on rheumatoid arthritis management as a model, this study demonstrates how the combined methodology has the ability to not only characterize learning gaps but also identify those proficiency areas that have the greatest impact on clinical behavior. The meta-analysis included seven curricula with 25 activities. Learners who identified as rheumatologists were evaluated across multiple learning domains, using a uniform methodology to characterize learning gains and gaps. A performance composite variable (called the treatment individualization and optimization score) was then established as a target upon which predictive analytics were conducted. Significant predictors of the target included items related to the knowledge of rheumatologists and confidence concerning 1) treatment guidelines and 2) tests that measure disease activity. In addition, a striking demographic predictor related to geographic practice setting was also identified. The results demonstrate the power of advanced analytics to identify key predictors that influence clinical behaviors. Furthermore, the ability to provide an expected magnitude of change if these predictors are addressed has the potential to substantially refine educational priorities to those drivers that, if targeted, will most effectively overcome clinical barriers and lead to the greatest success in achieving treatment goals.
Goedecke, Thomas; Morales, Daniel R; Pacurariu, Alexandra; Kurz, Xavier
2018-03-01
Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non-European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. From 1246 screened articles, 229 were eligible for full-text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill-over effects were rarely evaluated. Two-thirds used before-after time series and 15.7% before-after cross-sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Whilst impact evaluation of pharmacovigilance and product-specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. © 2017 The Authors. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.
Set this house on fire: the self-analysis of Raymond Carver.
Tutter, Adele
2011-10-01
The convergence of features of Raymond Carver's short-story oeuvre and of psychoanalytic methodology suggests that Carver's writing served as the fulcrum and focus of a self-analytic experience. Within this model, his stories function as container and mirror of myriad aspects of the writer's self. Tracing the developmental arc of the contextual meanings of one motif--fire--through six stories and their ur-texts demonstrates gains comparable to certain analytic goals, including enhanced integration, accountability, and self-awareness. Over time, Carver's narratives of rage, impotence, and despair give way to a new story: of mourning, forgiveness, and the rekindling of hope.
NASA Technical Reports Server (NTRS)
Giles, G. L.; Rogers, J. L., Jr.
1982-01-01
The methodology used to implement structural sensitivity calculations into a major, general-purpose finite-element analysis system (SPAR) is described. This implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calculating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of SPAR are also discussed.
(U) Analytic First and Second Derivatives of the Uncollided Leakage for a Homogeneous Sphere
DOE Office of Scientific and Technical Information (OSTI.GOV)
Favorite, Jeffrey A.
2017-04-26
The second-order adjoint sensitivity analysis methodology (2nd-ASAM), developed by Cacuci, has been applied by Cacuci to derive second derivatives of a response with respect to input parameters for uncollided particles in an inhomogeneous transport problem. In this memo, we present an analytic benchmark for verifying the derivatives of the 2nd-ASAM. The problem is a homogeneous sphere, and the response is the uncollided total leakage. This memo does not repeat the formulas given in Ref. 2. We are preparing a journal article that will include the derivation of Ref. 2 and the benchmark of this memo.
SociAL Sensor Analytics: Measuring Phenomenology at Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corley, Courtney D.; Dowling, Chase P.; Rose, Stuart J.
The objective of this paper is to present a system for interrogating immense social media streams through analytical methodologies that characterize topics and events critical to tactical and strategic planning. First, we propose a conceptual framework for interpreting social media as a sensor network. Time-series models and topic clustering algorithms are used to implement this concept into a functioning analytical system. Next, we address two scientific challenges: 1) to understand, quantify, and baseline phenomenology of social media at scale, and 2) to develop analytical methodologies to detect and investigate events of interest. This paper then documents computational methods and reportsmore » experimental findings that address these challenges. Ultimately, the ability to process billions of social media posts per week over a period of years enables the identification of patterns and predictors of tactical and strategic concerns at an unprecedented rate through SociAL Sensor Analytics (SALSA).« less
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
Analytical Chemistry Division annual progress report for period ending November 30, 1977
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyon, W.S.
1978-03-01
Activities for the year are summarized in sections on analytical methodology, mass and mass emission spectrometry, analytical services, bio-organic analysis, nuclear and radiochemical analysis, and quality assurance and safety. Presentations of research results in publications and reports are tabulated. (JRD)
CREATE-IP and CREATE-V: Data and Services Update
NASA Astrophysics Data System (ADS)
Carriere, L.; Potter, G. L.; Hertz, J.; Peters, J.; Maxwell, T. P.; Strong, S.; Shute, J.; Shen, Y.; Duffy, D.
2017-12-01
The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center and the Earth System Grid Federation (ESGF) are working together to build a uniform environment for the comparative study and use of a group of reanalysis datasets of particular importance to the research community. This effort is called the Collaborative REAnalysis Technical Environment (CREATE) and it contains two components: the CREATE-Intercomparison Project (CREATE-IP) and CREATE-V. This year's efforts included generating and publishing an atmospheric reanalysis ensemble mean and spread and improving the analytics available through CREATE-V. Related activities included adding access to subsets of the reanalysis data through ArcGIS and expanding the visualization tool to GMAO forecast data. This poster will present the access mechanisms to this data and use cases including example Jupyter Notebook code. The reanalysis ensemble was generated using two methods, first using standard Python tools for regridding, extracting levels and creating the ensemble mean and spread on a virtual server in the NCCS environment. The second was using a new analytics software suite, the Earth Data Analytics Services (EDAS), coupled with a high-performance Data Analytics and Storage System (DASS) developed at the NCCS. Results were compared to validate the EDAS methodologies, and the results, including time to process, will be presented. The ensemble includes selected 6 hourly and monthly variables, regridded to 1.25 degrees, with 24 common levels used for the 3D variables. Use cases for the new data and services will be presented, including the use of EDAS for the backend analytics on CREATE-V, the use of the GMAO forecast aerosol and cloud data in CREATE-V, and the ability to connect CREATE-V data to NCCS ArcGIS services.
76 FR 55804 - Dicamba; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-09
... Considerations A. Analytical Enforcement Methodology Adequate enforcement methodologies, Methods I and II--gas chromatography with electron capture detection (GC/ECD), are available to enforce the tolerance expression. The...
A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales
Ayton, Gary S.; Voth, Gregory A.
2009-01-01
A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167
Weuve, Jennifer; Proust-Lima, Cécile; Power, Melinda C; Gross, Alden L; Hofer, Scott M; Thiébaut, Rodolphe; Chêne, Geneviève; Glymour, M Maria; Dufouil, Carole
2015-09-01
Clinical and population research on dementia and related neurologic conditions, including Alzheimer's disease, faces several unique methodological challenges. Progress to identify preventive and therapeutic strategies rests on valid and rigorous analytic approaches, but the research literature reflects little consensus on "best practices." We present findings from a large scientific working group on research methods for clinical and population studies of dementia, which identified five categories of methodological challenges as follows: (1) attrition/sample selection, including selective survival; (2) measurement, including uncertainty in diagnostic criteria, measurement error in neuropsychological assessments, and practice or retest effects; (3) specification of longitudinal models when participants are followed for months, years, or even decades; (4) time-varying measurements; and (5) high-dimensional data. We explain why each challenge is important in dementia research and how it could compromise the translation of research findings into effective prevention or care strategies. We advance a checklist of potential sources of bias that should be routinely addressed when reporting dementia research. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Application of damage tolerance methodology in certification of the Piaggio P-180 Avanti
NASA Technical Reports Server (NTRS)
Johnson, Jerry
1992-01-01
The Piaggio P-180 Avanti, a twin pusher-prop engine nine-passenger business aircraft was certified in 1990, to the requirements of FAR Part 23 and Associated Special Conditions for Composite Structure. Certification included the application of a damage tolerant methodology to the design of the composite forward wing and empennage (vertical fin, horizontal stabilizer, tailcone, and rudder) structure. This methodology included an extensive analytical evaluation coupled with sub-component and full-scale testing of the structure. The work from the Damage Tolerance Analysis Assessment was incorporated into the full-scale testing. Damage representing hazards such as dropped tools, ground equipment, handling, and runway debris, was applied to the test articles. Additional substantiation included allowing manufacturing discrepancies to exist unrepaired on the full-scale articles and simulated bondline failures in critical elements. The importance of full-scale testing in the critical environmental conditions and the application of critical damage are addressed. The implication of damage tolerance on static and fatigue testing is discussed. Good correlation between finite element solutions and experimental test data was observed.
Steuten, Lotte; van de Wetering, Gijs; Groothuis-Oudshoorn, Karin; Retèl, Valesca
2013-01-01
This article provides a systematic and critical review of the evolving methods and applications of value of information (VOI) in academia and practice and discusses where future research needs to be directed. Published VOI studies were identified by conducting a computerized search on Scopus and ISI Web of Science from 1980 until December 2011 using pre-specified search terms. Only full-text papers that outlined and discussed VOI methods for medical decision making, and studies that applied VOI and explicitly discussed the results with a view to informing healthcare decision makers, were included. The included papers were divided into methodological and applied papers, based on the aim of the study. A total of 118 papers were included of which 50 % (n = 59) are methodological. A rapidly accumulating literature base on VOI from 1999 onwards for methodological papers and from 2005 onwards for applied papers is observed. Expected value of sample information (EVSI) is the preferred method of VOI to inform decision making regarding specific future studies, but real-life applications of EVSI remain scarce. Methodological challenges to VOI are numerous and include the high computational demands, dealing with non-linear models and interdependency between parameters, estimations of effective time horizons and patient populations, and structural uncertainties. VOI analysis receives increasing attention in both the methodological and the applied literature bases, but challenges to applying VOI in real-life decision making remain. For many technical and methodological challenges to VOI analytic solutions have been proposed in the literature, including leaner methods for VOI. Further research should also focus on the needs of decision makers regarding VOI.
Alemayehu, Demissie; Cappelleri, Joseph C
2012-07-01
Patient-reported outcomes (PROs) can play an important role in personalized medicine. PROs can be viewed as an important fundamental tool to measure the extent of disease and the effect of treatment at the individual level, because they reflect the self-reported health state of the patient directly. However, their effective integration in personalized medicine requires addressing certain conceptual and methodological challenges, including instrument development and analytical issues. To evaluate methodological issues, such as multiple comparisons, missing data, and modeling approaches, associated with the analysis of data related to PRO and personalized medicine to further our understanding on the role of PRO data in personalized medicine. There is a growing recognition of the role of PROs in medical research, but their potential use in customizing healthcare is not widely appreciated. Emerging insights into the genetic basis of PROs could potentially lead to new pathways that may improve patient care. Knowledge of the biologic pathways through which the various genetic predispositions propel people toward negative or away from positive health experiences may ultimately transform healthcare. Understanding and addressing the conceptual and methodological issues in PROs and personalized medicine are expected to enhance the emerging area of personalized medicine and to improve patient care. This article addresses relevant concerns that need to be considered for effective integration of PROs in personalized medicine, with particular reference to conceptual and analytical issues that routinely arise with personalized medicine and PRO data. Some of these issues, including multiplicity problems, handling of missing values-and modeling approaches, are common to both areas. It is hoped that this article will help to stimulate further research to advance our understanding of the role of PRO data in personalized medicine. A robust conceptual framework to incorporate PROs into personalized medicine can provide fertile opportunity to bring these two areas even closer and to enhance the way a specific treatment is attuned and delivered to address patient care and patient needs.
FASP, an analytic resource appraisal program for petroleum play analysis
Crovelli, R.A.; Balay, R.H.
1986-01-01
An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented in a FORTRAN program termed FASP. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An established geologic model considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The program FASP produces resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and many laws of expectation and variance. ?? 1986.
Reducing Conservatism of Analytic Transient Response Bounds via Shaping Filters
NASA Technical Reports Server (NTRS)
Kwan, Aiyueh; Bedrossian, Nazareth; Jan, Jiann-Woei; Grigoriadis, Karolos; Hua, Tuyen (Technical Monitor)
1999-01-01
Recent results show that the peak transient response of a linear system to bounded energy inputs can be computed using the energy-to-peak gain of the system. However, analytically computed peak response bound can be conservative for a class of class bounded energy signals, specifically pulse trains generated from jet firings encountered in space vehicles. In this paper, shaping filters are proposed as a Methodology to reduce the conservatism of peak response analytic bounds. This Methodology was applied to a realistic Space Station assembly operation subject to jet firings. The results indicate that shaping filters indeed reduce the predicted peak response bounds.
Risk management of drinking water relies on quality analytical data. Analytical methodology can often be adapted from environmental monitoring sources. However, risk management sometimes presents special analytical challenges because data may be needed from a source for which n...
Microgenetic Learning Analytics Methods: Workshop Report
ERIC Educational Resources Information Center
Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin
2016-01-01
Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…
Kemp, Candace L.; Ball, Mary M.; Morgan, Jennifer Craft; Doyle, Patrick J.; Burgess, Elisabeth O.; Dillard, Joy A.; Barmon, Christina E.; Fitzroy, Andrea F.; Helmly, Victoria E.; Avent, Elizabeth S.; Perkins, Molly M.
2018-01-01
In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents’ care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building. PMID:27651072
Kemp, Candace L; Ball, Mary M; Morgan, Jennifer Craft; Doyle, Patrick J; Burgess, Elisabeth O; Dillard, Joy A; Barmon, Christina E; Fitzroy, Andrea F; Helmly, Victoria E; Avent, Elizabeth S; Perkins, Molly M
2017-07-01
In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents' care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building.
Recent advances in CE-MS coupling: Instrumentation, methodology, and applications.
Týčová, Anna; Ledvina, Vojtěch; Klepárník, Karel
2017-01-01
This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices coupled with MS for detection and identification of important analytes. It is a continuation of the review article on the same topic by Kleparnik (Electrophoresis 2015, 36, 159-178). A wide selection of 161 relevant articles covers the literature published from June 2014 till May 2016. New improvements in the instrumentation and methodology of MS interfaced with capillary or microfluidic versions of zone electrophoresis, isotachophoresis, and isoelectric focusing are described in detail. The most frequently implemented MS ionization methods include electrospray ionization, matrix-assisted desorption/ionization and inductively coupled plasma ionization. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography, and micellar electrokinetic chromatography are not included. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo
2013-10-24
In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min
2009-01-01
Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.
Prison Radicalization: The New Extremist Training Grounds?
2007-09-01
distributing and collecting survey data , and the data analysis. The analytical methodology includes descriptive and inferential statistical methods, in... statistical analysis of the responses to identify significant correlations and relationships. B. SURVEY DATA COLLECTION To effectively access a...Q18, Q19, Q20, and Q21. Due to the exploratory nature of this small survey, data analyses were confined mostly to descriptive statistics and
ERIC Educational Resources Information Center
Cafri, Guy; van den Berg, Patricia; Brannick, Michael T.
2010-01-01
Difference scores are often used as a means of assessing body image satisfaction using silhouette scales. Unfortunately, difference scores suffer from numerous potential methodological problems, including reduced reliability, ambiguity, confounded effects, untested constraints, and dimensional reduction. In this article, the methodological…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-28
...-nitroguanidine, in or on fruiting, vegetables, group 8-10, except pepper/eggplant subgroup 8-10B at 0.2 ppm; and pepper/eggplant subgroup 8-10B at 0.7 ppm. Adequate enforcement methodology (LC/MS/MS analysis) is... validated, including LC- MS/MS methods for use on tomato, pepper, melon, and cucumber. The analytical...
Model verification of large structural systems
NASA Technical Reports Server (NTRS)
Lee, L. T.; Hasselman, T. K.
1977-01-01
A methodology was formulated, and a general computer code implemented for processing sinusoidal vibration test data to simultaneously make adjustments to a prior mathematical model of a large structural system, and resolve measured response data to obtain a set of orthogonal modes representative of the test model. The derivation of estimator equations is shown along with example problems. A method for improving the prior analytic model is included.
2016 Workplace and Gender Relations Survey of Active Duty Members: Statistical Methodology Report
2017-03-01
2016 Workplace and Gender Relations Survey of Active Duty Members Statistical Methodology Report Additional copies of this report may be...MEMBERS: STATISTICAL METHODOLOGY REPORT Office of People Analytics (OPA) Defense Research, Surveys, and Statistics Center 4800 Mark Center Drive...20 1 2016 WORKPLACE AND GENDER RELATIONS SURVEY OF ACTIVE DUTY MEMBERS: STATISTICAL METHODOLOGY REPORT
Quantitative SIMS Imaging of Agar-Based Microbial Communities.
Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V
2018-05-01
After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.
Ho, Tien D; Yehl, Peter M; Chetwyn, Nik P; Wang, Jin; Anderson, Jared L; Zhong, Qiqing
2014-09-26
Ionic liquids (ILs) were used as a new class of diluents for the analysis of two classes of genotoxic impurities (GTIs), namely, alkyl/aryl halides and nitro-aromatics, in small molecule drug substances by headspace gas chromatography (HS-GC) coupled with electron capture detection (ECD). This novel approach using ILs as contemporary diluents greatly broadens the applicability of HS-GC for the determination of high boiling (≥ 130°C) analytes including GTIs with limits of detection (LOD) ranging from 5 to 500 parts-per-billion (ppb) of analytes in a drug substance. This represents up to tens of thousands-fold improvement compared to traditional HS-GC diluents such as dimethyl sulfoxide (DMSO) and dimethylacetamide (DMAC). Various ILs were screened to determine their suitability as diluents for the HS-GC/ECD analysis. Increasing the HS oven temperatures resulted in varying responses for alkyl/aryl halides and a significant increase in response for all nitroaromatic GTIs. Linear ranges of up to five orders of magnitude were found for a number of analytes. The technique was validated on two active pharmaceutical ingredients with excellent recovery. This simple and robust methodology offers a key advantage in the ease of method transfer from development laboratories to quality control environments since conventional validated chromatographic data systems and GC instruments can be used. For many analytes, it is a cost effective alternative to more complex trace analytical methodologies like LC/MS and GC/MS, and significantly reduces the training needed for operation. Copyright © 2014 Elsevier B.V. All rights reserved.
Manufacturing data analytics using a virtual factory representation.
Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun
2017-01-01
Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.
Pendergrass, S M
1999-01-01
Glycol-based fluids are used in the production of theatrical smokes in theaters, concerts, and other stage productions. The fluids are heated and dispersed in aerosol form to create the effect of a smoke, mist, or fog. There have been reports of adverse health effects such as respiratory irritation, chest tightness, shortness of breath, asthma, and skin rashes. Previous attempts to collect and quantify the aerosolized glycols used in fogging agents have been plagued by inconsistent results, both in the efficiency of collection and in the chromatographic analysis of the glycol components. The development of improved sampling and analytical methodology for aerosolized glycols was required to assess workplace exposures more effectively. An Occupational Safety and Health Administration versatile sampler tube was selected for the collection of ethylene glycol, propylene glycol, 1,3-butylene glycol, diethylene glycol, triethylene glycol, and tetraethylene glycol aerosols. Analytical methodology for the separation, identification, and quantitation of the six glycols using gas chromatography/flame ionization detection is described. Limits of detection of the glycol analytes ranged from 7 to 16 micrograms/sample. Desorption efficiencies for all glycol compounds were determined over the range of study and averaged greater than 90%. Storage stability results were acceptable after 28 days for all analytes except ethylene glycol, which was stable at ambient temperature for 14 days. Based on the results of this study, the new glycol method was published in the NIOSH Manual of Analytical Methods.
Versatile electrophoresis-based self-test platform.
Guijt, Rosanne M
2015-03-01
Lab on a Chip technology offers the possibility to extract chemical information from a complex sample in a simple, automated way without the need for a laboratory setting. In the health care sector, this chemical information could be used as a diagnostic tool for example to inform dosing. In this issue, the research underpinning a family of electrophoresis-based point-of-care devices for self-testing of ionic analytes in various sample matrices is described [Electrophoresis 2015, 36, 712-721.]. Hardware, software, and methodological chances made to improve the overall analytical performance in terms of accuracy, precision, detection limit, and reliability are discussed. In addition to the main focus of lithium monitoring, new applications including the use of the platform for veterinary purposes, sodium, and for creatinine measurements are included. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean
2014-01-01
Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.
On the Use of Accelerated Test Methods for Characterization of Advanced Composite Materials
NASA Technical Reports Server (NTRS)
Gates, Thomas S.
2003-01-01
A rational approach to the problem of accelerated testing for material characterization of advanced polymer matrix composites is discussed. The experimental and analytical methods provided should be viewed as a set of tools useful in the screening of material systems for long-term engineering properties in aerospace applications. Consideration is given to long-term exposure in extreme environments that include elevated temperature, reduced temperature, moisture, oxygen, and mechanical load. Analytical formulations useful for predictive models that are based on the principles of time-based superposition are presented. The need for reproducible mechanisms, indicator properties, and real-time data are outlined as well as the methodologies for determining specific aging mechanisms.
An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application
ERIC Educational Resources Information Center
Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth
2016-01-01
Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…
A Progressive Approach to Teaching Analytics in the Marketing Curriculum
ERIC Educational Resources Information Center
Liu, Yiyuan; Levin, Michael A.
2018-01-01
With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…
NASA Technical Reports Server (NTRS)
Fantano, Louis
2015-01-01
Thermal and Fluids Analysis Workshop Silver Spring, MD NCTS 21070-15 The Landsat 8 Data Continuity Mission, which is part of the United States Geologic Survey (USGS), launched February 11, 2013. A Landsat environmental test requirement mandated that test conditions bound worst-case flight thermal environments. This paper describes a rigorous analytical methodology applied to assess refine proposed thermal vacuum test conditions and the issues encountered attempting to satisfy this requirement.
NEMS Freight Transportation Module Improvement Study
2015-01-01
The U.S. Energy Information Administration (EIA) contracted with IHS Global, Inc. (IHS) to analyze the relationship between the value of industrial output, physical output, and freight movement in the United States for use in updating analytic assumptions and modeling structure within the National Energy Modeling System (NEMS) freight transportation module, including forecasting methodologies and processes to identify possible alternative approaches that would improve multi-modal freight flow and fuel consumption estimation.
A Methodology for Evaluating the Fidelity of Ground-Based Flight Simulators
NASA Technical Reports Server (NTRS)
Zeyada, Y.; Hess, R. A.
1999-01-01
An analytical and experimental investigation was undertaken to model the manner in which pilots perceive and utilize visual, proprioceptive, and vestibular cues in a ground-based flight simulator. The study was part of a larger research effort which has the creation of a methodology for determining flight simulator fidelity requirements as its ultimate goal. The study utilized a closed-loop feedback structure of the pilot/simulator system which included the pilot, the cockpit inceptor, the dynamics of the simulated vehicle and the motion system. With the exception of time delays which accrued in visual scene production in the simulator, visual scene effects were not included in this study. The NASA Ames Vertical Motion Simulator was used in a simple, single-degree of freedom rotorcraft bob-up/down maneuver. Pilot/vehicle analysis and fuzzy-inference identification were employed to study the changes in fidelity which occurred as the characteristics of the motion system were varied over five configurations i The data from three of the five pilots that participated in the experimental study were analyzed in the fuzzy inference identification. Results indicate that both the analytical pilot/vehicle analysis and the fuzzyinference identification can be used to reflect changes in simulator fidelity for the task examined.
A Methodology for Evaluating the Fidelity of Ground-Based Flight Simulators
NASA Technical Reports Server (NTRS)
Zeyada, Y.; Hess, R. A.
1999-01-01
An analytical and experimental investigation was undertaken to model the manner in which pilots perceive and utilize visual, proprioceptive, and vestibular cues in a ground-based flight simulator. The study was part of a larger research effort which has the creation of a methodology for determining flight simulator fidelity requirements as its ultimate goal. The study utilized a closed-loop feedback structure of the pilot/simulator system which included the pilot, the cockpit inceptor, the dynamics of the simulated vehicle and the motion system. With the exception of time delays which accrued in visual scene production in the simulator, visual scene effects were not included in this study. The NASA Ames Vertical Motion Simulator was used in a simple, single-degree of freedom rotorcraft bob-up/down maneuver. Pilot/vehicle analysis and fuzzy-inference identification were employed to study the changes in fidelity which occurred as the characteristics of the motion system were varied over five configurations. The data from three of the five pilots that participated in the experimental study were analyzed in the fuzzy-inference identification. Results indicate that both the analytical pilot/vehicle analysis and the fuzzy-inference identification can be used to reflect changes in simulator fidelity for the task examined.
Tennessee long-range transportation plan : project evaluation system
DOT National Transportation Integrated Search
2005-12-01
The Project Evaluation System (PES) Report is an analytical methodology to aid programming efforts and prioritize multimodal investments. The methodology consists of both quantitative and qualitative evaluation criteria built upon the Guiding Princip...
NASA Astrophysics Data System (ADS)
Pratiher, Sawon; Patra, Sayantani; Pratiher, Souvik
2017-06-01
A novel analytical methodology for segregating healthy and neurological disorders from gait patterns is proposed by employing a set of oscillating components called intrinsic mode functions (IMF's). These IMF's are generated by the Empirical Mode Decomposition of the gait time series and the Hilbert transformed analytic signal representation forms the complex plane trace of the elliptical shaped analytic IMFs. The area measure and the relative change in the centroid position of the polygon formed by the Convex Hull of these analytic IMF's are taken as the discriminative features. Classification accuracy of 79.31% with Ensemble learning based Adaboost classifier validates the adequacy of the proposed methodology for a computer aided diagnostic (CAD) system for gait pattern identification. Also, the efficacy of several potential biomarkers like Bandwidth of Amplitude Modulation and Frequency Modulation IMF's and it's Mean Frequency from the Fourier-Bessel expansion from each of these analytic IMF's has been discussed for its potency in diagnosis of gait pattern identification and classification.
PCB congener analysis with Hall electrolytic conductivity detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edstrom, R.D.
1989-01-01
This work reports the development of an analytical methodology for the analysis of PCB congeners based on integrating relative retention data provided by other researchers. The retention data were transposed into a multiple retention marker system which provided good precision in the calculation of relative retention indices for PCB congener analysis. Analytical run times for the developed methodology were approximately one hour using a commercially available GC capillary column. A Tracor Model 700A Hall Electrolytic Conductivity Detector (HECD) was employed in the GC detection of Aroclor standards and environmental samples. Responses by the HECD provided good sensitivity and were reasonablymore » predictable. Ten response factors were calculated based on the molar chlorine content of each homolog group. Homolog distributions were determined for Aroclors 1016, 1221, 1232, 1242, 1248, 1254, 1260, 1262 along with binary and ternary mixtures of the same. These distributions were compared with distributions reported by other researchers using electron capture detection as well as chemical ionization mass spectrometric methodologies. Homolog distributions acquired by the HECD methodology showed good correlation with the previously mentioned methodologies. The developed analytical methodology was used in the analysis of bluefish (Pomatomas saltatrix) and weakfish (Cynoscion regalis) collected from the York River, lower James River and lower Chesapeake Bay in Virginia. Total PCB concentrations were calculated and homolog distributions were constructed from the acquired data. Increases in total PCB concentrations were found in the analyzed fish samples during the fall of 1985 collected from the lower James River and lower Chesapeake Bay.« less
Reynolds, Arthur J; Ou, Suh-Ruu
2016-10-01
This article reviews methodological and analytic approaches and impact evidence for understanding the mechanisms of effects of early childhood interventions, including delinquency and violence prevention. Illustrations from longitudinal studies of preschool preventive interventions are provided. We restrict our attention to preventive interventions for children from birth to age 5, including evidence from the Chicago Longitudinal Study (CLS), which investigates the impact of an established school-based early childhood intervention. Frameworks and evidence will be organized according to the Five-Hypothesis Model (5HM), which postulates that a variety of early childhood interventions impact later well-being through the promotion of cognitive and scholastic advantages, motivational advantages, social adjustment, family support behaviors, and school supports. Recommendations are made for advancing confirmatory approaches for identifying the most effective prevention programs using identification of generative mechanisms as a major methodological criterion.
Vandermause, Roxanne; Barbosa-Leiker, Celestina; Fritz, Roschelle
2014-12-01
This multimethod, qualitative study provides results for educators of nursing doctoral students to consider. Combining the expertise of an empirical analytical researcher (who uses statistical methods) and an interpretive phenomenological researcher (who uses hermeneutic methods), a course was designed that would place doctoral students in the midst of multiparadigmatic discussions while learning fundamental research methods. Field notes and iterative analytical discussions led to patterns and themes that highlight the value of this innovative pedagogical application. Using content analysis and interpretive phenomenological approaches, together with one of the students, data were analyzed from field notes recorded in real time over the period the course was offered. This article describes the course and the study analysis, and offers the pedagogical experience as transformative. A link to a sample syllabus is included in the article. The results encourage nurse educators of doctoral nursing students to focus educational practice on multiple methodological perspectives. Copyright 2014, SLACK Incorporated.
NASA Astrophysics Data System (ADS)
Didier, Delaunay; Baptiste, Pignon; Nicolas, Boyard; Vincent, Sobotka
2018-05-01
Heat transfer during the cooling of a thermoplastic injected part directly affects the solidification of the polymer and consequently the quality of the part in term of mechanical properties, geometric tolerance and surface aspect. This paper proposes to mold designers a methodology based on analytical models to provide quickly the time to reach the ejection temperature depending of the temperature and the position of cooling channels. The obtained cooling time is the first step of the thermal conception of the mold. The presented methodology is dedicated to the determination of solidification time of a semi-crystalline polymer slab. It allows the calculation of the crystallization time of the part and is based on the analytical solution of the Stefan problem in a semi-infinite medium. The crystallization is then considered as a phase change with an effective crystallization temperature, which is obtained from Fast Scanning Calorimetry (FSC) results. The crystallization time is then corrected to take the finite thickness of the part into account. To check the accuracy of such approach, the solidification time is calculated by solving the heat conduction equation coupled to the crystallization kinetics of the polymer. The impact of the nature of the contact between the polymer and the mold is evaluated. The thermal contact resistance (TCR) appears as significant parameter that needs to be taken into account in the cooling time calculation. The results of the simplified model including or not TCR are compared in the case of a polypropylene (PP) with experiments carried out with an instrumented mold. Then, the methodology is applied for a part made with PolyEtherEtherKetone (PEEK).
Boix, C; Ibáñez, M; Fabregat-Safont, D; Morales, E; Pastor, L; Sancho, J V; Sánchez-Ramírez, J E; Hernández, F
2016-01-01
In this work, two analytical methodologies based on liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) were developed for quantification of emerging pollutants identified in sewage sludge after a previous wide-scope screening. The target list included 13 emerging contaminants (EC): thiabendazole, acesulfame, fenofibric acid, valsartan, irbesartan, salicylic acid, diclofenac, carbamazepine, 4-aminoantipyrine (4-AA), 4-acetyl aminoantipyrine (4-AAA), 4-formyl aminoantipyrine (4-FAA), venlafaxine and benzoylecgonine. The aqueous and solid phases of the sewage sludge were analyzed making use of Solid-Phase Extraction (SPE) and UltraSonic Extraction (USE) for sample treatment, respectively. The methods were validated at three concentration levels: 0.2, 2 and 20 μg L(-1) for the aqueous phase, and 50, 500 and 2000 μg kg(-1) for the solid phase of the sludge. In general, the method was satisfactorily validated, showing good recoveries (70-120%) and precision (RSD < 20%). Regarding the limit of quantification (LOQ), it was below 0.1 μg L(-1) in the aqueous phase and below 50 μg kg(-1) in the solid phase for the majority of the analytes. The method applicability was tested by analysis of samples from a wider study on degradation of emerging pollutants in sewage sludge under anaerobic digestion. The key benefits of these methodologies are: • SPE and USE are appropriate sample procedures to extract selected emerging contaminants from the aqueous phase of the sewage sludge and the solid residue. • LC-MS/MS is highly suitable for determining emerging contaminants in both sludge phases. • Up to our knowledge, the main metabolites of dipyrone had not been studied before in sewage sludge.
Methodologies for Optimum Capital Expenditure Decisions for New Medical Technology
Landau, Thomas P.; Ledley, Robert S.
1980-01-01
This study deals with the development of a theory and an analytical model to support decisions regarding capital expenditures for complex new medical technology. Formal methodologies and quantitative techniques developed by applied mathematicians and management scientists can be used by health planners to develop cost-effective plans for the utilization of medical technology on a community or region-wide basis. In order to maximize the usefulness of the model, it was developed and tested against multiple technologies. The types of technologies studied include capital and labor-intensive technologies, technologies whose utilization rates vary with hospital occupancy rate, technologies whose use can be scheduled, and limited-use and large-use technologies.
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Tiffany, Sherwood H.; Cole, Stanley R.; Buttrill, Carey S.; Adams, William M., Jr.; Houck, Jacob A.; Srinathkumar, S.; Mukhopadhyay, Vivek; Pototzky, Anthony S.
1989-01-01
The status of the joint NASA/Rockwell Active Flexible Wing Wind-Tunnel Test Program is described. The objectives are to develop and validate the analysis, design, and test methodologies required to apply multifunction active control technology for improving aircraft performance and stability. Major tasks include designing digital multi-input/multi-output flutter-suppression and rolling-maneuver-load alleviation concepts for a flexible full-span wind-tunnel model, obtaining an experimental data base for the basic model and each control concept and providing comparisons between experimental and analytical results to validate the methodologies. The opportunity is provided to improve real-time simulation techniques and to gain practical experience with digital control law implementation procedures.
Generalized Subset Designs in Analytical Chemistry.
Surowiec, Izabella; Vikström, Ludvig; Hector, Gustaf; Johansson, Erik; Vikström, Conny; Trygg, Johan
2017-06-20
Design of experiments (DOE) is an established methodology in research, development, manufacturing, and production for screening, optimization, and robustness testing. Two-level fractional factorial designs remain the preferred approach due to high information content while keeping the number of experiments low. These types of designs, however, have never been extended to a generalized multilevel reduced design type that would be capable to include both qualitative and quantitative factors. In this Article we describe a novel generalized fractional factorial design. In addition, it also provides complementary and balanced subdesigns analogous to a fold-over in two-level reduced factorial designs. We demonstrate how this design type can be applied with good results in three different applications in analytical chemistry including (a) multivariate calibration using microwave resonance spectroscopy for the determination of water in tablets, (b) stability study in drug product development, and (c) representative sample selection in clinical studies. This demonstrates the potential of generalized fractional factorial designs to be applied in many other areas of analytical chemistry where representative, balanced, and complementary subsets are required, especially when a combination of quantitative and qualitative factors at multiple levels exists.
NASA Technical Reports Server (NTRS)
Brinson, H. F.
1985-01-01
The utilization of adhesive bonding for composite structures is briefly assessed. The need for a method to determine damage initiation and propagation for such joints is outlined. Methods currently in use to analyze both adhesive joints and fiber reinforced plastics is mentioned and it is indicated that all methods require the input of the mechanical properties of the polymeric adhesive and composite matrix material. The mechanical properties of polymers are indicated to be viscoelastic and sensitive to environmental effects. A method to analytically characterize environmentally dependent linear and nonlinear viscoelastic properties is given. It is indicated that the methodology can be used to extrapolate short term data to long term design lifetimes. That is, the method can be used for long term durability predictions. Experimental results for near adhesive resins, polymers used as composite matrices and unidirectional composite laminates is given. The data is fitted well with the analytical durability methodology. Finally, suggestions are outlined for the development of an analytical methodology for the durability predictions of adhesively bonded composite structures.
Integrated corridor management analysis, modeling and simulation (AMS) methodology.
DOT National Transportation Integrated Search
2008-03-01
This AMS Methodologies Document provides a discussion of potential ICM analytical approaches for the assessment of generic corridor operations. The AMS framework described in this report identifies strategies and procedures for tailoring AMS general ...
DOT National Transportation Integrated Search
1995-01-01
This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...
DOT National Transportation Integrated Search
1995-09-01
This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...
Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr
2016-03-20
Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Du, Xiaojiao; Jiang, Ding; Hao, Nan; Qian, Jing; Dai, Liming; Zhou, Lei; Hu, Jianping; Wang, Kun
2016-10-04
The development of novel detection methodologies in electrochemiluminescence (ECL) aptasensor fields with simplicity and ultrasensitivity is essential for constructing biosensing architectures. Herein, a facile, specific, and sensitive methodology was developed unprecedentedly for quantitative detection of microcystin-LR (MC-LR) based on three-dimensional boron and nitrogen codoped graphene hydrogels (BN-GHs) assisted steric hindrance amplifying effect between the aptamer and target analytes. The recognition reaction was monitored by quartz crystal microbalance (QCM) to validate the possible steric hindrance effect. First, the BN-GHs were synthesized via self-assembled hydrothermal method and then applied as the Ru(bpy) 3 2+ immobilization platform for further loading the biomolecule aptamers due to their nanoporous structure and large specific surface area. Interestingly, we discovered for the first time that, without the aid of conventional double-stranded DNA configuration, such three-dimensional nanomaterials can directly amplify the steric hindrance effect between the aptamer and target analytes to a detectable level, and this facile methodology could be for an exquisite assay. With the MC-LR as a model, this novel ECL biosensor showed a high sensitivity and a wide linear range. This strategy supplies a simple and versatile platform for specific and sensitive determination of a wide range of aptamer-related targets, implying that three-dimensional nanomaterials would play a crucial role in engineering and developing novel detection methodologies for ECL aptasensing fields.
Pla-Tolós, J; Serra-Mora, P; Hakobyan, L; Molins-Legua, C; Moliner-Martinez, Y; Campins-Falcó, P
2016-11-01
In this work, in-tube solid phase microextraction (in-tube SPME) coupled to capillary LC (CapLC) with diode array detection has been reported, for on-line extraction and enrichment of booster biocides (irgarol-1051 and diuron) included in Water Frame Directive 2013/39/UE (WFD). The analytical performance has been successfully demonstrated. Furthermore, in the present work, the environmental friendliness of the procedure has been quantified by means of the implementation of the carbon footprint calculation of the analytical procedure and the comparison with other methodologies previously reported. Under the optimum conditions, the method presents good linearity over the range assayed, 0.05-10μg/L for irgarol-1051 and 0.7-10μg/L for diuron. The LODs were 0.015μg/L and 0.2μg/L for irgarol-1051 and diuron, respectively. Precision was also satisfactory (relative standard deviation, RSD<3.5%). The proposed methodology was applied to monitor water samples, taking into account the EQS standards for these compounds. The carbon footprint values for the proposed procedure consolidate the operational efficiency (analytical and environmental performance) of in-tube SPME-CapLC-DAD, in general, and in particular for determining irgarol-1051 and diuron in water samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update
Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.
2012-01-01
Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996
Absorption into fluorescence. A method to sense biologically relevant gas molecules
NASA Astrophysics Data System (ADS)
Strianese, Maria; Varriale, Antonio; Staiano, Maria; Pellecchia, Claudio; D'Auria, Sabato
2011-01-01
In this work we present an innovative optical sensing methodology based on the use of biomolecules as molecular gating nano-systems. Here, as an example, we report on the detection ofanalytes related to climate change. In particular, we focused our attention on the detection ofnitric oxide (NO) and oxygen (O2). Our methodology builds on the possibility of modulating the excitation intensity of a fluorescent probe used as a transducer and a sensor molecule whose absorption is strongly affected by the binding of an analyte of interest used as a filter. The two simple conditions that have to be fulfilled for the method to work are: (a) the absorption spectrum of the sensor placed inside the cuvette, and acting as the recognition element for the analyte of interest, should strongly change upon the binding of the analyte and (b) the fluorescence dye transducer should exhibit an excitation band which overlaps with one or more absorption bands of the sensor. The absorption band of the sensor affected by the binding of the specific analyte should overlap with the excitation band of the transducer. The high sensitivity of fluorescence detection combined with the use of proteins as highly selective sensors makes this method a powerful basis for the development of a new generation of analytical assays. Proof-of-principle results showing that cytochrome c peroxidase (CcP) for NO detection and myoglobin (Mb) for O2 detection can be successfully used by exploiting our new methodology are reported. The proposed technology can be easily expanded to the determination of different target analytes.
A Six Sigma Trial For Reduction of Error Rates in Pathology Laboratory.
Tosuner, Zeynep; Gücin, Zühal; Kiran, Tuğçe; Büyükpinarbaşili, Nur; Turna, Seval; Taşkiran, Olcay; Arici, Dilek Sema
2016-01-01
A major target of quality assurance is the minimization of error rates in order to enhance patient safety. Six Sigma is a method targeting zero error (3.4 errors per million events) used in industry. The five main principles of Six Sigma are defining, measuring, analysis, improvement and control. Using this methodology, the causes of errors can be examined and process improvement strategies can be identified. The aim of our study was to evaluate the utility of Six Sigma methodology in error reduction in our pathology laboratory. The errors encountered between April 2014 and April 2015 were recorded by the pathology personnel. Error follow-up forms were examined by the quality control supervisor, administrative supervisor and the head of the department. Using Six Sigma methodology, the rate of errors was measured monthly and the distribution of errors at the preanalytic, analytic and postanalytical phases was analysed. Improvement strategies were reclaimed in the monthly intradepartmental meetings and the control of the units with high error rates was provided. Fifty-six (52.4%) of 107 recorded errors in total were at the pre-analytic phase. Forty-five errors (42%) were recorded as analytical and 6 errors (5.6%) as post-analytical. Two of the 45 errors were major irrevocable errors. The error rate was 6.8 per million in the first half of the year and 1.3 per million in the second half, decreasing by 79.77%. The Six Sigma trial in our pathology laboratory provided the reduction of the error rates mainly in the pre-analytic and analytic phases.
Morales Guerrero, Josefina C; García Zepeda, Rodrigo A; Flores Ruvalcaba, Edgar; Martínez Michel, Lorelei
2012-09-01
We evaluated the two methods accepted by the Mexican norm for the determination of nitritesin infant meat-based food with vegetables. We determined the content of nitrites in the infant food, raw materials as well as products from the intermediate stages of production. A reagent blank and a reference sample were included at each analytical run. In addition, we determined the sensitivity, recovery percentage and accuracy of each methodology. Infant food results indicated an important difference in the nitrite content determined under each methodology, due to the persistent presence of turbidity in the extracts. Different treatments were proposed to eliminate the turbidity, but these only managed to reduce it. The turbidity was attributed to carbohydrates which disclosed concentration exhibit a wide dispersion and were below the quantifiable limit under both methodologies; therefore it is not recommended to apply these techniques with food suspected to contain traces of nitrites.
An immersed boundary method for modeling a dirty geometry data
NASA Astrophysics Data System (ADS)
Onishi, Keiji; Tsubokura, Makoto
2017-11-01
We present a robust, fast, and low preparation cost immersed boundary method (IBM) for simulating an incompressible high Re flow around highly complex geometries. The method is achieved by the dispersion of the momentum by the axial linear projection and the approximate domain assumption satisfying the mass conservation around the wall including cells. This methodology has been verified against an analytical theory and wind tunnel experiment data. Next, we simulate the problem of flow around a rotating object and demonstrate the ability of this methodology to the moving geometry problem. This methodology provides the possibility as a method for obtaining a quick solution at a next large scale supercomputer. This research was supported by MEXT as ``Priority Issue on Post-K computer'' (Development of innovative design and production processes) and used computational resources of the K computer provided by the RIKEN Advanced Institute for Computational Science.
A two-parameter family of double-power-law biorthonormal potential-density expansions
NASA Astrophysics Data System (ADS)
Lilley, Edward J.; Sanders, Jason L.; Evans, N. Wyn
2018-07-01
We present a two-parameter family of biorthonormal double-power-law potential-density expansions. Both the potential and density are given in a closed analytic form and may be rapidly computed via recurrence relations. We show that this family encompasses all the known analytic biorthonormal expansions: the Zhao expansions (themselves generalizations of ones found earlier by Hernquist & Ostriker and by Clutton-Brock) and the recently discovered Lilley et al. expansion. Our new two-parameter family includes expansions based around many familiar spherical density profiles as zeroth-order models, including the γ models and the Jaffe model. It also contains a basis expansion that reproduces the famous Navarro-Frenk-White (NFW) profile at zeroth order. The new basis expansions have been found via a systematic methodology which has wide applications in finding other new expansions. In the process, we also uncovered a novel integral transform solution to Poisson's equation.
A two-parameter family of double-power-law biorthonormal potential-density expansions
NASA Astrophysics Data System (ADS)
Lilley, Edward J.; Sanders, Jason L.; Evans, N. Wyn
2018-05-01
We present a two-parameter family of biorthonormal double-power-law potential-density expansions. Both the potential and density are given in closed analytic form and may be rapidly computed via recurrence relations. We show that this family encompasses all the known analytic biorthonormal expansions: the Zhao expansions (themselves generalizations of ones found earlier by Hernquist & Ostriker and by Clutton-Brock) and the recently discovered Lilley et al. (2017a) expansion. Our new two-parameter family includes expansions based around many familiar spherical density profiles as zeroth-order models, including the γ models and the Jaffe model. It also contains a basis expansion that reproduces the famous Navarro-Frenk-White (NFW) profile at zeroth order. The new basis expansions have been found via a systematic methodology which has wide applications in finding other new expansions. In the process, we also uncovered a novel integral transform solution to Poisson's equation.
NASA Astrophysics Data System (ADS)
Batzias, Dimitris F.
2012-12-01
In this work, we present an analytic estimation of recycled products added value in order to provide a means for determining the degree of recycling that maximizes profit, taking also into account the social interest by including the subsidy of the corresponding investment. A methodology has been developed based on Life Cycle Product (LCP) with emphasis on added values H, R as fractions of production and recycle cost, respectively (H, R >1, since profit is included), which decrease by the corresponding rates h, r in the recycle course, due to deterioration of quality. At macrolevel, the claim that "an increase of exergy price, as a result of available cheap energy sources becoming more scarce, leads to less recovered quantity of any recyclable material" is proved by means of the tradeoff between the partial benefits due to material saving and resources degradation/consumption (assessed in monetary terms).
A two-parameter family of double-power-law biorthonormal potential-density expansions
NASA Astrophysics Data System (ADS)
Lilley, Edward J.; Sanders, Jason L.; Evans, N. Wyn
2018-05-01
We present a two-parameter family of biorthonormal double-power-law potential-density expansions. Both the potential and density are given in closed analytic form and may be rapidly computed via recurrence relations. We show that this family encompasses all the known analytic biorthonormal expansions: the Zhao expansions (themselves generalizations of ones found earlier by Hernquist & Ostriker and by Clutton-Brock) and the recently discovered Lilley et al. (2018b) expansion. Our new two-parameter family includes expansions based around many familiar spherical density profiles as zeroth-order models, including the γ models and the Jaffe model. It also contains a basis expansion that reproduces the famous Navarro-Frenk-White (NFW) profile at zeroth order. The new basis expansions have been found via a systematic methodology which has wide applications in finding other new expansions. In the process, we also uncovered a novel integral transform solution to Poisson's equation.
ERIC Educational Resources Information Center
Martin, Ian; Lauterbach, Alexandra; Carey, John
2015-01-01
A grounded theory methodology was used to analyze articles and book chapters describing the development and practice of school-based counseling in 25 different countries in order to identify the factors that affect development and practice. An 11-factor analytic framework was developed. Factors include: Cultural Factors, National Needs, Larger…
Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech
2015-01-01
Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Screening for Triterpenoid Saponins in Plants Using Hyphenated Analytical Platforms.
Khakimov, Bekzod; Tseng, Li Hong; Godejohann, Markus; Bak, Søren; Engelsen, Søren Balling
2016-11-24
Recently the number of studies investigating triterpenoid saponins has drastically increased due to their diverse and potentially attractive biological activities. Currently the literature contains chemical structures of few hundreds of triterpenoid saponins of plant and animal origin. Triterpenoid saponins consist of a triterpene aglycone with one or more sugar moieties attached to it. However, due to similar physico-chemical properties, isolation and identification of a large diversity of triterpenoid saponins remain challenging. This study demonstrates a methodology to screen saponins using hyphenated analytical platforms, GC-MS, LC-MS/MS, and LC-SPE-NMR/MS, in the example of two different phenotypes of the model plant Barbarea vulgaris (winter cress), glabrous (G) and pubescent (P) type that are known to differ by their insect resistance. The proposed methodology allows for detailed comparison of saponin profiles from intact plant extracts as well as saponin aglycone profiles from hydrolysed samples. Continuously measured 1D proton NMR data during LC separation along with mass spectrometry data revealed significant differences, including contents of saponins, types of aglycones and numbers of sugar moieties attached to the aglycone. A total of 49 peaks were tentatively identified as saponins from both plants; they are derived from eight types of aglycones and with 2-5 sugar moieties. Identification of two previously known insect-deterrent saponins, hederagenin cellobioside and oleanolic acid cellobioside, demonstrated the applicability of the methodology for relatively rapid screening of bioactive compounds.
Fabregat-Cabello, Neus; Castillo, Ángel; Sancho, Juan V; González, Florenci V; Roig-Navarro, Antoni Francesc
2013-08-02
In this work we have developed and validated an accurate and fast methodology for the determination of 4-nonylphenol (technical mixture) in complex matrix water samples by UHPLC-ESI-MS/MS. The procedure is based on isotope dilution mass spectrometry (IDMS) in combination with isotope pattern deconvolution (IPD), which provides the concentration of the analyte directly from the spiked sample without requiring any methodological calibration graph. To avoid any possible isotopic effect during the analytical procedure the in-house synthesized (13)C1-4-(3,6-dimethyl-3-heptyl)phenol was used as labeled compound. This proposed surrogate was able to compensate the matrix effect even from wastewater samples. A SPE pre-concentration step together with exhaustive efforts to avoid contamination were included to reach the signal-to-noise ratio necessary to detect the endogenous concentrations present in environmental samples. Calculations were performed acquiring only three transitions, achieving limits of detection lower than 100ng/g for all water matrix assayed. Recoveries within 83-108% and coefficients of variation ranging from 1.5% to 9% were obtained. On the contrary a considerable overestimation was obtained with the most usual classical calibration procedure using 4-n-nonylphenol as internal standard, demonstrating the suitability of the minimal labeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.
Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh
2011-03-01
It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.
Results from Alloy 600 And Alloy 690 Caustic SCC Model Boiler Tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Frederick D.; Thomas, Larry E.
2009-08-03
A versatile model boiler test methodology was developed and used to compare caustic stress corrosion cracking (SCC) of mill annealed Alloy 600 and thermally treated Alloy 690. The model boiler included simulated crevice devices that efficiently and consistently concentrated Na2CO3, resulting in volatilization of CO2 with the steam and concentration of NaOH at the tube surfaces. The test methodology also included variation in tube stress, either produced by the primary to secondary side pressure differential, or by a novel method that reproducibly yields a higher stress condition on the tube. The significant effect of residual stress on tube SCC wasmore » also considered. SCC of both Alloy 600 and Alloy 690 were evaluated as a function of temperature and stress. Analytical transmission electron microscopy (ATEM) evaluations of the cracks and the grain boundaries ahead of the cracks were performed, providing insight into the SCC mechanism. This model boiler test methodology may be applicable to a range of bulkwater secondary chemistries that concentrate to produce aggressive crevice environments.« less
Methodological considerations in cost of illness studies on Alzheimer disease
2012-01-01
Cost-of-illness studies (COI) can identify and measure all the costs of a particular disease, including the direct, indirect and intangible dimensions. They are intended to provide estimates about the economic impact of costly disease. Alzheimer disease (AD) is a relevant example to review cost of illness studies because of its costliness.The aim of this study was to review relevant published cost studies of AD to analyze the method used and to identify which dimension had to be improved from a methodological perspective. First, we described the key points of cost study methodology. Secondly, cost studies relating to AD were systematically reviewed, focussing on an analysis of the different methods used. The methodological choices of the studies were analysed using an analytical grid which contains the main methodological items of COI studies. Seventeen articles were retained. Depending on the studies, annual total costs per patient vary from $2,935 to $52, 954. The methods, data sources, and estimated cost categories in each study varied widely. The review showed that cost studies adopted different approaches to estimate costs of AD, reflecting a lack of consensus on the methodology of cost studies. To increase its credibility, closer agreement among researchers on the methodological principles of cost studies would be desirable. PMID:22963680
O'Brien, Kelly K; Colquhoun, Heather; Levac, Danielle; Baxter, Larry; Tricco, Andrea C; Straus, Sharon; Wickerson, Lisa; Nayar, Ayesha; Moher, David; O'Malley, Lisa
2016-07-26
Scoping studies (or reviews) are a method used to comprehensively map evidence across a range of study designs in an area, with the aim of informing future research practice, programs and policy. However, no universal agreement exists on terminology, definition or methodological steps. Our aim was to understand the experiences of, and considerations for conducting scoping studies from the perspective of academic and community partners. Primary objectives were to 1) describe experiences conducting scoping studies including strengths and challenges; and 2) describe perspectives on terminology, definition, and methodological steps. We conducted a cross-sectional web-based survey with clinicians, educators, researchers, knowledge users, representatives from community-based organizations, graduate students, and policy stakeholders with experience and/or interest in conducting scoping studies to gain an understanding of experiences and perspectives on the conduct and reporting of scoping studies. We administered an electronic self-reported questionnaire comprised of 22 items related to experiences with scoping studies, strengths and challenges, opinions on terminology, and methodological steps. We analyzed questionnaire data using descriptive statistics and content analytical techniques. Survey results were discussed during a multi-stakeholder consultation to identify key considerations in the conduct and reporting of scoping studies. Of the 83 invitations, 54 individuals (65 %) completed the scoping questionnaire, and 48 (58 %) attended the scoping study meeting from Canada, the United Kingdom and United States. Many scoping study strengths were dually identified as challenges including breadth of scope, and iterative process. No consensus on terminology emerged, however key defining features that comprised a working definition of scoping studies included the exploratory mapping of literature in a field; iterative process, inclusion of grey literature; no quality assessment of included studies, and an optional consultation phase. We offer considerations for the conduct and reporting of scoping studies for researchers, clinicians and knowledge users engaging in this methodology. Lack of consensus on scoping terminology, definition and methodological steps persists. Reasons for this may be attributed to diversity of disciplines adopting this methodology for differing purposes. Further work is needed to establish guidelines on the reporting and methodological quality assessment of scoping studies.
Analytical aspects of plant metabolite profiling platforms: current standings and future aims.
Seger, Christoph; Sturm, Sonja
2007-02-01
Over the past years, metabolic profiling has been established as a comprehensive systems biology tool. Mass spectrometry or NMR spectroscopy-based technology platforms combined with unsupervised or supervised multivariate statistical methodologies allow a deep insight into the complex metabolite patterns of plant-derived samples. Within this review, we provide a thorough introduction to the analytical hard- and software requirements of metabolic profiling platforms. Methodological limitations are addressed, and the metabolic profiling workflow is exemplified by summarizing recent applications ranging from model systems to more applied topics.
Hybrid Analytical and Data-Driven Modeling for Feed-Forward Robot Control †
Reinhart, René Felix; Shareef, Zeeshan; Steil, Jochen Jakob
2017-01-01
Feed-forward model-based control relies on models of the controlled plant, e.g., in robotics on accurate knowledge of manipulator kinematics or dynamics. However, mechanical and analytical models do not capture all aspects of a plant’s intrinsic properties and there remain unmodeled dynamics due to varying parameters, unmodeled friction or soft materials. In this context, machine learning is an alternative suitable technique to extract non-linear plant models from data. However, fully data-based models suffer from inaccuracies as well and are inefficient if they include learning of well known analytical models. This paper thus argues that feed-forward control based on hybrid models comprising an analytical model and a learned error model can significantly improve modeling accuracy. Hybrid modeling here serves the purpose to combine the best of the two modeling worlds. The hybrid modeling methodology is described and the approach is demonstrated for two typical problems in robotics, i.e., inverse kinematics control and computed torque control. The former is performed for a redundant soft robot and the latter for a rigid industrial robot with redundant degrees of freedom, where a complete analytical model is not available for any of the platforms. PMID:28208697
Hybrid Analytical and Data-Driven Modeling for Feed-Forward Robot Control.
Reinhart, René Felix; Shareef, Zeeshan; Steil, Jochen Jakob
2017-02-08
Feed-forward model-based control relies on models of the controlled plant, e.g., in robotics on accurate knowledge of manipulator kinematics or dynamics. However, mechanical and analytical models do not capture all aspects of a plant's intrinsic properties and there remain unmodeled dynamics due to varying parameters, unmodeled friction or soft materials. In this context, machine learning is an alternative suitable technique to extract non-linear plant models from data. However, fully data-based models suffer from inaccuracies as well and are inefficient if they include learning of well known analytical models. This paper thus argues that feed-forward control based on hybrid models comprising an analytical model and a learned error model can significantly improve modeling accuracy. Hybrid modeling here serves the purpose to combine the best of the two modeling worlds. The hybrid modeling methodology is described and the approach is demonstrated for two typical problems in robotics, i.e., inverse kinematics control and computed torque control. The former is performed for a redundant soft robot and the latter for a rigid industrial robot with redundant degrees of freedom, where a complete analytical model is not available for any of the platforms.
NASA Astrophysics Data System (ADS)
Javier Romualdez, Luis
Scientific balloon-borne instrumentation offers an attractive, competitive, and effective alternative to space-borne missions when considering the overall scope, cost, and development timescale required to design and launch scientific instruments. In particular, the balloon-borne environment provides a near-space regime that is suitable for a number of modern astronomical and cosmological experiments, where the atmospheric interference suffered by ground-based instrumentation is negligible at stratospheric altitudes. This work is centered around the analytical strategies and implementation considerations for the attitude determination and control of SuperBIT, a scientific balloon-borne payload capable of meeting the strict sub-arcsecond pointing and image stability requirements demanded by modern cosmological experiments. Broadly speaking, the designed stability specifications of SuperBIT coupled with its observational efficiency, image quality, and accessibility rivals state-of-the-art astronomical observatories such as the Hubble Space Telescope. To this end, this work presents an end-to-end design methodology for precision pointing balloon-borne payloads such as SuperBIT within an analytical yet implementationally grounded context. Simulation models of SuperBIT are analytically derived to aid in pre-assembly trade-off and case studies that are pertinent to the dynamic balloon-borne environment. From these results, state estimation techniques and control methodologies are extensively developed, leveraging the analytical framework of simulation models and design studies. This pre-assembly design phase is physically validated during assembly, integration, and testing through implementation in real-time hardware and software, which bridges the gap between analytical results and practical application. SuperBIT attitude determination and control is demonstrated throughout two engineering test flights that verify pointing and image stability requirements in flight, where the post-flight results close the overall design loop by suggesting practical improvements to pre-design methodologies. Overall, the analytical and practical results presented in this work, though centered around the SuperBIT project, provide generically useful and implementationally viable methodologies for high precision balloon-borne instrumentation, all of which are validated, justified, and improved both theoretically and practically. As such, the continuing development of SuperBIT, built from the work presented in this thesis, strives to further the potential for scientific balloon-borne astronomy in the near future.
Effects of space environment on composites: An analytical study of critical experimental parameters
NASA Technical Reports Server (NTRS)
Gupta, A.; Carroll, W. F.; Moacanin, J.
1979-01-01
A generalized methodology currently employed at JPL, was used to develop an analytical model for effects of high-energy electrons and interactions between electron and ultraviolet effects. Chemical kinetic concepts were applied in defining quantifiable parameters; the need for determining short-lived transient species and their concentration was demonstrated. The results demonstrates a systematic and cost-effective means of addressing the issues and show qualitative and quantitative, applicable relationships between space radiation and simulation parameters. An equally important result is identification of critical initial experiments necessary to further clarify the relationships. Topics discussed include facility and test design; rastered vs. diffuse continuous e-beam; valid acceleration level; simultaneous vs. sequential exposure to different types of radiation; and interruption of test continuity.
Auditing of chromatographic data.
Mabie, J T
1998-01-01
During a data audit, it is important to ensure that there is clear documentation and an audit trail. The Quality Assurance Unit should review all areas, including the laboratory, during the conduct of the sample analyses. The analytical methodology that is developed should be documented prior to sample analyses. This is an important document for the auditor, as it is the instrumental piece used by the laboratory personnel to maintain integrity throughout the process. It is expected that this document will give insight into the sample analysis, run controls, run sequencing, instrument parameters, and acceptance criteria for the samples. The sample analysis and all supporting documentation should be audited in conjunction with this written analytical method and any supporting Standard Operating Procedures to ensure the quality and integrity of the data.
Incorporating Information Literacy Skills into Analytical Chemistry: An Evolutionary Step
ERIC Educational Resources Information Center
Walczak, Mary M.; Jackson, Paul T.
2007-01-01
The American Chemical Society (ACS) has recently decided to incorporate various information literacy skills for teaching analytical chemistry to the students. The methodology has been found to be extremely effective, as it provides better understanding to the students.
Green approach using monolithic column for simultaneous determination of coformulated drugs.
Yehia, Ali M; Mohamed, Heba M
2016-06-01
Green chemistry and sustainability is now entirely encompassed across the majority of pharmaceutical companies and research labs. Researchers' attention is careworn toward implementing the green analytical chemistry principles for more eco-friendly analytical methodologies. Solvents play a dominant role in determining the greenness of the analytical procedure. Using safer solvents, the greenness profile of the methodology could be increased remarkably. In this context, a green chromatographic method has been developed and validated for the simultaneous determination of phenylephrine, paracetamol, and guaifenesin in their ternary pharmaceutical mixture. The chromatographic separation was carried out using monolithic column and green solvents as mobile phase. The use of monolithic column allows efficient separation protocols at higher flow rates, which results in short time of analysis. Two-factor three-level experimental design was used to optimize the chromatographic conditions. The greenness profile of the proposed methodology was assessed using eco-scale as a green metrics and was found to be an excellent green method with regard to the usage and production of hazardous chemicals and solvents, energy consumption, and amount of produced waste. The proposed method improved the environmental impact without compromising the analytical performance criteria and could be used as a safer alternate for the routine analysis of the studied drugs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Reynolds, Arthur J.; Ou, Suh-Ruu
2015-01-01
This article reviews methodological and analytic approaches, and impact evidence for understanding the mechanisms of effects of early childhood interventions, including delinquency and violence prevention. Illustrations from longitudinal studies of preschool preventive interventions are provided. We restrict our attention to preventive interventions for children from birth to age 5, including evidence from the Chicago Longitudinal Study (CLS), which investigates the impact of an established school-based early childhood intervention. Frameworks and evidence will be organized according to the 5-Hypothesis Model (5HM), which postulates that a variety of early childhood interventions impact later well-being through the promotion of cognitive and scholastic advantages, motivational advantages, social adjustment, family support behaviors, and school supports. Recommendations are made for advancing confirmatory approaches for identifying the most effective prevention programs using identification of generative mechanisms as a major methodological criterion. PMID:26497315
Measuring allostatic load in the workforce: a systematic review
MAUSS, Daniel; LI, Jian; SCHMIDT, Burkhard; ANGERER, Peter; JARCZOK, Marc N.
2014-01-01
The Allostatic Load Index (ALI) has been used to establish associations between stress and health-related outcomes. This review summarizes the measurement and methodological challenges of allostatic load in occupational settings. Databases of Medline, PubPsych, and Cochrane were searched to systematically explore studies measuring ALI in working adults following the PRISMA statement. Study characteristics, biomarkers and methods were tabulated. Methodological quality was evaluated using a standardized checklist. Sixteen articles (2003–2013) met the inclusion criteria, with a total of 39 (range 6–17) different variables used to calculate ALI. Substantial heterogeneity was observed in the number and type of biomarkers used, the analytic techniques applied and study quality. Particularly, primary mediators were not regularly included in ALI calculation. Consensus on methods to measure ALI in working populations is limited. Research should include longitudinal studies using multi-systemic variables to measure employees at risk for biological wear and tear. PMID:25224337
Aronsson, T; Bjørnstad, P; Leskinen, E; Uldall, A; de Verdier, C H
1984-01-01
The aim of this investigation was primarily to assess analytical quality expressed as between-laboratory, within-laboratory, and total imprecision, not in order to detect laboratories with poor performance, but in the positive sense to provide data for improving critical steps in analytical methodology. The aim was also to establish the present state of the art in comparison with earlier investigations to see if improvement in analytical quality could be observed.
Nowak, Peter
2011-03-01
There is a broad range qualitative linguistic research (sequential analysis) on doctor-patient interaction that had only a marginal impact on clinical research and practice. At least in parts this is due to the lack of qualitative research synthesis in the field. Available research summaries are not systematic in their methodology. This paper proposes a synthesis methodology for qualitative, sequential analytic research on doctor-patient interaction. The presented methodology is not new but specifies standard methodology of qualitative research synthesis for sequential analytic research. This pilot review synthesizes twelve studies on German-speaking doctor-patient interactions, identifies 45 verbal actions of doctors and structures them in a systematics of eight interaction components. Three interaction components ("Listening", "Asking for information", and "Giving information") seem to be central and cover two thirds of the identified action types. This pilot review demonstrates that sequential analytic research can be synthesized in a consistent and meaningful way, thus providing a more comprehensive and unbiased integration of research. Future synthesis of qualitative research in the area of health communication research is very much needed. Qualitative research synthesis can support the development of quantitative research and of educational materials in medical training and patient training. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
DOT National Transportation Integrated Search
2000-04-01
This report presents detailed analytic tools and results on dynamic response which are used to develop the safe dynamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical parameters and characteris...
Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L
2012-10-01
Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.
Review of calcium methodologies.
Zak, B; Epstein, E; Baginski, E S
1975-01-01
A review of calcium methodologies for serum has been described. The analytical systems developed over the past century have been classified as to type beginning with gravimetry and extending to isotope dilution-mass spectrometry by covering all of the commonly used technics which have evolved during that period. Screening and referee procedures are discussed along with comparative sensitivities encountered between atomic absorption spectrophotometry and molecular absorption spectrophotometry. A procedure involving a simple direct reaction for serum calcium using cresolphthalein complexone is recommended in which high blanks are minimized by repressing the ionization of the color reagent on lowering the dielectric constant characteristics of the mixture with dimethylsulfoxide. Reaction characteristics, errors which can be encountered, normal ranges and an interpretative resume are included in its discussion.
Analytical technique characterizes all trace contaminants in water
NASA Technical Reports Server (NTRS)
Foster, J. N.; Lysyj, I.; Nelson, K. H.
1967-01-01
Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.
User-Centered Evaluation of Visual Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean C.
Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less
[Theoretical and methodological uses of research in Social and Human Sciences in Health].
Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein
2012-12-01
The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.
Cognitive Foundations for Visual Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greitzer, Frank L.; Noonan, Christine F.; Franklin, Lyndsey
In this report, we provide an overview of scientific/technical literature on information visualization and VA. Topics discussed include an update and overview of the extensive literature search conducted for this study, the nature and purpose of the field, major research thrusts, and scientific foundations. We review methodologies for evaluating and measuring the impact of VA technologies as well as taxonomies that have been proposed for various purposes to support the VA community. A cognitive science perspective underlies each of these discussions.
Transient and steady state viscoelastic rolling contact
NASA Technical Reports Server (NTRS)
Padovan, J.; Paramadilok, O.
1985-01-01
Based on moving total Lagrangian coordinates, a so-called traveling Hughes type contact strategy is developed. Employing the modified contact scheme in conjunction with a traveling finite element strategy, an overall solution methodology is developed to handle transient and steady viscoelastic rolling contact. To verify the scheme, the results of both experimental and analytical benchmarking is presented. The experimental benchmarking includes the handling of rolling tires up to their upper bound behavior, namely the standing wave response.
1990 National Water Quality Laboratory Services Catalog
Pritt, Jeffrey; Jones, Berwyn E.
1989-01-01
PREFACE This catalog provides information about analytical services available from the National Water Quality Laboratory (NWQL) to support programs of the Water Resources Division of the U.S. Geological Survey. To assist personnel in the selection of analytical services, the catalog lists cost, sample volume, applicable concentration range, detection level, precision of analysis, and preservation techniques for samples to be submitted for analysis. Prices for services reflect operationa1 costs, the complexity of each analytical procedure, and the costs to ensure analytical quality control. The catalog consists of five parts. Part 1 is a glossary of terminology; Part 2 lists the bottles, containers, solutions, and other materials that are available through the NWQL; Part 3 describes the field processing of samples to be submitted for analysis; Part 4 describes analytical services that are available; and Part 5 contains indices of analytical methodology and Chemical Abstract Services (CAS) numbers. Nomenclature used in the catalog is consistent with WATSTORE and STORET. The user is provided with laboratory codes and schedules that consist of groupings of parameters which are measured together in the NWQL. In cases where more than one analytical range is offered for a single element or compound, different laboratory codes are given. Book 5 of the series 'Techniques of Water Resources Investigations of the U.S. Geological Survey' should be consulted for more information about the analytical procedures included in the tabulations. This catalog supersedes U.S. Geological Survey Open-File Report 86-232 '1986-87-88 National Water Quality Laboratory Services Catalog', October 1985.
A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses
Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert
2011-01-01
Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325
Testik, Özlem Müge; Shaygan, Amir; Dasdemir, Erdi; Soydan, Guray
It is often vital to identify, prioritize, and select quality improvement projects in a hospital. Yet, a methodology, which utilizes experts' opinions with different points of view, is needed for better decision making. The proposed methodology utilizes the cause-and-effect diagram to identify improvement projects and construct a project hierarchy for a problem. The right improvement projects are then prioritized and selected using a weighting scheme of analytical hierarchy process by aggregating experts' opinions. An approach for collecting data from experts and a graphical display for summarizing the obtained information are also provided. The methodology is implemented for improving a hospital appointment system. The top-ranked 2 major project categories for improvements were identified to be system- and accessibility-related causes (45%) and capacity-related causes (28%), respectively. For each of the major project category, subprojects were then ranked for selecting the improvement needs. The methodology is useful in cases where an aggregate decision based on experts' opinions is expected. Some suggestions for practical implementations are provided.
COMPARISON OF ANALYTICAL METHODS FOR THE MEASUREMENT OF NON-VIABLE BIOLOGICAL PM
The paper describes a preliminary research effort to develop a methodology for the measurement of non-viable biologically based particulate matter (PM), analyzing for mold, dust mite, and ragweed antigens and endotoxins. Using a comparison of analytical methods, the research obj...
Cognitive-Developmental and Behavior-Analytic Theories: Evolving into Complementarity
ERIC Educational Resources Information Center
Overton, Willis F.; Ennis, Michelle D.
2006-01-01
Historically, cognitive-developmental and behavior-analytic approaches to the study of human behavior change and development have been presented as incompatible alternative theoretical and methodological perspectives. This presumed incompatibility has been understood as arising from divergent sets of metatheoretical assumptions that take the form…
Bioinspired Methodology for Artificial Olfaction
Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve
2008-01-01
Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409
How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.
Levitt, Heidi M
2018-05-01
Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.
Petticrew, Mark; Rehfuess, Eva; Noyes, Jane; Higgins, Julian P T; Mayhew, Alain; Pantoja, Tomas; Shemilt, Ian; Sowden, Amanda
2013-11-01
Although there is increasing interest in the evaluation of complex interventions, there is little guidance on how evidence from complex interventions may be reviewed and synthesized, and the relevance of the plethora of evidence synthesis methods to complexity is unclear. This article aims to explore how different meta-analytical approaches can be used to examine aspects of complexity; describe the contribution of various narrative, tabular, and graphical approaches to synthesis; and give an overview of the potential choice of selected qualitative and mixed-method evidence synthesis approaches. The methodological discussions presented here build on a 2-day workshop held in Montebello, Canada, in January 2012, involving methodological experts from the Campbell and Cochrane Collaborations and from other international review centers (Anderson L, Petticrew M, Chandler J, et al. systematic reviews of complex interventions. In press). These systematic review methodologists discussed the broad range of existing methods and considered the relevance of these methods to reviews of complex interventions. The evidence from primary studies of complex interventions may be qualitative or quantitative. There is a wide range of methodological options for reviewing and presenting this evidence. Specific contributions of statistical approaches include the use of meta-analysis, meta-regression, and Bayesian methods, whereas narrative summary approaches provide valuable precursors or alternatives to these. Qualitative and mixed-method approaches include thematic synthesis, framework synthesis, and realist synthesis. A suitable combination of these approaches allows synthesis of evidence for understanding complex interventions. Reviewers need to consider which aspects of complex interventions should be a focus of their review and what types of quantitative and/or qualitative studies they will be including, and this will inform their choice of review methods. These may range from standard meta-analysis through to more complex mixed-method synthesis and synthesis approaches that incorporate theory and/or user's perspectives. Copyright © 2013 Elsevier Inc. All rights reserved.
Closed-loop, pilot/vehicle analysis of the approach and landing task
NASA Technical Reports Server (NTRS)
Schmidt, D. K.; Anderson, M. R.
1985-01-01
Optimal-control-theoretic modeling and frequency-domain analysis is the methodology proposed to evaluate analytically the handling qualities of higher-order manually controlled dynamic systems. Fundamental to the methodology is evaluating the interplay between pilot workload and closed-loop pilot/vehicle performance and stability robustness. The model-based metric for pilot workload is the required pilot phase compensation. Pilot/vehicle performance and loop stability is then evaluated using frequency-domain techniques. When these techniques were applied to the flight-test data for thirty-two highly-augmented fighter configurations, strong correlation was obtained between the analytical and experimental results.
Hybrid experimental/analytical models of structural dynamics - Creation and use for predictions
NASA Technical Reports Server (NTRS)
Balmes, Etienne
1993-01-01
An original complete methodology for the construction of predictive models of damped structural vibrations is introduced. A consistent definition of normal and complex modes is given which leads to an original method to accurately identify non-proportionally damped normal mode models. A new method to create predictive hybrid experimental/analytical models of damped structures is introduced, and the ability of hybrid models to predict the response to system configuration changes is discussed. Finally a critical review of the overall methodology is made by application to the case of the MIT/SERC interferometer testbed.
Abramovitch, Amitai; Mittelman, Andrew; Tankersley, Amelia P; Abramowitz, Jonathan S; Schweiger, Avraham
2015-07-30
The inconsistent nature of the neuropsychology literature pertaining to obsessive-compulsive disorder (OCD) has long been recognized. However, individual studies, systematic reviews, and recent meta-analytic reviews were unsuccessful in establishing a consensus regarding a disorder-specific neuropsychological profile. In an attempt to identify methodological factors that may contribute to the inconsistency that is characteristic of this body of research, a systematic review of methodological factors in studies comparing OCD patients and non-psychiatric controls on neuropsychological tests was conducted. This review covered 115 studies that included nearly 3500 patients. Results revealed a range of methodological weaknesses. Some of these weaknesses have been previously noted in the broader neuropsychological literature, while some are more specific to psychiatric disorders, and to OCD. These methodological shortcomings have the potential to hinder the identification of a specific neuropsychological profile associated with OCD as well as to obscure the association between neurocognitive dysfunctions and contemporary neurobiological models. Rectifying these weaknesses may facilitate replicability, and promote our ability to extract cogent, meaningful, and more unified inferences regarding the neuropsychology of OCD. To that end, we present a set of methodological recommendations to facilitate future neuropsychology research in psychiatric disorders in general, and in OCD in particular. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Big data analytics in healthcare: promise and potential.
Raghupathi, Wullianallur; Raghupathi, Viju
2014-01-01
To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.
Galfi, Istvan; Virtanen, Jorma; Gasik, Michael M.
2017-01-01
A new, faster and more reliable analytical methodology for S(IV) species analysis at low pH solutions by bichromatometry is proposed. For decades the state of the art methodology has been iodometry that is still well justified method for neutral solutions, thus at low pH media possess various side reactions increasing inaccuracy. In contrast, the new methodology has no side reactions at low pH media, requires only one titration step and provides a clear color change if S(IV) species are present in the solution. The method is validated using model solutions with known concentrations and applied to analyses of gaseous SO2 from purged solution in low pH media samples. The results indicate that bichromatometry can accurately analyze SO2 from liquid samples having pH even below 0 relevant to metallurgical industrial processes. PMID:29145479
Background for Joint Systems Aspects of AIR 6000
2000-04-01
Checkland’s Soft Systems Methodology [7, 8,9]. The analytical techniques that are proposed for joint systems work are based on calculating probability...Supporting Global Interests 21 DSTO-CR-0155 SLMP Structural Life Management Plan SOW Stand-Off Weapon SSM Soft Systems Methodology UAV Uninhabited Aerial... Systems Methodology in Action, John Wiley & Sons, Chichester, 1990. [101 Pearl, Judea, Probabilistic Reasoning in Intelligent Systems: Networks of Plausible
Managing knowledge business intelligence: A cognitive analytic approach
NASA Astrophysics Data System (ADS)
Surbakti, Herison; Ta'a, Azman
2017-10-01
The purpose of this paper is to identify and analyze integration of Knowledge Management (KM) and Business Intelligence (BI) in order to achieve competitive edge in context of intellectual capital. Methodology includes review of literatures and analyzes the interviews data from managers in corporate sector and models established by different authors. BI technologies have strong association with process of KM for attaining competitive advantage. KM have strong influence from human and social factors and turn them to the most valuable assets with efficient system run under BI tactics and technologies. However, the term of predictive analytics is based on the field of BI. Extracting tacit knowledge is a big challenge to be used as a new source for BI to use in analyzing. The advanced approach of the analytic methods that address the diversity of data corpus - structured and unstructured - required a cognitive approach to provide estimative results and to yield actionable descriptive, predictive and prescriptive results. This is a big challenge nowadays, and this paper aims to elaborate detail in this initial work.
2017-08-01
of metallic additive manufacturing processes and show that combining experimental data with modelling and advanced data processing and analytics...manufacturing processes and show that combining experimental data with modelling and advanced data processing and analytics methods will accelerate that...geometries, we develop a methodology that couples experimental data and modelling to convert the scan paths into spatially resolved local thermal histories
Design and analysis of composite structures with stress concentrations
NASA Technical Reports Server (NTRS)
Garbo, S. P.
1983-01-01
An overview of an analytic procedure which can be used to provide comprehensive stress and strength analysis of composite structures with stress concentrations is given. The methodology provides designer/analysts with a user-oriented procedure which, within acceptable engineering accuracy, accounts for the effects of a wide range of application design variables. The procedure permits the strength of arbitrary laminate constructions under general bearing/bypass load conditions to be predicted with only unnotched unidirectional strength and stiffness input data required. Included is a brief discussion of the relevancy of this analysis to the design of primary aircraft structure; an overview of the analytic procedure with theory/test correlations; and an example of the use and interaction of this strength analysis relative to the design of high-load transfer bolted composite joints.
NASA Technical Reports Server (NTRS)
Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.;
2007-01-01
The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.
Advances in spatial epidemiology and geographic information systems.
Kirby, Russell S; Delmelle, Eric; Eberth, Jan M
2017-01-01
The field of spatial epidemiology has evolved rapidly in the past 2 decades. This study serves as a brief introduction to spatial epidemiology and the use of geographic information systems in applied research in epidemiology. We highlight technical developments and highlight opportunities to apply spatial analytic methods in epidemiologic research, focusing on methodologies involving geocoding, distance estimation, residential mobility, record linkage and data integration, spatial and spatio-temporal clustering, small area estimation, and Bayesian applications to disease mapping. The articles included in this issue incorporate many of these methods into their study designs and analytical frameworks. It is our hope that these studies will spur further development and utilization of spatial analysis and geographic information systems in epidemiologic research. Copyright © 2016 Elsevier Inc. All rights reserved.
Quantitation of DNA adducts by stable isotope dilution mass spectrometry
Tretyakova, Natalia; Goggin, Melissa; Janis, Gregory
2012-01-01
Exposure to endogenous and exogenous chemicals can lead to the formation of structurally modified DNA bases (DNA adducts). If not repaired, these nucleobase lesions can cause polymerase errors during DNA replication, leading to heritable mutations potentially contributing to the development of cancer. Due to their critical role in cancer initiation, DNA adducts represent mechanism-based biomarkers of carcinogen exposure, and their quantitation is particularly useful for cancer risk assessment. DNA adducts are also valuable in mechanistic studies linking tumorigenic effects of environmental and industrial carcinogens to specific electrophilic species generated from their metabolism. While multiple experimental methodologies have been developed for DNA adduct analysis in biological samples – including immunoassay, HPLC, and 32P-postlabeling – isotope dilution high performance liquid chromatography-electrospray ionization-tandem mass spectrometry (HPLC-ESI-MS/MS) generally has superior selectivity, sensitivity, accuracy, and reproducibility. As typical DNA adducts concentrations in biological samples are between 0.01 – 10 adducts per 108 normal nucleotides, ultrasensitive HPLC-ESI-MS/MS methodologies are required for their analysis. Recent developments in analytical separations and biological mass spectrometry – especially nanoflow HPLC, nanospray ionization MS, chip-MS, and high resolution MS – have pushed the limits of analytical HPLC-ESI-MS/MS methodologies for DNA adducts, allowing researchers to accurately measure their concentrations in biological samples from patients treated with DNA alkylating drugs and in populations exposed to carcinogens from urban air, drinking water, cooked food, alcohol, and cigarette smoke. PMID:22827593
Econometric model for age- and population-dependent radiation exposures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandquist, G.M.; Slaughter, D.M.; Rogers, V.C.
1991-01-01
The economic impact associated with ionizing radiation exposures in a given human population depends on numerous factors including the individual's mean economic status as a function age, the age distribution of the population, the future life expectancy at each age, and the latency period for the occurrence of radiation-induced health effects. A simple mathematical model has been developed that provides an analytical methodology for estimating the societal econometrics associated with radiation effects are to be assessed and compared for economic evaluation.
Static Strength Characteristics of Mechanically Fastened Composite Joints
NASA Technical Reports Server (NTRS)
Fox, D. E.; Swaim, K. W.
1999-01-01
The analysis of mechanically fastened composite joints presents a great challenge to structural analysts because of the large number of parameters that influence strength. These parameters include edge distance, width, bolt diameter, laminate thickness, ply orientation, and bolt torque. The research presented in this report investigates the influence of some of these parameters through testing and analysis. A methodology is presented for estimating the strength of the bolt-hole based on classical lamination theory using the Tsai-Hill failure criteria and typical bolthole bearing analytical methods.
Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis
2013-01-01
Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs. PMID:22273587
Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik
2009-12-01
The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
Dual-process cognitive interventions to enhance diagnostic reasoning: a systematic review.
Lambe, Kathryn Ann; O'Reilly, Gary; Kelly, Brendan D; Curristan, Sarah
2016-10-01
Diagnostic error incurs enormous human and economic costs. The dual-process model reasoning provides a framework for understanding the diagnostic process and attributes certain errors to faulty cognitive shortcuts (heuristics). The literature contains many suggestions to counteract these and to enhance analytical and non-analytical modes of reasoning. To identify, describe and appraise studies that have empirically investigated interventions to enhance analytical and non-analytical reasoning among medical trainees and doctors, and to assess their effectiveness. Systematic searches of five databases were carried out (Medline, PsycInfo, Embase, Education Resource Information Centre (ERIC) and Cochrane Database of Controlled Trials), supplemented with searches of bibliographies and relevant journals. Included studies evaluated an intervention to enhance analytical and/or non-analytical reasoning among medical trainees or doctors. Twenty-eight studies were included under five categories: educational interventions, checklists, cognitive forcing strategies, guided reflection, instructions at test and other interventions. While many of the studies found some effect of interventions, guided reflection interventions emerged as the most consistently successful across five studies, and cognitive forcing strategies improved accuracy and confidence judgements. Significant heterogeneity of measurement approaches was observed, and existing studies are largely limited to early-career doctors. Results to date are promising and this relatively young field is now close to a point where these kinds of cognitive interventions can be recommended to educators. Further research with refined methodology and more diverse samples is required before firm recommendations may be made for medical education and policy; however, these results suggest that such interventions hold promise, with much current enthusiasm for new research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
ERIC Educational Resources Information Center
Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew
2015-01-01
Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…
Analytical Methods of Decoupling the Automotive Engine Torque Roll Axis
NASA Astrophysics Data System (ADS)
JEONG, TAESEOK; SINGH, RAJENDRA
2000-06-01
This paper analytically examines the multi-dimensional mounting schemes of an automotive engine-gearbox system when excited by oscillating torques. In particular, the issue of torque roll axis decoupling is analyzed in significant detail since it is poorly understood. New dynamic decoupling axioms are presented an d compared with the conventional elastic axis mounting and focalization methods. A linear time-invariant system assumption is made in addition to a proportionally damped system. Only rigid-body modes of the powertrain are considered and the chassis elements are assumed to be rigid. Several simplified physical systems are considered and new closed-form solutions for symmetric and asymmetric engine-mounting systems are developed. These clearly explain the design concepts for the 4-point mounting scheme. Our analytical solutions match with the existing design formulations that are only applicable to symmetric geometries. Spectra for all six rigid-body motions are predicted using the alternate decoupling methods and the closed-form solutions are verified. Also, our method is validated by comparing modal solutions with prior experimental and analytical studies. Parametric design studies are carried out to illustrate the methodology. Chief contributions of this research include the development of new or refined analytical models and closed-form solutions along with improved design strategies for the torque roll axis decoupling.
Anisotropic Multishell Analytical Modeling of an Intervertebral Disk Subjected to Axial Compression.
Demers, Sébastien; Nadeau, Sylvie; Bouzid, Abdel-Hakim
2016-04-01
Studies on intervertebral disk (IVD) response to various loads and postures are essential to understand disk's mechanical functions and to suggest preventive and corrective actions in the workplace. The experimental and finite-element (FE) approaches are well-suited for these studies, but validating their findings is difficult, partly due to the lack of alternative methods. Analytical modeling could allow methodological triangulation and help validation of FE models. This paper presents an analytical method based on thin-shell, beam-on-elastic-foundation and composite materials theories to evaluate the stresses in the anulus fibrosus (AF) of an axisymmetric disk composed of multiple thin lamellae. Large deformations of the soft tissues are accounted for using an iterative method and the anisotropic material properties are derived from a published biaxial experiment. The results are compared to those obtained by FE modeling. The results demonstrate the capability of the analytical model to evaluate the stresses at any location of the simplified AF. It also demonstrates that anisotropy reduces stresses in the lamellae. This novel model is a preliminary step in developing valuable analytical models of IVDs, and represents a distinctive groundwork that is able to sustain future refinements. This paper suggests important features that may be included to improve model realism.
Green aspects, developments and perspectives of liquid phase microextraction techniques.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2014-02-01
Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.
Horizon Missions Methodology - Using new paradigms to overcome conceptual blocks to innovation
NASA Technical Reports Server (NTRS)
Anderson, John L.
1993-01-01
The Horizon Mission Methodology was developed to provide a systematic analytical approach for evaluating and identifying technological requirements for breakthrough technology options (BTOs) and for assessing their potential to provide revolutionary capabilities for advanced space missions. Here, attention is given to the further use of the methodology as a new tool for a broader range of studies dealing with technology innovation and new technology paradigms.
Integrated Response Time Evaluation Methodology for the Nuclear Safety Instrumentation System
NASA Astrophysics Data System (ADS)
Lee, Chang Jae; Yun, Jae Hee
2017-06-01
Safety analysis for a nuclear power plant establishes not only an analytical limit (AL) in terms of a measured or calculated variable but also an analytical response time (ART) required to complete protective action after the AL is reached. If the two constraints are met, the safety limit selected to maintain the integrity of physical barriers used for preventing uncontrolled radioactivity release will not be exceeded during anticipated operational occurrences and postulated accidents. Setpoint determination methodologies have actively been developed to ensure that the protective action is initiated before the process conditions reach the AL. However, regarding the ART for a nuclear safety instrumentation system, an integrated evaluation methodology considering the whole design process has not been systematically studied. In order to assure the safety of nuclear power plants, this paper proposes a systematic and integrated response time evaluation methodology that covers safety analyses, system designs, response time analyses, and response time tests. This methodology is applied to safety instrumentation systems for the advanced power reactor 1400 and the optimized power reactor 1000 nuclear power plants in South Korea. The quantitative evaluation results are provided herein. The evaluation results using the proposed methodology demonstrate that the nuclear safety instrumentation systems fully satisfy corresponding requirements of the ART.
Kang, Ju-Hee; Vanderstichele, Hugo; Trojanowski, John Q; Shaw, Leslie M
2012-04-01
The xMAP-Luminex multiplex platform for measurement of Alzheimer's disease (AD) cerebrospinal fluid (CSF) biomarkers using Innogenetics AlzBio3 immunoassay reagents that are for research use only has been shown to be an effective tool for early detection of an AD-like biomarker signature based on concentrations of CSF Aβ(1-42), t-tau and p-tau(181). Among the several advantages of the xMAP-Luminex platform for AD CSF biomarkers are: a wide dynamic range of ready-to-use calibrators, time savings for the simultaneous analyses of three biomarkers in one analytical run, reduction of human error, potential of reduced cost of reagents, and a modest reduction of sample volume as compared to conventional enzyme-linked immunosorbant assay (ELISA) methodology. Recent clinical studies support the use of CSF Aβ(1-42), t-tau and p-tau(181) measurement using the xMAP-Luminex platform for the early detection of AD pathology in cognitively normal individuals, and for prediction of progression to AD dementia in subjects with mild cognitive impairment (MCI). Studies that have shown the prediction of risk for progression to AD dementia by MCI patients provide the basis for the use of CSF Aβ(1-42), t-tau and p-tau(181) testing to assign risk for progression in patients enrolled in therapeutic trials. Furthermore emerging study data suggest that these pathologic changes occur in cognitively normal subjects 20 or more years before the onset of clinically detectable memory changes thus providing an objective measurement for use in the assessment of treatment effects in primary treatment trials. However, numerous previous ELISA and Luminex-based multiplex studies reported a wide range of absolute values of CSF Aβ(1-42), t-tau and p-tau(181) indicative of substantial inter-laboratory variability as well as varying degrees of intra-laboratory imprecision. In order to address these issues a recent inter-laboratory investigation that included a common set of CSF pool aliquots from controls as well as AD patients over a range of normal and pathological Aβ(1-42), t-tau and p-tau(181) values as well as agreed-on standard operating procedures (SOPs) assessed the reproducibility of the multiplex methodology and Innogenetics AlzBio3 immunoassay reagents. This study showed within-center precision values of 5% to a little more than 10% and good inter-laboratory %CV values (10-20%). There are several likely factors influencing the variability of CSF Aβ(1-42), t-tau and p-tau(181) measurements. In this review, we describe the pre-analytical, analytical and post-analytical sources of variability including sources inherent to kits, and describe procedures to decrease the variability. A CSF AD biomarker Quality Control program has been established and funded by the Alzheimer Association, and global efforts are underway to further define optimal pre-analytical SOPs and best practices for the methodologies available or in development including plans for production of a standard reference material that could provide for a common standard against which manufacturers of immunoassay kits would assign calibration standard values. Copyright © 2012 Elsevier Inc. All rights reserved.
True, Lawrence D
2014-03-01
Paralleling the growth of ever more cost efficient methods to sequence the whole genome in minute fragments of tissue has been the identification of increasingly numerous molecular abnormalities in cancers--mutations, amplifications, insertions and deletions of genes, and patterns of differential gene expression, i.e., overexpression of growth factors and underexpression of tumor suppressor genes. These abnormalities can be translated into assays to be used in clinical decision making. In general terms, the result of such an assay is subject to a large number of variables regarding the characteristics of the available sample, particularities of the used assay, and the interpretation of the results. This review discusses the effects of these variables on assays of tissue-based biomarkers, classified by macromolecule--DNA, RNA (including micro RNA, messenger RNA, long noncoding RNA, protein, and phosphoprotein). Since the majority of clinically applicable biomarkers are immunohistochemically detectable proteins this review focuses on protein biomarkers. However, the principles outlined are mostly applicable to any other analyte. A variety of preanalytical variables impacts on the results obtained, including analyte stability (which is different for different analytes, i.e., DNA, RNA, or protein), period of warm and of cold ischemia, fixation time, tissue processing, sample storage time, and storage conditions. In addition, assay variables play an important role, including reagent specificity (notably but not uniquely an issue concerning antibodies used in immunohistochemistry), technical components of the assay, quantitation, and assay interpretation. Finally, appropriateness of an assay for clinical application is an important issue. Reference is made to publicly available guidelines to improve on biomarker development in general and requirements for clinical use in particular. Strategic goals are formulated in order to improve on the quality of biomarker reporting, including issues of analyte quality, experimental detail, assay efficiency and precision, and assay appropriateness.
Against Simplicity, against Ethics: Analytics of Disruption as Quasi-Methodology
ERIC Educational Resources Information Center
Childers, Sara M.
2012-01-01
Simplified understandings of qualitative inquiry as mere method overlook the complexity and nuance of qualitative practice. As is the call of this special issue, the author intervenes in the simplification of qualitative inquiry through a discussion of methodology to illustrate how theory and inquiry are inextricably linked and ethically…
The Integration of Project-Based Methodology into Teaching in Machine Translation
ERIC Educational Resources Information Center
Madkour, Magda
2016-01-01
This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…
Toddi A. Steelman; Branda Nowell; Deena Bayoumi; Sarah McCaffrey
2014-01-01
We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"a term and...
Centroid and Theoretical Rotation: Justification for Their Use in Q Methodology Research
ERIC Educational Resources Information Center
Ramlo, Sue
2016-01-01
This manuscript's purpose is to introduce Q as a methodology before providing clarification about the preferred factor analytical choices of centroid and theoretical (hand) rotation. Stephenson, the creator of Q, designated that only these choices allowed for scientific exploration of subjectivity while not violating assumptions associated with…
The recent discovery of the pollution of the environment with Kepone has resulted in a tremendous interest in the development of residue methodology for the compound. Current multiresidue methods for the determination of the common organochlorinated pesticides do not yield good q...
The Nature of Educational Research
ERIC Educational Resources Information Center
Gillett, Simon G.
2011-01-01
The paper is in two parts. The first part of the paper is a critique of current methodology in educational research: scientific, critical and interpretive. The ontological and epistemological assumptions of those methodologies are described from the standpoint of John Searle's analytic philosophy. In the second part two research papers with…
Speciated arsenic in air: measurement methodology and risk assessment considerations.
Lewis, Ari S; Reid, Kim R; Pollock, Margaret C; Campleman, Sharan L
2012-01-01
Accurate measurement of arsenic (As) in air is critical to providing a more robust understanding of arsenic exposures and associated human health risks. Although there is extensive information available on total arsenic in air, less is known on the relative contribution of each arsenic species. To address this data gap, the authors conducted an in-depth review of available information on speciated arsenic in air. The evaluation included the type of species measured and the relative abundance, as well as an analysis of the limitations of current analytical methods. Despite inherent differences in the procedures, most techniques effectively separated arsenic species in the air samples. Common analytical techniques such as inductively coupled plasma mass spectrometry (ICP-MS) and/or hydride generation (HG)- or quartz furnace (GF)-atomic absorption spectrometry (AAS) were used for arsenic measurement in the extracts, and provided some of the most sensitive detection limits. The current analysis demonstrated that, despite limited comparability among studies due to differences in seasonal factors, study duration, sample collection methods, and analytical methods, research conducted to date is adequate to show that arsenic in air is mainly in the inorganic form. Reported average concentrations of As(III) and As(V) ranged up to 7.4 and 10.4 ng/m3, respectively, with As(V) being more prevalent than As(III) in most studies. Concentrations of the organic methylated arsenic compounds are negligible (in the pg/m3 range). However because of the variability in study methods and measurement methodology, the authors were unable to determine the variation in arsenic composition as a function of source or particulate matter (PM) fraction. In this work, the authors include the implications of arsenic speciation in air on potential exposure and risks. The authors conclude that it is important to synchronize sample collection, preparation, and analytical techniques in order to generate data more useful for arsenic inhalation risk assessment, and a more robust documentation of quality assurance/quality control (QA/QC) protocols is necessary to ensure accuracy, precision, representativeness, and comparability.
Plant gum identification in historic artworks
Granzotto, Clara; Arslanoglu, Julie; Rolando, Christian; Tokarski, Caroline
2017-01-01
We describe an integrated and straightforward new analytical protocol that identifies plant gums from various sample sources including cultural heritage. Our approach is based on the identification of saccharidic fingerprints using mass spectrometry following controlled enzymatic hydrolysis. We developed an enzyme cocktail suitable for plant gums of unknown composition. Distinctive MS profiles of gums such as arabic, cherry and locust-bean gums were successfully identified. A wide range of oligosaccharidic combinations of pentose, hexose, deoxyhexose and hexuronic acid were accurately identified in gum arabic whereas cherry and locust bean gums showed respectively PentxHexy and Hexn profiles. Optimized for low sample quantities, the analytical protocol was successfully applied to contemporary and historic samples including ‘Colour Box Charles Roberson & Co’ dating 1870s and drawings from the American painter Arthur Dove (1880–1946). This is the first time that a gum is accurately identified in a cultural heritage sample using structural information. Furthermore, this methodology is applicable to other domains (food, cosmetic, pharmaceutical, biomedical). PMID:28425501
Disturbance characteristics of half-selected cells in a cross-point resistive switching memory array
NASA Astrophysics Data System (ADS)
Chen, Zhe; Li, Haitong; Chen, Hong-Yu; Chen, Bing; Liu, Rui; Huang, Peng; Zhang, Feifei; Jiang, Zizhen; Ye, Hongfei; Gao, Bin; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng; Wong, H.-S. Philip; Yu, Shimeng
2016-05-01
Disturbance characteristics of cross-point resistive random access memory (RRAM) arrays are comprehensively studied in this paper. An analytical model is developed to quantify the number of pulses (#Pulse) the cell can bear before disturbance occurs under various sub-switching voltage stresses based on physical understanding. An evaluation methodology is proposed to assess the disturb behavior of half-selected (HS) cells in cross-point RRAM arrays by combining the analytical model and SPICE simulation. The characteristics of cross-point RRAM arrays such as energy consumption, reliable operating cycles and total error bits are evaluated by the methodology. A possible solution to mitigate disturbance is proposed.
Learning Analytics in Higher Education Development: A Roadmap
ERIC Educational Resources Information Center
Adejo, Olugbenga; Connolly, Thomas
2017-01-01
The increase in education data and advance in technology are bringing about enhanced teaching and learning methodology. The emerging field of Learning Analytics (LA) continues to seek ways to improve the different methods of gathering, analysing, managing and presenting learners' data with the sole aim of using it to improve the student learning…
ERIC Educational Resources Information Center
Brown, Steven D.; Tramayne, Selena; Hoxha, Denada; Telander, Kyle; Fan, Xiaoyan; Lent, Robert W.
2008-01-01
This study tested Social Cognitive Career Theory's (SCCT) academic performance model using a two-stage approach that combined meta-analytic and structural equation modeling methodologies. Unbiased correlations obtained from a previously published meta-analysis [Robbins, S. B., Lauver, K., Le, H., Davis, D., & Langley, R. (2004). Do psychosocial…
Analytical Chemistry Division. Annual progress report for period ending December 31, 1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyon, W.S.
1981-05-01
This report is divided into: analytical methodology; mass and emission spectrometry; technical support; bio/organic analysis; nuclear and radiochemical analysis; quality assurance, safety, and tabulation of analyses; supplementary activities; and presentation of research results. Separate abstracts were prepared for the technical support, bio/organic analysis, and nuclear and radiochemical analysis. (DLC)
A Multidimensional Reappraisal of Language in Autism: Insights from a Discourse Analytic Study
ERIC Educational Resources Information Center
Sterponi, Laura; de Kirby, Kenton
2016-01-01
In this article, we leverage theoretical insights and methodological guidelines of discourse analytic scholarship to re-examine language phenomena typically associated with autism. Through empirical analysis of the verbal behavior of three children with autism, we engage the question of how prototypical features of autistic language--notably…
Improvements in analytical methodology have allowed low-level detection of an ever increasing number of pharmaceuticals, personal care products, hormones, pathogens and other contaminants of emerging concern (CECs). The use of these improved analytical tools has allowed researche...
ERIC Educational Resources Information Center
Reinholz, Daniel L.; Shah, Niral
2018-01-01
Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
NASA Astrophysics Data System (ADS)
Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha
2016-04-01
This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.
Hogendoom, E A; Huls, R; Dijkman, E; Hoogerbrugge, R
2001-12-14
A screening method has been developed for the determination of acidic pesticides in various types of soils. Methodology is based on the use of microwave assisted solvent extraction (MASE) for fast and efficient extraction of the analytes from the soils and coupled-column reversed-phase liquid chromatography (LC-LC) with UV detection at 228 nm for the instrumental analysis of uncleaned extracts. Four types of soils, including sand, clay and peat, with a range in organic matter content of 0.3-13% and ten acidic pesticides of different chemical families (bentazone, bromoxynil, metsulfuron-methyl, 2,4-D, MCPA, MCPP, 2,4-DP, 2,4,5-T, 2,4-DB and MCPB) were selected as matrices and analytes, respectively. The method developed included the selection of suitable MASE and LC-LC conditions. The latter consisted of the selection of a 5-microm GFF-II internal surface reversed-phase (ISRP, Pinkerton) analytical column (50 x 4.6 mm, I.D.) as the first column in the RAM-C18 configuration in combination with an optimised linear gradient elution including on-line cleanup of sample extracts and reconditioning of the columns. The method was validated with the analysis of freshly spiked samples and samples with aged residues (120 days). The four types of soils were spiked with the ten acidic pesticides at levels between 20 and 200 microg/kg. Weighted regression of the recovery data showed for most analyte-matrix combinations, including freshly spiked samples and aged residues, that the method provides overall recoveries between 60 and 90% with relative standard deviations of the intra-laboratory reproducibility's between 5 and 25%; LODs were obtained between 5 and 50 microg/kg. Evaluation of the data set with principal component analysis revealed that the parameters (i) increase of organic matter content of the soil samples and (ii) aged residues negatively effect the recovery of the analytes.
A literature review of empirical research on learning analytics in medical education
Saqr, Mohammed
2018-01-01
The number of publications in the field of medical education is still markedly low, despite recognition of the value of the discipline in the medical education literature, and exponential growth of publications in other fields. This necessitates raising awareness of the research methods and potential benefits of learning analytics (LA). The aim of this paper was to offer a methodological systemic review of empirical LA research in the field of medical education and a general overview of the common methods used in the field in general. Search was done in Medline database using the term “LA.” Inclusion criteria included empirical original research articles investigating LA using qualitative, quantitative, or mixed methodologies. Articles were also required to be written in English, published in a scholarly peer-reviewed journal and have a dedicated section for methods and results. A Medline search resulted in only six articles fulfilling the inclusion criteria for this review. Most of the studies collected data about learners from learning management systems or online learning resources. Analysis used mostly quantitative methods including descriptive statistics, correlation tests, and regression models in two studies. Patterns of online behavior and usage of the digital resources as well as predicting achievement was the outcome most studies investigated. Research about LA in the field of medical education is still in infancy, with more questions than answers. The early studies are encouraging and showed that patterns of online learning can be easily revealed as well as predicting students’ performance. PMID:29599699
A literature review of empirical research on learning analytics in medical education.
Saqr, Mohammed
2018-01-01
The number of publications in the field of medical education is still markedly low, despite recognition of the value of the discipline in the medical education literature, and exponential growth of publications in other fields. This necessitates raising awareness of the research methods and potential benefits of learning analytics (LA). The aim of this paper was to offer a methodological systemic review of empirical LA research in the field of medical education and a general overview of the common methods used in the field in general. Search was done in Medline database using the term "LA." Inclusion criteria included empirical original research articles investigating LA using qualitative, quantitative, or mixed methodologies. Articles were also required to be written in English, published in a scholarly peer-reviewed journal and have a dedicated section for methods and results. A Medline search resulted in only six articles fulfilling the inclusion criteria for this review. Most of the studies collected data about learners from learning management systems or online learning resources. Analysis used mostly quantitative methods including descriptive statistics, correlation tests, and regression models in two studies. Patterns of online behavior and usage of the digital resources as well as predicting achievement was the outcome most studies investigated. Research about LA in the field of medical education is still in infancy, with more questions than answers. The early studies are encouraging and showed that patterns of online learning can be easily revealed as well as predicting students' performance.
INTEGRATING DATA ANALYTICS AND SIMULATION METHODS TO SUPPORT MANUFACTURING DECISION MAKING
Kibira, Deogratias; Hatim, Qais; Kumara, Soundar; Shao, Guodong
2017-01-01
Modern manufacturing systems are installed with smart devices such as sensors that monitor system performance and collect data to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of streaming data collected is beyond simulation analysis alone. Simulation models are run with well-prepared data. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs; system optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. PMID:28690363
Cordeiro, Liliana; Valente, Inês M; Santos, João Rodrigo; Rodrigues, José A
2018-05-01
In this work, an analytical methodology for volatile carbonyl compounds characterization in green and roasted coffee beans was developed. The methodology relied on a recent and simple sample preparation technique, gas diffusion microextraction for extraction of the samples' volatiles, followed HPLC-DAD-MS/MS analysis. The experimental conditions in terms of extraction temperature and extraction time were studied. A profile for carbonyl compounds was obtained for both arabica and robusta coffee species (green and roasted samples). Twenty-seven carbonyl compounds were identified and further discussed, in light of reported literature, with different coffee characteristics: coffee ageing, organoleptic impact, presence of defective beans, authenticity, human's health implication, post-harvest coffee processing and roasting. The applied methodology showed to be a powerful analytical tool to be used for coffee characterization as it measures marker compounds of different coffee characteristics. Copyright © 2018 Elsevier Ltd. All rights reserved.
Transport composite fuselage technology: Impact dynamics and acoustic transmission
NASA Technical Reports Server (NTRS)
Jackson, A. C.; Balena, F. J.; Labarge, W. L.; Pei, G.; Pitman, W. A.; Wittlin, G.
1986-01-01
A program was performed to develop and demonstrate the impact dynamics and acoustic transmission technology for a composite fuselage which meets the design requirements of a 1990 large transport aircraft without substantial weight and cost penalties. The program developed the analytical methodology for the prediction of acoustic transmission behavior of advanced composite stiffened shell structures. The methodology predicted that the interior noise level in a composite fuselage due to turbulent boundary layer will be less than in a comparable aluminum fuselage. The verification of these analyses will be performed by NASA Langley Research Center using a composite fuselage shell fabricated by filament winding. The program also developed analytical methodology for the prediction of the impact dynamics behavior of lower fuselage structure constructed with composite materials. Development tests were performed to demonstrate that the composite structure designed to the same operating load requirement can have at least the same energy absorption capability as aluminum structure.
NASA Astrophysics Data System (ADS)
Whitehead, James Joshua
The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.
Test-Retest Reliability of Pediatric Heart Rate Variability: A Meta-Analysis.
Weiner, Oren M; McGrath, Jennifer J
2017-01-01
Heart rate variability (HRV), an established index of autonomic cardiovascular modulation, is associated with health outcomes (e.g., obesity, diabetes) and mortality risk. Time- and frequency-domain HRV measures are commonly reported in longitudinal adult and pediatric studies of health. While test-retest reliability has been established among adults, less is known about the psychometric properties of HRV among infants, children, and adolescents. The objective was to conduct a meta-analysis of the test-retest reliability of time- and frequency-domain HRV measures from infancy to adolescence. Electronic searches (PubMed, PsycINFO; January 1970-December 2014) identified studies with nonclinical samples aged ≤ 18 years; ≥ 2 baseline HRV recordings separated by ≥ 1 day; and sufficient data for effect size computation. Forty-nine studies ( N = 5,170) met inclusion criteria. Methodological variables coded included factors relevant to study protocol, sample characteristics, electrocardiogram (ECG) signal acquisition and preprocessing, and HRV analytical decisions. Fisher's Z was derived as the common effect size. Analyses were age-stratified (infant/toddler < 5 years, n = 3,329; child/adolescent 5-18 years, n = 1,841) due to marked methodological differences across the pediatric literature. Meta-analytic results revealed HRV demonstrated moderate reliability; child/adolescent studies ( Z = 0.62, r = 0.55) had significantly higher reliability than infant/toddler studies ( Z = 0.42, r = 0.40). Relative to other reported measures, HF exhibited the highest reliability among infant/toddler studies ( Z = 0.42, r = 0.40), while rMSSD exhibited the highest reliability among child/adolescent studies ( Z = 1.00, r = 0.76). Moderator analyses indicated greater reliability with shorter test-retest interval length, reported exclusion criteria based on medical illness/condition, lower proportion of males, prerecording acclimatization period, and longer recording duration; differences were noted across age groups. HRV is reliable among pediatric samples. Reliability is sensitive to pertinent methodological decisions that require careful consideration by the researcher. Limited methodological reporting precluded several a priori moderator analyses. Suggestions for future research, including standards specified by Task Force Guidelines, are discussed.
Test-Retest Reliability of Pediatric Heart Rate Variability
Weiner, Oren M.; McGrath, Jennifer J.
2017-01-01
Heart rate variability (HRV), an established index of autonomic cardiovascular modulation, is associated with health outcomes (e.g., obesity, diabetes) and mortality risk. Time- and frequency-domain HRV measures are commonly reported in longitudinal adult and pediatric studies of health. While test-retest reliability has been established among adults, less is known about the psychometric properties of HRV among infants, children, and adolescents. The objective was to conduct a meta-analysis of the test-retest reliability of time- and frequency-domain HRV measures from infancy to adolescence. Electronic searches (PubMed, PsycINFO; January 1970–December 2014) identified studies with nonclinical samples aged ≤ 18 years; ≥ 2 baseline HRV recordings separated by ≥ 1 day; and sufficient data for effect size computation. Forty-nine studies (N = 5,170) met inclusion criteria. Methodological variables coded included factors relevant to study protocol, sample characteristics, electrocardiogram (ECG) signal acquisition and preprocessing, and HRV analytical decisions. Fisher’s Z was derived as the common effect size. Analyses were age-stratified (infant/toddler < 5 years, n = 3,329; child/adolescent 5–18 years, n = 1,841) due to marked methodological differences across the pediatric literature. Meta-analytic results revealed HRV demonstrated moderate reliability; child/adolescent studies (Z = 0.62, r = 0.55) had significantly higher reliability than infant/toddler studies (Z = 0.42, r = 0.40). Relative to other reported measures, HF exhibited the highest reliability among infant/toddler studies (Z = 0.42, r = 0.40), while rMSSD exhibited the highest reliability among child/adolescent studies (Z = 1.00, r = 0.76). Moderator analyses indicated greater reliability with shorter test-retest interval length, reported exclusion criteria based on medical illness/condition, lower proportion of males, prerecording acclimatization period, and longer recording duration; differences were noted across age groups. HRV is reliable among pediatric samples. Reliability is sensitive to pertinent methodological decisions that require careful consideration by the researcher. Limited methodological reporting precluded several a priori moderator analyses. Suggestions for future research, including standards specified by Task Force Guidelines, are discussed. PMID:29307951
Aromataris, Edoardo; Fernandez, Ritin; Godfrey, Christina M; Holly, Cheryl; Khalil, Hanan; Tungpunkom, Patraporn
2015-09-01
With the increase in the number of systematic reviews available, a logical next step to provide decision makers in healthcare with the evidence they require has been the conduct of reviews of existing systematic reviews. Syntheses of existing systematic reviews are referred to by many different names, one of which is an umbrella review. An umbrella review allows the findings of reviews relevant to a review question to be compared and contrasted. An umbrella review's most characteristic feature is that this type of evidence synthesis only considers for inclusion the highest level of evidence, namely other systematic reviews and meta-analyses. A methodology working group was formed by the Joanna Briggs Institute to develop methodological guidance for the conduct of an umbrella review, including diverse types of evidence, both quantitative and qualitative. The aim of this study is to describe the development and guidance for the conduct of an umbrella review. Discussion and testing of the elements of methods for the conduct of an umbrella review were held over a 6-month period by members of a methodology working group. The working group comprised six participants who corresponded via teleconference, e-mail and face-to-face meeting during this development period. In October 2013, the methodology was presented in a workshop at the Joanna Briggs Institute Convention. Workshop participants, review authors and methodologists provided further testing, critique and feedback on the proposed methodology. This study describes the methodology and methods developed for the conduct of an umbrella review that includes published systematic reviews and meta-analyses as the analytical unit of the review. Details are provided regarding the essential elements of an umbrella review, including presentation of the review question in a Population, Intervention, Comparator, Outcome format, nuances of the inclusion criteria and search strategy. A critical appraisal tool with 10 questions to help assess risk of bias in systematic reviews and meta-analyses was also developed and tested. Relevant details to extract from included reviews and how to best present the findings of both quantitative and qualitative systematic reviews in a reader friendly format are provided. Umbrella reviews provide a ready means for decision makers in healthcare to gain a clear understanding of a broad topic area. The umbrella review methodology described here is the first to consider reviews that report other than quantitative evidence derived from randomized controlled trials. The methodology includes an easy to use and informative summary of evidence table to readily provide decision makers with the available, highest level of evidence relevant to the question posed.
Analytical procedure validation and the quality by design paradigm.
Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno
2015-01-01
Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.
Evaluative methodology for prioritizing transportation energy conservation strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pang, L.M.G.
An analytical methodology was developed for the purpose of prioritizing a set of transportation energy conservation (TEC) strategies within an urban environment. Steps involved in applying the methodology consist of 1) defining the goals, objectives and constraints of the given urban community, 2) identifying potential TEC strategies, 3) assessing the impact of the strategies, 4) applying the TEC evaluation model, and 5) utilizing a selection process to determine the optimal set of strategies for implementation. This research provides an overview of 21 TEC strategies, a quick-response technique for estimating energy savings, a multiattribute utility theory approach for assessing subjective impacts,more » and a computer program for making the strategy evaluations, all of which assist in expediting the execution of the entire methodology procedure. The critical element of the methodology is the strategy evaluation model which incorporates a number of desirable concepts including 1) a comprehensive accounting of all relevant impacts, 2) the application of multiobjective decision-making techniques, 3) an approach to assure compatibilty among quantitative and qualitative impact measures, 4) the inclusion of the decision maker's preferences in the evaluation procedure, and 5) the cost-effectiveness concept. Application of the methodolgy to Salt Lake City, Utah demonstrated its utility, ease of use and favorability by decision makers.« less
Predictive simulation of guide-wave structural health monitoring
NASA Astrophysics Data System (ADS)
Giurgiutiu, Victor
2017-04-01
This paper presents an overview of recent developments on predictive simulation of guided wave structural health monitoring (SHM) with piezoelectric wafer active sensor (PWAS) transducers. The predictive simulation methodology is based on the hybrid global local (HGL) concept which allows fast analytical simulation in the undamaged global field and finite element method (FEM) simulation in the local field around and including the damage. The paper reviews the main results obtained in this area by researchers of the Laboratory for Active Materials and Smart Structures (LAMSS) at the University of South Carolina, USA. After thematic introduction and research motivation, the paper covers four main topics: (i) presentation of the HGL analysis; (ii) analytical simulation in 1D and 2D; (iii) scatter field generation; (iv) HGL examples. The paper ends with summary, discussion, and suggestions for future work.
Advances in analytical technologies for environmental protection and public safety.
Sadik, O A; Wanekaya, A K; Andreescu, S
2004-06-01
Due to the increased threats of chemical and biological agents of injury by terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat chemical and biochemical toxins. In addition to the right mix of policies and training of medical personnel on how to recognize symptoms of biochemical warfare agents, the major success in combating terrorism still lies in the prevention, early detection and the efficient and timely response using reliable analytical technologies and powerful therapies for minimizing the effects in the event of an attack. The public and regulatory agencies expect reliable methodologies and devices for public security. Today's systems are too bulky or slow to meet the "detect-to-warn" needs for first responders such as soldiers and medical personnel. This paper presents the challenges in monitoring technologies for warfare agents and other toxins. It provides an overview of how advances in environmental analytical methodologies could be adapted to design reliable sensors for public safety and environmental surveillance. The paths to designing sensors that meet the needs of today's measurement challenges are analyzed using examples of novel sensors, autonomous cell-based toxicity monitoring, 'Lab-on-a-Chip' devices and conventional environmental analytical techniques. Finally, in order to ensure that the public and legal authorities are provided with quality data to make informed decisions, guidelines are provided for assessing data quality and quality assurance using the United States Environmental Protection Agency (US-EPA) methodologies.
Cooper, Chris; Booth, Andrew; Britten, Nicky; Garside, Ruth
2017-11-28
The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged. Numerous studies have demonstrated their potential in identifying studies or study data that would have been missed by bibliographic database searching alone. What is less certain is how supplementary search methods actually work, how they are applied, and the consequent advantages, disadvantages and resource implications of each search method. The aim of this study is to compare current practice in using supplementary search methods with methodological guidance. Four methodological handbooks in informing systematic review practice in the UK were read and audited to establish current methodological guidance. Studies evaluating the use of supplementary search methods were identified by searching five bibliographic databases. Studies were included if they (1) reported practical application of a supplementary search method (descriptive) or (2) examined the utility of a supplementary search method (analytical) or (3) identified/explored factors that impact on the utility of a supplementary method, when applied in practice. Thirty-five studies were included in this review in addition to the four methodological handbooks. Studies were published between 1989 and 2016, and dates of publication of the handbooks ranged from 1994 to 2014. Five supplementary search methods were reviewed: contacting study authors, citation chasing, handsearching, searching trial registers and web searching. There is reasonable consistency between recommended best practice (handbooks) and current practice (methodological studies) as it relates to the application of supplementary search methods. The methodological studies provide useful information on the effectiveness of the supplementary search methods, often seeking to evaluate aspects of the method to improve effectiveness or efficiency. In this way, the studies advance the understanding of the supplementary search methods. Further research is required, however, so that a rational choice can be made about which supplementary search strategies should be used, and when.
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2014-02-25
It is fundamental that randomised controlled trials (RCTs) are properly conducted in order to reach well-supported conclusions. However, there is emerging evidence that RCTs are subject to biases which can overestimate or underestimate the true treatment effect, due to flaws in the study design characteristics of such trials. The extent to which this holds true in oral health RCTs, which have some unique design characteristics compared to RCTs in other health fields, is unclear. As such, we aim to examine the empirical evidence quantifying the extent of bias associated with methodological and non-methodological characteristics in oral health RCTs. We plan to perform a meta-epidemiological study, where a sample size of 60 meta-analyses (MAs) including approximately 600 RCTs will be selected. The MAs will be randomly obtained from the Oral Health Database of Systematic Reviews using a random number table; and will be considered for inclusion if they include a minimum of five RCTs, and examine a therapeutic intervention related to one of the recognised dental specialties. RCTs identified in selected MAs will be subsequently included if their study design includes a comparison between an intervention group and a placebo group or another intervention group. Data will be extracted from selected trials included in MAs based on a number of methodological and non-methodological characteristics. Moreover, the risk of bias will be assessed using the Cochrane Risk of Bias tool. Effect size estimates and measures of variability for the main outcome will be extracted from each RCT included in selected MAs, and a two-level analysis will be conducted using a meta-meta-analytic approach with a random effects model to allow for intra-MA and inter-MA heterogeneity. The intended audiences of the findings will include dental clinicians, oral health researchers, policymakers and graduate students. The aforementioned will be introduced to the findings through workshops, seminars, round table discussions and targeted individual meetings. Other opportunities for knowledge transfer will be pursued such as key dental conferences. Finally, the results will be published as a scientific report in a dental peer-reviewed journal.
A Review of Meta-Analyses in Education: Methodological Strengths and Weaknesses
ERIC Educational Resources Information Center
Ahn, Soyeon; Ames, Allison J.; Myers, Nicholas D.
2012-01-01
The current review addresses the validity of published meta-analyses in education that determines the credibility and generalizability of study findings using a total of 56 meta-analyses published in education in the 2000s. Our objectives were to evaluate the current meta-analytic practices in education, identify methodological strengths and…
Designing Evaluations. 2012 Revision. Applied Research and Methods. GAO-12-208G
ERIC Educational Resources Information Center
US Government Accountability Office, 2012
2012-01-01
GAO assists congressional decision makers in their deliberations by furnishing them with analytical information on issues and options. Many diverse methodologies are needed to develop sound and timely answers to the questions the Congress asks. To provide GAO evaluators with basic information about the more commonly used methodologies, GAO's…
NASA Astrophysics Data System (ADS)
Madariaga, J. M.; Torre-Fdez, I.; Ruiz-Galende, P.; Aramendia, J.; Gomez-Nubla, L.; Fdez-Ortiz de Vallejuelo, S.; Maguregui, M.; Castro, K.; Arana, G.
2018-04-01
Advanced methodologies based on Raman spectroscopy are proposed to detect prebiotic and biotic molecules in returned samples from Mars: (a) optical microscopy with confocal micro-Raman, (b) the SCA instrument, (c) Raman Imaging. Examples for NWA 6148.
Technological Leverage in Higher Education: An Evolving Pedagogy
ERIC Educational Resources Information Center
Pillai, K. Rajasekharan; Prakash, Ashish Viswanath
2017-01-01
Purpose: The purpose of the study is to analyse the perception of students toward a computer-based exam on a custom-made digital device and their willingness to adopt the same for high-stake summative assessment. Design/methodology/approach: This study followed an analytical methodology using survey design. A modified version of students'…
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.
1994-01-01
This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.
Zhou, Ping; Schechter, Clyde; Cai, Ziyong; Markowitz, Morri
2011-06-01
To highlight complexities in defining vitamin D sufficiency in children. Serum 25-(OH) vitamin D [25(OH)D] levels from 140 healthy obese children age 6 to 21 years living in the inner city were compared with multiple health outcome measures, including bone biomarkers and cardiovascular risk factors. Several statistical analytic approaches were used, including Pearson correlation, analysis of covariance (ANCOVA), and "hockey stick" regression modeling. Potential threshold levels for vitamin D sufficiency varied by outcome variable and analytic approach. Only systolic blood pressure (SBP) was significantly correlated with 25(OH)D (r = -0.261; P = .038). ANCOVA revealed that SBP and triglyceride levels were statistically significant in the test groups [25(OH)D <10, <15 and <20 ng/mL] compared with the reference group [25(OH)D >25 ng/mL]. ANCOVA also showed that only children with severe vitamin D deficiency [25(OH)D <10 ng/mL] had significantly higher parathyroid hormone levels (Δ = 15; P = .0334). Hockey stick model regression analyses found evidence of a threshold level in SBP, with a 25(OH)D breakpoint of 27 ng/mL, along with a 25(OH)D breakpoint of 18 ng/mL for triglycerides, but no relationship between 25(OH)D and parathyroid hormone. Defining vitamin D sufficiency should take into account different vitamin D-related health outcome measures and analytic methodologies. Copyright © 2011 Mosby, Inc. All rights reserved.
Sailaukhanuly, Yerbolat; Zhakupbekova, Arai; Amutova, Farida; Carlsen, Lars
2013-01-01
Knowledge of the environmental behavior of chemicals is a fundamental part of the risk assessment process. The present paper discusses various methods of ranking of a series of persistent organic pollutants (POPs) according to the persistence, bioaccumulation and toxicity (PBT) characteristics. Traditionally ranking has been done as an absolute (total) ranking applying various multicriteria data analysis methods like simple additive ranking (SAR) or various utility functions (UFs) based rankings. An attractive alternative to these ranking methodologies appears to be partial order ranking (POR). The present paper compares different ranking methods like SAR, UF and POR. Significant discrepancies between the rankings are noted and it is concluded that partial order ranking, as a method without any pre-assumptions concerning possible relation between the single parameters, appears as the most attractive ranking methodology. In addition to the initial ranking partial order methodology offers a wide variety of analytical tools to elucidate the interplay between the objects to be ranked and the ranking parameters. In the present study is included an analysis of the relative importance of the single P, B and T parameters. Copyright © 2012 Elsevier Ltd. All rights reserved.
Spatio-Temporal Dimensions of Child Poverty in America, 1990-2010.
Call, Maia A; Voss, Paul R
2016-01-01
The persistence of childhood poverty in the United States, a wealthy and developed country, continues to pose both an analytical dilemma and public policy challenge, despite many decades of research and remedial policy implementation. In this paper, our goals are twofold, though our primary focus is methodological. We attempt both to examine the relationship between space, time, and previously established factors correlated with childhood poverty at the county level in the continental United States as well as to provide an empirical case study to demonstrate an underutilized methodological approach. We analyze a spatially consistent dataset built from the 1990 and 2000 U.S. Censuses, and the 2006-2010 American Community Survey. Our analytic approach includes cross-sectional spatial models to estimate the reproduction of poverty for each of the reference years as well as a fixed effects panel data model, to analyze change in child poverty over time. In addition, we estimate a full space-time interaction model, which adjusts for spatial and temporal variation in these data. These models reinforce our understanding of the strong regional persistence of childhood poverty in the U.S. over time and suggest that the factors impacting childhood poverty remain much the same today as they have in past decades.
Quantitative mass spectrometry of unconventional human biological matrices
NASA Astrophysics Data System (ADS)
Dutkiewicz, Ewelina P.; Urban, Pawel L.
2016-10-01
The development of sensitive and versatile mass spectrometric methodology has fuelled interest in the analysis of metabolites and drugs in unconventional biological specimens. Here, we discuss the analysis of eight human matrices-hair, nail, breath, saliva, tears, meibum, nasal mucus and skin excretions (including sweat)-by mass spectrometry (MS). The use of such specimens brings a number of advantages, the most important being non-invasive sampling, the limited risk of adulteration and the ability to obtain information that complements blood and urine tests. The most often studied matrices are hair, breath and saliva. This review primarily focuses on endogenous (e.g. potential biomarkers, hormones) and exogenous (e.g. drugs, environmental contaminants) small molecules. The majority of analytical methods used chromatographic separation prior to MS; however, such a hyphenated methodology greatly limits analytical throughput. On the other hand, the mass spectrometric methods that exclude chromatographic separation are fast but suffer from matrix interferences. To enable development of quantitative assays for unconventional matrices, it is desirable to standardize the protocols for the analysis of each specimen and create appropriate certified reference materials. Overcoming these challenges will make analysis of unconventional human biological matrices more common in a clinical setting. This article is part of the themed issue 'Quantitative mass spectrometry'.
Yang, Heejung; Kim, Hyun Woo; Kwon, Yong Soo; Kim, Ho Kyong; Sung, Sang Hyun
2017-09-01
Anthocyanins are potent antioxidant agents that protect against many degenerative diseases; however, they are unstable because they are vulnerable to external stimuli including temperature, pH and light. This vulnerability hinders the quality control of anthocyanin-containing berries using classical high-performance liquid chromatography (HPLC) analytical methodologies based on UV or MS chromatograms. To develop an alternative approach for the quality assessment and discrimination of anthocyanin-containing berries, we used MS spectral data acquired in a short analytical time rather than UV or MS chromatograms. Mixtures of anthocyanins were separated from other components in a short gradient time (5 min) due to their higher polarity, and the representative MS spectrum was acquired from the MS chromatogram corresponding to the mixture of anthocyanins. The chemometric data from the representative MS spectra contained reliable information for the identification and relative quantification of anthocyanins in berries with good precision and accuracy. This fast and simple methodology, which consists of a simple sample preparation method and short gradient analysis, could be applied to reliably discriminate the species and geographical origins of different anthocyanin-containing berries. These features make the technique useful for the food industry. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Shuttle TPS thermal performance and analysis methodology
NASA Technical Reports Server (NTRS)
Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.
1983-01-01
Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.
Prioritizing sewer rehabilitation projects using AHP-PROMETHEE II ranking method.
Kessili, Abdelhak; Benmamar, Saadia
2016-01-01
The aim of this paper is to develop a methodology for the prioritization of sewer rehabilitation projects for Algiers (Algeria) sewer networks to support the National Sanitation Office in its challenge to make decisions on prioritization of sewer rehabilitation projects. The methodology applies multiple-criteria decision making. The study includes 47 projects (collectors) and 12 criteria to evaluate them. These criteria represent the different issues considered in the prioritization of the projects, which are structural, hydraulic, environmental, financial, social and technical. The analytic hierarchy process (AHP) is used to determine weights of the criteria and the Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE II) method is used to obtain the final ranking of the projects. The model was verified using the sewer data of Algiers. The results have shown that the method can be used for prioritizing sewer rehabilitation projects.
Integrated active and passive control design methodology for the LaRC CSI evolutionary model
NASA Technical Reports Server (NTRS)
Voth, Christopher T.; Richards, Kenneth E., Jr.; Schmitz, Eric; Gehling, Russel N.; Morgenthaler, Daniel R.
1994-01-01
A general design methodology to integrate active control with passive damping was demonstrated on the NASA LaRC CSI Evolutionary Model (CEM), a ground testbed for future large, flexible spacecraft. Vibration suppression controllers designed for Line-of Sight (LOS) minimization were successfully implemented on the CEM. A frequency-shaped H2 methodology was developed, allowing the designer to specify the roll-off of the MIMO compensator. A closed loop bandwidth of 4 Hz, including the six rigid body modes and the first three dominant elastic modes of the CEM was achieved. Good agreement was demonstrated between experimental data and analytical predictions for the closed loop frequency response and random tests. Using the Modal Strain Energy (MSE) method, a passive damping treatment consisting of 60 viscoelastically damped struts was designed, fabricated and implemented on the CEM. Damping levels for the targeted modes were more than an order of magnitude larger than for the undamped structure. Using measured loss and stiffness data for the individual damped struts, analytical predictions of the damping levels were very close to the experimental values in the (1-10) Hz frequency range where the open loop model matched the experimental data. An integrated active/passive controller was successfully implemented on the CEM and was evaluated against an active-only controller. A two-fold increase in the effective control bandwidth and further reductions of 30 percent to 50 percent in the LOS RMS outputs were achieved compared to an active-only controller. Superior performance was also obtained compared to a High-Authority/Low-Authority (HAC/LAC) controller.
ERIC Educational Resources Information Center
Ho, Hsuan-Fu; Hung, Chia-Chi
2008-01-01
Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…
ERIC Educational Resources Information Center
Cheung, Mike W. L.; Chan, Wai
2009-01-01
Structural equation modeling (SEM) is widely used as a statistical framework to test complex models in behavioral and social sciences. When the number of publications increases, there is a need to systematically synthesize them. Methodology of synthesizing findings in the context of SEM is known as meta-analytic SEM (MASEM). Although correlation…
Contextual and Analytic Qualities of Research Methods Exemplified in Research on Teaching
ERIC Educational Resources Information Center
Svensson, Lennart; Doumas, Kyriaki
2013-01-01
The aim of the present article is to discuss contextual and analytic qualities of research methods. The arguments are specified in relation to research on teaching. A specific investigation is used as an example to illustrate the general methodological approach. It is argued that research methods should be carefully grounded in an understanding of…
ERIC Educational Resources Information Center
Wilczek-Vera, Grazyna; Salin, Eric Dunbar
2011-01-01
An experiment on fluorescence spectroscopy suitable for an advanced analytical laboratory is presented. Its conceptual development used a combination of the expository and discovery styles. The "learn-as-you-go" and direct "hands-on" methodology applied ensures an active role for a student in the process of visualization and discovery of concepts.…
About Skinner and Time: Behavior-Analytic Contributions to Research on Animal Timing
ERIC Educational Resources Information Center
Lejeune, Helga; Richelle, Marc; Wearden, J. H.
2006-01-01
The article discusses two important influences of B. F. Skinner, and later workers in the behavior-analytic tradition, on the study of animal timing. The first influence is methodological, and is traced from the invention of schedules imposing temporal constraints or periodicities on animals in "The Behavior of Organisms," through the rate…
Capillary gas chromatography with GC/PFPD was used in the development of analytical methodology for determining both non-pesticidal and pesticidal organotin compounds in drinking water and other aqueous matrices. The method involves aqueous ethylation of organotin analytes with ...
Analytical Study on Thermal and Mechanical Design of Printed Circuit Heat Exchanger
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Su-Jong; Sabharwall, Piyush; Kim, Eung-Soo
2013-09-01
The analytical methodologies for the thermal design, mechanical design and cost estimation of printed circuit heat exchanger are presented in this study. In this study, three flow arrangements of parallel flow, countercurrent flow and crossflow are taken into account. For each flow arrangement, the analytical solution of temperature profile of heat exchanger is introduced. The size and cost of printed circuit heat exchangers for advanced small modular reactors, which employ various coolants such as sodium, molten salts, helium, and water, are also presented.
Benchmark Tests for Stirling Convertor Heater Head Life Assessment Conducted
NASA Technical Reports Server (NTRS)
Krause, David L.; Halford, Gary R.; Bowman, Randy R.
2004-01-01
A new in-house test capability has been developed at the NASA Glenn Research Center, where a critical component of the Stirling Radioisotope Generator (SRG) is undergoing extensive testing to aid the development of analytical life prediction methodology and to experimentally aid in verification of the flight-design component's life. The new facility includes two test rigs that are performing creep testing of the SRG heater head pressure vessel test articles at design temperature and with wall stresses ranging from operating level to seven times that (see the following photograph).
Advanced rotorcraft technology: Task force report
NASA Technical Reports Server (NTRS)
1978-01-01
The technological needs and opportunities related to future civil and military rotorcraft were determined and a program plan for NASA research which was responsive to the needs and opportunities was prepared. In general, the program plan places the primary emphasis on design methodology where the development and verification of analytical methods is built upon a sound data base. The four advanced rotorcraft technology elements identified are aerodynamics and structures, flight control and avionic systems, propulsion, and vehicle configurations. Estimates of the total funding levels that would be required to support the proposed program plan are included.
A review of gear housing dynamics and acoustics literature
NASA Technical Reports Server (NTRS)
Lim, Teik Chin; Singh, Rajendra
1989-01-01
A review of the available literature on gear housing vibration and noise radiation is presented. Analytical and experimental methodologies used for bearing dynamics, housing vibration and noise, mounts and suspensions, and the overall gear and housing system are discussed. Typical design guidelines, as outlined by various investigators, are also included. Results of this review indicate that although many attempts were made to characterize the dynamics of gearbox system components, no comprehensive set of design criteria currently exist. Moreover, the literature contains conflicting reports concerning relevant design guidelines.
Architecture for Business Intelligence in the Healthcare Sector
NASA Astrophysics Data System (ADS)
Lee, Sang Young
2018-03-01
Healthcare environment is growing to include not only the traditional information systems, but also a business intelligence platform. For executive leaders, consultants, and analysts, there is no longer a need to spend hours in design and develop of typical reports or charts, the entire solution can be completed through using Business Intelligence software. The current paper highlights the advantages of big data analytics and business intelligence in the healthcare industry. In this paper, In this paper we focus our discussion around intelligent techniques and methodologies which are recently used for business intelligence in healthcare.
NASA Astrophysics Data System (ADS)
Brandstetter, Gerd; Govindjee, Sanjay
2012-03-01
Existing analytical and numerical methodologies are discussed and then extended in order to calculate critical contamination-particle sizes, which will result in deleterious effects during EUVL E-chucking in the face of an error budget on the image-placement-error (IPE). The enhanced analytical models include a gap dependant clamping pressure formulation, the consideration of a general material law for realistic particle crushing and the influence of frictional contact. We present a discussion of the defects of the classical de-coupled modeling approach where particle crushing and mask/chuck indentation are separated from the global computation of mask bending. To repair this defect we present a new analytic approach based on an exact Hankel transform method which allows a fully coupled solution. This will capture the contribution of the mask indentation to the image-placement-error (estimated IPE increase of 20%). A fully coupled finite element model is used to validate the analytical models and to further investigate the impact of a mask back-side CrN-layer. The models are applied to existing experimental data with good agreement. For a standard material combination, a given IPE tolerance of 1 nm and a 15 kPa closing pressure, we derive bounds for single particles of cylindrical shape (radius × height < 44 μm) and spherical shape (diameter < 12 μm).
Sugden, Nicole A; Marquis, Alexandra R
2017-11-01
Infants show facility for discriminating between individual faces within hours of birth. Over the first year of life, infants' face discrimination shows continued improvement with familiar face types, such as own-race faces, but not with unfamiliar face types, like other-race faces. The goal of this meta-analytic review is to provide an effect size for infants' face discrimination ability overall, with own-race faces, and with other-race faces within the first year of life, how this differs with age, and how it is influenced by task methodology. Inclusion criteria were (a) infant participants aged 0 to 12 months, (b) completing a human own- or other-race face discrimination task, (c) with discrimination being determined by infant looking. Our analysis included 30 works (165 samples, 1,926 participants participated in 2,623 tasks). The effect size for infants' face discrimination was small, 6.53% greater than chance (i.e., equal looking to the novel and familiar). There was a significant difference in discrimination by race, overall (own-race, 8.18%; other-race, 3.18%) and between ages (own-race: 0- to 4.5-month-olds, 7.32%; 5- to 7.5-month-olds, 9.17%; and 8- to 12-month-olds, 7.68%; other-race: 0- to 4.5-month-olds, 6.12%; 5- to 7.5-month-olds, 3.70%; and 8- to 12-month-olds, 2.79%). Multilevel linear (mixed-effects) models were used to predict face discrimination; infants' capacity to discriminate faces is sensitive to face characteristics including race, gender, and emotion as well as the methods used, including task timing, coding method, and visual angle. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Connor, Thomas H; Smith, Jerome P
2016-09-01
At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.
Methods for Evaluating Natural Experiments in Obesity: A Systematic Review.
Bennett, Wendy L; Wilson, Renee F; Zhang, Allen; Tseng, Eva; Knapp, Emily A; Kharrazi, Hadi; Stuart, Elizabeth A; Shogbesan, Oluwaseun; Bass, Eric B; Cheskin, Lawrence J
2018-06-05
Given the obesity pandemic, rigorous methodological approaches, including natural experiments, are needed. To identify studies that report effects of programs, policies, or built environment changes on obesity prevention and control and to describe their methods. PubMed, CINAHL, PsycINFO, and EconLit (January 2000 to August 2017). Natural experiments and experimental studies evaluating a program, policy, or built environment change in U.S. or non-U.S. populations by using measures of obesity or obesity-related health behaviors. 2 reviewers serially extracted data on study design, population characteristics, data sources and linkages, measures, and analytic methods and independently evaluated risk of bias. 294 studies (188 U.S., 106 non-U.S.) were identified, including 156 natural experiments (53%), 118 experimental studies (40%), and 20 (7%) with unclear study design. Studies used 106 (71 U.S., 35 non-U.S.) data systems; 37% of the U.S. data systems were linked to another data source. For outcomes, 112 studies reported childhood weight and 32 adult weight; 152 had physical activity and 148 had dietary measures. For analysis, natural experiments most commonly used cross-sectional comparisons of exposed and unexposed groups (n = 55 [35%]). Most natural experiments had a high risk of bias, and 63% had weak handling of withdrawals and dropouts. Outcomes restricted to obesity measures and health behaviors; inconsistent or unclear descriptions of natural experiment designs; and imperfect methods for assessing risk of bias in natural experiments. Many methodologically diverse natural experiments and experimental studies were identified that reported effects of U.S. and non-U.S. programs, policies, or built environment changes on obesity prevention and control. The findings reinforce the need for methodological and analytic advances that would strengthen evaluations of obesity prevention and control initiatives. National Institutes of Health, Office of Disease Prevention, and Agency for Healthcare Research and Quality. (PROSPERO: CRD42017055750).
Moreira, Otacilio C; Yadon, Zaida E; Cupolillo, Elisa
2017-09-29
Cutaneous leishmaniasis (CL) is spread worldwide and is the most common manifestation of leishmaniasis. Diagnosis is performed by combining clinical and epidemiological features, and through the detection of Leishmania parasites (or DNA) in tissue specimens or trough parasite isolation in culture medium. Diagnosis of CL is challenging, reflecting the pleomorphic clinical manifestations of this disease. Skin lesions vary in severity, clinical appearance, and duration, and in some cases, they can be indistinguishable from lesions related to other diseases. Over the past few decades, PCR-based methods, including real-time PCR assays, have been developed for Leishmania detection, quantification and species identification, improving the molecular diagnosis of CL. This review provides an overview of many real-time PCR methods reported for the diagnostic evaluation of CL and some recommendations for the application of these methods for quantification purposes for clinical management and epidemiological studies. Furthermore, the use of real-time PCR for Leishmania species identification is also presented. The advantages of real-time PCR protocols are numerous, including increased sensitivity and specificity and simpler standardization of diagnostic procedures. However, despite the numerous assays described, there is still no consensus regarding the methods employed. Furthermore, the analytical and clinical validation of CL molecular diagnosis has not followed international guidelines so far. A consensus methodology comprising a DNA extraction protocol with an exogenous quality control and an internal reference to normalize parasite load is still needed. In addition, the analytical and clinical performance of any consensus methodology must be accurately assessed. This review shows that a standardization initiative is essential to guide researchers and clinical laboratories towards the achievement of a robust and reproducible methodology, which will permit further evaluation of parasite load as a surrogate marker of prognosis and monitoring of aetiological treatment, particularly in multi-centric observational studies and clinical trials. Copyright © 2017 Elsevier B.V. All rights reserved.
Robles-Molina, José; Gilbert-López, Bienvenida; García-Reyes, Juan F; Molina-Díaz, Antonio
2013-12-15
The European Water Framework Directive (WFD) 2000/60/EC establishes guidelines to control the pollution of surface water by sorting out a list of priority substances that involves a significant risk to or via the aquatic systems. In this article, the analytical performance of three different sample preparation methodologies for the GC-MS/MS determination of multiclass organic contaminants-including priority comprounds from the WFD-in wastewater samples using gas chromatography-mass spectrometry was evaluated. The methodologies tested were: (a) liquid-liquid extraction (LLE) with n-hexane; (b) solid-phase extraction (SPE) with C18 cartridges and elution with ethyl acetate:dichloromethane (1:1 (v/v)), and (c) headspace solid-phase microextraction (HS-SPME) using two different fibers: polyacrylate and polydimethylsiloxane/carboxen/divinilbenzene. Identification and confirmation of the selected 57 compounds included in the study (comprising polycyclic aromatic hydrocarbons (PAHs), pesticides and other contaminants) were accomplished using gas chromatography tandem mass spectrometry (GC-MS/MS) with a triple quadrupole instrument operated in the multiple reaction monitoring (MRM) mode. Three MS/MS transitions were selected for unambiguous confirmation of the target chemicals. The different advantages and pitfalls of each method were discussed. In the case of both LLE and SPE procedures, the method was validated at two different concentration levels (15 and 150 ng L(-1)) obtaining recovery rates in the range 70-120% for most of the target compounds. In terms of analyte coverage, results with HS-SPME were not satisfactory, since 14 of the compounds tested were not properly recovered and the overall performance was worse than the other two methods tested. LLE, SPE and HS-SPME (using polyacrylate fiber) procedures also showed good linearity and precision. Using any of the three methodologies tested, limits of quantitation obtained for most of the detected compounds were in the low nanogram per liter range. © 2013 Elsevier B.V. All rights reserved.
Analytical solutions for efficient interpretation of single-well push-pull tracer tests
NASA Astrophysics Data System (ADS)
Huang, Junqi; Christ, John A.; Goltz, Mark N.
2010-08-01
Single-well push-pull tracer tests have been used to characterize the extent, fate, and transport of subsurface contamination. Analytical solutions provide one alternative for interpreting test results. In this work, an exact analytical solution to two-dimensional equations describing the governing processes acting on a dissolved compound during a modified push-pull test (advection, longitudinal and transverse dispersion, first-order decay, and rate-limited sorption/partitioning in steady, divergent, and convergent flow fields) is developed. The coupling of this solution with inverse modeling to estimate aquifer parameters provides an efficient methodology for subsurface characterization. Synthetic data for single-well push-pull tests are employed to demonstrate the utility of the solution for determining (1) estimates of aquifer longitudinal and transverse dispersivities, (2) sorption distribution coefficients and rate constants, and (3) non-aqueous phase liquid (NAPL) saturations. Employment of the solution to estimate NAPL saturations based on partitioning and non-partitioning tracers is designed to overcome limitations of previous efforts by including rate-limited mass transfer. This solution provides a new tool for use by practitioners when interpreting single-well push-pull test results.
Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam
2017-01-18
While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in cellular media due to strong cross-talk between energetically separated detection channels.
NASA Astrophysics Data System (ADS)
Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam
2017-03-01
While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in cellular media due to strong cross-talk between energetically separated detection channels. Dedicated to Professor Kankan Bhattacharyya.
Rankin, Kristin M; Kroelinger, Charlan D; Rosenberg, Deborah; Barfield, Wanda D
2012-12-01
The purpose of this article is to summarize the methodology, partnerships, and products developed as a result of a distance-based workforce development initiative to improve analytic capacity among maternal and child health (MCH) epidemiologists in state health agencies. This effort was initiated by the Centers for Disease Control's MCH Epidemiology Program and faculty at the University of Illinois at Chicago to encourage and support the use of surveillance data by MCH epidemiologists and program staff in state agencies. Beginning in 2005, distance-based training in advanced analytic skills was provided to MCH epidemiologists. To support participants, this model of workforce development included: lectures about the practical application of innovative epidemiologic methods, development of multidisciplinary teams within and across agencies, and systematic, tailored technical assistance The goal of this initiative evolved to emphasize the direct application of advanced methods to the development of state data products using complex sample surveys, resulting in the articles published in this supplement to MCHJ. Innovative methods were applied by participating MCH epidemiologists, including regional analyses across geographies and datasets, multilevel analyses of state policies, and new indicator development. Support was provided for developing cross-state and regional partnerships and for developing and publishing the results of analytic projects. This collaboration was successful in building analytic capacity, facilitating partnerships and promoting surveillance data use to address state MCH priorities, and may have broader application beyond MCH epidemiology. In an era of decreasing resources, such partnership efforts between state and federal agencies and academia are essential for promoting effective data use.
NASA Astrophysics Data System (ADS)
Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.
2016-09-01
Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.
Albuquerque De Almeida, Fernando; Al, Maiwenn; Koymans, Ron; Caliskan, Kadir; Kerstens, Ankie; Severens, Johan L
2018-04-01
Describing the general and methodological characteristics of decision-analytical models used in the economic evaluation of early warning systems for the management of chronic heart failure patients and performing a quality assessment of their methodological characteristics is expected to provide concise and useful insight to inform the future development of decision-analytical models in the field of heart failure management. Areas covered: The literature on decision-analytical models for the economic evaluation of early warning systems for the management of chronic heart failure patients was systematically reviewed. Nine electronic databases were searched through the combination of synonyms for heart failure and sensitive filters for cost-effectiveness and early warning systems. Expert commentary: The retrieved models show some variability with regards to their general study characteristics. Overall, they display satisfactory methodological quality, even though some points could be improved, namely on the consideration and discussion of any competing theories regarding model structure and disease progression, identification of key parameters and the use of expert opinion, and uncertainty analyses. A comprehensive definition of early warning systems and further research under this label should be pursued. To improve the transparency of economic evaluation publications, authors should make available detailed technical information regarding the published models.
Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches
NASA Technical Reports Server (NTRS)
Farassat, Fereidoun; Casper, Jay H.
2006-01-01
In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.
Analytical Round Robin for Elastic-Plastic Analysis of Surface Cracked Plates: Phase I Results
NASA Technical Reports Server (NTRS)
Wells, D. N.; Allen, P. A.
2012-01-01
An analytical round robin for the elastic-plastic analysis of surface cracks in flat plates was conducted with 15 participants. Experimental results from a surface crack tension test in 2219-T8 aluminum plate provided the basis for the inter-laboratory study (ILS). The study proceeded in a blind fashion given that the analysis methodology was not specified to the participants, and key experimental results were withheld. This approach allowed the ILS to serve as a current measure of the state of the art for elastic-plastic fracture mechanics analysis. The analytical results and the associated methodologies were collected for comparison, and sources of variability were studied and isolated. The results of the study revealed that the J-integral analysis methodology using the domain integral method is robust, providing reliable J-integral values without being overly sensitive to modeling details. General modeling choices such as analysis code, model size (mesh density), crack tip meshing, or boundary conditions, were not found to be sources of significant variability. For analyses controlled only by far-field boundary conditions, the greatest source of variability in the J-integral assessment is introduced through the constitutive model. This variability can be substantially reduced by using crack mouth opening displacements to anchor the assessment. Conclusions provide recommendations for analysis standardization.
DOT National Transportation Integrated Search
2010-10-01
The Volvo-Ford-UMTRI project: Safety Impact Methodology (SIM) for Lane Departure Warning is part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program. The project developed a basic analytical framework for e...
ERIC Educational Resources Information Center
Palma, Lisiane Celia; Pedrozo, Eugênio Ávila
2015-01-01
Several papers propose analytical methods relating to the inclusion of sustainability in courses and universities. However, as sustainability is a complex subject, methodological proposals on the topic must avoid making disjointed analyses which focus exclusively on curricula or on organisational strategy, as often seen in the literature.…
ERIC Educational Resources Information Center
Barrett, Paula M.; Cooper, Marita; Stallard, Paul; Zeggio, Larissa; Gallegos- Guajardo, Julia
2017-01-01
This response aims to critically evaluate the methodology and aims of the meta-analytic review written by Maggin and Johnson (2014). The present authors systematically provide responses for each of the original criticisms and highlight concerns regarding Maggin and Johnson's methodology, while objectively describing the current state of evidence…
ERIC Educational Resources Information Center
Kelly, Kathleen; Lee, Seung Hwan; Bowen Ray, Heather; Kandaurova, Maria
2018-01-01
Barriers to cross-cultural instruction challenge even experienced educators and their students. To increase cross-cultural competence and bridge learning gaps, professors in two countries adapted the Photovoice methodology to develop shared visual vocabularies with students and unearth hidden assumptions. Results from an anonymous evaluation…
ERIC Educational Resources Information Center
Geisler, Cheryl
2018-01-01
Coding, the analytic task of assigning codes to nonnumeric data, is foundational to writing research. A rich discussion of methodological pluralism has established the foundational importance of systematicity in the task of coding, but less attention has been paid to the equally important commitment to language complexity. Addressing the interplay…
Muskett, Tom; Body, Richard
2013-01-01
Conversation analysis (CA) continues to accrue interest within clinical linguistics as a methodology that can enable elucidation of structural and sequential orderliness in interactions involving participants who produce ostensibly disordered communication behaviours. However, it can be challenging to apply CA to re-examine clinical phenomena that have initially been defined in terms of linguistics, as a logical starting point for analysis may be to focus primarily on the organisation of language ("talk") in such interactions. In this article, we argue that CA's methodological power can only be fully exploited in this research context when a multimodal analytic orientation is adopted, where due consideration is given to participants' co-ordinated use of multiple semiotic resources including, but not limited to, talk (e.g., gaze, embodied action, object use and so forth). To evidence this argument, a two-layered analysis of unusual question-answer sequences in a play episode involving a child with autism is presented. It is thereby demonstrated that only when the scope of enquiry is broadened to include gaze and other embodied action can an account be generated of orderliness within these sequences. This finding has important implications for CA's application as a research methodology within clinical linguistics.
Lismont, Jasmien; Janssens, Anne-Sophie; Odnoletkova, Irina; Vanden Broucke, Seppe; Caron, Filip; Vanthienen, Jan
2016-10-01
The aim of this study is to guide healthcare instances in applying process analytics on healthcare processes. Process analytics techniques can offer new insights in patient pathways, workflow processes, adherence to medical guidelines and compliance with clinical pathways, but also bring along specific challenges which will be examined and addressed in this paper. The following methodology is proposed: log preparation, log inspection, abstraction and selection, clustering, process mining, and validation. It was applied on a case study in the type 2 diabetes mellitus domain. Several data pre-processing steps are applied and clarify the usefulness of process analytics in a healthcare setting. Healthcare utilization, such as diabetes education, is analyzed and compared with diabetes guidelines. Furthermore, we take a look at the organizational perspective and the central role of the GP. This research addresses four challenges: healthcare processes are often patient and hospital specific which leads to unique traces and unstructured processes; data is not recorded in the right format, with the right level of abstraction and time granularity; an overflow of medical activities may cloud the analysis; and analysts need to deal with data not recorded for this purpose. These challenges complicate the application of process analytics. It is explained how our methodology takes them into account. Process analytics offers new insights into the medical services patients follow, how medical resources relate to each other and whether patients and healthcare processes comply with guidelines and regulations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modeling energy/economy interactions for conservation and renewable energy-policy analysis
NASA Astrophysics Data System (ADS)
Groncki, P. J.
Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean; Plaisant, Catherine; Whiting, Mark A.
The evaluation of visual analytics environments was a topic in Illuminating the Path [Thomas 2005] as a critical aspect of moving research into practice. For a thorough understanding of the utility of the systems available, evaluation not only involves assessing the visualizations, interactions or data processing algorithms themselves, but also the complex processes that a tool is meant to support (such as exploratory data analysis and reasoning, communication through visualization, or collaborative data analysis [Lam 2012; Carpendale 2007]). Researchers and practitioners in the field have long identified many of the challenges faced when planning, conducting, and executing an evaluation ofmore » a visualization tool or system [Plaisant 2004]. Evaluation is needed to verify that algorithms and software systems work correctly and that they represent improvements over the current infrastructure. Additionally to effectively transfer new software into a working environment, it is necessary to ensure that the software has utility for the end-users and that the software can be incorporated into the end-user’s infrastructure and work practices. Evaluation test beds require datasets, tasks, metrics and evaluation methodologies. As noted in [Thomas 2005] it is difficult and expensive for any one researcher to setup an evaluation test bed so in many cases evaluation is setup for communities of researchers or for various research projects or programs. Examples of successful community evaluations can be found [Chinchor 1993; Voorhees 2007; FRGC 2012]. As visual analytics environments are intended to facilitate the work of human analysts, one aspect of evaluation needs to focus on the utility of the software to the end-user. This requires representative users, representative tasks, and metrics that measure the utility to the end-user. This is even more difficult as now one aspect of the test methodology is access to representative end-users to participate in the evaluation. In many cases the sensitive nature of data and tasks and difficult access to busy analysts puts even more of a burden on researchers to complete this type of evaluation. User-centered design goes beyond evaluation and starts with the user [Beyer 1997, Shneiderman 2009]. Having some knowledge of the type of data, tasks, and work practices helps researchers and developers know the correct paths to pursue in their work. When access to the end-users is problematic at best and impossible at worst, user-centered design becomes difficult. Researchers are unlikely to go to work on the type of problems faced by inaccessible users. Commercial vendors have difficulties evaluating and improving their products when they cannot observe real users working with their products. In well-established fields such as web site design or office software design, user-interface guidelines have been developed based on the results of empirical studies or the experience of experts. Guidelines can speed up the design process and replace some of the need for observation of actual users [heuristics review references]. In 2006 when the visual analytics community was initially getting organized, no such guidelines existed. Therefore, we were faced with the problem of developing an evaluation framework for the field of visual analytics that would provide representative situations and datasets, representative tasks and utility metrics, and finally a test methodology which would include a surrogate for representative users, increase interest in conducting research in the field, and provide sufficient feedback to the researchers so that they could improve their systems.« less
1991-09-01
iv III. THE ANALYTIC HIERARCHY PROCESS ..... ........ 15 A. INTRODUCTION ...... ................. 15 B. THE AHP PROCESS ...... ................ 16 C...INTRODUCTION ...... ................. 26 B. IMPLEMENTATION OF CERTS USING AHP ........ .. 27 1. Consistency ...... ................ 29 2. User Interface...the proposed technique into a Decision Support System. Expert Choice implements the Analytic Hierarchy Process ( AHP ), an approach to multi- criteria
Probabilistic assessment methodology for continuous-type petroleum accumulations
Crovelli, R.A.
2003-01-01
The analytic resource assessment method, called ACCESS (Analytic Cell-based Continuous Energy Spreadsheet System), was developed to calculate estimates of petroleum resources for the geologic assessment model, called FORSPAN, in continuous-type petroleum accumulations. The ACCESS method is based upon mathematical equations derived from probability theory in the form of a computer spreadsheet system. ?? 2003 Elsevier B.V. All rights reserved.
Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.
Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís
2016-01-08
In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.
Modern data science for analytical chemical data - A comprehensive review.
Szymańska, Ewa
2018-10-22
Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Arena, Matteo P.; Porter, Marc D.; Fritz, James S.
2002-01-01
A new, rapid methodology for trace analysis using solid-phase extraction is described. The two-step methodology is based on the concentration of an analyte onto a membrane disk and on the determination by diffuse reflectance spectroscopy of the amount of analyte extracted on the disk surface. This method, which is adaptable to a wide range of analytes, has been used for monitoring ppm levels of iodine and iodide in spacecraft water. Iodine is used as a biocide in spacecraft water. For these determinations, a water sample is passed through a membrane disk by means of a 10-mL syringe that is attached to a disk holder assembly. The disk, which is a polystyrene-divinylbenzene composite, is impregnated with poly(vinylpyrrolidone) (PVP), which exhaustively concentrates iodine as a yellow iodine-PVP complex. The amount of concentrated iodine is then determined in only 2 s by using a hand-held diffuse reflectance spectrometer by comparing the result with a calibration curve based on the Kubelka-Munk function. The same general procedure can be used to determine iodide levels after its facile and exhaustive oxidation to iodine by peroxymonosulfate (i.e., Oxone reagent). For samples containing both analytes, a two-step procedure can be used in which the iodide concentration is calculated from the difference in iodine levels before and after treatment of the sample with peroxymonosulfate. With this methodology, iodine and iodide levels in the 0.1-5.0 ppm range can be determined with a total workup time of approximately 60 s with a RSD of approximately 6%.
A General Methodology for the Translation of Behavioral Terms into Vernacular Languages.
Virues-Ortega, Javier; Martin, Neil; Schnerch, Gabriel; García, Jesús Ángel Miguel; Mellichamp, Fae
2015-05-01
As the field of behavior analysis expands internationally, the need for comprehensive and systematic glossaries of behavioral terms in the vernacular languages of professionals and clients becomes crucial. We created a Spanish-language glossary of behavior-analytic terms by developing and employing a systematic set of decision-making rules for the inclusion of terms. We then submitted the preliminary translation to a multi-national advisory committee to evaluate the transnational acceptability of the glossary. This method led to a translated corpus of over 1200 behavioral terms. The end products of this work included the following: (a) a Spanish-language glossary of behavior analytic terms that are publicly available over the Internet through the Behavior Analyst Certification Board and (b) a set of translation guidelines summarized here that may be useful for the development of glossaries of behavioral terms into other vernacular languages.
Gear noise, vibration, and diagnostic studies at NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Zakrajsek, James J.; Oswald, Fred B.; Townsend, Dennis P.; Coy, John J.
1990-01-01
The NASA Lewis Research Center and the U.S. Army Aviation Systems Command are involved in a joint research program to advance the technology of rotorcraft transmissions. This program consists of analytical as well as experimental efforts to achieve the overall goals of reducing weight, noise, and vibration, while increasing life and reliability. Recent analytical activities are highlighted in the areas of gear noise, vibration, and diagnostics performed in-house and through NASA and U.S. Army sponsored grants and contracts. These activities include studies of gear tooth profiles to reduce transmission error and vibration as well as gear housing and rotordynamic modeling to reduce structural vibration transmission and noise radiation, and basic research into current gear failure diagnostic methodologies. Results of these activities are presented along with an overview of near term research plans in the gear noise, vibration, and diagnostics area.
Analytic analysis of auxetic metamaterials through analogy with rigid link systems
NASA Astrophysics Data System (ADS)
Rayneau-Kirkhope, Daniel; Zhang, Chengzhao; Theran, Louis; Dias, Marcelo A.
2018-02-01
In recent years, many structural motifs have been designed with the aim of creating auxetic metamaterials. One area of particular interest in this subject is the creation of auxetic material properties through elastic instability. Such metamaterials switch from conventional behaviour to an auxetic response for loads greater than some threshold value. This paper develops a novel methodology in the analysis of auxetic metamaterials which exhibit elastic instability through analogy with rigid link lattice systems. The results of our analytic approach are confirmed by finite-element simulations for both the onset of elastic instability and post-buckling behaviour including Poisson's ratio. The method gives insight into the relationships between mechanisms within lattices and their mechanical behaviour; as such, it has the potential to allow existing knowledge of rigid link lattices with auxetic paths to be used in the design of future buckling-induced auxetic metamaterials.
Tautin, J.; Lebreton, J.-D.; North, P.M.
1993-01-01
Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.
Chemical regulators of plant hormones and their applications in basic research and agriculture.
Jiang, Kai; Asami, Tadao
2018-04-20
Plant hormones are small molecules that play versatile roles in regulating plant growth, development, and responses to the environment. Classic methodologies, including genetics, analytic chemistry, biochemistry, and molecular biology, have contributed to the progress in plant hormone studies. In addition, chemical regulators of plant hormone functions have been important in such studies. Today, synthetic chemicals, including plant growth regulators, are used to study and manipulate biological systems, collectively referred to as chemical biology. Here, we summarize the available chemical regulators and their contributions to plant hormone studies. We also pose questions that remain to be addressed in plant hormone studies and that might be solved with the help of chemical regulators.
Roetzheim, Richard G; Freund, Karen M; Corle, Don K; Murray, David M; Snyder, Frederick R; Kronman, Andrea C; Jean-Pierre, Pascal; Raich, Peter C; Holden, Alan Ec; Darnell, Julie S; Warren-Mears, Victoria; Patierno, Steven
2012-04-01
The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, with similar clinical criteria but with different study designs. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed-upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from the members of the PNRP Design and Analysis Committee. To review possible methodologies for analyzing combined data arising from heterogeneous study designs. The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. The conclusions were based on simple consensus. The five approaches reviewed included the following: (1) analyzing and reporting each project separately, (2) combining data from all projects and performing an individual-level analysis, (3) pooling data from projects having similar study designs, (4) analyzing pooled data using a prospective meta-analytic technique, and (5) analyzing pooled data utilizing a novel simulated group-randomized design. Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and to accommodate differing project sample sizes. The conclusions reached were based on expert opinion and not derived from actual analyses performed. The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multisite community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs.
Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1989-12-01
On March 6, 1980, the US Department of Energy (DOE) and the Ministry of Energy and Mines of Venezuela (MEMV) entered into a joint agreement which included analysis of heavy crude oils from the Venezuelan Orinoco oil belt. The purpose of this report is to present compositional data and describe new analytical methods obtained from work on the Cerro Negro Orinoco belt crude oil since 1980. Most of the chapters focus on the methods rather than the resulting data on Cerro Negro oil, and results from other oils obtained during the verification of the method are included. In addition, publishedmore » work on analysis of heavy oils, tar sand bitumens, and like materials is reviewed, and the overall state of the art in analytical methodology for heavy fossil liquids is assessed. The various phases of the work included: distillation and determination of routine'' physical/chemical properties (Chapter 1); preliminary separation of >200{degrees} C distillates and the residue into acid, base, neutral, saturated hydrocarbon and neutral-aromatic concentrates (Chapter 2); further separation of acid, base, and neutral concentrates into subtypes (Chapters 3--5); and determination of the distribution of metal-containing compounds in all fractions (Chapter 6).« less
Control/structure interaction design methodology
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.; Layman, William E.
1989-01-01
The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.
Vicario, Ana; Aragón, Leslie; Wang, Chien C; Bertolino, Franco; Gomez, María R
2018-02-05
In this work, a novel molecularly imprinted polymer (MIP) proposed as solid phase extraction sorbent was developed for the determination of propylparaben (PP) in diverse cosmetic samples. The use of parabens (PAs) is authorized by regulatory agencies as microbiological preservative; however, recently several studies claim that large-scale use of these preservatives can be a potential health risk and harmful to the environment. Diverse factors that influence on polymer synthesis were studied, including template, functional monomer, porogen and crosslinker used. Morphological characterization of the MIP was performed using SEM and BET analysis. Parameters affecting the molecularly imprinted solid phase extraction (MISPE) and elution efficiency of PP were evaluated. After sample clean-up, the analyte was analyzed by high performance liquid chromatography (HPLC). The whole procedure was validated, showing satisfactory analytical parameters. After applying the MISPE methodology, the extraction recoveries were always better than 86.15%; the obtained precision expressed as RSD% was always lower than 2.19 for the corrected peak areas. Good linear relationship was obtained within the range 8-500ngmL -1 of PP, r 2 =0.99985. Lower limits of detection and quantification after MISPE procedure of 2.4 and 8ngmL -1 , respectively were reached, in comparison with previously reported methodologies. The development of MISPE-HPLC methodology provided a simple an economic way for accomplishing a clean-up/preconcentration step and the subsequent determination of PP in a complex matrix. The performance of the proposed method was compared against C-18 and silica solid phase extraction (SPE) cartridges. The recovery factors obtained after applying extraction methods were 96.6, 64.8 and 0.79 for MISPE, C18-SPE and silica-SPE procedures, respectively. The proposed methodology improves the retention capability of SPE material plus robustness and possibility of reutilization, enabling it to be used for PP routine monitoring in diverse personal care products (PCP) and environmental samples. Copyright © 2017 Elsevier B.V. All rights reserved.
A methodological systematic review of what's wrong with meta-ethnography reporting.
France, Emma F; Ring, Nicola; Thomas, Rebecca; Noyes, Jane; Maxwell, Margaret; Jepson, Ruth
2014-11-19
Syntheses of qualitative studies can inform health policy, services and our understanding of patient experience. Meta-ethnography is a systematic seven-phase interpretive qualitative synthesis approach well-suited to producing new theories and conceptual models. However, there are concerns about the quality of meta-ethnography reporting, particularly the analysis and synthesis processes. Our aim was to investigate the application and reporting of methods in recent meta-ethnography journal papers, focusing on the analysis and synthesis process and output. Methodological systematic review of health-related meta-ethnography journal papers published from 2012-2013. We searched six electronic databases, Google Scholar and Zetoc for papers using key terms including 'meta-ethnography.' Two authors independently screened papers by title and abstract with 100% agreement. We identified 32 relevant papers. Three authors independently extracted data and all authors analysed the application and reporting of methods using content analysis. Meta-ethnography was applied in diverse ways, sometimes inappropriately. In 13% of papers the approach did not suit the research aim. In 66% of papers reviewers did not follow the principles of meta-ethnography. The analytical and synthesis processes were poorly reported overall. In only 31% of papers reviewers clearly described how they analysed conceptual data from primary studies (phase 5, 'translation' of studies) and in only one paper (3%) reviewers explicitly described how they conducted the analytic synthesis process (phase 6). In 38% of papers we could not ascertain if reviewers had achieved any new interpretation of primary studies. In over 30% of papers seminal methodological texts which could have informed methods were not cited. We believe this is the first in-depth methodological systematic review of meta-ethnography conduct and reporting. Meta-ethnography is an evolving approach. Current reporting of methods, analysis and synthesis lacks clarity and comprehensiveness. This is a major barrier to use of meta-ethnography findings that could contribute significantly to the evidence base because it makes judging their rigour and credibility difficult. To realise the high potential value of meta-ethnography for enhancing health care and understanding patient experience requires reporting that clearly conveys the methodology, analysis and findings. Tailored meta-ethnography reporting guidelines, developed through expert consensus, could improve reporting.
Simoens, Steven
2013-01-01
Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474
Simoens, Steven
2013-01-01
This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation.
A Practical Methodology for Disaggregating the Drivers of Drug Costs Using Administrative Data.
Lungu, Elena R; Manti, Orlando J; Levine, Mitchell A H; Clark, Douglas A; Potashnik, Tanya M; McKinley, Carol I
2017-09-01
Prescription drug expenditures represent a significant component of health care costs in Canada, with estimates of $28.8 billion spent in 2014. Identifying the major cost drivers and the effect they have on prescription drug expenditures allows policy makers and researchers to interpret current cost pressures and anticipate future expenditure levels. To identify the major drivers of prescription drug costs and to develop a methodology to disaggregate the impact of each of the individual drivers. The methodology proposed in this study uses the Laspeyres approach for cost decomposition. This approach isolates the effect of the change in a specific factor (e.g., price) by holding the other factor(s) (e.g., quantity) constant at the base-period value. The Laspeyres approach is expanded to a multi-factorial framework to isolate and quantify several factors that drive prescription drug cost. Three broad categories of effects are considered: volume, price and drug-mix effects. For each category, important sub-effects are quantified. This study presents a new and comprehensive methodology for decomposing the change in prescription drug costs over time including step-by-step demonstrations of how the formulas were derived. This methodology has practical applications for health policy decision makers and can aid researchers in conducting cost driver analyses. The methodology can be adjusted depending on the purpose and analytical depth of the research and data availability. © 2017 Journal of Population Therapeutics and Clinical Pharmacology. All rights reserved.
Aeroelastic optimization methodology for viscous and turbulent flows
NASA Astrophysics Data System (ADS)
Barcelos Junior, Manuel Nascimento Dias
2007-12-01
In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.
D'Amato, Marilena; Turrini, Aida; Aureli, Federica; Moracci, Gabriele; Raggi, Andrea; Chiaravalle, Eugenio; Mangiacotti, Michele; Cenci, Telemaco; Orletti, Roberta; Candela, Loredana; di Sandro, Alessandra; Cubadda, Francesco
2013-01-01
This article presents the methodology of the Italian Total Diet Study 2012-2014 aimed at assessing the dietary exposure of the general Italian population to selected nonessential trace elements (Al, inorganic As, Cd, Pb, methyl-Hg, inorganic Hg, U) and radionuclides (40K, 134Cs, 137Cs, 90Sr). The establishment of the TDS food list, the design of the sampling plan, and details about the collection of food samples, their standardized culinary treatment, pooling into analytical samples and subsequent sample treatment are described. Analytical techniques and quality assurance are discussed, with emphasis on the need for speciation data and for minimizing the percentage of left-censored data so as to reduce uncertainties in exposure assessment. Finally the methodology for estimating the exposure of the general population and of population subgroups according to age (children, teenagers, adults, and the elderly) and gender, both at the national level and for each of the four main geographical areas of Italy, is presented.
Precision medicine: Towards complexity science age.
Yuan, Bing
2016-04-01
Precision medicine (PM) refers to the tailoring of the prevention and treatment strategies to the individual characteristics of each patient. Following the vigorous advocacy of the U.S. President Obama and China's President Xi, PM has now become a hot topic of common concern worldwide. PM does not merely refer to the skill set level but rather a comprehensive medical methodology. Hence, there is PM that builds on the analytical methodology of Western medical system as well as PM that builds on Chinese medicine (CM). The differences between the two systems, fundamentally speaking, are the differences in methodology to describe the body constitution that based on reductionism and holism. Today, as science advances to complex systems, the mainstream analytical reductionism advances to the holistic synthesis era, it is imperative to introduce CM's holistic body constitution to the modern medical system in order to progress to PM. PM with its foundation on holistic body constitution, is a medical system that integrates Western medicine and CM, is the highest attainment of "PM" in the future.
2018-01-01
The data collection and reporting approaches of four major altmetric data aggregators are studied. The main aim of this study is to understand how differences in social media tracking and data collection methodologies can have effects on the analytical use of altmetric data. For this purpose, discrepancies in the metrics across aggregators have been studied in order to understand how the methodological choices adopted by these aggregators can explain the discrepancies found. Our results show that different forms of accessing the data from diverse social media platforms, together with different approaches of collecting, processing, summarizing, and updating social media metrics cause substantial differences in the data and metrics offered by these aggregators. These results highlight the importance that methodological choices in the tracking, collecting, and reporting of altmetric data can have in the analytical value of the data. Some recommendations for altmetric users and data aggregators are proposed and discussed. PMID:29772003
Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek
2013-11-15
A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
1977-01-01
Samples of liquid oxygen, high pressure nitrogen, low pressure nitrogen, and missile grade air were studied to determine the hydrocarbon concentrations. Concentration of the samples was achieved by adsorption on a molecular sieve and activated charcoal. The trapped hydrocarbons were then desorbed and transferred to an analytical column in a gas chromatograph. The sensitivity of the method depends on the volume of gas passed through the adsorbent tubes. The value of the method was verified through recoverability and reproducibility studies. The use of this method enables LOX, GN2, and missile grade air systems to be routinely monitored to determine low level increases in specific hydrocarbon concentration that could lead to potentially hazardous conditions.
ERIC Educational Resources Information Center
Osler, James Edward
2015-01-01
This monograph provides a neuroscience-based systemological, epistemological, and methodological rational for the design of an advanced and novel parametric statistical analytics designed for the biological sciences referred to as "Biotrichotomy". The aim of this new arena of statistics is to provide dual metrics designed to analyze the…
ERIC Educational Resources Information Center
Sideridis, Georgios D.; Tsaousis, Ioannis; Al-harbi, Khaleel A.
2015-01-01
The purpose of the present study was to extend the model of measurement invariance by simultaneously estimating invariance across multiple populations in the dichotomous instrument case using multi-group confirmatory factor analytic and multiple indicator multiple causes (MIMIC) methodologies. Using the Arabic version of the General Aptitude Test…
ERIC Educational Resources Information Center
Moskovkin, Vladimir M.; Bocharova, Emilia A.; Balashova, Oksana V.
2014-01-01
Purpose: The purpose of this paper is to introduce and develop the methodology of journal benchmarking. Design/Methodology/ Approach: The journal benchmarking method is understood to be an analytic procedure of continuous monitoring and comparing of the advance of specific journal(s) against that of competing journals in the same subject area,…
ERIC Educational Resources Information Center
Porter, Kristin E.; Balu, Rekha; Hendra, Richard
2017-01-01
This post is one in a series highlighting MDRC's methodological work. Contributors discuss the refinement and practical use of research methods being employed across the organization. Across policy domains, practitioners and researchers are benefiting from a trend of greater access to both more detailed and frequent data and the increased…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, M.S.Y.
1990-12-01
The PAGAN code system is a part of the performance assessment methodology developed for use by the U.S. Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. In this methodology, PAGAN is used as one candidate approach for analysis of the ground-water pathway. PAGAN, Version 1.1. has the capability to model the source term, vadose-zone transport, and aquifer transport of radionuclides from a waste disposal unit. It combines the two codes SURFACE and DISPERSE which are used as semi-analytical solutions to the convective-dispersion equation. This system uses menu driven input/out for implementing a simple ground-water transport analysismore » and incorporates statistical uncertainty functions for handling data uncertainties. The output from PAGAN includes a time and location-dependent radionuclide concentration at a well in the aquifer, or a time and location-dependent radionuclide flux into a surface-water body.« less
Shaw, Bronwen E; Hahn, Theresa; Martin, Paul J; Mitchell, Sandra A; Petersdorf, Effie W; Armstrong, Gregory T; Shelburne, Nonniekaye; Storer, Barry E; Bhatia, Smita
2017-01-01
The increasing numbers of hematopoietic cell transplantations (HCTs) performed each year, the changing demographics of HCT recipients, the introduction of new transplantation strategies, incremental improvement in survival, and the growing population of HCT survivors demand a comprehensive approach to examining the health and well-being of patients throughout life after HCT. This report summarizes strategies for the conduct of research on late effects after transplantation, including consideration of the study design and analytic approaches; methodologic challenges in handling complex phenotype data; an appreciation of the changing trends in the practice of transplantation; and the availability of biospecimens to support laboratory-based research. It is hoped that these concepts will promote continued research and facilitate the development of new approaches to address fundamental questions in transplantation outcomes. Copyright © 2017 The American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
Zhang, Yong-Feng; Chiang, Hsiao-Dong
2017-09-01
A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.
Insights from two industrial hygiene pilot e-cigarette passive vaping studies.
Maloney, John C; Thompson, Michael K; Oldham, Michael J; Stiff, Charles L; Lilly, Patrick D; Patskan, George J; Shafer, Kenneth H; Sarkar, Mohamadi A
2016-01-01
While several reports have been published using research methods of estimating exposure risk to e-cigarette vapors in nonusers, only two have directly measured indoor air concentrations from vaping using validated industrial hygiene sampling methodology. Our first study was designed to measure indoor air concentrations of nicotine, menthol, propylene glycol, glycerol, and total particulates during the use of multiple e-cigarettes in a well-characterized room over a period of time. Our second study was a repeat of the first study, and it also evaluated levels of formaldehyde. Measurements were collected using active sampling, near real-time and direct measurement techniques. Air sampling incorporated industrial hygiene sampling methodology using analytical methods established by the National Institute of Occupational Safety and Health and the Occupational Safety and Health Administration. Active samples were collected over a 12-hr period, for 4 days. Background measurements were taken in the same room the day before and the day after vaping. Panelists (n = 185 Study 1; n = 145 Study 2) used menthol and non-menthol MarkTen prototype e-cigarettes. Vaping sessions (six, 1-hr) included 3 prototypes, with total number of puffs ranging from 36-216 per session. Results of the active samples were below the limit of quantitation of the analytical methods. Near real-time data were below the lowest concentration on the established calibration curves. Data from this study indicate that the majority of chemical constituents sampled were below quantifiable levels. Formaldehyde was detected at consistent levels during all sampling periods. These two studies found that indoor vaping of MarkTen prototype e-cigarette does not produce chemical constituents at quantifiable levels or background levels using standard industrial hygiene collection techniques and analytical methods.
Non-traditional applications of laser desorption/ionization mass spectrometry
NASA Astrophysics Data System (ADS)
McAlpin, Casey R.
Seven studies were carried out using laser desorption/ionization mass spectrometry (LDI MS) to develop enhanced methodologies for a variety of analyte systems by investigating analyte chemistries, ionization processes, and elimination of spectral interferences. Applications of LDI and matrix assisted laser/desorption/ionization (MALDI) have been previously limited by poorly understood ionization phenomena, and spectral interferences from matrices. Matrix assisted laser desorption ionization MS is well suited to the analysis of proteins. However, the proteins associated with bacteriophages often form complexes which are too massive for detection with a standard MALDI mass spectrometer. As such, methodologies for pretreatment of these samples are discussed in detail in the first chapter. Pretreatment of bacteriophage samples with reducing agents disrupted disulfide linkages and allowed enhanced detection of bacteriophage proteins. The second chapter focuses on the use of MALDI MS for lipid compounds whose molecular mass is significantly less than the proteins for which MALDI is most often applied. The use of MALDI MS for lipid analysis presented unique challenges such as matrix interference and differential ionization efficiencies. It was observed that optimization of the matrix system, and addition of cationization reagents mitigated these challenges and resulted in an enhanced methodology for MALDI MS of lipids. One of the challenges commonly encountered in efforts to expand MALDI MS applications is as previously mentioned interferences introduced by organic matrix molecules. The third chapter focuses on the development of a novel inorganic matrix replacement system called metal oxide laser ionization mass spectrometry (MOLI MS). In contrast to other matrix replacements, considerable effort was devoted to elucidating the ionization mechanism. It was shown that chemisorption of analytes to the metal oxide surface produced acidic adsorbed species which then protonated free analyte molecules. Expanded applications of MOLI MS were developed following description of the ionization mechanism. A series of experiments were carried out involving treatment of metal oxide surfaces with reagent molecules to expand MOLI MS and develop enhanced MOLI MS methodologies. It was found that treatment of the metal oxide surface with a small molecule to act as a proton source expanded MOLI MS to analytes which did not form acidic adsorbed species. Proton-source pretreated MOLI MS was then used for the analysis of oils obtained from the fast, anoxic pyrolysis of biomass (py-oil). These samples are complex and produce MOLI mass spectra with many peaks. In this experiment, methods of data reduction including Kendrick mass defects and nominal mass z*-scores, which are commonly used for the study of petroleum fractions, were used to interpret these spectra and identify the major constituencies of py-oils. Through data reduction and collision induced dissociation (CID), homologous series of compounds were rapidly identified. The final chapter involves using metal oxides to catalytically cleave the ester linkage on lipids containing fatty acids in addition to ionization. The cleavage process results in the production of spectra similar to those observed with saponification/methylation. Fatty acid profiles were generated for a variety of micro-organisms to differentiate between bacterial species. (Abstract shortened by UMI.)
Life cycle costing of food waste: A review of methodological approaches.
De Menna, Fabio; Dietershagen, Jana; Loubiere, Marion; Vittuari, Matteo
2018-03-01
Food waste (FW) is a global problem that is receiving increasing attention due to its environmental and economic impacts. Appropriate FW prevention, valorization, and management routes could mitigate or avoid these effects. Life cycle thinking and approaches, such as life cycle costing (LCC), may represent suitable tools to assess the sustainability of these routes. This study analyzes different LCC methodological aspects and approaches to evaluate FW management and valorization routes. A systematic literature review was carried out with a focus on different LCC approaches, their application to food, FW, and waste systems, as well as on specific methodological aspects. The review consisted of three phases: a collection phase, an iterative phase with experts' consultation, and a final literature classification. Journal papers and reports were retrieved from selected databases and search engines. The standardization of LCC methodologies is still in its infancy due to a lack of consensus over definitions and approaches. Research on the life cycle cost of FW is limited and generally focused on FW management, rather than prevention or valorization of specific flows. FW prevention, valorization, and management require a consistent integration of LCC and Life Cycle Assessment (LCA) to avoid tradeoffs between environmental and economic impacts. This entails a proper investigation of methodological differences between attributional and consequential modelling in LCC, especially with regard to functional unit, system boundaries, multi-functionality, included cost, and assessed impacts. Further efforts could also aim at finding the most effective and transparent categorization of costs, in particular when dealing with multiple stakeholders sustaining costs of FW. Interpretation of results from LCC of FW should take into account the effect on larger economic systems. Additional key performance indicators and analytical tools could be included in consequential approaches. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba
2014-11-01
Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity. However, the same techniques discussed could also have application to catastrophes resulting from other incidents, such as natural disasters or industrial accidents. Further, the high sample throughput enabled by the techniques discussed could be employed for conventional environmental studies and compliance monitoring, potentially decreasing costs and/or increasing the quantity of data available to decision-makers. Published by Elsevier Ltd.
Baker, David R; Kasprzyk-Hordern, Barbara
2011-11-04
Presented is the first comprehensive study of drugs of abuse on suspended particulate matter (SPM) in wastewater. Analysis of SPM is crucial to prevent the under-reporting of the levels of analyte that may be present in wastewater. Analytical methods to date analyse the aqueous part of wastewater samples only, removing SPM through the use of filtration or centrifugation. The development of an analytical method to determine 60 compounds on SPM using a combination of pressurised liquid extraction, solid phase extraction and liquid chromatography coupled with tandem mass spectrometry (PLE-SPE-LC-MS/MS) is reported. The range of compounds monitored included stimulants, opioid and morphine derivatives, benzodiazepines, antidepressants, dissociative anaesthetics, drug precursors, and their metabolites. The method was successfully validated (parameters studied: linearity and range, recovery, accuracy, reproducibility, repeatability, matrix effects, and limits of detection and quantification). The developed methodology was applied to SPM samples collected at three wastewater treatment plants in the UK. The average proportion of analyte on SPM as opposed to in the aqueous phase was <5% for several compounds including cocaine, benzoylecgonine, MDMA, and ketamine; whereas the proportion was >10% with regard to methadone, EDDP, EMDP, BZP, fentanyl, nortramadol, norpropoxyphene, sildenafil and all antidepressants (dosulepin, amitriptyline, nortriptyline, fluoxetine and norfluoxetine). Consequently, the lack of SPM analysis in wastewater sampling protocol could lead to the under-reporting of the measured concentration of some compounds. Copyright © 2011 Elsevier B.V. All rights reserved.
Manterola, Carlos; Busquets, Juli; Pascual, Marta; Grande, Luis
2006-02-01
The aim of this study was to determine the methodological quality of articles on therapeutic procedures published in Cirugía Española and to study its association with the publication year, center, and subject-matter. A bibliometric study that included all articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 was performed. All kinds of clinical designs were considered, excluding editorials, review articles, letters to editor, and experimental studies. The variables analyzed were: year of publication, center, design, and methodological quality. Methodological quality was determined by a valid and reliable scale. Descriptive statistics (calculation of means, standard deviation and medians) and analytical statistics (Pearson's chi2, nonparametric, ANOVA and Bonferroni tests) were used. A total of 244 articles were studied (197 case series [81%], 28 cohort studies [12%], 17 clinical trials [7%], 1 cross sectional study and 1 case-control study [0.8%]). The studies were performed mainly in Catalonia and Murcia (22% and 16%, respectively). The most frequent subject areas were soft tissue and hepatobiliopancreatic surgery (23% and 19%, respectively). The mean and median of the methodological quality score calculated for the entire series was 10.2 +/- 3.9 points and 9.5 points, respectively. Methodological quality significantly increased by publication year (p < 0.001). An association between methodological quality and subject area was observed but no association was detected with the center performing the study. The methodological quality of articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 is low. However, a statistically significant trend toward improvement was observed.
Cardaci, Maurizio; Misuraca, Raffaella
2005-08-01
This paper raises some methodological problems in the dual process explanation provided by Wada and Nittono for their 2004 results using the Wason selection task. We maintain that the Nittono rethinking approach is weak and that it should be refined to grasp better the evidence of analytic processes.
Silva, Ana Rita M; Portugal, Fátima C M; Nogueira, J M F
2008-10-31
Stir bar sorptive extraction with polyurethane (PU) and polydimethylsiloxane (PDMS) polymeric phases followed by high-performance liquid chromatography with diode array detection [SBSE(PU or PDMS)/HPLC-DAD] was studied for the determination of six acidic pharmaceuticals [o-acetylsalicylic acid (ACA), ibuprofen (IBU), diclofenac sodium (DIC), naproxen (NAP), mefenamic acid (MEF) and gemfibrozil (GEM)], selected as non-steroidal acidic anti-inflammatory drugs and lipid regulators model compounds in environmental water matrices. The main parameters affecting the efficiency of the proposed methodology are fully discussed. Assays performed on 25 mL of water samples spiked at the 10 microg L(-1) level under optimized experimental conditions, yielded recoveries ranging from 45.3+/-9.0% (ACA) to 90.6+/-7.2% (IBU) by SBSE(PU) and 9.8+/-1.6% (NAP) to 73.4+/-5.0% (GEM) by SBSE(PDMS), where the former polymeric phase presented a better affinity to extract these target analytes from water matrices at the trace level. The methodology showed also excellent linear dynamic ranges for the six acidic pharmaceuticals studied, with correlation coefficients higher than 0.9976, limits of detection and quantification between 0.40-1.7 microg L(-1) and 1.5-5.8 microg L(-1), respectively, and suitable precision (RSD <15%). Moreover, the developed methodology was applied for the determination of these target analytes in several environmental matrices, including river, sea and wastewater samples, having achieved good performance and moderate matrix effects. In short, the PU foams demonstrated to be an excellent alternative for the enrichment of the more polar metabolites from water matrices by SBSE, overcoming the limitations of the conventional PDMS phase.
Rating methodological quality: toward improved assessment and investigation.
Moyer, Anne; Finney, John W
2005-01-01
Assessing methodological quality is considered essential in deciding what investigations to include in research syntheses and in detecting potential sources of bias in meta-analytic results. Quality assessment is also useful in characterizing the strengths and limitations of the research in an area of study. Although numerous instruments to measure research quality have been developed, they have lacked empirically-supported components. In addition, different summary quality scales have yielded different findings when they were used to weight treatment effect estimates for the same body of research. Suggestions for developing improved quality instruments include: distinguishing distinct domains of quality, such as internal validity, external validity, the completeness of the study report, and adherence to ethical practices; focusing on individual aspects, rather than domains of quality; and focusing on empirically-verified criteria. Other ways to facilitate the constructive use of quality assessment are to improve and standardize the reporting of research investigations, so that the quality of studies can be more equitably and thoroughly compared, and to identify optimal methods for incorporating study quality ratings into meta-analyses.
A design methodology for neutral buoyancy simulation of space operations
NASA Technical Reports Server (NTRS)
Akin, David L.
1988-01-01
Neutral buoyancy has often been used in the past for EVA development activities, but little has been done to provide an analytical understanding of the environment and its correlation with space. This paper covers a set of related research topics at the MIT Space Systems Laboratory, dealing with the modeling of the space and underwater environments, validation of the models through testing in neutral buoyancy, parabolic flight, and space flight experiments, and applications of the models to gain a better design methodology for creating meaningful neutral buoyancy simulations. Examples covered include simulation validation criteria for human body dynamics, and for applied torques in a beam rotation task, which is the pacing crew operation for EVA structural assembly. Extensions of the dynamics models are presented for powered vehicles in the underwater environment, and examples given from the MIT Space Telerobotics Research Program, including the Beam Assembly Teleoperator and the Multimode Proximity Operations Device. Future expansions of the modeling theory are also presented, leading to remote vehicles which behave in neutral buoyancy exactly as the modeled system would in space.
The IMI PROTECT project: purpose, organizational structure, and procedures.
Reynolds, Robert F; Kurz, Xavier; de Groot, Mark C H; Schlienger, Raymond G; Grimaldi-Bensouda, Lamiae; Tcherny-Lessenot, Stephanie; Klungel, Olaf H
2016-03-01
The Pharmacoepidemiological Research on Outcomes of Therapeutics by a European ConsorTium (PROTECT) initiative was a collaborative European project that sought to address limitations of current methods in the field of pharmacoepidemiology and pharmacovigilance. Initiated in 2009 and ending in 2015, PROTECT was part of the Innovative Medicines Initiative, a joint undertaking by the European Union and pharmaceutical industry. Thirty-five partners including academics, regulators, small and medium enterprises, and European Federation of Pharmaceuticals Industries and Associations companies contributed to PROTECT. Two work packages within PROTECT implemented research examining the extent to which differences in the study design, methodology, and choice of data source can contribute to producing discrepant results from observational studies on drug safety. To evaluate the effect of these differences, the project applied different designs and analytic methodology for six drug-adverse event pairs across several electronic healthcare databases and registries. This papers introduces the organizational structure and procedures of PROTECT, including how drug-adverse event and data sources were selected, study design and analyses documents were developed, and results managed centrally. Copyright © 2016 John Wiley & Sons, Ltd.
Imprinting Technology in Electrochemical Biomimetic Sensors
Frasco, Manuela F.; Truta, Liliana A. A. N. A.; Sales, M. Goreti F.; Moreira, Felismina T. C.
2017-01-01
Biosensors are a promising tool offering the possibility of low cost and fast analytical screening in point-of-care diagnostics and for on-site detection in the field. Most biosensors in routine use ensure their selectivity/specificity by including natural receptors as biorecognition element. These materials are however too expensive and hard to obtain for every biochemical molecule of interest in environmental and clinical practice. Molecularly imprinted polymers have emerged through time as an alternative to natural antibodies in biosensors. In theory, these materials are stable and robust, presenting much higher capacity to resist to harsher conditions of pH, temperature, pressure or organic solvents. In addition, these synthetic materials are much cheaper than their natural counterparts while offering equivalent affinity and sensitivity in the molecular recognition of the target analyte. Imprinting technology and biosensors have met quite recently, relying mostly on electrochemical detection and enabling a direct reading of different analytes, while promoting significant advances in various fields of use. Thus, this review encompasses such developments and describes a general overview for building promising biomimetic materials as biorecognition elements in electrochemical sensors. It includes different molecular imprinting strategies such as the choice of polymer material, imprinting methodology and assembly on the transduction platform. Their interface with the most recent nanostructured supports acting as standard conductive materials within electrochemical biomimetic sensors is pointed out. PMID:28272314
New insights into liquid chromatography for more eco-friendly analysis of pharmaceuticals.
Shaaban, Heba
2016-10-01
Greening the analytical methods used for analysis of pharmaceuticals has been receiving great interest aimed at eliminating or minimizing the amount of organic solvents consumed daily worldwide without loss in chromatographic performance. Traditional analytical LC techniques employed in pharmaceutical analysis consume tremendous amounts of hazardous solvents and consequently generate large amounts of waste. The monetary and ecological impact of using large amounts of solvents and waste disposal motivated the analytical community to search for alternatives to replace polluting analytical methodologies with clean ones. In this context, implementing the principles of green analytical chemistry (GAC) in analytical laboratories is highly desired. This review gives a comprehensive overview on different green LC pathways for implementing GAC principles in analytical laboratories and focuses on evaluating the greenness of LC analytical procedures. This review presents green LC approaches for eco-friendly analysis of pharmaceuticals in industrial, biological, and environmental matrices. Graphical Abstract Green pathways of liquid chromatography for more eco-friendly analysis of pharmaceuticals.
A rapid and sensitive analytical method for the determination of 14 pyrethroids in water samples.
Feo, M L; Eljarrat, E; Barceló, D
2010-04-09
A simple, efficient and environmentally friendly analytical methodology is proposed for extracting and preconcentrating pyrethroids from water samples prior to gas chromatography-negative ion chemical ionization mass spectrometry (GC-NCI-MS) analysis. Fourteen pyrethroids were selected for this work: bifenthrin, cyfluthrin, lambda-cyhalothrin, cypermethrin, deltamethrin, esfenvalerate, fenvalerate, fenpropathrin, tau-fluvalinate, permethrin, phenothrin, resmethrin, tetramethrin and tralomethrin. The method is based on ultrasound-assisted emulsification-extraction (UAEE) of a water-immiscible solvent in an aqueous medium. Chloroform was used as extraction solvent in the UAEE technique. Target analytes were quantitatively extracted achieving an enrichment factor of 200 when 20 mL aliquot of pure water spiked with pyrethroid standards was extracted. The method was also evaluated with tap water and river water samples. Method detection limits (MDLs) ranged from 0.03 to 35.8 ng L(-1) with RSDs values < or =3-25% (n=5). The coefficients of estimation of the calibration curves obtained following the proposed methodology were > or =0.998. Recovery values were in the range of 45-106%, showing satisfactory robustness of the method for analyzing pyrethroids in water samples. The proposed methodology was applied for the analysis of river water samples. Cypermethrin was detected at concentration levels ranging from 4.94 to 30.5 ng L(-1). Copyright 2010 Elsevier B.V. All rights reserved.
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2014-01-01
Introduction It is fundamental that randomised controlled trials (RCTs) are properly conducted in order to reach well-supported conclusions. However, there is emerging evidence that RCTs are subject to biases which can overestimate or underestimate the true treatment effect, due to flaws in the study design characteristics of such trials. The extent to which this holds true in oral health RCTs, which have some unique design characteristics compared to RCTs in other health fields, is unclear. As such, we aim to examine the empirical evidence quantifying the extent of bias associated with methodological and non-methodological characteristics in oral health RCTs. Methods and analysis We plan to perform a meta-epidemiological study, where a sample size of 60 meta-analyses (MAs) including approximately 600 RCTs will be selected. The MAs will be randomly obtained from the Oral Health Database of Systematic Reviews using a random number table; and will be considered for inclusion if they include a minimum of five RCTs, and examine a therapeutic intervention related to one of the recognised dental specialties. RCTs identified in selected MAs will be subsequently included if their study design includes a comparison between an intervention group and a placebo group or another intervention group. Data will be extracted from selected trials included in MAs based on a number of methodological and non-methodological characteristics. Moreover, the risk of bias will be assessed using the Cochrane Risk of Bias tool. Effect size estimates and measures of variability for the main outcome will be extracted from each RCT included in selected MAs, and a two-level analysis will be conducted using a meta-meta-analytic approach with a random effects model to allow for intra-MA and inter-MA heterogeneity. Ethics and dissemination The intended audiences of the findings will include dental clinicians, oral health researchers, policymakers and graduate students. The aforementioned will be introduced to the findings through workshops, seminars, round table discussions and targeted individual meetings. Other opportunities for knowledge transfer will be pursued such as key dental conferences. Finally, the results will be published as a scientific report in a dental peer-reviewed journal. PMID:24568962
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2013 CFR
2013-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2010 CFR
2010-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2011 CFR
2011-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2012 CFR
2012-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2014 CFR
2014-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
One Controller at a Time (1-CAT): A mimo design methodology
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Lucas, J. C.
1987-01-01
The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.
Programmable Nano-Bio-Chip Sensors: Analytical Meets Clinical
Jokerst, Jesse V.; Floriano, Pierre N.; Christodoulides, Nicolaos; McDevitt, John T.; Jacobson, James W.; Bhagwandin, Bryon D.
2010-01-01
synopsis There have been many recent advances in the nano-bio-chip (NBC) analysis methodology with implications for a number of high-morbidity diseases including HIV, cancer, and heart disease. In their Feature article, Jesse V. Jokerst of The University of Texas at Austin; Pierre N. Floriano, Nicolaos Christodoulides, and John T. McDevitt of Rice University; and James W. Jacobson and Bryon D. Bhagwandin of LabNow, Inc. discuss the construction, capabilities, and advantages of NBCs. The cover shows arrays of NBCs. Images courtesy of Glennon Simmons/McDevitt Lab and Marcha Miller of The University of Texas at Austin. PMID:20128622
On the singular perturbations for fractional differential equation.
Atangana, Abdon
2014-01-01
The goal of this paper is to examine the possible extension of the singular perturbation differential equation to the concept of fractional order derivative. To achieve this, we presented a review of the concept of fractional calculus. We make use of the Laplace transform operator to derive exact solution of singular perturbation fractional linear differential equations. We make use of the methodology of three analytical methods to present exact and approximate solution of the singular perturbation fractional, nonlinear, nonhomogeneous differential equation. These methods are including the regular perturbation method, the new development of the variational iteration method, and the homotopy decomposition method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, Jean-Paul; Guttromson, Ross; Silva-Monroy, Cesar
This report has been written for the Department of Energy’s Energy Policy and Systems Analysis Office to inform their writing of the Quadrennial Energy Review in the area of energy resilience. The topics of measuring and increasing energy resilience are addressed, including definitions, means of measuring, and analytic methodologies that can be used to make decisions for policy, infrastructure planning, and operations. A risk-based framework is presented which provides a standard definition of a resilience metric. Additionally, a process is identified which explains how the metrics can be applied. Research and development is articulated that will further accelerate the resiliencemore » of energy infrastructures.« less
Thermal barrier coating life prediction model development
NASA Technical Reports Server (NTRS)
Demasi, J. T.; Sheffler, K. D.
1986-01-01
The objective of this program is to establish a methodology to predict Thermal Barrier Coating (TBC) life on gas turbine engine components. The approach involves experimental life measurement coupled with analytical modeling of relevant degradation modes. The coating being studied is a flight qualified two layer system, designated PWA 264, consisting of a nominal ten mil layer of seven percent yttria partially stabilized zirconia plasma deposited over a nominal five mil layer of low pressure plasma deposited NiCoCrAlY. Thermal barrier coating degradation modes being investigated include: thermomechanical fatigue, oxidation, erosion, hot corrosion, and foreign object damage.
Construcción de un catálogo de cúmulos de galaxias en proceso de colisión
NASA Astrophysics Data System (ADS)
de los Ríos, M.; Domínguez, M. J.; Paz, D.
2015-08-01
In this work we present first results of the identification of colliding galaxy clusters in galaxy catalogs with redshift measurements (SDSS, 2DF), and introduce the methodology. We calibrated a method by studying the merger trees of clusters in a mock catalog based on a full-blown semi-analytic model of galaxy formation on top of the Millenium cosmological simulation. We also discuss future actions for studding our sample of colliding galaxy clusters, including x-ray observations and mass reconstruction obtained by using weak gravitational lenses.
Prediction of Composite Pressure Vessel Failure Location using Fiber Bragg Grating Sensors
NASA Technical Reports Server (NTRS)
Kreger, Steven T.; Taylor, F. Tad; Ortyl, Nicholas E.; Grant, Joseph
2006-01-01
Ten composite pressure vessels were instrumented with fiber Bragg grating sensors in order to assess the strain levels of the vessel under various loading conditions. This paper and presentation will discuss the testing methodology, the test results, compare the testing results to the analytical model, and present a possible methodology for predicting the failure location and strain level of composite pressure vessels.
Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz
2018-01-18
In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.
Analytical group decision making in natural resources: Methodology and application
Schmoldt, D.L.; Peterson, D.L.
2000-01-01
Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.
Maybe Small Is Too Small a Term: Introduction to Advancing Small Sample Prevention Science.
Fok, Carlotta Ching Ting; Henry, David; Allen, James
2015-10-01
Prevention research addressing health disparities often involves work with small population groups experiencing such disparities. The goals of this special section are to (1) address the question of what constitutes a small sample; (2) identify some of the key research design and analytic issues that arise in prevention research with small samples; (3) develop applied, problem-oriented, and methodologically innovative solutions to these design and analytic issues; and (4) evaluate the potential role of these innovative solutions in describing phenomena, testing theory, and evaluating interventions in prevention research. Through these efforts, we hope to promote broader application of these methodological innovations. We also seek whenever possible, to explore their implications in more general problems that appear in research with small samples but concern all areas of prevention research. This special section includes two sections. The first section aims to provide input for researchers at the design phase, while the second focuses on analysis. Each article describes an innovative solution to one or more challenges posed by the analysis of small samples, with special emphasis on testing for intervention effects in prevention research. A concluding article summarizes some of their broader implications, along with conclusions regarding future directions in research with small samples in prevention science. Finally, a commentary provides the perspective of the federal agencies that sponsored the conference that gave rise to this special section.
Perspective: Randomized Controlled Trials Are Not a Panacea for Diet-Related Research12
Hébert, James R; Frongillo, Edward A; Adams, Swann A; Turner-McGrievy, Gabrielle M; Hurley, Thomas G; Miller, Donald R; Ockene, Ira S
2016-01-01
Research into the role of diet in health faces a number of methodologic challenges in the choice of study design, measurement methods, and analytic options. Heavier reliance on randomized controlled trial (RCT) designs is suggested as a way to solve these challenges. We present and discuss 7 inherent and practical considerations with special relevance to RCTs designed to study diet: 1) the need for narrow focus; 2) the choice of subjects and exposures; 3) blinding of the intervention; 4) perceived asymmetry of treatment in relation to need; 5) temporal relations between dietary exposures and putative outcomes; 6) strict adherence to the intervention protocol, despite potential clinical counter-indications; and 7) the need to maintain methodologic rigor, including measuring diet carefully and frequently. Alternatives, including observational studies and adaptive intervention designs, are presented and discussed. Given high noise-to-signal ratios interjected by using inaccurate assessment methods in studies with weak or inappropriate study designs (including RCTs), it is conceivable and indeed likely that effects of diet are underestimated. No matter which designs are used, studies will require continued improvement in the assessment of dietary intake. As technology continues to improve, there is potential for enhanced accuracy and reduced user burden of dietary assessments that are applicable to a wide variety of study designs, including RCTs. PMID:27184269
Engineering Prashant.Sharan@nrel.gov | 303-275-3067 Prashant Sharan joined the Thermal Systems Group at NREL ), and solar thermal system. Prashant developed analytical methodologies for optimal integration of
Reliability analysis of composite structures
NASA Technical Reports Server (NTRS)
Kan, Han-Pin
1992-01-01
A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.
Analytical methodologies for aluminium speciation in environmental and biological samples--a review.
Bi, S P; Yang, X D; Zhang, F P; Wang, X L; Zou, G W
2001-08-01
It is recognized that aluminium (Al) is a potential environmental hazard. Acidic deposition has been linked to increased Al concentrations in natural waters. Elevated levels of Al might have serious consequences for biological communities. Of particular interest is the speciation of Al in aquatic environments, because Al toxicity depends on its forms and concentrations. In this paper, advances in analytical methodologies for Al speciation in environmental and biological samples during the past five years are reviewed. Concerns about the specific problems of Al speciation and highlights of some important methods are elucidated in sections devoted to hybrid techniques (HPLC or FPLC coupled with ET-AAS, ICP-AES, or ICP-MS), flow-injection analysis (FIA), nuclear magnetic resonance (27Al NMR), electrochemical analysis, and computer simulation. More than 130 references are cited.
Ahsan, Zaid; Jayaprakash, K R
2016-10-01
In this exposition we consider the wave dynamics of a one-dimensional periodic granular dimer (diatomic) chain mounted on a damped and an undamped linear elastic foundation (otherwise called the on-site potential). It is very well known that periodic granular dimers support solitary wave propagation (similar to that in the homogeneous granular chains) for a specific discrete set of mass ratios. In this work we present the analytical investigation of the evolution of solitary waves and primary pulses in granular dimers when they are mounted on on-site potential with and without velocity proportional foundation damping. We invoke a methodology based on the multiple time-scale asymptotic analysis and partition the dynamics of the perturbed dimer chain into slow and fast components. The dynamics of the dimer chain in the limit of large mass mismatch (auxiliary chain) mounted on on-site potential and foundation damping is used as the basis for the analysis. A systematic analytical procedure is then developed for the slowly varying response of the beads and in estimating primary pulse amplitude evolution resulting in a nonlinear map relating the relative displacement amplitudes of two adjacent beads. The methodology is applicable for arbitrary mass ratios between the beads. We present several examples to demonstrate the efficacy of the proposed method. It is observed that the amplitude evolution predicted by the described methodology is in good agreement with the numerical simulation of the original system. This work forms a basis for further application of the considered methodology to weakly coupled granular dimers which finds practical relevance in designing shock mitigating granular layers.
Low-Level Analytical Methodology Updates to Support Decontaminant Performance Evaluations
2011-06-01
from EPDM and tire rubber coupon materials that were spiked with a known amount of the chemical agent VX, treated with bleach decontaminant, and...to evaluate the performance of bleach decontaminant on EPDM and tire rubber coupons. Dose-confirmation or Tool samples were collected by delivering...components • An aging or damaged analytical column • Dirty detector • Other factors related to general instrument and/or sample analysis performance
Analytical Model For Fluid Dynamics In A Microgravity Environment
NASA Technical Reports Server (NTRS)
Naumann, Robert J.
1995-01-01
Report presents analytical approximation methodology for providing coupled fluid-flow, heat, and mass-transfer equations in microgravity environment. Experimental engineering estimates accurate to within factor of 2 made quickly and easily, eliminating need for time-consuming and costly numerical modeling. Any proposed experiment reviewed to see how it would perform in microgravity environment. Model applied in commercial setting for preliminary design of low-Grashoff/Rayleigh-number experiments.
ERIC Educational Resources Information Center
Myers, Nicholas D.; Ahn, Soyeon; Jin, Ying
2011-01-01
Monte Carlo methods can be used in data analytic situations (e.g., validity studies) to make decisions about sample size and to estimate power. The purpose of using Monte Carlo methods in a validity study is to improve the methodological approach within a study where the primary focus is on construct validity issues and not on advancing…
Analytic posteriors for Pearson's correlation coefficient.
Ly, Alexander; Marsman, Maarten; Wagenmakers, Eric-Jan
2018-02-01
Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open-source software package JASP.
CAA Annual Report Fiscal Year 1998.
1998-12-01
Studies , 3-1 Quick Reaction Analyses & Projects 3-1 4 TECHNOLOGY RESEARCH AND ANALYSIS SUPPORT 4-1 Technology Research 4-1 Methodology Research 4-2...Publications, Graphics, and Reproduction 5-2 6 ANALYTICAL EFFORTS COMPLETED BETWEEN FY90 AND FY98 6-1 Appendix A Annual Study , Work Evaluation...future. Chapter 2 highlights major studies and analysis activities which occurred in FY 98. Chapter 3 is the total package of analytical summaries
What Is the Methodologic Quality of Human Therapy Studies in ISI Surgical Publications?
Manterola, Carlos; Pineda, Viviana; Vial, Manuel; Losada, Héctor
2006-01-01
Objective: To determine the methodologic quality of therapy articles about humans published in ISI surgical journals, and to explore the association between methodologic quality, origin, and subject matter. Summary Background Data: It is supposed that ISI journals contain the best methodologic articles. Methods: This is a bibliometric study. All journals listed in the 2002 ISI under the subject heading of “Surgery” were included. A simple randomized sampling was conducted for selected journals (Annals of Surgery, The American Surgeon, Archives of Surgery, British Journal of Surgery, European Journal of Surgery, Journal of the American College of Surgeons, Surgery, and World Journal of Surgery). Published articles related to therapy on humans of the selected journals were reviewed and analyzed. All kinds of clinical designs were considered, excluding editorials, review articles, letters to the editor, and experimental studies. The variables considered were: place of origin, design, and the methodologic quality of articles, which was determined by applying a valid and reliable scale. The review was performed interchangeably and independently by 2 research teams. Descriptive and analytical statistics were used. Statistical significance was defined as P values less than 1%. Results: A total of 653 articles were studied. Studies came predominantly from the United States and Europe (43.6% and 36.8%, respectively). The subject areas most frequently found were digestive and hepatobiliopancreatic surgery (29.1% and 24.5%, respectively). Average and median methodologic quality scores of the entire series were 11.6 ± 4.9 points and 11 points, respectively. The association between methodologic quality and journals was determined. Also, the association between methodologic quality and origin was observed, but no association with subject area was verified. Conclusions: The methodologic quality of therapy articles published in the journals analyzed is low; however, statistical significance was determined between them. Association was observed between methodologic quality and origin, but not with subject matter. PMID:17060778
Kpaibe, André P S; Ben-Ameur, Randa; Coussot, Gaëlle; Ladner, Yoann; Montels, Jérôme; Ake, Michèle; Perrin, Catherine
2017-08-01
Snake venoms constitute a very promising resource for the development of new medicines. They are mainly composed of very complex peptide and protein mixtures, which composition may vary significantly from batch to batch. This latter consideration is a challenge for routine quality control (QC) in the pharmaceutical industry. In this paper, we report the use of capillary zone electrophoresis for the development of an analytical fingerprint methodology to assess the quality of snake venoms. The analytical fingerprint concept is being widely used for the QC of herbal drugs but rarely for venoms QC so far. CZE was chosen for its intrinsic efficiency in the separation of protein and peptide mixtures. The analytical fingerprint methodology was first developed and evaluated for a particular snake venom, Lachesis muta. Optimal analysis conditions required the use of PDADMAC capillary coating to avoid protein and peptide adsorption. Same analytical conditions were then applied to other snake venom species. Different electrophoretic profiles were obtained for each venom. Excellent repeatability and intermediate precision was observed for each batch. Analysis of different batches of the same species revealed inherent qualitative and quantitative composition variations of the venoms between individuals. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Dynamically analyte-responsive macrocyclic host-fluorophore systems.
Ghale, Garima; Nau, Werner M
2014-07-15
CONSPECTUS: Host-guest chemistry commenced to a large degree with the work of Pedersen, who in 1967 first reported the synthesis of crown ethers. The past 45 years have witnessed a substantial progress in the field, from the design of highly selective host molecules as receptors to their application in drug delivery and, particularly, analyte sensing. Much effort has been expended on designing receptors and signaling mechanism for detecting compounds of biological and environmental relevance. Traditionally, the design of a chemosensor comprises one component for molecular recognition, frequently macrocycles of the cyclodextrin, cucurbituril, cyclophane, or calixarene type. The second component, used for signaling, is typically an indicator dye which changes its photophysical properties, preferably its fluorescence, upon analyte binding. A variety of signal transduction mechanisms are available, of which displacement of the dye from the macrocyclic binding site is one of the simplest and most popular ones. This constitutes the working principle of indicator displacement assays. However, indicator displacement assays have been predominantly exploited in a static fashion, namely, to determine absolute analyte concentrations, or, by using combinations of several reporter pairs, to achieve a differential sensing and, thus, identification of specific food products or brands. In contrast, their use in biological systems, for example, with membranes, cells, or with enzymes has been comparably less explored, which led us to the design of the so-called tandem assays, that is, dynamically analyte-responsive host-dye systems, in which the change in analyte concentrations is induced by a biological reaction or process. This methodological variation has practical application potential, because the ability to monitor these biochemical pathways or to follow specific molecules in real time is of paramount interest for both biochemical laboratories and the pharmaceutical industry. We will begin by describing the underlying principles that govern the use of macrocycle-fluorescent dye complexes to monitor time-dependent changes in analyte concentrations. Suitable chemosensing ensembles are introduced, along with their fluorescence responses (switch-on or switch-off). This includes supramolecular tandem assays in their product- and substrate-selective variants, and in their domino and enzyme-coupled modifications, with assays for amino acid decarboxylases, diamine, and choline oxidase, proteases, methyl transferases, acetylcholineesterase (including an unpublished direct tandem assay), choline oxidase, and potato apyrase as examples. It also includes the very recently introduced tandem membrane assays in their published influx and unpublished efflux variants, with the outer membrane protein F as channel protein and protamine as bidirectionally translocated analyte. As proof-of-principle for environmental monitoring applications, we describe sensing ensembles for volatile hydrocarbons.
The words we work with that work on us: clinical paradigm and cumulative relational trauma.
Heuer, Birgit
2017-11-01
This paper addresses a gap between analytic clinical theory and practice which emerges when examining the words we work with via textual and narrative research of case histories. Both subject matter and methodology fit with the remit of conceptual research in psychoanalysis, currently ranging from inductive to nomothetical approaches. Research of clinical language reveals an implicit account of human nature and the world which undergirds clinical practice. Based in the critical philosophy of the previous century, this is termed clinical paradigm. Such implicit views are induced rather than explicitly taught during analytic training, and need to be spelled out in order to become available to discourse and difference of opinion. Textual research shows these implicit pre-clinical attitudes to be inherently pessimistic and thus too similar to the views of self and others found in cumulative relational trauma. Moreover, clinical accounts tend to normalize subtly antagonistic forms of relating, recently recognised as micro-trauma. Importantly, this contravenes the agapic orientation of our theories and ethics. Paradigmatic reflection as a form of professional individuation addresses this gap. This includes a more optimistic outlook which can be traced through the philosophical implications of quantum theory. © 2017, The Society of Analytical Psychology.
NASA Astrophysics Data System (ADS)
Bencherif, H.; Djeffal, F.; Ferhati, H.
2016-09-01
This paper presents a hybrid approach based on an analytical and metaheuristic investigation to study the impact of the interdigitated electrodes engineering on both speed and optical performance of an Interdigitated Metal-Semiconductor-Metal Ultraviolet Photodetector (IMSM-UV-PD). In this context, analytical models regarding the speed and optical performance have been developed and validated by experimental results, where a good agreement has been recorded. Moreover, the developed analytical models have been used as objective functions to determine the optimized design parameters, including the interdigit configuration effect, via a Multi-Objective Genetic Algorithm (MOGA). The ultimate goal of the proposed hybrid approach is to identify the optimal design parameters associated with the maximum of electrical and optical device performance. The optimized IMSM-PD not only reveals superior performance in terms of photocurrent and response time, but also illustrates higher optical reliability against the optical losses due to the active area shadowing effects. The advantages offered by the proposed design methodology suggest the possibility to overcome the most challenging problem with the communication speed and power requirements of the UV optical interconnect: high derived current and commutation speed in the UV receiver.
Modeling and control of flexible space platforms with articulated payloads
NASA Technical Reports Server (NTRS)
Graves, Philip C.; Joshi, Suresh M.
1989-01-01
The first steps in developing a methodology for spacecraft control-structure interaction (CSI) optimization are identification and classification of anticipated missions, and the development of tractable mathematical models in each mission class. A mathematical model of a generic large flexible space platform (LFSP) with multiple independently pointed rigid payloads is considered. The objective is not to develop a general purpose numerical simulation, but rather to develop an analytically tractable mathematical model of such composite systems. The equations of motion for a single payload case are derived, and are linearized about zero steady-state. The resulting model is then extended to include multiple rigid payloads, yielding the desired analytical form. The mathematical models developed clearly show the internal inertial/elastic couplings, and are therefore suitable for analytical and numerical studies. A simple decentralized control law is proposed for fine pointing the payloads and LFSP attitude control, and simulation results are presented for an example problem. The decentralized controller is shown to be adequate for the example problem chosen, but does not, in general, guarantee stability. A centralized dissipative controller is then proposed, requiring a symmetric form of the composite system equations. Such a controller guarantees robust closed loop stability despite unmodeled elastic dynamics and parameter uncertainties.
Guimarães, Geovani Pereira; Santos, Ravely Lucena; Júnior, Fernando José de Lima Ramos; da Silva, Karla Monik Alves; de Souza, Fabio Santos
2016-01-01
Momordica charantia is a species cultivated throughout the world and widely used in folk medicine, and its medicinal benefits are well documented, especially its pharmacological properties, including antimicrobial activities. Analytical methods have been used to aid in the characterization of compounds derived from plant drug extracts and their products. This paper developed a methodological model to evaluate the integrity of the vegetable drug M. charantia in different particle sizes, using different analytical methods. M. charantia was collected in the semiarid region of Paraíba, Brazil. The herbal medicine raw material derived from the leaves and fruits in different particle sizes was analyzed using thermoanalytical techniques as thermogravimetry (TG) and differential thermal analysis (DTA), pyrolysis coupled to gas chromatography/mass spectrometry (PYR-GC/MS), and nuclear magnetic resonance (1H NMR), in addition to the determination of antimicrobial activity. The different particle surface area among the samples was differentiated by the techniques. DTA and TG were used for assessing thermal and kinetic parameters and PYR-GC/MS was used for degradation products chromatographic identification through the pyrograms. The infusions obtained from the fruit and leaves of Momordica charantia presented antimicrobial activity. PMID:27579215
Anderson, Craig A; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L; Bushman, Brad J; Sakamoto, Akira; Rothstein, Hannah R; Saleem, Muniba
2010-03-01
Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past meta-analyses; (b) cross-cultural comparisons; (c) longitudinal studies for all outcomes except physiological arousal; (d) conservative statistical controls; (e) multiple moderator analyses; and (f) sensitivity analyses. Social-cognitive models and cultural differences between Japan and Western countries were used to generate theory-based predictions. Meta-analyses yielded significant effects for all 6 outcome variables. The pattern of results for different outcomes and research designs (experimental, cross-sectional, longitudinal) fit theoretical predictions well. The evidence strongly suggests that exposure to violent video games is a causal risk factor for increased aggressive behavior, aggressive cognition, and aggressive affect and for decreased empathy and prosocial behavior. Moderator analyses revealed significant research design effects, weak evidence of cultural differences in susceptibility and type of measurement effects, and no evidence of sex differences in susceptibility. Results of various sensitivity analyses revealed these effects to be robust, with little evidence of selection (publication) bias.
ElMekawy, A; Hegab, H M; Pant, D; Saint, C P
2018-01-01
Globally, sustainable provision of high-quality safe water is a major challenge of the 21st century. Various chemical and biological monitoring analytics are presently utilized to guarantee the availability of high-quality water. However, these techniques still face some challenges including high costs, complex design and onsite and online limitations. The recent technology of using microbial fuel cell (MFC)-based biosensors holds outstanding potential for the rapid and real-time monitoring of water source quality. MFCs have the advantages of simplicity in design and efficiency for onsite sensing. Even though some sensing applications of MFCs were previously studied, e.g. biochemical oxygen demand sensor, recently numerous research groups around the world have presented new practical applications of this technique, which combine multidisciplinary scientific knowledge in materials science, microbiology and electrochemistry fields. This review presents the most updated research on the utilization of MFCs as potential biosensors for monitoring water quality and considers the range of potentially toxic analytes that have so far been detected using this methodology. The advantages of MFCs over established technology are also considered as well as future work required to establish their routine use. © 2017 The Society for Applied Microbiology.
The riddle of Siegfried: exploring methods and psychological perspectives in analytical psychology.
Barreto, Marco Heleno
2016-02-01
Jung's dream of the killing of Siegfried poses a riddle: why did the unconscious choose precisely Siegfried as the hero to be murdered? Jung himself declares that he does not know. This paper attempts to decipher this riddle using three distinct methodological approaches accepted by Jung, two of them in fact grounded in his theories of dream interpretation. Besides presenting some possible answers to the riddle of Siegfried, this interpretative reflection brings to light the discrepancy of the psychological perspectives created by the heterogeneity of methods within analytical psychology. © 2016, The Society of Analytical Psychology.
Hand, Rosa K; Perzynski, Adam T
2016-09-01
Retrospective self-reported data have limitations, making it important to evaluate alternative forms of measurement for nutrition behaviors. Ecological momentary assessment (EMA) attempts to overcome the challenges of recalled data with real-time data collection in a subject's natural environment, often leveraging technology. This perspective piece 1) introduces the concepts and terminology of EMA, 2) provides an overview of the methodological and analytical considerations, 3) gives examples of past research using EMA, and 4) suggests new opportunities (including combining assessment and intervention) and limitations (including the need for technology) for the application of EMA to research and practice regarding nutrition behaviors. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Extending religion-health research to secular minorities: issues and concerns.
Hwang, Karen; Hammer, Joseph H; Cragun, Ryan T
2011-09-01
Claims about religion's beneficial effects on physical and psychological health have received substantial attention in popular media, but empirical support for these claims is mixed. Many of these claims are tenuous because they fail to address basic methodological issues relating to construct validity, sampling methods or analytical problems. A more conceptual problem has to do with the near universal lack of atheist control samples. While many studies include samples of individuals classified as "low spirituality" or religious "nones", these groups are heterogeneous and contain only a fraction of members who would be considered truly secular. We illustrate the importance of including an atheist control group whenever possible in the religiosity/spirituality and health research and discuss areas for further investigation.
Nuclear Forensics: A Methodology Applicable to Nuclear Security and to Non-Proliferation
NASA Astrophysics Data System (ADS)
Mayer, K.; Wallenius, M.; Lützenkirchen, K.; Galy, J.; Varga, Z.; Erdmann, N.; Buda, R.; Kratz, J.-V.; Trautmann, N.; Fifield, K.
2011-09-01
Nuclear Security aims at the prevention and detection of and response to, theft, sabotage, unauthorized access, illegal transfer or other malicious acts involving nuclear material. Nuclear Forensics is a key element of nuclear security. Nuclear Forensics is defined as a methodology that aims at re-establishing the history of nuclear material of unknown origin. It is based on indicators that arise from known relationships between material characteristics and process history. Thus, nuclear forensics analysis includes the characterization of the material and correlation with production history. To this end, we can make use of parameters such as the isotopic composition of the nuclear material and accompanying elements, chemical impurities, macroscopic appearance and microstructure of the material. In the present paper, we discuss the opportunities for attribution of nuclear material offered by nuclear forensics as well as its limitations. Particular attention will be given to the role of nuclear reactions. Such reactions include the radioactive decay of the nuclear material, but also reactions with neutrons. When uranium (of natural composition) is exposed to neutrons, plutonium is formed, as well as 236U. We will illustrate the methodology using the example of a piece of uranium metal that dates back to the German nuclear program in the 1940's. A combination of different analytical techniques and model calculations enables a nuclear forensics interpretation, thus correlating the material characteristics with the production history.
Hybrid perturbation methods based on statistical time series models
NASA Astrophysics Data System (ADS)
San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario
2016-04-01
In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.
Ramirez, Daniela Andrea; Locatelli, Daniela Ana; Torres-Palazzolo, Carolina Andrea; Altamirano, Jorgelina Cecilia; Camargo, Alejandra Beatriz
2017-01-15
Organosulphur compounds (OSCs) present in garlic (Allium sativum L.) are responsible of several biological properties. Functional foods researches indicate the importance of quantifying these compounds in food matrices and biological fluids. For this purpose, this paper introduces a novel methodology based on dispersive liquid-liquid microextraction (DLLME) coupled to high performance liquid chromatography with ultraviolet detector (HPLC-UV) for the extraction and determination of organosulphur compounds in different matrices. The target analytes were allicin, (E)- and (Z)-ajoene, 2-vinyl-4H-1,2-dithiin (2-VD), diallyl sulphide (DAS) and diallyl disulphide (DADS). The microextraction technique was optimized using an experimental design, and the analytical performance was evaluated under optimum conditions. The desirability function presented an optimal value for 600μL of chloroform as extraction solvent using acetonitrile as dispersant. The method proved to be reliable, precise and accurate. It was successfully applied to determine OSCs in cooked garlic samples as well as blood plasma and digestive fluids. Copyright © 2016 Elsevier Ltd. All rights reserved.
Computational Methodology for Absolute Calibration Curves for Microfluidic Optical Analyses
Chang, Chia-Pin; Nagel, David J.; Zaghloul, Mona E.
2010-01-01
Optical fluorescence and absorption are two of the primary techniques used for analytical microfluidics. We provide a thorough yet tractable method for computing the performance of diverse optical micro-analytical systems. Sample sizes range from nano- to many micro-liters and concentrations from nano- to milli-molar. Equations are provided to trace quantitatively the flow of the fundamental entities, namely photons and electrons, and the conversion of energy from the source, through optical components, samples and spectral-selective components, to the detectors and beyond. The equations permit facile computations of calibration curves that relate the concentrations or numbers of molecules measured to the absolute signals from the system. This methodology provides the basis for both detailed understanding and improved design of microfluidic optical analytical systems. It saves prototype turn-around time, and is much simpler and faster to use than ray tracing programs. Over two thousand spreadsheet computations were performed during this study. We found that some design variations produce higher signal levels and, for constant noise levels, lower minimum detection limits. Improvements of more than a factor of 1,000 were realized. PMID:22163573
Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark
2011-01-01
Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.
Bámaca-Colbert, Mayra Y; Gayles, Jochebed G
2010-11-01
The overall aim of the current study was to identify the methodological approach and corresponding analytic procedure that best elucidated the associations among Mexican-origin mother-daughter cultural orientation dissonance, family functioning, and adolescent adjustment. To do so, we employed, and compared, two methodological approaches (i.e., variable-centered and person-centered) via four analytic procedures (i.e., difference score, interactive, matched/mismatched grouping, and latent profiles). The sample consisted of 319 girls in the 7th or 10th grade and their mother or mother figure from a large Southwestern, metropolitan area in the US. Family factors were found to be important predictors of adolescent adjustment in all models. Although some findings were similar across all models, overall, findings suggested that the latent profile procedure best elucidated the associations among the variables examined in this study. In addition, associations were present across early and middle adolescents, with a few findings being only present for one group. Implications for using these analytic procedures in studying cultural and family processes are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pool, K.H.; Evans, J.C.; Olsen, K.B.
1997-08-01
This report presents the results from analyses of samples taken from the headspace of waste storage tank 241-S-102 (Tank S-102) at the Hanford Site in Washington State. Tank headspace samples collected by SGN Eurisys Service Corporation (SESC) were analyzed by Pacific Northwest National Laboratory (PNNL) to determine headspace concentrations of selected non-radioactive analytes. Analyses were performed by the Vapor Analytical Laboratory (VAL) at PNNL. Vapor concentrations from sorbent trap samples are based on measured sample volumes provided by SESC. Ammonia was determined to be above the immediate notification limit of 150 ppm as specified by the sampling and analysis planmore » (SAP). Hydrogen was the principal flammable constituent of the Tank S-102 headspace, determined to be present at approximately 2.410% of its lower flammability limit (LFL). Total headspace flammability was estimated to be <2.973% of its lower flammability limit (LFL). Total headspace flammability was estimated to be <2.973% of the LFL. Average measured concentrations of targeted gases, inorganic vapors, and selected organic vapors are provided in Table S.1. A summary of experimental methods, including sampling methodology, analytical procedures, and quality assurance and control methods are presented in Section 2.0. Detailed descriptions of the analytical results are provided in Section 3.0.« less
Industrial Demand Module - NEMS Documentation
2014-01-01
Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Industrial Demand Module. The report catalogues and describes model assumptions, computational methodology, parameter estimation techniques, and model source code.
Botero-Coy, A M; Ibáñez, M; Sancho, J V; Hernández, F
2013-05-31
The determination of glyphosate (GLY) in soils is of great interest due to the widespread use of this herbicide and the need of assessing its impact on the soil/water environment. However, its residue determination is very problematic especially in soils with high organic matter content, where strong interferences are normally observed, and because of the particular physico-chemical characteristics of this polar/ionic herbicide. In the present work, we have improved previous LC-MS/MS analytical methodology reported for GLY and its main metabolite AMPA in order to be applied to "difficult" soils, like those commonly found in South-America, where this herbicide is extensively used in large areas devoted to soya or maize, among other crops. The method is based on derivatization with FMOC followed by LC-MS/MS analysis, using triple quadrupole. After extraction with potassium hydroxide, a combination of extract dilution, adjustment to appropriate pH, and solid phase extraction (SPE) clean-up was applied to minimize the strong interferences observed. Despite the clean-up performed, the use of isotope labelled glyphosate as internal standard (ILIS) was necessary for the correction of matrix effects and to compensate for any error occurring during sample processing. The analytical methodology was satisfactorily validated in four soils from Colombia and Argentina fortified at 0.5 and 5mg/kg. In contrast to most LC-MS/MS methods, where the acquisition of two transitions is recommended, monitoring all available transitions was required for confirmation of positive samples, as some of them were interfered by unknown soil components. This was observed not only for GLY and AMPA but also for the ILIS. Analysis by QTOF MS was useful to confirm the presence of interferent compounds that shared the same nominal mass of analytes as well as some of their main product ions. Therefore, the selection of specific transitions was crucial to avoid interferences. The methodology developed was applied to the analysis of 26 soils from different areas of Colombia and Argentina, and the method robustness was demonstrated by analysis of quality control samples along 4 months. Copyright © 2012 Elsevier B.V. All rights reserved.
Considerations for Observational Research Using Large Data Sets in Radiation Oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jagsi, Reshma, E-mail: rjagsi@med.umich.edu; Bekelman, Justin E.; Chen, Aileen
The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concisemore » and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold substantial promise for advancing our understanding of many unanswered questions of importance to the field of radiation oncology.« less
Considerations for observational research using large data sets in radiation oncology.
Jagsi, Reshma; Bekelman, Justin E; Chen, Aileen; Chen, Ronald C; Hoffman, Karen; Shih, Ya-Chen Tina; Smith, Benjamin D; Yu, James B
2014-09-01
The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold substantial promise for advancing our understanding of many unanswered questions of importance to the field of radiation oncology. Copyright © 2014 Elsevier Inc. All rights reserved.
Considerations for Observational Research using Large Datasets in Radiation Oncology
Jagsi, Reshma; Bekelman, Justin E.; Chen, Aileen; Chen, Ronald C.; Hoffman, Karen; Shih, Ya-Chen Tina; Smith, Benjamin D.; Yu, James B.
2014-01-01
The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based datasets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the Red Journal assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytic challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold substantial promise for advancing our understanding of many unanswered questions of importance to the field of radiation oncology. PMID:25195986
Analysis and Purification of Bioactive Natural Products: The AnaPurNa Study
2012-01-01
Based on a meta-analysis of data mined from almost 2000 publications on bioactive natural products (NPs) from >80 000 pages of 13 different journals published in 1998–1999, 2004–2005, and 2009–2010, the aim of this systematic review is to provide both a survey of the status quo and a perspective for analytical methodology used for isolation and purity assessment of bioactive NPs. The study provides numerical measures of the common means of sourcing NPs, the chromatographic methodology employed for NP purification, and the role of spectroscopy and purity assessment in NP characterization. A link is proposed between the observed use of various analytical methodologies, the challenges posed by the complexity of metabolomes, and the inescapable residual complexity of purified NPs and their biological assessment. The data provide inspiration for the development of innovative methods for NP analysis as a means of advancing the role of naturally occurring compounds as a viable source of biologically active agents with relevance for human health and global benefit. PMID:22620854
Lociciro, S; Esseiva, P; Hayoz, P; Dujourdy, L; Besacier, F; Margot, P
2008-05-20
Harmonisation and optimization of analytical and statistical methodologies were carried out between two forensic laboratories (Lausanne, Switzerland and Lyon, France) in order to provide drug intelligence for cross-border cocaine seizures. Part I dealt with the optimization of the analytical method and its robustness. This second part investigates statistical methodologies that will provide reliable comparison of cocaine seizures analysed on two different gas chromatographs interfaced with a flame ionisation detectors (GC-FIDs) in two distinct laboratories. Sixty-six statistical combinations (ten data pre-treatments followed by six different distance measurements and correlation coefficients) were applied. One pre-treatment (N+S: area of each peak is divided by its standard deviation calculated from the whole data set) followed by the Cosine or Pearson correlation coefficients were found to be the best statistical compromise for optimal discrimination of linked and non-linked samples. The centralisation of the analyses in one single laboratory is not a required condition anymore to compare samples seized in different countries. This allows collaboration, but also, jurisdictional control over data.
Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.
1989-01-01
The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.
Baker, David R; Kasprzyk-Hordern, Barbara
2011-11-04
The main aim of this manuscript is to provide a comprehensive and critical verification of methodology commonly used for sample collection, storage and preparation in studies concerning the analysis of pharmaceuticals and illicit drugs in aqueous environmental samples with the usage of SPE-LC/MS techniques. This manuscript reports the results of investigations into several sample preparation parameters that to the authors' knowledge have not been reported or have received very little attention. This includes: (i) effect of evaporation temperature and (ii) solvent with regards to solid phase extraction (SPE) extracts; (iii) effect of silanising glassware; (iv) recovery of analytes during vacuum filtration through glass fibre filters and (v) pre LC-MS filter membranes. All of these parameters are vital to develop efficient and reliable extraction techniques; an essential factor given that target drug residues are often present in the aqueous environment at ng L(-1) levels. Presented is also the first comprehensive review of the stability of illicit drugs and pharmaceuticals in wastewater. Among the parameters studied are: time of storage, temperature and pH. Over 60 analytes were targeted including stimulants, opioid and morphine derivatives, benzodiazepines, antidepressants, dissociative anaesthetics, drug precursors, human urine indicators and their metabolites. The lack of stability of analytes in raw wastewater was found to be significant for many compounds. For instance, 34% of compounds studied reported a stability change >15% after only 12 h in raw wastewater stored at 2 °C; a very important finding given that wastewater is typically collected with the use of 24 h composite samplers. The stability of these compounds is also critical given the recent development of so-called 'sewage forensics' or 'sewage epidemiology' in which concentrations of target drug residues in wastewater are used to back-calculate drug consumption. Without an understanding of stability, under (or over) reporting of consumption estimations may take place. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Rappleye, Devin Spencer
The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).
Lu, Dasheng; Jin, Yu'e; Feng, Chao; Wang, Dongli; Lin, Yuanjie; Qiu, Xinlei; Xu, Qian; Wen, Yimin; She, Jianwen; Wang, Guoquan; Zhou, Zhijun
2017-09-01
Commonly, analytical methods measuring brominated flame retardants (BFRs) of different chemical polarities in human serum are labor consuming and tedious. Our study used acidified diatomaceous earth as solid-phase extraction (SPE) adsorbent and defatting material to simultaneously determine the most abundant BFRs and their metabolites with different polarities in human serum samples. The analytes include three types of commercial BFRs, tetrabromobisphenol A (TBBPA), hexabromocyclododecane (HBCD) isomers, and polybrominated biphenyl ethers (PBDEs), and dominant hydroxylated BDE (OH-PBDE) and methoxylated BDE (MeO-PBDE) metabolites of PBDEs. The sample eluents were sequentially analyzed for PBDEs and MeO-BDEs on online gel permeation chromatography/gas chromatography-electron capture-negative ionization mass spectrometry (online GPC GC-ECNI-MS) and for TBBPA, HBCD, and OH-BDEs on liquid chromatography-tandem mass spectrometry (LC-MS/MS). Method recoveries were 67-134% with a relative standard deviation (RSD) of less than 20%. Method detection limits (MDLs) were 0.30-4.20 pg/mL fresh weight (f.w.) for all analytes, except for BDE-209 of 16 pg/mL f.w. The methodology was also applied in a pilot study, which analyzed ten real samples from healthy donors in China, and the majority of target analytes were detected with a detection rate of more than 80%. To our knowledge, it is the first time for us in effectively determining BFRs of most types in one aliquot of human serum samples. This new analytical method is more specific, sensitive, accurate, and time saving for routine biomonitoring of these BFRs and for integrated assessment of health risk of BFR exposure.
Code of Federal Regulations, 2013 CFR
2013-01-01
... performance review in order to regularly spot check and assess that analytical or test data produced by each... export to the European Community (EC). (g) The granting of acceptance of standardized methodology or new...
Code of Federal Regulations, 2010 CFR
2010-01-01
... performance review in order to regularly spot check and assess that analytical or test data produced by each... export to the European Community (EC). (g) The granting of acceptance of standardized methodology or new...
Code of Federal Regulations, 2012 CFR
2012-01-01
... performance review in order to regularly spot check and assess that analytical or test data produced by each... export to the European Community (EC). (g) The granting of acceptance of standardized methodology or new...
Code of Federal Regulations, 2011 CFR
2011-01-01
... performance review in order to regularly spot check and assess that analytical or test data produced by each... export to the European Community (EC). (g) The granting of acceptance of standardized methodology or new...
Code of Federal Regulations, 2014 CFR
2014-01-01
... performance review in order to regularly spot check and assess that analytical or test data produced by each... export to the European Community (EC). (g) The granting of acceptance of standardized methodology or new...
Spatial and temporal epidemiological analysis in the Big Data era.
Pfeiffer, Dirk U; Stevens, Kim B
2015-11-01
Concurrent with global economic development in the last 50 years, the opportunities for the spread of existing diseases and emergence of new infectious pathogens, have increased substantially. The activities associated with the enormously intensified global connectivity have resulted in large amounts of data being generated, which in turn provides opportunities for generating knowledge that will allow more effective management of animal and human health risks. This so-called Big Data has, more recently, been accompanied by the Internet of Things which highlights the increasing presence of a wide range of sensors, interconnected via the Internet. Analysis of this data needs to exploit its complexity, accommodate variation in data quality and should take advantage of its spatial and temporal dimensions, where available. Apart from the development of hardware technologies and networking/communication infrastructure, it is necessary to develop appropriate data management tools that make this data accessible for analysis. This includes relational databases, geographical information systems and most recently, cloud-based data storage such as Hadoop distributed file systems. While the development in analytical methodologies has not quite caught up with the data deluge, important advances have been made in a number of areas, including spatial and temporal data analysis where the spectrum of analytical methods ranges from visualisation and exploratory analysis, to modelling. While there used to be a primary focus on statistical science in terms of methodological development for data analysis, the newly emerged discipline of data science is a reflection of the challenges presented by the need to integrate diverse data sources and exploit them using novel data- and knowledge-driven modelling methods while simultaneously recognising the value of quantitative as well as qualitative analytical approaches. Machine learning regression methods, which are more robust and can handle large datasets faster than classical regression approaches, are now also used to analyse spatial and spatio-temporal data. Multi-criteria decision analysis methods have gained greater acceptance, due in part, to the need to increasingly combine data from diverse sources including published scientific information and expert opinion in an attempt to fill important knowledge gaps. The opportunities for more effective prevention, detection and control of animal health threats arising from these developments are immense, but not without risks given the different types, and much higher frequency, of biases associated with these data. Copyright © 2015 Elsevier B.V. All rights reserved.
The evolution of analytical chemistry methods in foodomics.
Gallo, Monica; Ferranti, Pasquale
2016-01-08
The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.
Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette
2018-05-10
Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.
Jensen, Eric Allen
2017-01-01
With the rapid global proliferation of social media, there has been growing interest in using this existing source of easily accessible 'big data' to develop social science knowledge. However, amidst the big data gold rush, it is important that long-established principles of good social research are not ignored. This article critically evaluates Mitchell et al.'s (2013) study, 'The Geography of Happiness: Connecting Twitter Sentiment and Expression, Demographics, and Objective Characteristics of Place', demonstrating the importance of attending to key methodological issues associated with secondary data analysis.
ADM1-based methodology for the characterisation of the influent sludge in anaerobic reactors.
Huete, E; de Gracia, M; Ayesa, E; Garcia-Heras, J L
2006-01-01
This paper presents a systematic methodology to characterise the influent sludge in terms of the ADM1 components from the experimental measurements traditionally used in wastewater engineering. For this purpose, a complete characterisation of the model components in their elemental mass fractions and charge has been used, making a rigorous mass balance for all the process transformations and enabling the future connection with other unit-process models. It also makes possible the application of mathematical algorithms for the optimal characterisation of several components poorly defined in the ADM1 report. Additionally, decay and disintegration have been necessarily uncoupled so that the decay proceeds directly to hydrolysis instead of producing intermediate composites. The proposed methodology has been applied to the particular experimental work of a pilot-scale CSTR treating real sewage sludge, a mixture of primary and secondary sludge. The results obtained have shown a good characterisation of the influent reflected in good model predictions. However, its limitations for an appropriate prediction of alkalinity and carbon percentages in biogas suggest the convenience of including the elemental characterisation of the process in terms of carbon in the analytical program.
Control-Structure-Interaction (CSI) technologies and trends to future NASA missions
NASA Technical Reports Server (NTRS)
1990-01-01
Control-structure-interaction (CSI) issues which are relevant for future NASA missions are reviewed. This goal was achieved by: (1) reviewing large space structures (LSS) technologies to provide a background and survey of the current state of the art (SOA); (2) analytically studying a focus mission to identify opportunities where CSI technology may be applied to enhance or enable future NASA spacecraft; and (3) expanding a portion of the focus mission, the large antenna, to provide in-depth trade studies, scaling laws, and methodologies which may be applied to other NASA missions. Several sections are presented. Section 1 defines CSI issues and presents an overview of the relevant modeling and control issues for LLS. Section 2 presents the results of the three phases of the CSI study. Section 2.1 gives the results of a CSI study conducted with the Geostationary Platform (Geoplat) as the focus mission. Section 2.2 contains an overview of the CSI control design methodology available in the technical community. Included is a survey of the CSI ground-based experiments which were conducted to verify theoretical performance predictions. Section 2.3 presents and demonstrates a new CSI scaling law methodology for assessing potential CSI with large antenna systems.
Hartz, Susanne; John, Jürgen
2008-01-01
Economic evaluation as an integral part of health technology assessment is today mostly applied to established technologies. Evaluating healthcare innovations in their early states of development has recently attracted attention. Although it offers several benefits, it also holds methodological challenges. The aim of our study was to investigate the possible contributions of economic evaluation to industry's decision making early in product development and to confront the results with the actual use of early data in economic assessments. We conducted a literature research to detect methodological contributions as well as economic evaluations that used data from early phases of product development. Economic analysis can be beneficially used in early phases of product development for various purposes including early market assessment, R&D portfolio management, and first estimations of pricing and reimbursement scenarios. Analytical tools available for these purposes have been identified. Numerous empirical works were detected, but most do not disclose any concrete decision context and could not be directly matched with the suggested applications. Industry can benefit from starting economic evaluation early in product development in several ways. Empirical evidence suggests that there is still potential left unused.
Zacharis, Constantinos K; Vastardi, Elli
2018-02-20
In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B.V. All rights reserved.
Analytical method of waste allocation in waste management systems: Concept, method and case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergeron, Francis C., E-mail: francis.b.c@videotron.ca
Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. Themore » conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste management in Geneva.« less
Fleet management performance monitoring.
DOT National Transportation Integrated Search
2013-05-01
The principle goal of this project was to enhance and expand the analytical modeling methodology previously developed as part of the Fleet Management Criteria: Disposal Points and Utilization Rates project completed in 2010. The enhanced and ex...
Validation of urban freeway models.
DOT National Transportation Integrated Search
2015-01-01
This report describes the methodology, data, conclusions, and enhanced models regarding the validation of two sets of models developed in the Strategic Highway Research Program 2 (SHRP 2) Reliability Project L03, Analytical Procedures for Determining...
77 FR 25678 - International Trade Administration
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-01
... dumping or a countervailable subsidy (as the case may be) and of material injury. Upcoming Sunset Reviews... of Sunset Reviews are set forth in 19 CFR 351.218. Guidance on methodological or analytical issues...
International Natural Gas Model 2011, Model Documentation Report
2013-01-01
This report documents the objectives, analytical approach and development of the International Natural Gas Model (INGM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
Bauer, Talya N; Bodner, Todd; Erdogan, Berrin; Truxillo, Donald M; Tucker, Jennifer S
2007-05-01
The authors tested a model of antecedents and outcomes of newcomer adjustment using 70 unique samples of newcomers with meta-analytic and path modeling techniques. Specifically, they proposed and tested a model in which adjustment (role clarity, self-efficacy, and social acceptance) mediated the effects of organizational socialization tactics and information seeking on socialization outcomes (job satisfaction, organizational commitment, job performance, intentions to remain, and turnover). The results generally supported this model. In addition, the authors examined the moderating effects of methodology on these relationships by coding for 3 methodological issues: data collection type (longitudinal vs. cross-sectional), sample characteristics (school-to-work vs. work-to-work transitions), and measurement of the antecedents (facet vs. composite measurement). Discussion focuses on the implications of the findings and suggestions for future research. 2007 APA, all rights reserved
Evaluation of a handheld point-of-care analyser for measurement of creatinine in cats.
Reeve, Jenny; Warman, Sheena; Lewis, Daniel; Watson, Natalie; Papasouliotis, Kostas
2017-02-01
Objectives The aim of the study was to evaluate whether a handheld creatinine analyser (StatSensor Xpress; SSXp), available for human patients, can be used to measure creatinine reliably in cats. Methods Analytical performance was evaluated by determining within- and between-run coefficient of variation (CV, %), total error observed (TE obs , %) and sigma metrics. Fifty client-owned cats presenting for investigation of clinical disease had creatinine measured simultaneously, using SSXp (whole blood and plasma) and a reference instrument (Konelab, serum); 48 paired samples were included in the study. Creatinine correlation between methodologies (SSXp vs Konelab) and sample types (SSXp whole blood vs SSXp plasma ) was assessed by Spearman's correlation coefficient and agreement was determined using Bland-Altman difference plots. Each creatinine value was assigned an IRIS stage (1-4); correlation and agreement between Konelab and SSXp IRIS stages were evaluated. Results Within-run CV (4.23-8.85%), between-run CV (8.95-11.72%), TE obs (22.15-34.92%) and sigma metrics (⩽3) did not meet desired analytical requirements. Correlation between sample types was high (SSXp whole blood vs SSXp plasma ; r = 0.89), and between instruments was high (SSXp whole blood vs Konelab serum ; r = 0.85) to very high (SSXp plasma vs Konelab serum ; r = 0.91). Konelab and SSXp whole blood IRIS scores exhibited high correlation ( r = 0.76). Packed cell volume did not significantly affect SSXp determination of creatinine. Bland-Altman difference plots identified a positive bias for the SSXp (7.13 μmol/l SSXp whole blood ; 20.23 μmol/l SSXp plasma ) compared with the Konelab. Outliers (1/48 whole blood; 2/48 plasma) occurred exclusively at very high creatinine concentrations. The SSXp failed to identify 2/21 azotaemic cats. Conclusions and relevance Analytical performance of the SSXp in feline patients is not considered acceptable. The SSXp exhibited a high to very high correlation compared with the reference methodology but the two instruments cannot be used interchangeably. Improvements in the SSXp analytical performance are needed before its use can be recommended in feline clinical practice.
NASA Astrophysics Data System (ADS)
Armigliato, A.
2008-07-01
In the present and future CMOS technology, due to the ever shrinking geometries of the electronic devices, the availability of techniques capable of performing quantitative analyses of the relevant parameters (structural, chemical, mechanical) at a nanoscale is of a paramount importance. The influence of these features on the electrical performances of the nanodevices is a key issue for the nanoelectronics industry. In the recent years, a significant progress has been made in this field by a number of techniques, such as X-ray diffraction, in particular with the advent of synchrotron sources, ion-microbeam based Rutherford backscattering and channeling spectrometry, and micro Raman spectrometry. In addition, secondary ion mass spectrometry (SIMS) has achieved an important role in the determination of the dopant depth profile in ultra-shallow junctions (USJs) in silicon. However, the technique which features the ultimate spatial resolution (at the nanometer scale) is scanning transmission electron microscopy (STEM). In this presentation it will be reported on the nanoanalysis by STEM of two very important physical quantities which need to be controlled in the fabrication processes of nanodevices: the dopant profile in the USJs and the lattice strain that is generated in the Si electrically active regions of isolation structures by the different technological steps. The former quantity is investigated by the so-called Z-contrast high-angle annular dark field (HAADF-STEM) method, whereas the mechanical strain can be two-dimensionally mapped by the convergent beam electron diffraction (CBED-STEM) method. A spatial resolution lower than one nanometer and of a few nanometers can be achieved in the two cases, respectively. To keep the pace with the scientific and technological progress an increasingly wide array of analytical techniques is necessary; their complementary role in the solution of present and future characterization problems must be exploited. Presently, however, European laboratories with high-level expertise in materials characterization still operate in a largely independent way; this adversely affects the competitivity of European science and industry at the international level. For this reason the European Commission has started an Integrated Infrastructure Initiative (I3) in the sixth Framework Programme (now continuing in FP7) and funded a project called ANNA (2006-2010). This acronym stands for European Integrated Activity of Excellence and Networking for Nano and Micro- Electronics Analysis. The consortium includes 12 partners from 7 European countries and is coordinated by the Fondazione B.Kessler (FBK) in Trento (Italy); CNR-IMM is one of the 12 partners. Aim of ANNA is the onset of strong, long-term collaboration among the partners, so to form an integrated multi-site analytical facility, able to offer to the European community a wide variety of top-level analytical expertise and services in the field of micro- and nano-electronics. They include X-ray diffraction and scattering, SIMS, electron microscopy, medium-energy ion scattering, optical and electrical techniques. The project will be focused on three main activities: Networking (standardization of samples and methodologies, establishment of accredited reference laboratories), Transnational Access to laboratories located in the partners' premises to perform specific analytical experiments (an example is given by the two STEM methodologies discussed above) and Joint Research activity, which is targeted at the improvement and extension of the methodologies through a continuous instrumental and technical development. It is planned that the European joint analytical laboratory will continue its activity beyond the end of the project in 2010.
Johnston, Lisa G; Hakim, Avi J; Dittrich, Samantha; Burnett, Janet; Kim, Evelyn; White, Richard G
2016-08-01
Reporting key details of respondent-driven sampling (RDS) survey implementation and analysis is essential for assessing the quality of RDS surveys. RDS is both a recruitment and analytic method and, as such, it is important to adequately describe both aspects in publications. We extracted data from peer-reviewed literature published through September, 2013 that reported collected biological specimens using RDS. We identified 151 eligible peer-reviewed articles describing 222 surveys conducted in seven regions throughout the world. Most published surveys reported basic implementation information such as survey city, country, year, population sampled, interview method, and final sample size. However, many surveys did not report essential methodological and analytical information for assessing RDS survey quality, including number of recruitment sites, seeds at start and end, maximum number of waves, and whether data were adjusted for network size. Understanding the quality of data collection and analysis in RDS is useful for effectively planning public health service delivery and funding priorities.
Determination of biomass burning tracers in air samples by GC/MS
NASA Astrophysics Data System (ADS)
Janoszka, Katarzyna
2018-01-01
Levoglucosan (LG) as a main cellulose burning product at 300°C is a biomass burning tracer. LG characterize by relatively high molar mass and it is sorbed by particulate matter. In the study of air pollution monitoring LG is mainly analyzed in particulate matter, PM1 and PM2,5. The tracer create relatively high O-H…O bond and weaker C-H…O bond. Due to the hydrogen bond, LG dissolves very well in water. Analytical procedure of LG determination include: extraction, derivatization and analysis by gas chromatography coupled with mass spectrometry detector. In water samples levoglucosan is determined by liquid chromatography. The paper presents a methodology for particulate matter samples determination their analysis by gas chromatography coupled with a mass spectrometry detector. Determination of LG content in particulate matter was performed according to an analytical method based on simultaneous pyridine extraction and derivatization using N,O-bis (trimethylsilyl) trifluoroacetamide and trimethylchlorosilane mixture (BSTFA: TMCS, 99: 1).
Ordoñez, Edgar Y; Rodil, Rosario; Quintana, José Benito; Cela, Rafael
2015-02-15
A new analytical procedure involving the use of water and a low percentage of ethanol combined to high temperature liquid chromatography-tandem mass spectrometry has been developed for the determination of nine high-intensity sweeteners in a variety of drink samples. The method permitted the analysis in 23min (including column reequilibration) and consuming only 0.85mL of a green organic solvent (ethanol). This methodology provided limits of detection (after 50-fold dilution) in the 0.05-10mg/L range, with recoveries (obtained from five different types of beverages) being in the 86-110% range and relative standard deviation values lower than 12%. Finally, the method was applied to 25 different samples purchased in Spain, where acesulfame and sucralose were the most frequently detected analytes (>50% of the samples) and cyclamate was found over the legislation limit set by the European Union in a sample and at the regulation boundary in three others. Copyright © 2014 Elsevier Ltd. All rights reserved.
Kamradt, Jaclyn M; Momany, Allison M; Nikolas, Molly A
2018-06-01
A substantial literature suggests that abnormal cortisol reactivity may be a vulnerability for deleterious mental health outcomes, including ADHD. ADHD has been linked with difficulty in emotion regulation and increased risk of experiencing stressors, both of which may be related to psychobiological abnormalities (e.g., abnormal cortisol reactivity). Research has been mixed regarding the association between cortisol reactivity and ADHD. Therefore, the present meta-analytic review (k = 12) sought to quantify this association and review the relevant methodological issues and theoretical implications of this area of research. Overall, no effect was found between cortisol reactivity and ADHD (r = 0), although significant heterogeneity in the analyses suggested that there might be moderators of this association, if one does exist. Results highlight the importance of addressing limitations of the current literature on cortisol reactivity and ADHD and exploring additional indices of emotion regulation that may be associated with ADHD. Implications for future research efforts are discussed.
Momany, Allison M.; Nikolas, Molly A.
2017-01-01
A substantial literature suggests that abnormal cortisol reactivity may be a vulnerability for deleterious mental health outcomes, including ADHD. ADHD has been linked with difficulty in emotion regulation and increased risk of experiencing stressors, both of which may be related to psychobiological abnormalities (e.g., abnormal cortisol reactivity). Research has been mixed regarding the association between cortisol reactivity and ADHD. Therefore, the present meta-analytic review (k = 12) sought to quantify this association and review the relevant methodological issues and theoretical implications of this area of research. Overall, no effect was found between cortisol reactivity and ADHD (r = 0), although significant heterogeneity in the analyses suggested that there might be moderators of this association, if one does exist. Results highlight the importance of addressing limitations of the current literature on cortisol reactivity and ADHD and exploring additional indices of emotion regulation that may be associated with ADHD. Implications for future research efforts are discussed. PMID:28875432
Sánchez Morales, Lidia; Eiroa-Orosa, Francisco José; Valls Llagostera, Cristina; González Pérez, Alba; Alberich, Cristina
2018-05-01
Group cohesion, the establishment of hope, and the expression of feelings have been said to be the basic ingredients of group psychotherapy. To date, there is few literature describing therapeutic processes in short stay settings such as acute psychiatric wards and with special patient groups such as addictions. Our goal with this study is to describe and analyze group processes in such contexts. We used a qualitative methodology combining constant comparative methods and hermeneutical triangulation to analyze therapeutic narratives in the context of a group analytic process carried following Foulkes' and Yalom's styles. The results provide a picture of the therapeutic process including the use of norms to strengthen group cohesion facilitating the expression of emotions in early stages of group development. This analysis is intended to be a guide for practitioners implementing group therapy in contexts involving several constraints, such as acute psychiatric wards.
Progress toward the development of an implantable sensor for glucose.
Wilson, G S; Zhang, Y; Reach, G; Moatti-Sirat, D; Poitout, V; Thévenot, D R; Lemonnier, F; Klein, J C
1992-09-01
The development of an electrochemically based implantable sensor for glucose is described. The sensor is needle-shaped, about the size of a 28-gauge needle. It is flexible and must be implanted subcutaneously by using a 21-gauge catheter, which is then removed. When combined with a monitoring unit, this device, based on the glucose oxidase-catalyzed oxidation of glucose, reliably monitors glucose concentrations for as long as 10 days in rats. Various design considerations, including the decision to monitor the hydrogen peroxide produced in the enzymatic reaction, are discussed. Glucose constitutes the most important future target analyte for continuous monitoring, but the basic methodology developed for glucose could be applied to several other analytes such as lactate or ascorbate. The success in implementation of such a device depends on a reaction of the tissue surrounding the implant so as not to interfere with the proper functioning of the sensor. Histochemical evidence indicates that the tissue response leads to enhanced sensor performance.
Sutherland, Devon J; Stearman, G Kim; Wells, Martha J M
2003-01-01
The transport and fate of pesticides applied to ornamental plant nursery crops are not well documented. Methodology for analysis of soil and water runoff samples concomitantly containing the herbicides simazine (1-chloro-4,6-bis(ethylamino)-s-triazine) and 2,4-D ((2,4-dichlorophenoxy)acetic acid) was developed in this research to investigate the potential for runoff and leaching from ornamental nursery plots. Solid-phase extraction was used prior to analysis by gas chromatography and liquid chromatography. Chromatographic results were compared with determination by enzyme-linked immunoassay analysis. The significant analytical contributions of this research include (1) the development of a scheme using chromatographic mode sequencing for the fractionation of simazine and 2,4-D, (2) optimization of the homogeneous derivatization of 2,4-D using the methylating agent boron trifluoride in methanol as an alternative to in situ generation of diazomethane, and (3) the practical application of these techniques to field samples.
Schroder, Kerstin E. E.; Carey, Michael P.; Vanable, Peter A.
2008-01-01
Investigation of sexual behavior involves many challenges, including how to assess sexual behavior and how to analyze the resulting data. Sexual behavior can be assessed using absolute frequency measures (also known as “counts”) or with relative frequency measures (e.g., rating scales ranging from “never” to “always”). We discuss these two assessment approaches in the context of research on HIV risk behavior. We conclude that these two approaches yield non-redundant information and, more importantly, that only data yielding information about the absolute frequency of risk behavior have the potential to serve as valid indicators of HIV contraction risk. However, analyses of count data may be challenging due to non-normal distributions with many outliers. Therefore, we identify new and powerful data analytical solutions that have been developed recently to analyze count data, and discuss limitations of a commonly applied method (viz., ANCOVA using baseline scores as covariates). PMID:14534027
A Variational Approach to the Analysis of Dissipative Electromechanical Systems
Allison, Andrew; Pearce, Charles E. M.; Abbott, Derek
2014-01-01
We develop a method for systematically constructing Lagrangian functions for dissipative mechanical, electrical, and electromechanical systems. We derive the equations of motion for some typical electromechanical systems using deterministic principles that are strictly variational. We do not use any ad hoc features that are added on after the analysis has been completed, such as the Rayleigh dissipation function. We generalise the concept of potential, and define generalised potentials for dissipative lumped system elements. Our innovation offers a unified approach to the analysis of electromechanical systems where there are energy and power terms in both the mechanical and electrical parts of the system. Using our novel technique, we can take advantage of the analytic approach from mechanics, and we can apply these powerful analytical methods to electrical and to electromechanical systems. We can analyse systems that include non-conservative forces. Our methodology is deterministic, and does does require any special intuition, and is thus suitable for automation via a computer-based algebra package. PMID:24586221
Madsen, René Bjerregaard; Jensen, Mads Mørk; Mørup, Anders Juul; Houlberg, Kasper; Christensen, Per Sigaard; Klemmer, Maika; Becker, Jacob; Iversen, Bo Brummerstedt; Glasius, Marianne
2016-03-01
Hydrothermal liquefaction is a promising technique for the production of bio-oil. The process produces an oil phase, a gas phase, a solid residue, and an aqueous phase. Gas chromatography coupled with mass spectrometry is used to analyze the complex aqueous phase. Especially small organic acids and nitrogen-containing compounds are of interest. The efficient derivatization reagent methyl chloroformate was used to make analysis of the complex aqueous phase from hydrothermal liquefaction of dried distillers grains with solubles possible. A circumscribed central composite design was used to optimize the responses of both derivatized and nonderivatized analytes, which included small organic acids, pyrazines, phenol, and cyclic ketones. Response surface methodology was used to visualize significant factors and identify optimized derivatization conditions (volumes of methyl chloroformate, NaOH solution, methanol, and pyridine). Twenty-nine analytes of small organic acids, pyrazines, phenol, and cyclic ketones were quantified. An additional three analytes were pseudoquantified with use of standards with similar mass spectra. Calibration curves with high correlation coefficients were obtained, in most cases R (2) > 0.991. Method validation was evaluated with repeatability, and spike recoveries of all 29 analytes were obtained. The 32 analytes were quantified in samples from the commissioning of a continuous flow reactor and in samples from recirculation experiments involving the aqueous phase. The results indicated when the steady-state condition of the flow reactor was obtained and the effects of recirculation. The validated method will be especially useful for investigations of the effect of small organic acids on the hydrothermal liquefaction process.
Modeling the drugs' passive transfer in the body based on their chromatographic behavior.
Kouskoura, Maria G; Kachrimanis, Kyriakos G; Markopoulou, Catherine K
2014-11-01
One of the most challenging aims in modern analytical chemistry and pharmaceutical analysis is to create models for drugs' behavior based on simulation experiments. Since drugs' effects are closely related to their molecular properties, numerous characteristics of drugs are used in order to acquire a model of passive absorption and transfer in the human body. Importantly, such direction in innovative bioanalytical methodologies is also of stressful need in the area of personalized medicine to implement nanotechnological and genomics advancements. Simulation experiments were carried out by examining and interpreting the chromatographic behavior of 113 analytes/drugs (400 observations) in RP-HPLC. The dataset employed for this purpose included 73 descriptors which are referring to the physicochemical properties of the mobile phase mixture in different proportions, the physicochemical properties of the analytes and the structural characteristics of their molecules. A series of different software packages was used to calculate all the descriptors apart from those referring to the structure of analytes. The correlation of the descriptors with the retention time of the analytes eluted from a C4 column with an aqueous mobile phase was employed as dataset to introduce the behavior models in the human body. Their evaluation with a Partial Least Squares (PLS) software proved that the chromatographic behavior of a drug on a lipophilic stationary and a polar mobile phase is directly related to its drug-ability. At the same time, the behavior of an unknown drug in the human body can be predicted with reliability via the Artificial Neural Networks (ANNs) software. Copyright © 2014 Elsevier B.V. All rights reserved.
Marvel, Skylar W; To, Kimberly; Grimm, Fabian A; Wright, Fred A; Rusyn, Ivan; Reif, David M
2018-03-05
Drawing integrated conclusions from diverse source data requires synthesis across multiple types of information. The ToxPi (Toxicological Prioritization Index) is an analytical framework that was developed to enable integration of multiple sources of evidence by transforming data into integrated, visual profiles. Methodological improvements have advanced ToxPi and expanded its applicability, necessitating a new, consolidated software platform to provide functionality, while preserving flexibility for future updates. We detail the implementation of a new graphical user interface for ToxPi (Toxicological Prioritization Index) that provides interactive visualization, analysis, reporting, and portability. The interface is deployed as a stand-alone, platform-independent Java application, with a modular design to accommodate inclusion of future analytics. The new ToxPi interface introduces several features, from flexible data import formats (including legacy formats that permit backward compatibility) to similarity-based clustering to options for high-resolution graphical output. We present the new ToxPi interface for dynamic exploration, visualization, and sharing of integrated data models. The ToxPi interface is freely-available as a single compressed download that includes the main Java executable, all libraries, example data files, and a complete user manual from http://toxpi.org .
1983-05-01
DESIGN PROCEDURE M. S. IIAndal, University of Vermont, Burlington, VT Machinery Dynamics ANALYTICAL AND EXPERIMENTAL INVESTIGATION OF ROTATING BLADE... methodology to accurately predict rotor vibratory loads and has recently been initiated for detail design and bench test- coupled rotor/airframe vibrations... design methodology , a trating on the basic disciplines of aerodynamics and struc. coupled rotor/airframe vibration analysis has been developed. tural
Almeida, C; Nogueira, J M F
2014-06-27
In the present work, the development of an analytical methodology which combines bar adsorptive microextraction with microliquid desorption followed by high performance liquid chromatography-diode array detection (BAμE-μLD/HPLC-DAD) is proposed for the determination of trace levels of four parabens (methyl, ethyl, propyl and buthyl paraben) in real matrices. By comparing six polymer (P1, P2, P3, P4, P5 and P6) and five activated carbon (AC1, AC2, AC3, AC4 and AC5) coatings through BAμE, AC2 exhibited much higher selectivity and efficiency from all the sorbent phases tested, even when compared with the commercial stir bar sorptive extraction with polydimethylsiloxane. Assays performed through BAμE(AC2, 1.7mg) on 25mL of ultrapure water samples spiked at the 8.0μg/L level, yielded recoveries ranging from 85.6±6.3% to 100.6±11.8%, under optimized experimental conditions. The analytical performance showed also convenient limits of detection (0.1μg/L) and quantification (0.3μg/L), as well as good linear dynamic ranges (0.5-28.0μg/L) with remarkable determination coefficients (r(2)>0.9982). Excellent repeatability was also achieved through intraday (RSD<10.2%) and interday (RSD<10.0%) assays. By downsizing the analytical device to half-length (BAμE(AC2, 0.9mg)), similar analytical data was also achieved for the four parabens, under optimized experimental conditions, showing that this analytical technology can be design to operate with lower volumes of sample and desorption solvent, thus increasing the sensitivity and effectiveness. The application of the proposed analytical approach using the standard addition methodology on tap, underground, estuarine, swimming pool and waste water samples, as well as on commercial cosmetic products and urine samples, revealed good sensitivity, absence of matrix effects and the occurrence of levels of some parabens. Moreover, the present methodology is easy to implement, reliable, sensitive, requiring low sample and minimized desorption solvent volume, having the possibility to tune the most selective sorbent coating, according to the target compounds involved. Copyright © 2014 Elsevier B.V. All rights reserved.
Jacobs, Wura; Goodson, Patricia; Barry, Adam E; McLeroy, Kenneth R
2016-05-01
Despite previous research indicating an adolescents' alcohol, tobacco, and other drug (ATOD) use is dependent upon their sex and the sex composition of their social network, few social network studies consider sex differences and network sex composition as a determinant of adolescents' ATOD use behavior. This systematic literature review examining how social network analytic studies examine adolescent ATOD use behavior is guided by the following research questions: (1) How do studies conceptualize sex and network sex composition? (2) What types of network affiliations are employed to characterize adolescent networks? (3) What is the methodological quality of included studies? After searching several electronic databases (PsycINFO, EBSCO, and Communication Abstract) and applying our inclusion/exclusion criteria, 48 studies were included in the review. Overall, few studies considered sex composition of networks in which adolescents are embedded as a determinant that influences adolescent ATOD use. Although included studies all exhibited high methodological quality, the majority only used friendship networks to characterize adolescent social networks and subsequently failed to capture the influence of other network types, such as romantic networks. School-based prevention programs could be strengthened by (1) selecting and targeting peer leaders based on sex, and (2) leveraging other types of social networks beyond simply friendships. © 2016, American School Health Association.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Advanced flight control system study
NASA Technical Reports Server (NTRS)
Hartmann, G. L.; Wall, J. E., Jr.; Rang, E. R.; Lee, H. P.; Schulte, R. W.; Ng, W. K.
1982-01-01
A fly by wire flight control system architecture designed for high reliability includes spare sensor and computer elements to permit safe dispatch with failed elements, thereby reducing unscheduled maintenance. A methodology capable of demonstrating that the architecture does achieve the predicted performance characteristics consists of a hierarchy of activities ranging from analytical calculations of system reliability and formal methods of software verification to iron bird testing followed by flight evaluation. Interfacing this architecture to the Lockheed S-3A aircraft for flight test is discussed. This testbed vehicle can be expanded to support flight experiments in advanced aerodynamics, electromechanical actuators, secondary power systems, flight management, new displays, and air traffic control concepts.
A rotor technology assessment of the advancing blade concept
NASA Technical Reports Server (NTRS)
Pleasants, W. A.
1983-01-01
A rotor technology assessment of the Advancing Blade Concept (ABC) was conducted in support of a preliminary design study. The analytical methodology modifications and inputs, the correlation, and the results of the assessment are documented. The primary emphasis was on the high-speed forward flight performance of the rotor. The correlation data base included both the wind tunnel and the flight test results. An advanced ABC rotor design was examined; the suitability of the ABC for a particular mission was not considered. The objective of this technology assessment was to provide estimates of the performance potential of an advanced ABC rotor designed for high speed forward flight.
On the Singular Perturbations for Fractional Differential Equation
Atangana, Abdon
2014-01-01
The goal of this paper is to examine the possible extension of the singular perturbation differential equation to the concept of fractional order derivative. To achieve this, we presented a review of the concept of fractional calculus. We make use of the Laplace transform operator to derive exact solution of singular perturbation fractional linear differential equations. We make use of the methodology of three analytical methods to present exact and approximate solution of the singular perturbation fractional, nonlinear, nonhomogeneous differential equation. These methods are including the regular perturbation method, the new development of the variational iteration method, and the homotopy decomposition method. PMID:24683357
Skin microbiome: genomics-based insights into the diversity and role of skin microbes
Kong, Heidi H.
2011-01-01
Recent advances in DNA sequencing methodology have enabled studies of human skin microbes that circumvent difficulties in isolating and characterizing fastidious microbes. Sequence-based approaches have identified greater diversity of cutaneous bacteria than studies using traditional cultivation techniques. However, improved sequencing technologies and analytical methods are needed to study all skin microbes, including bacteria, archaea, fungi, viruses, and mites, and how they interact with each other and their human hosts. This review discusses current skin microbiome research, with a primary focus on bacteria, and the challenges facing investigators striving to understand how skin micro-organisms contribute to health and disease. PMID:21376666
Hybrid near-optimal aeroassisted orbit transfer plane change trajectories
NASA Technical Reports Server (NTRS)
Calise, Anthony J.; Duckeman, Gregory A.
1994-01-01
In this paper, a hybrid methodology is used to determine optimal open loop controls for the atmospheric portion of the aeroassisted plane change problem. The method is hybrid in the sense that it combines the features of numerical collocation with the analytically tractable portions of the problem which result when the two-point boundary value problem is cast in the form of a regular perturbation problem. Various levels of approximation are introduced by eliminating particular collocation parameters and their effect upon problem complexity and required number of nodes is discussed. The results include plane changes of 10, 20, and 30 degrees for a given vehicle.
NASA Technical Reports Server (NTRS)
Humphreys, E. A.
1981-01-01
A computerized, analytical methodology was developed to study damage accumulation during low velocity lateral impact of layered composite plates. The impact event was modeled as perfectly plastic with complete momentum transfer to the plate structure. A transient dynamic finite element approach was selected to predict the displacement time response of the plate structure. Composite ply and interlaminar stresses were computed at selected time intervals and subsequently evaluated to predict layer and interlaminar damage. The effects of damage on elemental stiffness were then incorporated back into the analysis for subsequent time steps. Damage predicted included fiber failure, matrix ply failure and interlaminar delamination.
Analytic institutes: A guide to training in the United States
NASA Astrophysics Data System (ADS)
Blanken, Terry G.
This investigation was inspired by the researcher's desire to pursue psychoanalytic training subsequent to completion of her PhD in clinical psychology and the discovery that no comprehensive resource existed to assist prospective psychoanalytic candidates with identifying or evaluating psychoanalytic training opportunities. This dissertation therefore aspires to provide a comprehensive guide to analytic training in the United States today. The researcher presents the expanding horizons of depth-oriented training leading to certification as an analyst, including training based on those schools of thought that resulted from early splits with Freud (Adlerian and Jungian) as well as training based on thought that has remained within the Freudian theoretical umbrella (e.g., classical, object relations, self psychology, etc.). Employing a heuristic approach and using hermeneutics and systems theory methodologies, the study situates analytic training in its historical context, explores contemporary issues, and considers its future. The study reviews the various analytic schools of thought and traces the history of psychoanalytic theory from its origins with Freud through its many permutations. It then discusses the history of psychoanalytic training and describes political, social, and economic factors influencing the development of training in this country. The centerpiece of the dissertation is a guidebook offering detailed information on each of 107 training institutes in the United States. Tables provide contact data and information which differentiate the institutes in terms of such parameters as size; length of program, theoretical orientation, and accreditation. A narrative of each institute summarizes the unique aspects of the program, including its admissions policy, the requirements for the training analysis and supervised clinical work, and the didactic curriculum, along with lists of courses offered. Child and adolescent psychoanalytic training is also discussed for institutes offering this option. A discussion of the contemporary world of analytic training emerges from the results of the analysis of individual institutes. Both the variations and convergences among institutes are explored. Current problems and issues in training, accreditation, and licensing are addressed. Finally, the future of psychoanalytic training is considered; concluding with an assessment of needed reforms and presentation of a model for the ideal analytic training institute of the future.
Kim, Manuela Leticia; Tudino, Mabel Beatríz
2010-08-15
Several studies involving the physicochemical interaction of three silica based hybrid mesoporous materials with metal ions of the group IB have been performed in order to employ them for preconcentration purposes in the determination of traces of Cu(II), Ag(I) and Au(III). The three solids were obtained from mesoporous silica functionalized with 3-aminopropyl (APS), 3-mercaptopropyl (MPS) and N-[2-aminoethyl]-3-aminopropyl (NN) groups, respectively. Adsorption capacities for Au, Cu and Ag were calculated using Langmuir's isotherm model and then, the optimal values for the retention of each element onto each one of the solids were found. Physicochemical data obtained under thermodynamic equilibrium and under kinetic conditions - imposed by flow through experiments - allowed the design of simple analytical methodologies where the solids were employed as fillings of microcolumns held in continuous systems coupled on-line to an atomic absorption spectrometry. In order to control the interaction between the filling and the analyte at short times (flow through conditions) and thus, its effect on the analytical signal and the presence of interferences, the initial adsorption velocities were calculated using the pseudo second order model. All these experiments allowed the comparison of the solids in terms of their analytical behaviour at the moment of facing the determination of the three elements. Under optimized conditions mainly given by the features of the filling, the analytical methodologies developed in this work showed excellent performances with limits of detection of 0.14, 0.02 and 0.025 microg L(-1) and RSD % values of 3.4, 2.7 and 3.1 for Au, Cu and Ag, respectively. A full discussion of the main findings on the interaction metal ions/fillings will be provided. The analytical results for the determination of the three metals will be also presented. Copyright 2010 Elsevier B.V. All rights reserved.
Solà-Vázquez, Auristela; Lara-Gonzalo, Azucena; Costa-Fernández, José M; Pereiro, Rosario; Sanz-Medel, Alfredo
2010-05-01
A tuneable microsecond pulsed direct current glow discharge (GD)-time-of-flight mass spectrometer MS(TOF) developed in our laboratory was coupled to a gas chromatograph (GC) to obtain sequential collection of the mass spectra, at different temporal regimes occurring in the GD pulses, during elution of the analytes. The capabilities of this set-up were explored using a mixture of volatile organic compounds of environmental concern: BrClCH, Cl(3)CH, Cl(4)C, BrCl(2)CH, Br(2)ClCH, Br(3)CH. The experimental parameters of the GC-pulsed GD-MS(TOF) prototype were optimized in order to separate appropriately and analyze the six selected organic compounds, and two GC carrier gases, helium and nitrogen, were evaluated. Mass spectra for all analytes were obtained in the prepeak, plateau and afterpeak temporal regimes of the pulsed GD. Results showed that helium offered the best elemental sensitivity, while nitrogen provided higher signal intensities for fragments and molecular peaks. The analytical performance characteristics were also worked out for each analyte. Absolute detection limits obtained were in the order of ng. In a second step, headspace solid phase microextraction (HS SPME), as sample preparation and preconcentration technique, was evaluated for the quantification of the compounds under study, in order to achieve the required analytical sensitivity for trihalomethanes European Union (EU) environmental legislation. The analytical figures of merit obtained using the proposed methodology showed rather good detection limits (between 2 and 13 microg L(-1) depending on the analyte). In fact, the developed methodology met the EU legislation requirements (the maximum level permitted in tap water for the "total trihalomethanes" is set at 100 microg L(-1)). Real analysis of drinking water and river water were successfully carried out. To our knowledge this is the first application of GC-pulsed GD-MS(TOF) for the analysis of real samples. Its ability to provide elemental, fragments and molecular information of the organic compounds is demonstrated.
Measurement-based reliability prediction methodology. M.S. Thesis
NASA Technical Reports Server (NTRS)
Linn, Linda Shen
1991-01-01
In the past, analytical and measurement based models were developed to characterize computer system behavior. An open issue is how these models can be used, if at all, for system design improvement. The issue is addressed here. A combined statistical/analytical approach to use measurements from one environment to model the system failure behavior in a new environment is proposed. A comparison of the predicted results with the actual data from the new environment shows a close correspondence.
2015-05-01
high-demand degrees and skills, essential concepts and methodologies, and required programming languages and product knowledge Benefits • Gained...According to ·finance report I’BM Corp. ’s EPS increased by according corporation Increase 10.1% preposition noun( singular ) noun( sing ,ular...used for other languages too (e.g. French, Spanish, etc.) Need to identify phrasal expressions by scanning minimum number of tokens I Need to
NASA Astrophysics Data System (ADS)
Kupchikova, N. V.; Kurbatskiy, E. N.
2017-11-01
This paper presents a methodology for the analytical research solutions for the work pile foundations with surface broadening and inclined side faces in the ground array, based on the properties of Fourier transform of finite functions. The comparative analysis of the calculation results using the suggested method for prismatic piles, piles with surface broadening prismatic with precast piles and end walls with precast wedges on the surface is described.
Validating a faster method for reconstitution of Crotalidae Polyvalent Immune Fab (ovine).
Gerring, David; King, Thomas R; Branton, Richard
2013-07-01
Reconstitution of CroFab(®) (Crotalidae Polyvalent Immune Fab [ovine]) lyophilized drug product was previously performed using 10 mL sterile water for injection followed by up to 36 min of gentle swirling of the vial. CroFab has been clinically demonstrated to be most effective when administered within 6 h of snake envenomation, and improved clinical outcomes are correlated with quicker timing of administration. An alternate reconstitution method was devised, using 18 mL 0.9% saline with manual inversion, with the goal of shortening reconstitution time while maintaining a high quality, efficacious product. An analytical study was designed to compare the physicochemical properties of 3 separate batches of CroFab when reconstituted using the standard procedure (10 mL WFI with gentle swirling) and a modified rapid procedure using 18 mL 0.9% saline and manual inversion. The physical and chemical characteristics of the same 3 batches were assessed using various analytic methodologies associated with routine quality control release testing. In addition further analytical methodologies were applied in order to elucidate possible structural changes that may be induced by the changed reconstitution procedure. Batches A, B, and C required mean reconstitution times of 25 min 51 s using the label method and 3 min 07 s (a 88.0% mean decrease) using the modified method. Physicochemical characteristics (color and clarity, pH, purity, protein content, potency) were found to be highly comparable. Characterization assays (dynamic light scattering, analytical ultracentrifugation, LC-MS, SDS-PAGE and circular dichroism spectroscopy were also all found to be comparable between methods. When comparing CroFab batches that were reconstituted using the labeled and modified methods, the physicochemical and biological (potency) characteristics of CroFab were not significantly changed when challenged by the various standard analytical methodologies applied in routine quality control analysis. Additionally, no changes in the CroFab molecule regarding degradation, aggregation, purity, structure, or mass were observed. The analyses performed validated the use of the more rapid reconstitution method using 18 mL 0.9% saline in order to allow a significantly reduced time to administration of CroFab to patients in need. Copyright © 2013 Elsevier Ltd. All rights reserved.
A conflict of analysis: analytical chemistry and milk adulteration in Victorian Britain.
Steere-Williams, Jacob
2014-08-01
This article centres on a particularly intense debate within British analytical chemistry in the late nineteenth century, between local public analysts and the government chemists of the Inland Revenue Service. The two groups differed in both practical methodologies and in the interpretation of analytical findings. The most striking debates in this period were related to milk analysis, highlighted especially in Victorian courtrooms. It was in protracted court cases, such as the well known Manchester Milk Case in 1883, that analytical chemistry was performed between local public analysts and the government chemists, who were often both used as expert witnesses. Victorian courtrooms were thus important sites in the context of the uneven professionalisation of chemistry. I use this tension to highlight what Christopher Hamlin has called the defining feature of Victorian public health, namely conflicts of professional jurisdiction, which adds nuance to histories of the struggle of professionalisation and public credibility in analytical chemistry.
Gonçalves, C; Alpendurada, M F
2005-03-15
In order to reduce the amount of sample to be collected and the time consumed in the analytical process, a broad range of analytes should be preferably considered in the same analytical procedure. A suitable methodology for pesticide residue analysis in soil samples was developed based on ultrasonic extraction (USE) and gas chromatography-mass spectrometry (GC-MS). For this study, different classes of pesticides were selected, both recent and old persistent molecules: parent compounds and degradation products, namely organochlorine, organophosphorous and pyrethroid insecticides, triazine and acetanilide herbicides and other miscellaneous pesticides. Pesticide residues could be detected in the low- to sub-ppb range (0.05-7.0mugkg(-1)) with good precision (7.5-20.5%, average 13.7% R.S.D.) and extraction efficiency (69-118%, average 88%) for the great majority of analytes. This methodology has been applied in a monitoring program of soil samples from an intensive horticulture area in Póvoa de Varzim, North of Portugal. The pesticides detected in four sampling programs (2001/2002) were the following: lindane, dieldrin, endosulfan, endosulfan sulfate, 4,4'-DDE, 4,4'-DDD, atrazine, desethylatrazine, alachlor, dimethoate, chlorpyrifos, pendimethalin, procymidone and chlorfenvinphos. Pesticide contamination was investigated at three depths and in different soil and crop types to assess the influence of soil characteristics and trends over time.
Improved Design of Tunnel Supports : Executive Summary
DOT National Transportation Integrated Search
1979-12-01
This report focuses on improvement of design methodologies related to the ground-structure interaction in tunneling. The design methods range from simple analytical and empirical methods to sophisticated finite element techniques as well as an evalua...
This is the first phase of a potentially multi-phase project aimed at identifying scientific methodologies that will lead to the development of innnovative analytical tools supporting the analysis of control strategy effectiveness, namely. accountabilty. Significant reductions i...
75 FR 103 - Initiation of Five-Year (“Sunset”) Review
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
... 20, 1998) and 70 FR 62061 (October 28, 2005). Guidance on methodological or analytical issues...: DOC Case No. ITC Case No. Country Product Department contact A-351-838 731-TA-1063 Brazil Frozen...
76 FR 31588 - Initiation of Five-Year (“Sunset”) Review
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-01
...) and 70 FR 62061 (October 28, 2005). Guidance on methodological or analytical issues relevant to the...), we are initiating the Sunset Review of the following antidumping duty orders: DOC case no. ITC case...
77 FR 32527 - Initiation of Five-Year (“Sunset”) Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-01
...) and 70 FR 62061 (October 28, 2005). Guidance on methodological or analytical issues relevant to the... initiating the Sunset Review of the following antidumping duty order: DOC Case No. ITC Case No. Country...
77 FR 53867 - Initiation of Five-Year (“Sunset”) Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-04
...) and 70 FR 62061 (October 28, 2005). Guidance on methodological or analytical issues relevant to the... initiating Sunset Reviews of the following antidumping duty orders: DOC case No. ITC case No. Country Product...
78 FR 13862 - Initiation of Five-Year (“Sunset”) Review
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-01
...) and 70 FR 62061 (October 28, 2005). Guidance on methodological or analytical issues relevant to the... initiating a Sunset Review of the following antidumping duty order: Department DOC Case No. ITC Case No...
75 FR 53664 - Initiation of Five-Year (“Sunset”) Review
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-01
...) and 70 FR 62061 (October 28, 2005). Guidance on methodological or analytical issues relevant to the...), we are initiating the Sunset Review of the following antidumping duty orders: DOC Case No. ITC Case...
76 FR 24459 - Initiation of Five-Year (“Sunset”) Review
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-02
...) and 70 FR 62061 (October 28, 2005). Guidance on methodological or analytical issues relevant to the...), we are initiating the Sunset Review of the following antidumping duty orders: DOC case No. ITC case...
77 FR 19643 - Initiation of Five-Year (“Sunset”) Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
...) and 70 FR 62061 (October 28, 2005). Guidance on methodological or analytical issues relevant to the... initiating the Sunset Review of the following antidumping duty orders: DOC Case No. ITC Case No. Country...
76 FR 67412 - Initiation of Five-Year (“Sunset”) Review
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
...) and 70 FR 62061 (October 28, 2005). Guidance on methodological or analytical issues relevant to the...), we are initiating the Sunset Review of the following antidumping duty orders: DOC case No. ITC case...