2013-01-01
Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate them into a form that is credible and useful to healthcare practitioners, educators and policy-makers. PMID:23721181
Clinical and diagnostic utility of saliva as a non-invasive diagnostic fluid: a systematic review
Nunes, Lazaro Alessandro Soares; Mussavira, Sayeeda
2015-01-01
This systematic review presents the latest trends in salivary research and its applications in health and disease. Among the large number of analytes present in saliva, many are affected by diverse physiological and pathological conditions. Further, the non-invasive, easy and cost-effective collection methods prompt an interest in evaluating its diagnostic or prognostic utility. Accumulating data over the past two decades indicates towards the possible utility of saliva to monitor overall health, diagnose and treat various oral or systemic disorders and drug monitoring. Advances in saliva based systems biology has also contributed towards identification of several biomarkers, development of diverse salivary diagnostic kits and other sensitive analytical techniques. However, its utilization should be carefully evaluated in relation to standardization of pre-analytical and analytical variables, such as collection and storage methods, analyte circadian variation, sample recovery, prevention of sample contamination and analytical procedures. In spite of all these challenges, there is an escalating evolution of knowledge with the use of this biological matrix. PMID:26110030
Child-Parent Interventions for Childhood Anxiety Disorders: A Systematic Review and Meta-Analysis
ERIC Educational Resources Information Center
Brendel, Kristen Esposito; Maynard, Brandy R.
2014-01-01
Objective: This study compared the effects of direct child-parent interventions to the effects of child-focused interventions on anxiety outcomes for children with anxiety disorders. Method: Systematic review methods and meta-analytic techniques were employed. Eight randomized controlled trials examining effects of family cognitive behavior…
Reader's Block: A Systematic Review of Barriers to Adoption, Access and Use in E-Book User Studies
ERIC Educational Resources Information Center
Girard, Adam
2014-01-01
Introduction: This review of barriers to e-book use systematically identifies obstacles to engaging reading experiences. Through the use of an analytical framework, the users being studied, study setting, and methods used in previous work are described in order to identify promising areas for future research. Method: The method used is a…
ERIC Educational Resources Information Center
Filges, Trine; Andersen, Ditte; Jørgensen, Anne-Marie Klint
2018-01-01
Purpose: This review evaluates the evidence of the effects of multidimensional family therapy (MDFT) on drug use reduction in young people for the treatment of nonopioid drug use. Method: We followed Campbell Collaboration guidelines to conduct a systematic review of randomized and nonrandomized trials. Meta-analytic methods were used to…
Galli, C
2001-07-01
It is well established that the use of polychromatic radiation in spectrophotometric assays leads to excursions from the Beer-Lambert limit. This Note models the resulting systematic error as a function of assay spectral width, slope of molecular extinction coefficient, and analyte concentration. The theoretical calculations are compared with recent experimental results; a parameter is introduced which can be used to estimate the magnitude of the systematic error in both chromatographic and nonchromatographic spectrophotometric assays. It is important to realize that the polychromatic radiation employed in common laboratory equipment can yield assay errors up to approximately 4%, even at absorption levels generally considered 'safe' (i.e. absorption <1). Thus careful consideration of instrumental spectral width, analyte concentration, and slope of molecular extinction coefficient is required to ensure robust analytical methods.
Meta-Analysis: A Systematic Method for Synthesizing Counseling Research
ERIC Educational Resources Information Center
Whiston, Susan C.; Li, Peiwei
2011-01-01
The authors provide a template for counseling researchers who are interested in quantitatively aggregating research findings. Meta-analytic studies can provide relevant information to the counseling field by systematically synthesizing studies performed by researchers from diverse fields. Methodologically sound meta-analyses require careful…
Validation of a common data model for active safety surveillance research
Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E
2011-01-01
Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893
This paper describes a systematic method for comparing options for the long-term management of surplus elemental mercury in the U.S., using the Analytic Hierarchy Process (AHP) as embodied in commercially available Expert Choice software. A limited scope multi-criteria decision-a...
ERIC Educational Resources Information Center
Hodge, David R.; Jackson, Kelly F.; Vaughn, Michael G.
2012-01-01
This study assessed the effectiveness of culturally sensitive interventions (CSIs) ("N" = 10) designed to address substance use among minority youths. Study methods consisted of systematic search procedures, quality of study ratings, and meta-analytic techniques to gauge effects and evaluate publication bias. The results, across all measures and…
Fast analytical spectral filtering methods for magnetic resonance perfusion quantification.
Reddy, Kasireddy V; Mitra, Abhishek; Yalavarthy, Phaneendra K
2016-08-01
The deconvolution in the perfusion weighted imaging (PWI) plays an important role in quantifying the MR perfusion parameters. The PWI application to stroke and brain tumor studies has become a standard clinical practice. The standard approach for this deconvolution is oscillatory-limited singular value decomposition (oSVD) and frequency domain deconvolution (FDD). The FDD is widely recognized as the fastest approach currently available for deconvolution of MR perfusion data. In this work, two fast deconvolution methods (namely analytical fourier filtering and analytical showalter spectral filtering) are proposed. Through systematic evaluation, the proposed methods are shown to be computationally efficient and quantitatively accurate compared to FDD and oSVD.
The Importance of Method Selection in Determining Product Integrity for Nutrition Research1234
Mudge, Elizabeth M; Brown, Paula N
2016-01-01
The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. PMID:26980823
The Importance of Method Selection in Determining Product Integrity for Nutrition Research.
Mudge, Elizabeth M; Betz, Joseph M; Brown, Paula N
2016-03-01
The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. © 2016 American Society for Nutrition.
ERIC Educational Resources Information Center
Maynard, Brandy R.; Wilson, Alyssa N.; Labuzienski, Elizabeth; Whiting, Seth W.
2018-01-01
Background and Aims: To examine the effects of mindfulness-based interventions on gambling behavior and symptoms, urges, and financial outcomes. Method: Systematic review and meta-analytic procedures were employed to search, select, code, and analyze studies conducted between 1980 and 2014, assessing the effects of mindfulness-based interventions…
How effects on health equity are assessed in systematic reviews of interventions.
Welch, Vivian; Tugwell, Peter; Petticrew, Mark; de Montigny, Joanne; Ueffing, Erin; Kristjansson, Betsy; McGowan, Jessie; Benkhalti Jandu, Maria; Wells, George A; Brand, Kevin; Smylie, Janet
2010-12-08
Enhancing health equity has now achieved international political importance with endorsement from the World Health Assembly in 2009. The failure of systematic reviews to consider effects on health equity is cited by decision-makers as a limitation to their ability to inform policy and program decisions. To systematically review methods to assess effects on health equity in systematic reviews of effectiveness. We searched the following databases up to July 2 2010: MEDLINE, PsychINFO, the Cochrane Methodology Register, CINAHL, Education Resources Information Center, Education Abstracts, Criminal Justice Abstracts, Index to Legal Periodicals, PAIS International, Social Services Abstracts, Sociological Abstracts, Digital Dissertations and the Health Technology Assessment Database. We searched SCOPUS to identify articles that cited any of the included studies on October 7 2010. We included empirical studies of cohorts of systematic reviews that assessed methods for measuring effects on health inequalities. Data were extracted using a pre-tested form by two independent reviewers. Risk of bias was appraised for included studies according to the potential for bias in selection and detection of systematic reviews. Thirty-four methodological studies were included. The methods used by these included studies were: 1) Targeted approaches (n=22); 2) gap approaches (n=12) and gradient approach (n=1). Gender or sex was assessed in eight out of 34 studies, socioeconomic status in ten studies, race/ethnicity in seven studies, age in seven studies, low and middle income countries in 14 studies, and two studies assessed multiple factors across health inequity may exist.Only three studies provided a definition of health equity. Four methodological approaches to assessing effects on health equity were identified: 1) descriptive assessment of reporting and analysis in systematic reviews (all 34 studies used a type of descriptive method); 2) descriptive assessment of reporting and analysis in original trials (12/34 studies); 3) analytic approaches (10/34 studies); and 4) applicability assessment (11/34 studies). Both analytic and applicability approaches were not reported transparently nor in sufficient detail to judge their credibility. There is a need for improvement in conceptual clarity about the definition of health equity, describing sufficient detail about analytic approaches (including subgroup analyses) and transparent reporting of judgments required for applicability assessments in order to assess and report effects on health equity in systematic reviews.
ERIC Educational Resources Information Center
Avella, John T.; Kebritchi, Mansureh; Nunn, Sandra G.; Kanai, Therese
2016-01-01
Higher education for the 21st century continues to promote discoveries in the field through learning analytics (LA). The problem is that the rapid embrace of of LA diverts educators' attention from clearly identifying requirements and implications of using LA in higher education. LA is a promising emerging field, yet higher education stakeholders…
Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards
Smith, Justin D.
2013-01-01
This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874
Guise, Jeanne-Marie; Chang, Christine; Viswanathan, Meera; Glick, Susan; Treadwell, Jonathan; Umscheid, Craig A; Whitlock, Evelyn; Fu, Rongwei; Berliner, Elise; Paynter, Robin; Anderson, Johanna; Motu'apuaka, Pua; Trikalinos, Tom
2014-11-01
The purpose of this Agency for Healthcare Research and Quality Evidence-based Practice Center methods white paper was to outline approaches to conducting systematic reviews of complex multicomponent health care interventions. We performed a literature scan and conducted semistructured interviews with international experts who conduct research or systematic reviews of complex multicomponent interventions (CMCIs) or organizational leaders who implement CMCIs in health care. Challenges identified include lack of consistent terminology for such interventions (eg, complex, multicomponent, multidimensional, multifactorial); a wide range of approaches used to frame the review, from grouping interventions by common features to using more theoretical approaches; decisions regarding whether and how to quantitatively analyze the interventions, from holistic to individual component analytic approaches; and incomplete and inconsistent reporting of elements critical to understanding the success and impact of multicomponent interventions, such as methods used for implementation the context in which interventions are implemented. We provide a framework for the spectrum of conceptual and analytic approaches to synthesizing studies of multicomponent interventions and an initial list of critical reporting elements for such studies. This information is intended to help systematic reviewers understand the options and tradeoffs available for such reviews. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kartashov, E. M.
1986-10-01
Analytical methods for solving boundary value problems for the heat conduction equation with heterogeneous boundary conditions on lines, on a plane, and in space are briefly reviewed. In particular, the method of dual integral equations and summator series is examined with reference to stationary processes. A table of principal solutions to dual integral equations and pair summator series is proposed which presents the known results in a systematic manner. Newly obtained results are presented in addition to the known ones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haab, Brian B.; Geierstanger, Bernhard H.; Michailidis, George
2005-08-01
Four different immunoassay and antibody microarray methods performed at four different sites were used to measure the levels of a broad range of proteins (N = 323 assays; 39, 88, 168, and 28 assays at the respective sites; 237 unique analytes) in the human serum and plasma reference specimens distributed by the Plasma Proteome Project (PPP) of the HUPO. The methods provided a means to (1) assess the level of systematic variation in protein abundances associated with blood preparation methods (serum, citrate-anticoagulated-plasma, EDTA-anticoagulated-plasma, or heparin-anticoagulated-plasma) and (2) evaluate the dependence on concentration of MS-based protein identifications from data sets usingmore » the HUPO specimens. Some proteins, particularly cytokines, had highly variable concentrations between the different sample preparations, suggesting specific effects of certain anticoagulants on the stability or availability of these proteins. The linkage of antibody-based measurements from 66 different analytes with the combined MS/MS data from 18 different laboratories showed that protein detection and the quality of MS data increased with analyte concentration. The conclusions from these initial analyses are that the optimal blood preparation method is variable between analytes and that the discovery of blood proteins by MS can be extended to concentrations below the ng/mL range under certain circumstances. Continued developments in antibody-based methods will further advance the scientific goals of the PPP.« less
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
Bao, Yijun; Gaylord, Thomas K
2016-11-01
Multifilter phase imaging with partially coherent light (MFPI-PC) is a promising new quantitative phase imaging method. However, the existing MFPI-PC method is based on the paraxial approximation. In the present work, an analytical nonparaxial partially coherent phase optical transfer function is derived. This enables the MFPI-PC to be extended to the realistic nonparaxial case. Simulations over a wide range of test phase objects as well as experimental measurements on a microlens array verify higher levels of imaging accuracy compared to the paraxial method. Unlike the paraxial version, the nonparaxial MFPI-PC with obliquity factor correction exhibits no systematic error. In addition, due to its analytical expression, the increase in computation time compared to the paraxial version is negligible.
The Water-Energy-Food Nexus: A systematic review of methods for nexus assessment
NASA Astrophysics Data System (ADS)
Albrecht, Tamee R.; Crootof, Arica; Scott, Christopher A.
2018-04-01
The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex resource and development challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, while the WEF nexus offers a promising conceptual approach, the use of WEF nexus methods to systematically evaluate water, energy, and food interlinkages or support development of socially and politically-relevant resource policies has been limited. This paper reviews WEF nexus methods to provide a knowledge base of existing approaches and promote further development of analytical methods that align with nexus thinking. The systematic review of 245 journal articles and book chapters reveals that (a) use of specific and reproducible methods for nexus assessment is uncommon (less than one-third); (b) nexus methods frequently fall short of capturing interactions among water, energy, and food—the very linkages they conceptually purport to address; (c) assessments strongly favor quantitative approaches (nearly three-quarters); (d) use of social science methods is limited (approximately one-quarter); and (e) many nexus methods are confined to disciplinary silos—only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. To help overcome these limitations, we derive four key features of nexus analytical tools and methods—innovation, context, collaboration, and implementation—from the literature that reflect WEF nexus thinking. By evaluating existing nexus analytical approaches based on these features, we highlight 18 studies that demonstrate promising advances to guide future research. This paper finds that to address complex resource and development challenges, mixed-methods and transdisciplinary approaches are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and decision-makers.
ERIC Educational Resources Information Center
Choi, Samuel P. M.; Lam, S. S.; Li, Kam Cheong; Wong, Billy T. M.
2018-01-01
While learning analytics (LA) practices have been shown to be practical and effective, most of them require a huge amount of data and effort. This paper reports a case study which demonstrates the feasibility of practising LA at a low cost for instructors to identify at-risk students in an undergraduate business quantitative methods course.…
Wunderli, S; Fortunato, G; Reichmuth, A; Richard, Ph
2003-06-01
A new method to correct for the largest systematic influence in mass determination-air buoyancy-is outlined. A full description of the most relevant influence parameters is given and the combined measurement uncertainty is evaluated according to the ISO-GUM approach [1]. A new correction method for air buoyancy using an artefact is presented. This method has the advantage that only a mass artefact is used to correct for air buoyancy. The classical approach demands the determination of the air density and therefore suitable equipment to measure at least the air temperature, the air pressure and the relative air humidity within the demanded uncertainties (i.e. three independent measurement tasks have to be performed simultaneously). The calculated uncertainty is lower for the classical method. However a field laboratory may not always be in possession of fully traceable measurement systems for these room climatic parameters.A comparison of three approaches applied to the calculation of the combined uncertainty of mass values is presented. Namely the classical determination of air buoyancy, the artefact method, and the neglecting of this systematic effect as proposed in the new EURACHEM/CITAC guide [2]. The artefact method is suitable for high-precision measurement in analytical chemistry and especially for the production of certified reference materials, reference values and analytical chemical reference materials. The method could also be used either for volume determination of solids or for air density measurement by an independent method.
Klous, Miriam; Klous, Sander
2010-07-01
The aim of skin-marker-based motion analysis is to reconstruct the motion of a kinematical model from noisy measured motion of skin markers. Existing kinematic models for reconstruction of chains of segments can be divided into two categories: analytical methods that do not take joint constraints into account and numerical global optimization methods that do take joint constraints into account but require numerical optimization of a large number of degrees of freedom, especially when the number of segments increases. In this study, a new and largely analytical method for a chain of rigid bodies is presented, interconnected in spherical joints (chain-method). In this method, the number of generalized coordinates to be determined through numerical optimization is three, irrespective of the number of segments. This new method is compared with the analytical method of Veldpaus et al. [1988, "A Least-Squares Algorithm for the Equiform Transformation From Spatial Marker Co-Ordinates," J. Biomech., 21, pp. 45-54] (Veldpaus-method, a method of the first category) and the numerical global optimization method of Lu and O'Connor [1999, "Bone Position Estimation From Skin-Marker Co-Ordinates Using Global Optimization With Joint Constraints," J. Biomech., 32, pp. 129-134] (Lu-method, a method of the second category) regarding the effects of continuous noise simulating skin movement artifacts and regarding systematic errors in joint constraints. The study is based on simulated data to allow a comparison of the results of the different algorithms with true (noise- and error-free) marker locations. Results indicate a clear trend that accuracy for the chain-method is higher than the Veldpaus-method and similar to the Lu-method. Because large parts of the equations in the chain-method can be solved analytically, the speed of convergence in this method is substantially higher than in the Lu-method. With only three segments, the average number of required iterations with the chain-method is 3.0+/-0.2 times lower than with the Lu-method when skin movement artifacts are simulated by applying a continuous noise model. When simulating systematic errors in joint constraints, the number of iterations for the chain-method was almost a factor 5 lower than the number of iterations for the Lu-method. However, the Lu-method performs slightly better than the chain-method. The RMSD value between the reconstructed and actual marker positions is approximately 57% of the systematic error on the joint center positions for the Lu-method compared with 59% for the chain-method.
Errors in causal inference: an organizational schema for systematic error and random error.
Suzuki, Etsuji; Tsuda, Toshihide; Mitsuhashi, Toshiharu; Mansournia, Mohammad Ali; Yamamoto, Eiji
2016-11-01
To provide an organizational schema for systematic error and random error in estimating causal measures, aimed at clarifying the concept of errors from the perspective of causal inference. We propose to divide systematic error into structural error and analytic error. With regard to random error, our schema shows its four major sources: nondeterministic counterfactuals, sampling variability, a mechanism that generates exposure events and measurement variability. Structural error is defined from the perspective of counterfactual reasoning and divided into nonexchangeability bias (which comprises confounding bias and selection bias) and measurement bias. Directed acyclic graphs are useful to illustrate this kind of error. Nonexchangeability bias implies a lack of "exchangeability" between the selected exposed and unexposed groups. A lack of exchangeability is not a primary concern of measurement bias, justifying its separation from confounding bias and selection bias. Many forms of analytic errors result from the small-sample properties of the estimator used and vanish asymptotically. Analytic error also results from wrong (misspecified) statistical models and inappropriate statistical methods. Our organizational schema is helpful for understanding the relationship between systematic error and random error from a previously less investigated aspect, enabling us to better understand the relationship between accuracy, validity, and precision. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Galley, Chad R.; Rothstein, Ira Z.
2017-05-01
We utilize the dynamical renormalization group formalism to calculate the real space trajectory of a compact binary inspiral for long times via a systematic resummation of secularly growing terms. This method generates closed form solutions without orbit averaging, and the accuracy can be systematically improved. The expansion parameter is v5ν Ω (t -t0) where t0 is the initial time, t is the time elapsed, and Ω and v are the angular orbital frequency and initial speed, respectively. ν is the binary's symmetric mass ratio. We demonstrate how to apply the renormalization group method to resum solutions beyond leading order in two ways. First, we calculate the second-order corrections of the leading radiation reaction force, which involves highly nontrivial checks of the formalism (i.e., its renormalizability). Second, we show how to systematically include post-Newtonian corrections to the radiation reaction force. By avoiding orbit averaging, we gain predictive power and eliminate ambiguities in the initial conditions. Finally, we discuss how this methodology can be used to find analytic solutions to the spin equations of motion that are valid over long times.
PRELIMINARY ANALYSIS OF ALTERNATIVES FOR THE LONG TERM MANAGEMENT OF EXCESS MERCURY
This report describes the use of a systematic method for comparing options for the long term management and retirement of surplus mercury in the U.S. The method chosen is the Analytical Hierarchy Procedure (AHP) as embodied in the Expert Choice 2000 software. The goal, criteria, ...
Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd
2015-01-01
The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585
Cognitive-Behavioral Therapies for Young People in Outpatient Treatment for Nonopioid Drug Use
ERIC Educational Resources Information Center
Filges, Trine; Jorgensen, Anne-Marie Klint
2018-01-01
Objectives: This review evaluates the evidence on the effects of cognitive-behavioral therapy (CBT) on drug use reduction for young people in treatment for nonopioid drug use. Method: We followed Campbell Collaboration guidelines to conduct a systematic review of randomized and nonrandomized trials. Meta-analytic methods were used to…
Sharma, Teenu; Khurana, Rajneet Kaur; Jain, Atul; Katare, O P; Singh, Bhupinder
2018-05-01
The current research work envisages an analytical quality by design-enabled development of a simple, rapid, sensitive, specific, robust and cost-effective stability-indicating reversed-phase high-performance liquid chromatographic method for determining stress-induced forced-degradation products of sorafenib tosylate (SFN). An Ishikawa fishbone diagram was constructed to embark upon analytical target profile and critical analytical attributes, i.e. peak area, theoretical plates, retention time and peak tailing. Factor screening using Taguchi orthogonal arrays and quality risk assessment studies carried out using failure mode effect analysis aided the selection of critical method parameters, i.e. mobile phase ratio and flow rate potentially affecting the chosen critical analytical attributes. Systematic optimization using response surface methodology of the chosen critical method parameters was carried out employing a two-factor-three-level-13-run, face-centered cubic design. A method operable design region was earmarked providing optimum method performance using numerical and graphical optimization. The optimum method employed a mobile phase composition consisting of acetonitrile and water (containing orthophosphoric acid, pH 4.1) at 65:35 v/v at a flow rate of 0.8 mL/min with UV detection at 265 nm using a C 18 column. Response surface methodology validation studies confirmed good efficiency and sensitivity of the developed method for analysis of SFN in mobile phase as well as in human plasma matrix. The forced degradation studies were conducted under different recommended stress conditions as per ICH Q1A (R2). Mass spectroscopy studies showed that SFN degrades in strongly acidic, alkaline and oxidative hydrolytic conditions at elevated temperature, while the drug was per se found to be photostable. Oxidative hydrolysis using 30% H 2 O 2 showed maximum degradation with products at retention times of 3.35, 3.65, 4.20 and 5.67 min. The absence of any significant change in the retention time of SFN and degradation products, formed under different stress conditions, ratified selectivity and specificity of the systematically developed method. Copyright © 2017 John Wiley & Sons, Ltd.
Analytical Algorithms to Quantify the Uncertainty in Remaining Useful Life Prediction
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar; Saxena, Abhinav; Daigle, Matthew; Goebel, Kai
2013-01-01
This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decisionmaking. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the the first-order second moment method (FOSM), the first-order reliability method (FORM), and the inverse first-order reliability method (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.
Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech
2015-01-01
Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Irregular analytical errors in diagnostic testing - a novel concept.
Vogeser, Michael; Seger, Christoph
2018-02-23
In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.
Cembrowski, G S; Hackney, J R; Carey, N
1993-04-01
The Clinical Laboratory Improvement Act of 1988 (CLIA 88) has dramatically changed proficiency testing (PT) practices having mandated (1) satisfactory PT for certain analytes as a condition of laboratory operation, (2) fixed PT limits for many of these "regulated" analytes, and (3) an increased number of PT specimens (n = 5) for each testing cycle. For many of these analytes, the fixed limits are much broader than the previously employed Standard Deviation Index (SDI) criteria. Paradoxically, there may be less incentive to identify and evaluate analytically significant outliers to improve the analytical process. Previously described "control rules" to evaluate these PT results are unworkable as they consider only two or three results. We used Monte Carlo simulations of Kodak Ektachem analyzers participating in PT to determine optimal control rules for the identification of PT results that are inconsistent with those from other laboratories using the same methods. The analysis of three representative analytes, potassium, creatine kinase, and iron was simulated with varying intrainstrument and interinstrument standard deviations (si and sg, respectively) obtained from the College of American Pathologists (Northfield, Ill) Quality Assurance Services data and Proficiency Test data, respectively. Analytical errors were simulated in each of the analytes and evaluated in terms of multiples of the interlaboratory SDI. Simple control rules for detecting systematic and random error were evaluated with power function graphs, graphs of probability of error detected vs magnitude of error. Based on the simulation results, we recommend screening all analytes for the occurrence of two or more observations exceeding the same +/- 1 SDI limit. For any analyte satisfying this condition, the mean of the observations should be calculated. For analytes with sg/si ratios between 1.0 and 1.5, a significant systematic error is signaled by the mean exceeding 1.0 SDI. Significant random error is signaled by one observation exceeding the +/- 3-SDI limit or the range of the observations exceeding 4 SDIs. For analytes with higher sg/si, significant systematic or random error is signaled by violation of the screening rule (having at least two observations exceeding the same +/- 1 SDI limit). Random error can also be signaled by one observation exceeding the +/- 1.5-SDI limit or the range of the observations exceeding 3 SDIs. We present a practical approach to the workup of apparent PT errors.
Species delimitation: A case study in a problematic ant taxon
USDA-ARS?s Scientific Manuscript database
Species delimitation has been invigorated as a discipline in systematics by an influx of new character sets, analytical methods, and conceptual advances. We use genetic data from 68 markers, combined with distributional, bioclimatic, and coloration information, to distinguish evolutionarily indepe...
Decision Support Model for Introduction of Gamification Solution Using AHP
2014-01-01
Gamification means the use of various elements of game design in nongame contexts including workplace collaboration, marketing, education, military, and medical services. Gamification is effective for both improving workplace productivity and motivating employees. However, introduction of gamification is not easy because the planning and implementation processes of gamification are very complicated and it needs interdisciplinary knowledge such as information systems, organization behavior, and human psychology. Providing a systematic decision making method for gamification process is the purpose of this paper. This paper suggests the decision criteria for selection of gamification platform to support a systematic decision making process for managements. The criteria are derived from previous works on gamification, introduction of information systems, and analytic hierarchy process. The weights of decision criteria are calculated through a survey by the professionals on game, information systems, and business administration. The analytic hierarchy process is used to derive the weights. The decision criteria and weights provided in this paper could support the managements to make a systematic decision for selection of gamification platform. PMID:24892075
Decision support model for introduction of gamification solution using AHP.
Kim, Sangkyun
2014-01-01
Gamification means the use of various elements of game design in nongame contexts including workplace collaboration, marketing, education, military, and medical services. Gamification is effective for both improving workplace productivity and motivating employees. However, introduction of gamification is not easy because the planning and implementation processes of gamification are very complicated and it needs interdisciplinary knowledge such as information systems, organization behavior, and human psychology. Providing a systematic decision making method for gamification process is the purpose of this paper. This paper suggests the decision criteria for selection of gamification platform to support a systematic decision making process for managements. The criteria are derived from previous works on gamification, introduction of information systems, and analytic hierarchy process. The weights of decision criteria are calculated through a survey by the professionals on game, information systems, and business administration. The analytic hierarchy process is used to derive the weights. The decision criteria and weights provided in this paper could support the managements to make a systematic decision for selection of gamification platform.
Quality Assurance of Chemical Measurements.
ERIC Educational Resources Information Center
Taylor, John K.
1981-01-01
Reviews aspects of quality control (methods to control errors) and quality assessment (verification that systems are operating within acceptable limits) including an analytical measurement system, quality control by inspection, control charts, systematic errors, and use of SRMs, materials for which properties are certified by the National Bureau…
Analytic Methods for Adjusting Subjective Rating Schemes.
ERIC Educational Resources Information Center
Cooper, Richard V. L.; Nelson, Gary R.
Statistical and econometric techniques of correcting for supervisor bias in models of individual performance appraisal were developed, using a variant of the classical linear regression model. Location bias occurs when individual performance is systematically overestimated or underestimated, while scale bias results when raters either exaggerate…
ANALYSIS OF ALTERNATIVES FOR THE LONG TERM MANAGEMENT OF EXCESS MERCURY
This paper describes a systematic method for comparing options for the long-term management of surplus elemental mercury in the U.S., using the Analytic Hierarchy Process (AHP) as embodied in commercially available Expert Choice software. A limited scope multi-criteria decisionan...
NASA Astrophysics Data System (ADS)
Barsan, Victor
2018-05-01
Several classes of transcendental equations, mainly eigenvalue equations associated to non-relativistic quantum mechanical problems, are analyzed. Siewert's systematic approach of such equations is discussed from the perspective of the new results recently obtained in the theory of generalized Lambert functions and of algebraic approximations of various special or elementary functions. Combining exact and approximate analytical methods, quite precise analytical outputs are obtained for apparently untractable problems. The results can be applied in quantum and classical mechanics, magnetism, elasticity, solar energy conversion, etc.
Dynamic Looping of a Free-Draining Polymer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ye, Felix X. -F.; Stinis, Panos; Qian, Hong
Here, we revisit the celebrated Wilemski--Fixman (WF) treatment for the looping time of a free-draining polymer. The WF theory introduces a sink term into the Fokker--Planck equation for themore » $3(N+1)$-dimensional Ornstein--Uhlenbeck process of polymer dynamics, which accounts for the appropriate boundary condition due to the formation of a loop. The assumption for WF theory is considerably relaxed. A perturbation method approach is developed that justifies and generalizes the previous results using either a delta sink or a Heaviside sink. For both types of sinks, we show that under the condition of a small dimensionless $$\\epsilon$$, the ratio of capture radius to the Kuhn length, we are able to systematically produce all known analytical and asymptotic results obtained by other methods. This includes most notably the transition regime between the $N^2$ scaling of Doi, and $$N\\sqrt{N}/\\epsilon$$ scaling of Szabo, Schulten, and Schulten. The mathematical issue at play is the nonuniform convergence of $$\\epsilon\\to 0$$ and $$N\\to\\infty$$, the latter being an inherent part of the theory of a Gaussian polymer. Our analysis yields a novel term in the analytical expression for the looping time with small $$\\epsilon$$, which was previously unknown. Monte Carlo numerical simulations corroborate the analytical findings. The systematic method developed here can be applied to other systems modeled by multidimensional Smoluchowski equations.« less
ERIC Educational Resources Information Center
Farahmand, Farahnaz K.; Grant, Kathryn E.; Polo, Antonio J.; Duffy, Sophia N.; Dubois, David L.
2011-01-01
A systematic and meta-analytic review was conducted of the effectiveness of school-based mental health and behavioral programs for low-income, urban youth. Applying criteria from an earlier systematic review (Rones & Hoagwood, 2000) of such programs for all populations indicated substantially fewer effective programs for low-income, urban…
NASA Astrophysics Data System (ADS)
Bascetin, A.
2007-04-01
The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.
A Structure for Pedagogical Art Criticism.
ERIC Educational Resources Information Center
Anderson, Tom
1988-01-01
Develops method for incorporating the intuitive and affective with intellectual and analytic components for understanding works of art. States that the premises for such a systematization include both Arnheim's claim that two basic interdependent procedures of intelligent cognition are intuition and intellect (1986); and Harry Broudy's (1972)…
Validating Performance Level Descriptors (PLDs) for the AP® Environmental Science Exam
ERIC Educational Resources Information Center
Reshetar, Rosemary; Kaliski, Pamela; Chajewski, Michael; Lionberger, Karen
2012-01-01
This presentation summarizes a pilot study conducted after the May 2011 administration of the AP Environmental Science Exam. The study used analytical methods based on scaled anchoring as input to a Performance Level Descriptor validation process that solicited systematic input from subject matter experts.
Analytical N beam position monitor method
NASA Astrophysics Data System (ADS)
Wegscheider, A.; Langner, A.; Tomás, R.; Franchi, A.
2017-11-01
Measurement and correction of focusing errors is of great importance for performance and machine protection of circular accelerators. Furthermore LHC needs to provide equal luminosities to the experiments ATLAS and CMS. High demands are also set on the speed of the optics commissioning, as the foreseen operation with β*-leveling on luminosity will require many operational optics. A fast measurement of the β -function around a storage ring is usually done by using the measured phase advance between three consecutive beam position monitors (BPMs). A recent extension of this established technique, called the N-BPM method, was successfully applied for optics measurements at CERN, ALBA, and ESRF. We present here an improved algorithm that uses analytical calculations for both random and systematic errors and takes into account the presence of quadrupole, sextupole, and BPM misalignments, in addition to quadrupolar field errors. This new scheme, called the analytical N-BPM method, is much faster, further improves the measurement accuracy, and is applicable to very pushed beam optics where the existing numerical N-BPM method tends to fail.
[Developments in preparation and experimental method of solid phase microextraction fibers].
Yi, Xu; Fu, Yujie
2004-09-01
Solid phase microextraction (SPME) is a simple and effective adsorption and desorption technique, which concentrates volatile or nonvolatile compounds from liquid samples or headspace of samples. SPME is compatible with analyte separation and detection by gas chromatography, high performance liquid chromatography, and other instrumental methods. It can provide many advantages, such as wide linear scale, low solvent and sample consumption, short analytical times, low detection limits, simple apparatus, and so on. The theory of SPME is introduced, which includes equilibrium theory and non-equilibrium theory. The novel development of fiber preparation methods and relative experimental techniques are discussed. In addition to commercial fiber preparation, different newly developed fabrication techniques, such as sol-gel, electronic deposition, carbon-base adsorption, high-temperature epoxy immobilization, are presented. Effects of extraction modes, selection of fiber coating, optimization of operating conditions, method sensitivity and precision, and systematical automation, are taken into considerations in the analytical process of SPME. A simple perspective of SPME is proposed at last.
Martin, Jeffrey D.; Norman, Julia E.; Sandstrom, Mark W.; Rose, Claire E.
2017-09-06
U.S. Geological Survey monitoring programs extensively used two analytical methods, gas chromatography/mass spectrometry and liquid chromatography/mass spectrometry, to measure pesticides in filtered water samples during 1992–2012. In October 2012, the monitoring programs began using direct aqueous-injection liquid chromatography tandem mass spectrometry as a new analytical method for pesticides. The change in analytical methods, however, has the potential to inadvertently introduce bias in analysis of datasets that span the change.A field study was designed to document performance of the new method in a variety of stream-water matrices and to quantify any potential changes in measurement bias or variability that could be attributed to changes in analytical methods. The goals of the field study were to (1) summarize performance (bias and variability of pesticide recovery) of the new method in a variety of stream-water matrices; (2) compare performance of the new method in laboratory blank water (laboratory reagent spikes) to that in a variety of stream-water matrices; (3) compare performance (analytical recovery) of the new method to that of the old methods in a variety of stream-water matrices; (4) compare pesticide detections and concentrations measured by the new method to those of the old methods in a variety of stream-water matrices; (5) compare contamination measured by field blank water samples in old and new methods; (6) summarize the variability of pesticide detections and concentrations measured by the new method in field duplicate water samples; and (7) identify matrix characteristics of environmental water samples that adversely influence the performance of the new method. Stream-water samples and a variety of field quality-control samples were collected at 48 sites in the U.S. Geological Survey monitoring networks during June–September 2012. Stream sites were located across the United States and included sites in agricultural and urban land-use settings, as well as sites on major rivers.The results of the field study identified several challenges for the analysis and interpretation of data analyzed by both old and new methods, particularly when data span the change in methods and are combined for analysis of temporal trends in water quality. The main challenges identified are large (greater than 30 percent), statistically significant differences in analytical recovery, detection capability, and (or) measured concentrations for selected pesticides. These challenges are documented and discussed, but specific guidance or statistical methods to resolve these differences in methods are beyond the scope of the report. The results of the field study indicate that the implications of the change in analytical methods must be assessed individually for each pesticide and method.Understanding the possible causes of the systematic differences in concentrations between methods that remain after recovery adjustment might be necessary to determine how to account for the differences in data analysis. Because recoveries for each method are independently determined from separate reference standards and spiking solutions, the differences might be due to an error in one of the reference standards or solutions or some other basic aspect of standard procedure in the analytical process. Further investigation of the possible causes is needed, which will lead to specific decisions on how to compensate for these differences in concentrations in data analysis. In the event that further investigations do not provide insight into the causes of systematic differences in concentrations between methods, the authors recommend continuing to collect and analyze paired environmental water samples by both old and new methods. This effort should be targeted to seasons, sites, and expected concentrations to supplement those concentrations already assessed and to compare the ongoing analytical recovery of old and new methods to those observed in the summer and fall of 2012.
Does Parent-Child Interaction Therapy Reduce Future Physical Abuse? A Meta-Analysis
ERIC Educational Resources Information Center
Kennedy, Stephanie C.; Kim, Johnny S.; Tripodi, Stephen J.; Brown, Samantha M.; Gowdy, Grace
2016-01-01
Objective: To use meta-analytic techniques to evaluating the effectiveness of parent-child interaction therapy (PCIT) at reducing future physical abuse among physically abusive families. Methods: A systematic search identified six eligible studies. Outcomes of interest were physical abuse recurrence, child abuse potential, and parenting stress.…
Conceptualizing the Critical Path Linked by Teacher Commitment
ERIC Educational Resources Information Center
Sun, Jingping
2015-01-01
Purpose: The purpose of this paper is to propose a critical path through which school leadership travels to students by highlighting the importance of teacher commitment. Design/methodology/approach: Using both meta-analytic and narrative review methods, this paper systematically reviews the evidence in the past 20 years about the…
A Meta-Analysis of Interventions to Reduce Adolescent Cannabis Use
ERIC Educational Resources Information Center
Bender, Kimberly; Tripodi, Stephen J.; Sarteschi, Christy; Vaughn, Michael G.
2011-01-01
Objective: This meta-analytic review assesses the effectiveness of substance abuse interventions to reduce adolescent cannabis use. Method: A systematic search identified 15 randomized controlled evaluations of interventions to reduce adolescent cannabis use published between 1960 and 2008. The primary outcome variables, frequency of cannabis use,…
The US Environmental Protection Agency (EPA) has conducted an HIA at the German Gerena Community School in Springfield, MA. HIA is a six-step systematic process that uses an array of data sources, analytic methods and stakeholder input to determine the potential health effects of...
Javed, Bushra; Padfield, Philip; Sperrin, Matthew; Simpson, Angela; Mills, E N Clare
2017-08-01
Food regulations require that tree nuts and derived ingredients are included on food labels in order to help individuals with IgE-mediated allergies to avoid them. However, there is no consensus regarding which tree nut species should be included in this definition and specified on food labels. Allergen detection methods used for monitoring foods target allergen molecules, but it not clear which are the most relevant molecules to choose. A modified population-exposure-comparators-outcome (PECO) approach has been developed to systematically review the evidence regarding (1) which allergenic tree nuts should be included in food allergen labelling lists and (2) which are the clinically relevant allergens which should be used as analytical targets. A search strategy and criteria against which the evidence will be evaluated have been developed. The resulting evidence will be used to rank tree nuts with regards their ability to cause IgE-mediated allergies, and allergen molecules regarding their capacity to elicit an allergic reaction. The results of the systematic review will enable risk assessors and managers to identify tree nut species that should be included in food allergen labelling lists and ensure analytical methods for determination of allergens in foods are targeting appropriate molecules. Copyright © 2017. Published by Elsevier Ltd.
Uncertainties in the deprojection of the observed bar properties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou, Yanfei; Shen, Juntai; Li, Zhao-Yu, E-mail: jshen@shao.ac.cn
2014-08-10
In observations, it is important to deproject the two fundamental quantities characterizing a bar, i.e., its length (a) and ellipticity (e), to face-on values before any careful analyses. However, systematic estimation on the uncertainties of the commonly used deprojection methods is still lacking. Simulated galaxies are well suited in this study. We project two simulated barred galaxies onto a two-dimensional (2D) plane with different bar orientations and disk inclination angles (i). Bar properties are measured and deprojected with the popular deprojection methods in the literature. Generally speaking, deprojection uncertainties increase with increasing i. All of the deprojection methods behave badlymore » when i is larger than 60°, due to the vertical thickness of the bar. Thus, future statistical studies of barred galaxies should exclude galaxies more inclined than 60°. At moderate inclination angles (i ≤ 60°), 2D deprojection methods (analytical and image stretching), and Fourier-based methods (Fourier decomposition and bar-interbar contrast) perform reasonably well with uncertainties ∼10% in both the bar length and ellipticity, whereas the uncertainties of the one-dimensional (1D) analytical deprojection can be as high as 100% in certain extreme cases. We find that different bar measurement methods show systematic differences in the deprojection uncertainties. We further discuss the deprojection uncertainty factors with the emphasis on the most important one, i.e., the three-dimensional structure of the bar itself. We construct two triaxial toy bar models that can qualitatively reproduce the results of the 1D and 2D analytical deprojections; they confirm that the vertical thickness of the bar is the main source of uncertainties.« less
Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.
Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C
2016-09-01
Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.
Khurana, Rajneet Kaur; Rao, Satish; Beg, Sarwar; Katare, O.P.; Singh, Bhupinder
2016-01-01
The present work aims at the systematic development of a simple, rapid and highly sensitive densitometry-based thin-layer chromatographic method for the quantification of mangiferin in bioanalytical samples. Initially, the quality target method profile was defined and critical analytical attributes (CAAs) earmarked, namely, retardation factor (Rf), peak height, capacity factor, theoretical plates and separation number. Face-centered cubic design was selected for optimization of volume loaded and plate dimensions as the critical method parameters selected from screening studies employing D-optimal and Plackett–Burman design studies, followed by evaluating their effect on the CAAs. The mobile phase containing a mixture of ethyl acetate : acetic acid : formic acid : water in a 7 : 1 : 1 : 1 (v/v/v/v) ratio was finally selected as the optimized solvent for apt chromatographic separation of mangiferin at 262 nm with Rf 0.68 ± 0.02 and all other parameters within the acceptance limits. Method validation studies revealed high linearity in the concentration range of 50–800 ng/band for mangiferin. The developed method showed high accuracy, precision, ruggedness, robustness, specificity, sensitivity, selectivity and recovery. In a nutshell, the bioanalytical method for analysis of mangiferin in plasma revealed the presence of well-resolved peaks and high recovery of mangiferin. PMID:26912808
Bridging the Gap between Human Judgment and Automated Reasoning in Predictive Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Riensche, Roderick M.; Unwin, Stephen D.
2010-06-07
Events occur daily that impact the health, security and sustainable growth of our society. If we are to address the challenges that emerge from these events, anticipatory reasoning has to become an everyday activity. Strong advances have been made in using integrated modeling for analysis and decision making. However, a wider impact of predictive analytics is currently hindered by the lack of systematic methods for integrating predictive inferences from computer models with human judgment. In this paper, we present a predictive analytics approach that supports anticipatory analysis and decision-making through a concerted reasoning effort that interleaves human judgment and automatedmore » inferences. We describe a systematic methodology for integrating modeling algorithms within a serious gaming environment in which role-playing by human agents provides updates to model nodes and the ensuing model outcomes in turn influence the behavior of the human players. The approach ensures a strong functional partnership between human players and computer models while maintaining a high degree of independence and greatly facilitating the connection between model and game structures.« less
Steuer, Andrea E; Forss, Anna-Maria; Dally, Annika M; Kraemer, Thomas
2014-11-01
In the context of driving under the influence of drugs (DUID), not only common drugs of abuse may have an influence, but also medications with similar mechanisms of action. Simultaneous quantification of a variety of drugs and medications relevant in this context allows faster and more effective analyses. Therefore, multi-analyte approaches have gained more and more popularity in recent years. Usually, calibration curves for such procedures contain a mixture of all analytes, which might lead to mutual interferences. In this study we investigated whether the use of such mixtures leads to reliable results for authentic samples containing only one or two analytes. Five hundred microliters of whole blood were extracted by routine solid-phase extraction (SPE, HCX). Analysis was performed on an ABSciex 3200 QTrap instrument with ESI+ in scheduled MRM mode. The method was fully validated according to international guidelines including selectivity, recovery, matrix effects, accuracy and precision, stabilities, and limit of quantification. The selected SPE provided recoveries >60% for all analytes except 6-monoacetylmorphine (MAM) with coefficients of variation (CV) below 15% or 20% for quality controls (QC) LOW and HIGH, respectively. Ion suppression >30% was found for benzoylecgonine, hydrocodone, hydromorphone, MDA, oxycodone, and oxymorphone at QC LOW, however CVs were always below 10% (n=6 different whole blood samples). Accuracy and precision criteria were fulfilled for all analytes except for MAM. Systematic investigation of accuracy determined for QC MED in a multi-analyte mixture compared to samples containing only single analytes revealed no relevant differences for any analyte, indicating that a multi-analyte calibration is suitable for the presented method. Comparison of approximately 60 samples to a former GC-MS method showed good correlation. The newly validated method was successfully applied to more than 1600 routine samples and 3 proficiency tests. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Chemical analysis of acoustically levitated drops by Raman spectroscopy.
Tuckermann, Rudolf; Puskar, Ljiljana; Zavabeti, Mahta; Sekine, Ryo; McNaughton, Don
2009-07-01
An experimental apparatus combining Raman spectroscopy with acoustic levitation, Raman acoustic levitation spectroscopy (RALS), is investigated in the field of physical and chemical analytics. Whereas acoustic levitation enables the contactless handling of microsized samples, Raman spectroscopy offers the advantage of a noninvasive method without complex sample preparation. After carrying out some systematic tests to probe the sensitivity of the technique to drop size, shape, and position, RALS has been successfully applied in monitoring sample dilution and preconcentration, evaporation, crystallization, an acid-base reaction, and analytes in a surface-enhanced Raman spectroscopy colloidal suspension.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kassemi, S.A.
1988-04-01
High Rayleigh number convection in a rectangular cavity with insulated horizontal surfaces and differentially heated vertical walls was analyzed for an arbitrary aspect ratio smaller than or equal to unity. Unlike previous analytical studies, a systematic method of solution based on linearization technique and analytical iteration procedure was developed to obtain approximate closed-form solutions for a wide range of aspect ratios. The predicted velocity and temperature fields are shown to be in excellent agreement with available experimental and numerical data.
NASA Technical Reports Server (NTRS)
Kassemi, Siavash A.
1988-01-01
High Rayleigh number convection in a rectangular cavity with insulated horizontal surfaces and differentially heated vertical walls was analyzed for an arbitrary aspect ratio smaller than or equal to unity. Unlike previous analytical studies, a systematic method of solution based on linearization technique and analytical iteration procedure was developed to obtain approximate closed-form solutions for a wide range of aspect ratios. The predicted velocity and temperature fields are shown to be in excellent agreement with available experimental and numerical data.
Ho, Robin ST; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel YS; Chung, Vincent CH
2015-01-01
Background: Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. Aims: To assess the methodological quality of MAs on COPD treatments. Methods: A cross-sectional study on MAs of COPD trials. MAs published during 2000–2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Results: Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. Conclusions: The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods. PMID:25569783
OPTHYLIC: An Optimised Tool for Hybrid Limits Computation
NASA Astrophysics Data System (ADS)
Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée
2018-05-01
A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.
Cylindrical optical resonators: fundamental properties and bio-sensing characteristics
NASA Astrophysics Data System (ADS)
Khozeymeh, Foroogh; Razaghi, Mohammad
2018-04-01
In this paper, detailed theoretical analysis of cylindrical resonators is demonstrated. As illustrated, these kinds of resonators can be used as optical bio-sensing devices. The proposed structure is analyzed using an analytical method based on Lam's approximation. This method is systematic and has simplified the tedious process of whispering-gallery mode (WGM) wavelength analysis in optical cylindrical biosensors. By this method, analysis of higher radial orders of high angular momentum WGMs has been possible. Using closed-form analytical equations, resonance wavelengths of higher radial and angular order WGMs of TE and TM polarization waves are calculated. It is shown that high angular momentum WGMs are more appropriate for bio-sensing applications. Some of the calculations are done using a numerical non-linear Newton method. A perfect match of 99.84% between the analytical and the numerical methods has been achieved. In order to verify the validity of the calculations, Meep simulations based on the finite difference time domain (FDTD) method are performed. In this case, a match of 96.70% between the analytical and FDTD results has been obtained. The analytical predictions are in good agreement with other experimental work (99.99% match). These results validate the proposed analytical modelling for the fast design of optical cylindrical biosensors. It is shown that by extending the proposed two-layer resonator structure analyzing scheme, it is possible to study a three-layer cylindrical resonator structure as well. Moreover, by this method, fast sensitivity optimization in cylindrical resonator-based biosensors has been possible. Sensitivity of the WGM resonances is analyzed as a function of the structural parameters of the cylindrical resonators. Based on the results, fourth radial order WGMs, with a resonator radius of 50 μm, display the most bulk refractive index sensitivity of 41.50 (nm/RIU).
Mihura, Joni L; Meyer, Gregory J; Dumitrascu, Nicolae; Bombel, George
2016-01-01
We respond to Tibon Czopp and Zeligman's (2016) critique of our systematic reviews and meta-analyses of 65 Rorschach Comprehensive System (CS) variables published in Psychological Bulletin (2013). The authors endorsed our supportive findings but critiqued the same methodology when used for the 13 unsupported variables. Unfortunately, their commentary was based on significant misunderstandings of our meta-analytic method and results, such as thinking we used introspectively assessed criteria in classifying levels of support and reporting only a subset of our externally assessed criteria. We systematically address their arguments that our construct label and criterion variable choices were inaccurate and, therefore, meta-analytic validity for these 13 CS variables was artificially low. For example, the authors created new construct labels for these variables that they called "the customary CS interpretation," but did not describe their methodology nor provide evidence that their labels would result in better validity than ours. They cite studies they believe we should have included; we explain how these studies did not fit our inclusion criteria and that including them would have actually reduced the relevant CS variables' meta-analytic validity. Ultimately, criticisms alone cannot change meta-analytic support from negative to positive; Tibon Czopp and Zeligman would need to conduct their own construct validity meta-analyses.
Sensitivity and systematics of calorimetric neutrino mass experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nucciotti, A.; Cremonesi, O.; Ferri, E.
2009-12-16
A large calorimetric neutrino mass experiment using thermal detectors is expected to play a crucial role in the challenge for directly assessing the neutrino mass. We discuss and compare here two approaches for the estimation of the experimental sensitivity of such an experiment. The first method uses an analytic formulation and allows to obtain readily a close estimate over a wide range of experimental configurations. The second method is based on a Montecarlo technique and is more precise and reliable. The Montecarlo approach is then exploited to study some sources of systematic uncertainties peculiar to calorimetric experiments. Finally, the toolsmore » are applied to investigate the optimal experimental configuration of the MARE project.« less
Roemmelt, Andreas T; Steuer, Andrea E; Poetzsch, Michael; Kraemer, Thomas
2014-12-02
Forensic and clinical toxicological screening procedures are employing liquid chromatography-tandem mass spectrometry (LC-MS/MS) techniques with information-dependent acquisition (IDA) approaches more and more often. It is known that the complexity of a sample and the IDA settings might prevent important compounds from being triggered. Therefore, data-independent acquisition (DIA) methods should be more suitable for systematic toxicological analysis (STA). The DIA method sequential window acquisition of all theoretical fragment-ion spectra (SWATH), which uses Q1 windows of 20-35 Da for data-independent fragmentation, was systematically investigated for its suitability for STA. Quality of SWATH-generated mass spectra were evaluated with regard to mass error, relative abundance of the fragments, and library hits. With the Q1 window set to 20-25 Da, several precursors pass Q1 at the same time and are fragmented, thus impairing the library search algorithms to a different extent: forward fit was less affected than reverse fit and purity fit. Mass error was not affected. The relative abundance of the fragments was concentration dependent for some analytes and was influenced by cofragmentation, especially of deuterated analogues. Also, the detection rate of IDA compared to SWATH was investigated in a forced coelution experiment (up to 20 analytes coeluting). Even using several different IDA settings, it was observed that IDA failed to trigger relevant compounds. Screening results of 382 authentic forensic cases revealed that SWATH's detection rate was superior to IDA, which failed to trigger ∼10% of the analytes.
A General Methodology for the Translation of Behavioral Terms into Vernacular Languages.
Virues-Ortega, Javier; Martin, Neil; Schnerch, Gabriel; García, Jesús Ángel Miguel; Mellichamp, Fae
2015-05-01
As the field of behavior analysis expands internationally, the need for comprehensive and systematic glossaries of behavioral terms in the vernacular languages of professionals and clients becomes crucial. We created a Spanish-language glossary of behavior-analytic terms by developing and employing a systematic set of decision-making rules for the inclusion of terms. We then submitted the preliminary translation to a multi-national advisory committee to evaluate the transnational acceptability of the glossary. This method led to a translated corpus of over 1200 behavioral terms. The end products of this work included the following: (a) a Spanish-language glossary of behavior analytic terms that are publicly available over the Internet through the Behavior Analyst Certification Board and (b) a set of translation guidelines summarized here that may be useful for the development of glossaries of behavioral terms into other vernacular languages.
Wang, Hai-Feng; Lu, Hai; Li, Jia; Sun, Guo-Hua; Wang, Jun; Dai, Xin-Hua
2014-02-01
The present paper reported the differential scanning calorimetry-thermogravimetry curves and the infrared (IR) absorption spectrometry under the temperature program analyzed by the combined simultaneous thermal analysis-IR spectrometer. The gas products of coal were identified by the IR spectrometry. This paper emphasized on the combustion at high temperature-IR absorption method, a convenient and accurate method, which measures the content of sulfur in coal indirectly through the determination of the content of sulfur dioxide in the mixed gas products by IR absorption. It was demonstrated, when the instrument was calibrated by varied pure compounds containing sulfur and certified reference materials (CRMs) for coal, that there was a large deviation in the measured sulfur contents. It indicates that the difference in chemical speciations of sulfur between CRMs and the analyte results in a systematic error. The time-IR absorption curve was utilized to analyze the composition of sulfur at low temperatures and high temperatures and then the sulfur content of coal sample was determined by using a CRM for coal with a close composition of sulfur. Therefore, the systematic error due to the difference in chemical speciations of sulfur between the CRM and analyte was eliminated. On the other hand, in this combustion at high temperature-IR absorption method, the mass of CRM and analyte were adjusted to assure the sulfur mass equal and then the CRM and the analyte were measured alternately. This single-point calibration method reduced the effect of the drift of the IR detector and improved the repeatability of results, compared with the conventional multi-point calibration method using the calibration curves of signal intensity vs sulfur mass. The sulfur content results and their standard deviations of an anthracite coal and a bituminous coal with a low sulfur content determined by this modified method were 0.345% (0.004%) and 0.372% (0.008%), respectively. The uncertainty (U, k =2) of sulfur contents of two coal samples was evaluated to be 0.019% and 0.021%, respectively. Two main modifications, namely the calibration using the coal CRM with a similar composition of low-temperature sulfur and high temperature sulfur, and the single-point calibration alternating CRM and analyte, endow the combustion at high temperature-IR absorption method with an accuracy obviously better than that of the ASTM method. Therefore, this modified method has a well potential in the analysis of sulfur content.
Evaluation of analytical performance of a new high-sensitivity immunoassay for cardiac troponin I.
Masotti, Silvia; Prontera, Concetta; Musetti, Veronica; Storti, Simona; Ndreu, Rudina; Zucchelli, Gian Carlo; Passino, Claudio; Clerico, Aldo
2018-02-23
The study aim was to evaluate and compare the analytical performance of the new chemiluminescent immunoassay for cardiac troponin I (cTnI), called Access hs-TnI using DxI platform, with those of Access AccuTnI+3 method, and high-sensitivity (hs) cTnI method for ARCHITECT platform. The limits of blank (LoB), detection (LoD) and quantitation (LoQ) at 10% and 20% CV were evaluated according to international standardized protocols. For the evaluation of analytical performance and comparison of cTnI results, both heparinized plasma samples, collected from healthy subjects and patients with cardiac diseases, and quality control samples distributed in external quality assessment programs were used. LoB, LoD and LoQ at 20% and 10% CV values of the Access hs-cTnI method were 0.6, 1.3, 2.1 and 5.3 ng/L, respectively. Access hs-cTnI method showed analytical performance significantly better than that of Access AccuTnI+3 method and similar results to those of hs ARCHITECT cTnI method. Moreover, the cTnI concentrations measured with Access hs-cTnI method showed close linear regressions with both Access AccuTnI+3 and ARCHITECT hs-cTnI methods, although there were systematic differences between these methods. There was no difference between cTnI values measured by Access hs-cTnI in heparinized plasma and serum samples, whereas there was a significant difference between cTnI values, respectively measured in EDTA and heparin plasma samples. Access hs-cTnI has analytical sensitivity parameters significantly improved compared to Access AccuTnI+3 method and is similar to those of the high-sensitivity method using ARCHITECT platform.
NASA Astrophysics Data System (ADS)
Khademian, Amir; Abdollahipour, Hamed; Bagherpour, Raheb; Faramarzi, Lohrasb
2017-10-01
In addition to the numerous planning and executive challenges, underground excavation in urban areas is always followed by certain destructive effects especially on the ground surface; ground settlement is the most important of these effects for which estimation there exist different empirical, analytical and numerical methods. Since geotechnical models are associated with considerable model uncertainty, this study characterized the model uncertainty of settlement estimation models through a systematic comparison between model predictions and past performance data derived from instrumentation. To do so, the amount of surface settlement induced by excavation of the Qom subway tunnel was estimated via empirical (Peck), analytical (Loganathan and Poulos) and numerical (FDM) methods; the resulting maximum settlement value of each model were 1.86, 2.02 and 1.52 cm, respectively. The comparison of these predicted amounts with the actual data from instrumentation was employed to specify the uncertainty of each model. The numerical model outcomes, with a relative error of 3.8%, best matched the reality and the analytical method, with a relative error of 27.8%, yielded the highest level of model uncertainty.
NASA Astrophysics Data System (ADS)
Krishnan, Karthik; Reddy, Kasireddy V.; Ajani, Bhavya; Yalavarthy, Phaneendra K.
2017-02-01
CT and MR perfusion weighted imaging (PWI) enable quantification of perfusion parameters in stroke studies. These parameters are calculated from the residual impulse response function (IRF) based on a physiological model for tissue perfusion. The standard approach for estimating the IRF is deconvolution using oscillatory-limited singular value decomposition (oSVD) or Frequency Domain Deconvolution (FDD). FDD is widely recognized as the fastest approach currently available for deconvolution of CT Perfusion/MR PWI. In this work, three faster methods are proposed. The first is a direct (model based) crude approximation to the final perfusion quantities (Blood flow, Blood volume, Mean Transit Time and Delay) using the Welch-Satterthwaite approximation for gamma fitted concentration time curves (CTC). The second method is a fast accurate deconvolution method, we call Analytical Fourier Filtering (AFF). The third is another fast accurate deconvolution technique using Showalter's method, we call Analytical Showalter's Spectral Filtering (ASSF). Through systematic evaluation on phantom and clinical data, the proposed methods are shown to be computationally more than twice as fast as FDD. The two deconvolution based methods, AFF and ASSF, are also shown to be quantitatively accurate compared to FDD and oSVD.
Morales, Daniel R.; Pacurariu, Alexandra; Kurz, Xavier
2017-01-01
Aims Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. Methods We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non‐European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. Results From 1246 screened articles, 229 were eligible for full‐text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill‐over effects were rarely evaluated. Two‐thirds used before–after time series and 15.7% before–after cross‐sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Conclusion Whilst impact evaluation of pharmacovigilance and product‐specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. PMID:29105853
Systematic effects on dark energy from 3D weak shear
NASA Astrophysics Data System (ADS)
Kitching, T. D.; Taylor, A. N.; Heavens, A. F.
2008-09-01
We present an investigation into the potential effect of systematics inherent in multiband wide-field surveys on the dark energy equation-of-state determination for two 3D weak lensing methods. The weak lensing methods are a geometric shear-ratio method and 3D cosmic shear. The analysis here uses an extension of the Fisher matrix framework to include jointly photometric redshift systematics, shear distortion systematics and intrinsic alignments. Using analytic parametrizations of these three primary systematic effects allows an isolation of systematic parameters of particular importance. We show that assuming systematic parameters are fixed, but possibly biased, results in potentially large biases in dark energy parameters. We quantify any potential bias by defining a Bias Figure of Merit. By marginalizing over extra systematic parameters, such biases are negated at the expense of an increase in the cosmological parameter errors. We show the effect on the dark energy Figure of Merit of marginalizing over each systematic parameter individually. We also show the overall reduction in the Figure of Merit due to all three types of systematic effects. Based on some assumption of the likely level of systematic errors, we find that the largest effect on the Figure of Merit comes from uncertainty in the photometric redshift systematic parameters. These can reduce the Figure of Merit by up to a factor of 2 to 4 in both 3D weak lensing methods, if no informative prior on the systematic parameters is applied. Shear distortion systematics have a smaller overall effect. Intrinsic alignment effects can reduce the Figure of Merit by up to a further factor of 2. This, however, is a worst-case scenario, within the assumptions of the parametrizations used. By including prior information on systematic parameters, the Figure of Merit can be recovered to a large extent, and combined constraints from 3D cosmic shear and shear ratio are robust to systematics. We conclude that, as a rule of thumb, given a realistic current understanding of intrinsic alignments and photometric redshifts, then including all three primary systematic effects reduces the Figure of Merit by at most a factor of 2.
Yadav, Nand K; Raghuvanshi, Ashish; Sharma, Gajanand; Beg, Sarwar; Katare, Om P; Nanda, Sanju
2016-03-01
The current studies entail systematic quality by design (QbD)-based development of simple, precise, cost-effective and stability-indicating high-performance liquid chromatography method for estimation of ketoprofen. Analytical target profile was defined and critical analytical attributes (CAAs) were selected. Chromatographic separation was accomplished with an isocratic, reversed-phase chromatography using C-18 column, pH 6.8, phosphate buffer-methanol (50 : 50v/v) as a mobile phase at a flow rate of 1.0 mL/min and UV detection at 258 nm. Systematic optimization of chromatographic method was performed using central composite design by evaluating theoretical plates and peak tailing as the CAAs. The method was validated as per International Conference on Harmonization guidelines with parameters such as high sensitivity, specificity of the method with linearity ranging between 0.05 and 250 µg/mL, detection limit of 0.025 µg/mL and quantification limit of 0.05 µg/mL. Precision was demonstrated using relative standard deviation of 1.21%. Stress degradation studies performed using acid, base, peroxide, thermal and photolytic methods helped in identifying the degradation products in the proniosome delivery systems. The results successfully demonstrated the utility of QbD for optimizing the chromatographic conditions for developing highly sensitive liquid chromatographic method for ketoprofen. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Khurana, Rajneet Kaur; Rao, Satish; Beg, Sarwar; Katare, O P; Singh, Bhupinder
2016-01-01
The present work aims at the systematic development of a simple, rapid and highly sensitive densitometry-based thin-layer chromatographic method for the quantification of mangiferin in bioanalytical samples. Initially, the quality target method profile was defined and critical analytical attributes (CAAs) earmarked, namely, retardation factor (Rf), peak height, capacity factor, theoretical plates and separation number. Face-centered cubic design was selected for optimization of volume loaded and plate dimensions as the critical method parameters selected from screening studies employing D-optimal and Plackett-Burman design studies, followed by evaluating their effect on the CAAs. The mobile phase containing a mixture of ethyl acetate : acetic acid : formic acid : water in a 7 : 1 : 1 : 1 (v/v/v/v) ratio was finally selected as the optimized solvent for apt chromatographic separation of mangiferin at 262 nm withRf 0.68 ± 0.02 and all other parameters within the acceptance limits. Method validation studies revealed high linearity in the concentration range of 50-800 ng/band for mangiferin. The developed method showed high accuracy, precision, ruggedness, robustness, specificity, sensitivity, selectivity and recovery. In a nutshell, the bioanalytical method for analysis of mangiferin in plasma revealed the presence of well-resolved peaks and high recovery of mangiferin. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Spencer-Bonilla, Gabriela; Singh Ospina, Naykky; Rodriguez-Gutierrez, Rene; Brito, Juan P; Iñiguez-Ariza, Nicole; Tamhane, Shrikant; Erwin, Patricia J; Murad, M Hassan; Montori, Victor M
2017-07-01
Systematic reviews provide clinicians and policymakers estimates of diagnostic test accuracy and their usefulness in clinical practice. We identified all available systematic reviews of diagnosis in endocrinology, summarized the diagnostic accuracy of the tests included, and assessed the credibility and clinical usefulness of the methods and reporting. We searched Ovid MEDLINE, EMBASE, and Cochrane CENTRAL from inception to December 2015 for systematic reviews and meta-analyses reporting accuracy measures of diagnostic tests in endocrinology. Experienced reviewers independently screened for eligible studies and collected data. We summarized the results, methods, and reporting of the reviews. We performed subgroup analyses to categorize diagnostic tests as most useful based on their accuracy. We identified 84 systematic reviews; half of the tests included were classified as helpful when positive, one-fourth as helpful when negative. Most authors adequately reported how studies were identified and selected and how their trustworthiness (risk of bias) was judged. Only one in three reviews, however, reported an overall judgment about trustworthiness and one in five reported using adequate meta-analytic methods. One in four reported contacting authors for further information and about half included only patients with diagnostic uncertainty. Up to half of the diagnostic endocrine tests in which the likelihood ratio was calculated or provided are likely to be helpful in practice when positive as are one-quarter when negative. Most diagnostic systematic reviews in endocrine lack methodological rigor, protection against bias, and offer limited credibility. Substantial efforts, therefore, seem necessary to improve the quality of diagnostic systematic reviews in endocrinology.
Behavioral Inhibition and Risk for Developing Social Anxiety Disorder: A Meta-Analytic Study
ERIC Educational Resources Information Center
Clauss, Jacqueline A.; Blackford, Jennifer Urbano
2012-01-01
Objective: Behavioral inhibition (BI) has been associated with increased risk for developing social anxiety disorder (SAD); however, the degree of risk associated with BI has yet to be systematically examined and quantified. The goal of the present study was to quantify the association between childhood BI and risk for developing SAD. Method: A…
Goldberg, Tony L; Gillespie, Thomas R; Singer, Randall S
2006-09-01
Repetitive-element PCR (rep-PCR) is a method for genotyping bacteria based on the selective amplification of repetitive genetic elements dispersed throughout bacterial chromosomes. The method has great potential for large-scale epidemiological studies because of its speed and simplicity; however, objective guidelines for inferring relationships among bacterial isolates from rep-PCR data are lacking. We used multilocus sequence typing (MLST) as a "gold standard" to optimize the analytical parameters for inferring relationships among Escherichia coli isolates from rep-PCR data. We chose 12 isolates from a large database to represent a wide range of pairwise genetic distances, based on the initial evaluation of their rep-PCR fingerprints. We conducted MLST with these same isolates and systematically varied the analytical parameters to maximize the correspondence between the relationships inferred from rep-PCR and those inferred from MLST. Methods that compared the shapes of densitometric profiles ("curve-based" methods) yielded consistently higher correspondence values between data types than did methods that calculated indices of similarity based on shared and different bands (maximum correspondences of 84.5% and 80.3%, respectively). Curve-based methods were also markedly more robust in accommodating variations in user-specified analytical parameter values than were "band-sharing coefficient" methods, and they enhanced the reproducibility of rep-PCR. Phylogenetic analyses of rep-PCR data yielded trees with high topological correspondence to trees based on MLST and high statistical support for major clades. These results indicate that rep-PCR yields accurate information for inferring relationships among E. coli isolates and that accuracy can be enhanced with the use of analytical methods that consider the shapes of densitometric profiles.
Ryan, Patrick B.; Schuemie, Martijn
2013-01-01
Background: Clinical studies that use observational databases, such as administrative claims and electronic health records, to evaluate the effects of medical products have become commonplace. These studies begin by selecting a particular study design, such as a case control, cohort, or self-controlled design, and different authors can and do choose different designs for the same clinical question. Furthermore, published papers invariably report the study design but do not discuss the rationale for the specific choice. Studies of the same clinical question with different designs, however, can generate different results, sometimes with strikingly different implications. Even within a specific study design, authors make many different analytic choices and these too can profoundly impact results. In this paper, we systematically study heterogeneity due to the type of study design and due to analytic choices within study design. Methods and findings: We conducted our analysis in 10 observational healthcare databases but mostly present our results in the context of the GE Centricity EMR database, an electronic health record database containing data for 11.2 million lives. We considered the impact of three different study design choices on estimates of associations between bisphosphonates and four particular health outcomes for which there is no evidence of an association. We show that applying alternative study designs can yield discrepant results, in terms of direction and significance of association. We also highlight that while traditional univariate sensitivity analysis may not show substantial variation, systematic assessment of all analytical choices within a study design can yield inconsistent results ranging from statistically significant decreased risk to statistically significant increased risk. Our findings show that clinical studies using observational databases can be sensitive both to study design choices and to specific analytic choices within study design. Conclusion: More attention is needed to consider how design choices may be impacting results and, when possible, investigators should examine a wide array of possible choices to confirm that significant findings are consistently identified. PMID:25083251
Researching Mental Health Disorders in the Era of Social Media: Systematic Review
Vadillo, Miguel A; Curcin, Vasa
2017-01-01
Background Mental illness is quickly becoming one of the most prevalent public health problems worldwide. Social network platforms, where users can express their emotions, feelings, and thoughts, are a valuable source of data for researching mental health, and techniques based on machine learning are increasingly used for this purpose. Objective The objective of this review was to explore the scope and limits of cutting-edge techniques that researchers are using for predictive analytics in mental health and to review associated issues, such as ethical concerns, in this area of research. Methods We performed a systematic literature review in March 2017, using keywords to search articles on data mining of social network data in the context of common mental health disorders, published between 2010 and March 8, 2017 in medical and computer science journals. Results The initial search returned a total of 5386 articles. Following a careful analysis of the titles, abstracts, and main texts, we selected 48 articles for review. We coded the articles according to key characteristics, techniques used for data collection, data preprocessing, feature extraction, feature selection, model construction, and model verification. The most common analytical method was text analysis, with several studies using different flavors of image analysis and social interaction graph analysis. Conclusions Despite an increasing number of studies investigating mental health issues using social network data, some common problems persist. Assembling large, high-quality datasets of social media users with mental disorder is problematic, not only due to biases associated with the collection methods, but also with regard to managing consent and selecting appropriate analytics techniques. PMID:28663166
Xu, Wei
2007-12-01
This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.
Wong, William CW; Cheung, Catherine SK; Hart, Graham J
2008-01-01
Background Systematic reviews based on the critical appraisal of observational and analytic studies on HIV prevalence and risk factors for HIV transmission among men having sex with men are very useful for health care decisions and planning. Such appraisal is particularly difficult, however, as the quality assessment tools available for use with observational and analytic studies are poorly established. Methods We reviewed the existing quality assessment tools for systematic reviews of observational studies and developed a concise quality assessment checklist to help standardise decisions regarding the quality of studies, with careful consideration of issues such as external and internal validity. Results A pilot version of the checklist was developed based on epidemiological principles, reviews of study designs, and existing checklists for the assessment of observational studies. The Quality Assessment Tool for Systematic Reviews of Observational Studies (QATSO) Score consists of five items: External validity (1 item), reporting (2 items), bias (1 item) and confounding factors (1 item). Expert opinions were sought and it was tested on manuscripts that fulfil the inclusion criteria of a systematic review. Like all assessment scales, QATSO may oversimplify and generalise information yet it is inclusive, simple and practical to use, and allows comparability between papers. Conclusion A specific tool that allows researchers to appraise and guide study quality of observational studies is developed and can be modified for similar studies in the future. PMID:19014686
Giardiasis as a neglected disease in Brazil: Systematic review of 20 years of publications
Durigan, Maurício; Leal, Diego Averaldo Guiguet; Schneider, Adriano de Bernardi; Franco, Regina Maura Bueno; Singer, Steven M.
2017-01-01
Introduction Giardiasis is an intestinal infection that affects more than two hundred million people annually worldwide; it is caused by the flagellated protozoan Giardia duodenalis. In tropical countries and in low or middle-income settings, like Brazil, its prevalence can be high. There is currently no systematic review on the presence of G. duodenalis in patients, animals or water sources in Brazil. Methods This systematic review was performed according to recommendations established by Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA). As databases for our searches, we have used PubMed, Embase, Scopus and the Brazilian database SciELO using the keywords Giardia* and Brazil. Results This systematic review identified research studies related to G. duodenalis in water, giardiasis in animals, prevalence of giardiasis across Brazilian regions, genotyping of strains isolated in humans, and giardiasis in indigenous populations. We also propose a network of G. duodenalis transmission in Brazil based on genotypes analyses. Conclusion This is the first time within the last twenty years that a review is being published on the occurrence of G. duodenalis in Brazil, addressing relevant issues such as prevalence, molecular epidemiology and analytical methods for parasite detection. PMID:29065126
Schmidt, Katharina; Aumann, Ines; Hollander, Ines; Damm, Kathrin; von der Schulenburg, J-Matthias Graf
2015-12-24
The Analytic Hierarchy Process (AHP), developed by Saaty in the late 1970s, is one of the methods for multi-criteria decision making. The AHP disaggregates a complex decision problem into different hierarchical levels. The weight for each criterion and alternative are judged in pairwise comparisons and priorities are calculated by the Eigenvector method. The slowly increasing application of the AHP was the motivation for this study to explore the current state of its methodology in the healthcare context. A systematic literature review was conducted by searching the Pubmed and Web of Science databases for articles with the following keywords in their titles or abstracts: "Analytic Hierarchy Process," "Analytical Hierarchy Process," "multi-criteria decision analysis," "multiple criteria decision," "stated preference," and "pairwise comparison." In addition, we developed reporting criteria to indicate whether the authors reported important aspects and evaluated the resulting studies' reporting. The systematic review resulted in 121 articles. The number of studies applying AHP has increased since 2005. Most studies were from Asia (almost 30%), followed by the US (25.6%). On average, the studies used 19.64 criteria throughout their hierarchical levels. Furthermore, we restricted a detailed analysis to those articles published within the last 5 years (n = 69). The mean of participants in these studies were 109, whereas we identified major differences in how the surveys were conducted. The evaluation of reporting showed that the mean of reported elements was about 6.75 out of 10. Thus, 12 out of 69 studies reported less than half of the criteria. The AHP has been applied inconsistently in healthcare research. A minority of studies described all the relevant aspects. Thus, the statements in this review may be biased, as they are restricted to the information available in the papers. Hence, further research is required to discover who should be interviewed and how, how inconsistent answers should be dealt with, and how the outcome and stability of the results should be presented. In addition, we need new insights to determine which target group can best handle the challenges of the AHP.
A Systematic Review on Using Literature for the Young Learners in an EFL Classroom
ERIC Educational Resources Information Center
Al-hajji, Badria A.; Shuqair, Khaled M.
2014-01-01
This study has objectives that are exploratory and analytical in nature. It focuses on the use of relevant information with regard to the use of literature in EFL classrooms that is available for an analysis in order to draw conclusions and make useful recommendations. The study is, therefore, conducted as library research using the method of…
Analytical solutions of the two-dimensional Dirac equation for a topological channel intersection
NASA Astrophysics Data System (ADS)
Anglin, J. R.; Schulz, A.
2017-01-01
Numerical simulations in a tight-binding model have shown that an intersection of topologically protected one-dimensional chiral channels can function as a beam splitter for noninteracting fermions on a two-dimensional lattice [Qiao, Jung, and MacDonald, Nano Lett. 11, 3453 (2011), 10.1021/nl201941f; Qiao et al., Phys. Rev. Lett. 112, 206601 (2014), 10.1103/PhysRevLett.112.206601]. Here we confirm this result analytically in the corresponding continuum k .p model, by solving the associated two-dimensional Dirac equation, in the presence of a "checkerboard" potential that provides a right-angled intersection between two zero-line modes. The method by which we obtain our analytical solutions is systematic and potentially generalizable to similar problems involving intersections of one-dimensional systems.
Bruno, C; Patin, F; Bocca, C; Nadal-Desbarats, L; Bonnier, F; Reynier, P; Emond, P; Vourc'h, P; Joseph-Delafont, K; Corcia, P; Andres, C R; Blasco, H
2018-01-30
Metabolomics is an emerging science based on diverse high throughput methods that are rapidly evolving to improve metabolic coverage of biological fluids and tissues. Technical progress has led researchers to combine several analytical methods without reporting the impact on metabolic coverage of such a strategy. The objective of our study was to develop and validate several analytical techniques (mass spectrometry coupled to gas or liquid chromatography and nuclear magnetic resonance) for the metabolomic analysis of small muscle samples and evaluate the impact of combining methods for more exhaustive metabolite covering. We evaluated the muscle metabolome from the same pool of mouse muscle samples after 2 metabolite extraction protocols. Four analytical methods were used: targeted flow injection analysis coupled with mass spectrometry (FIA-MS/MS), gas chromatography coupled with mass spectrometry (GC-MS), liquid chromatography coupled with high-resolution mass spectrometry (LC-HRMS), and nuclear magnetic resonance (NMR) analysis. We evaluated the global variability of each compound i.e., analytical (from quality controls) and extraction variability (from muscle extracts). We determined the best extraction method and we reported the common and distinct metabolites identified based on the number and identity of the compounds detected with low analytical variability (variation coefficient<30%) for each method. Finally, we assessed the coverage of muscle metabolic pathways obtained. Methanol/chloroform/water and water/methanol were the best extraction solvent for muscle metabolome analysis by NMR and MS, respectively. We identified 38 metabolites by nuclear magnetic resonance, 37 by FIA-MS/MS, 18 by GC-MS, and 80 by LC-HRMS. The combination led us to identify a total of 132 metabolites with low variability partitioned into 58 metabolic pathways, such as amino acid, nitrogen, purine, and pyrimidine metabolism, and the citric acid cycle. This combination also showed that the contribution of GC-MS was low when used in combination with other mass spectrometry methods and nuclear magnetic resonance to explore muscle samples. This study reports the validation of several analytical methods, based on nuclear magnetic resonance and several mass spectrometry methods, to explore the muscle metabolome from a small amount of tissue, comparable to that obtained during a clinical trial. The combination of several techniques may be relevant for the exploration of muscle metabolism, with acceptable analytical variability and overlap between methods However, the difficult and time-consuming data pre-processing, processing, and statistical analysis steps do not justify systematically combining analytical methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Feigenbaum, A; Scholler, D; Bouquant, J; Brigot, G; Ferrier, D; Franzl, R; Lillemarktt, L; Riquet, A M; Petersen, J H; van Lierop, B; Yagoubi, N
2002-02-01
The results of a research project (EU AIR Research Programme CT94-1025) aimed to introduce control of migration into good manufacturing practice and into enforcement work are reported. Representative polymer classes were defined on the basis of chemical structure, technological function, migration behaviour and market share. These classes were characterized by analytical methods. Analytical techniques were investigated for identification of potential migrants. High-temperature gas chromatography was shown to be a powerful method and 1H-magnetic resonance provided a convenient fingerprint of plastic materials. Volatile compounds were characterized by headspace techniques, where it was shown to be essential to differentiate volatile compounds desorbed from those generated during the thermal desorption itself. For metal trace analysis, microwave mineralization followed by atomic absorption was employed. These different techniques were introduced into a systematic testing scheme that is envisaged as being suitable both for industrial control and for enforcement laboratories. Guidelines will be proposed in the second part of this paper.
Choosing and Using Introns in Molecular Phylogenetics
Creer, Simon
2007-01-01
Introns are now commonly used in molecular phylogenetics in an attempt to recover gene trees that are concordant with species trees, but there are a range of genomic, logistical and analytical considerations that are infrequently discussed in empirical studies that utilize intron data. This review outlines expedient approaches for locus selection, overcoming paralogy problems, recombination detection methods and the identification and incorporation of LVHs in molecular systematics. A range of parsimony and Bayesian analytical approaches are also described in order to highlight the methods that can currently be employed to align sequences and treat indels in subsequent analyses. By covering the main points associated with the generation and analysis of intron data, this review aims to provide a comprehensive introduction to using introns (or any non-coding nuclear data partition) in contemporary phylogenetics. PMID:19461984
Recursive linearization of multibody dynamics equations of motion
NASA Technical Reports Server (NTRS)
Lin, Tsung-Chieh; Yae, K. Harold
1989-01-01
The equations of motion of a multibody system are nonlinear in nature, and thus pose a difficult problem in linear control design. One approach is to have a first-order approximation through the numerical perturbations at a given configuration, and to design a control law based on the linearized model. Here, a linearized model is generated analytically by following the footsteps of the recursive derivation of the equations of motion. The equations of motion are first written in a Newton-Euler form, which is systematic and easy to construct; then, they are transformed into a relative coordinate representation, which is more efficient in computation. A new computational method for linearization is obtained by applying a series of first-order analytical approximations to the recursive kinematic relationships. The method has proved to be computationally more efficient because of its recursive nature. It has also turned out to be more accurate because of the fact that analytical perturbation circumvents numerical differentiation and other associated numerical operations that may accumulate computational error, thus requiring only analytical operations of matrices and vectors. The power of the proposed linearization algorithm is demonstrated, in comparison to a numerical perturbation method, with a two-link manipulator and a seven degrees of freedom robotic manipulator. Its application to control design is also demonstrated.
Ho, Robin S T; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel Y S; Chung, Vincent C H
2015-01-08
Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. To assess the methodological quality of MAs on COPD treatments. A cross-sectional study on MAs of COPD trials. MAs published during 2000-2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods.
Pinto, Eduardo Costa; Dolzan, Maressa Danielli; Cabral, Lucio Mendes; Armstrong, Daniel W; de Sousa, Valéria Pereira
2016-02-01
An important step during the development of high-performance liquid chromatography (HPLC) methods for quantitative analysis of drugs is choosing the appropriate detector. High sensitivity, reproducibility, stability, wide linear range, compatibility with gradient elution, non-destructive detection of the analyte and response unaffected by changes in the temperature/flow are some of the ideal characteristics of a universal HPLC detector. Topiramate is an anticonvulsant drug mainly used for the treatment of different types of seizures and prophylactic treatment of migraine. Different analytical approaches to quantify topiramate by HPLC have been described because of the lack of chromophoric moieties on its structure, such as derivatization with fluorescent moieties and UV-absorbing moieties, conductivity detection, evaporative light scattering detection, refractive index detection, chemiluminescent nitrogen detection and MS detection. Some methods for the determination of topiramate by capillary electrophoresis and gas chromatography have also been published. This systematic review provides a description of the main analytical methods presented in the literature to analyze topiramate in the drug substance and in pharmaceutical formulations. Each of these methods is briefly discussed, especially considering the detector used with HPLC. In addition, this article presents a review of the data available regarding topiramate stability, degradation products and impurities. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Tracking Matrix Effects in the Analysis of DNA Adducts of Polycyclic Aromatic Hydrocarbons
Klaene, Joshua J.; Flarakos, Caroline; Glick, James; Barret, Jennifer T.; Zarbl, Helmut; Vouros, Paul
2015-01-01
LC-MS using electrospray ionization is currently the method of choice in bio-organic analysis covering a wide range of applications in a broad spectrum of biological media. The technique is noted for its high sensitivity but one major limitation which hinders achievement of its optimal sensitivity is the signal suppression due to matrix inferences introduced by the presence of co-extracted compounds during the sample preparation procedure. The analysis of DNA adducts of common environmental carcinogens is particularly sensitive to such matrix effects as sample preparation is a multistep process which involves “contamination” of the sample due to the addition of enzymes and other reagents for digestion of the DNA in order to isolate the analyte(s). This problem is further exacerbated by the need to reach low levels of quantitation (LOQ in the ppb level) while also working with limited (2-5 μg) quantities of sample. We report here on the systematic investigation of ion signal suppression contributed by each individual step involved in the sample preparation associated with the analysis of DNA adducts of polycyclic aromatic hydrocarbon (PAH) using as model analyte dG-BaP, the deoxyguanosine adduct of benzo[a]pyrene (BaP). The individual matrix contribution of each one of these sources to analyte signal was systematically addressed as were any interactive effects. The information was used to develop a validated analytical protocol for the target biomarker at levels typically encountered in vivo using as little as 2 μg of DNA and applied to a dose response study using a metabolically competent cell line. PMID:26607319
INTEGRATING DATA ANALYTICS AND SIMULATION METHODS TO SUPPORT MANUFACTURING DECISION MAKING
Kibira, Deogratias; Hatim, Qais; Kumara, Soundar; Shao, Guodong
2017-01-01
Modern manufacturing systems are installed with smart devices such as sensors that monitor system performance and collect data to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of streaming data collected is beyond simulation analysis alone. Simulation models are run with well-prepared data. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs; system optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. PMID:28690363
NASA Astrophysics Data System (ADS)
Luis, Josep M.; Duran, Miquel; Andrés, José L.
1997-08-01
An analytic method to evaluate nuclear contributions to electrical properties of polyatomic molecules is presented. Such contributions control changes induced by an electric field on equilibrium geometry (nuclear relaxation contribution) and vibrational motion (vibrational contribution) of a molecular system. Expressions to compute the nuclear contributions have been derived from a power series expansion of the potential energy. These contributions to the electrical properties are given in terms of energy derivatives with respect to normal coordinates, electric field intensity or both. Only one calculation of such derivatives at the field-free equilibrium geometry is required. To show the useful efficiency of the analytical evaluation of electrical properties (the so-called AEEP method), results for calculations on water and pyridine at the SCF/TZ2P and the MP2/TZ2P levels of theory are reported. The results obtained are compared with previous theoretical calculations and with experimental values.
NASA Astrophysics Data System (ADS)
Díaz, M.; Hirsch, M.; Porod, W.; Romão, J.; Valle, J.
2003-07-01
We give an analytical calculation of solar neutrino masses and mixing at one-loop order within bilinear R-parity breaking supersymmetry, and compare our results to the exact numerical calculation. Our method is based on a systematic perturbative expansion of R-parity violating vertices to leading order. We find in general quite good agreement between the approximate and full numerical calculations, but the approximate expressions are much simpler to implement. Our formalism works especially well for the case of the large mixing angle Mikheyev-Smirnov-Wolfenstein solution, now strongly favored by the recent KamLAND reactor neutrino data.
A direct method for nonlinear ill-posed problems
NASA Astrophysics Data System (ADS)
Lakhal, A.
2018-02-01
We propose a direct method for solving nonlinear ill-posed problems in Banach-spaces. The method is based on a stable inversion formula we explicitly compute by applying techniques for analytic functions. Furthermore, we investigate the convergence and stability of the method and prove that the derived noniterative algorithm is a regularization. The inversion formula provides a systematic sensitivity analysis. The approach is applicable to a wide range of nonlinear ill-posed problems. We test the algorithm on a nonlinear problem of travel-time inversion in seismic tomography. Numerical results illustrate the robustness and efficiency of the algorithm.
Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A
2011-07-01
Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.
Bolann, B J; Asberg, A
2004-01-01
The deviation of test results from patients' homeostatic set points in steady-state conditions may complicate interpretation of the results and the comparison of results with clinical decision limits. In this study the total deviation from the homeostatic set point is defined as the maximum absolute deviation for 95% of measurements, and we present analytical quality requirements that prevent analytical error from increasing this deviation to more than about 12% above the value caused by biology alone. These quality requirements are: 1) The stable systematic error should be approximately 0, and 2) a systematic error that will be detected by the control program with 90% probability, should not be larger than half the value of the combined analytical and intra-individual standard deviation. As a result, when the most common control rules are used, the analytical standard deviation may be up to 0.15 times the intra-individual standard deviation. Analytical improvements beyond these requirements have little impact on the interpretability of measurement results.
Alladio, Eugenio; Biosa, Giulia; Seganti, Fabrizio; Di Corcia, Daniele; Salomone, Alberto; Vincenti, Marco; Baumgartner, Markus R
2018-05-11
The quantitative determination of ethyl glucuronide (EtG) in hair samples is consistently used throughout the world to assess chronic excessive alcohol consumption. For administrative and legal purposes, the analytical results are compared with cut-off values recognized by regulatory authorities and scientific societies. However, it has been recently recognized that the analytical results depend on the hair sample pretreatment procedures, including the crumbling and extraction conditions. A systematic evaluation of the EtG extraction conditions from pulverized scalp hair was conducted by design of experiments (DoE) considering the extraction time, temperature, pH, and solvent composition as potential influencing factors. It was concluded that an overnight extraction at 60°C with pure water at neutral pH represents the most effective conditions to achieve high extraction yields. The absence of differential degradation of the internal standard (isotopically-labeled EtG) under such conditions was confirmed and the overall analytical method was validated according to SGWTOX and ISO17025 criteria. Twenty real hair samples with different EtG content were analyzed with three commonly accepted procedures: (a) hair manually cut in snippets and extracted at room temperature; (b) pulverized hair extracted at room temperature; (c) hair treated with the optimized method. Average increments of EtG concentration around 69% (from a to c) and 29% (from b to c) were recorded. In light of these results, the authors urge the scientific community to undertake an inter-laboratory study with the aim of defining more in detail the optimal hair EtG detection method and verifying the corresponding cut-off level for legal enforcements. This article is protected by copyright. All rights reserved.
Faller, Maximilian; Wilhelmsson, Peter; Kjelland, Vivian; Andreassen, Åshild; Dargis, Rimtas; Quarsten, Hanne; Dessau, Ram; Fingerle, Volker; Margos, Gabriele; Noraas, Sølvi; Ornstein, Katharina; Petersson, Ann-Cathrine; Matussek, Andreas; Lindgren, Per-Eric; Henningsson, Anna J.
2017-01-01
Introduction Lyme borreliosis (LB) is the most common tick transmitted disease in Europe. The diagnosis of LB today is based on the patient´s medical history, clinical presentation and laboratory findings. The laboratory diagnostics are mainly based on antibody detection, but in certain conditions molecular detection by polymerase chain reaction (PCR) may serve as a complement. Aim The purpose of this study was to evaluate the analytical sensitivity, analytical specificity and concordance of eight different real-time PCR methods at five laboratories in Sweden, Norway and Denmark. Method Each participating laboratory was asked to analyse three different sets of samples (reference panels; all blinded) i) cDNA extracted and transcribed from water spiked with cultured Borrelia strains, ii) cerebrospinal fluid spiked with cultured Borrelia strains, and iii) DNA dilution series extracted from cultured Borrelia and relapsing fever strains. The results and the method descriptions of each laboratory were systematically evaluated. Results and conclusions The analytical sensitivities and the concordance between the eight protocols were in general high. The concordance was especially high between the protocols using 16S rRNA as the target gene, however, this concordance was mainly related to cDNA as the type of template. When comparing cDNA and DNA as the type of template the analytical sensitivity was in general higher for the protocols using DNA as template regardless of the use of target gene. The analytical specificity for all eight protocols was high. However, some protocols were not able to detect Borrelia spielmanii, Borrelia lusitaniae or Borrelia japonica. PMID:28937997
Yang, Tzuhsiung; Berry, John F
2018-06-04
The computation of nuclear second derivatives of energy, or the nuclear Hessian, is an essential routine in quantum chemical investigations of ground and transition states, thermodynamic calculations, and molecular vibrations. Analytic nuclear Hessian computations require the resolution of costly coupled-perturbed self-consistent field (CP-SCF) equations, while numerical differentiation of analytic first derivatives has an unfavorable 6 N ( N = number of atoms) prefactor. Herein, we present a new method in which grid computing is used to accelerate and/or enable the evaluation of the nuclear Hessian via numerical differentiation: NUMFREQ@Grid. Nuclear Hessians were successfully evaluated by NUMFREQ@Grid at the DFT level as well as using RIJCOSX-ZORA-MP2 or RIJCOSX-ZORA-B2PLYP for a set of linear polyacenes with systematically increasing size. For the larger members of this group, NUMFREQ@Grid was found to outperform the wall clock time of analytic Hessian evaluation; at the MP2 or B2LYP levels, these Hessians cannot even be evaluated analytically. We also evaluated a 156-atom catalytically relevant open-shell transition metal complex and found that NUMFREQ@Grid is faster (7.7 times shorter wall clock time) and less demanding (4.4 times less memory requirement) than an analytic Hessian. Capitalizing on the capabilities of parallel grid computing, NUMFREQ@Grid can outperform analytic methods in terms of wall time, memory requirements, and treatable system size. The NUMFREQ@Grid method presented herein demonstrates how grid computing can be used to facilitate embarrassingly parallel computational procedures and is a pioneer for future implementations.
ERIC Educational Resources Information Center
Critchfield, Thomas S.; Becirevic, Amel; Reed, Derek D.
2017-01-01
It has often been suggested that nonexperts find the communication of behavior analysts to be viscerally off-putting. We argue that this concern should be the focus of systematic research rather than mere discussion, and describe five studies that illustrate how publicly available lists of word-emotion ratings can be used to estimate the responses…
Modern data science for analytical chemical data - A comprehensive review.
Szymańska, Ewa
2018-10-22
Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
Cantrill, Richard C
2008-01-01
Methods of analysis for products of modern biotechnology are required for national and international trade in seeds, grain and food in order to meet the labeling or import/export requirements of different nations and trading blocks. Although many methods were developed by the originators of transgenic events, governments, universities, and testing laboratories, trade is less complicated if there exists a set of international consensus-derived analytical standards. In any analytical situation, multiple methods may exist for testing for the same analyte. These methods may be supported by regional preferences and regulatory requirements. However, tests need to be sensitive enough to determine low levels of these traits in commodity grain for regulatory purposes and also to indicate purity of seeds containing these traits. The International Organization for Standardization (ISO) and its European counterpart have worked to produce a suite of standards through open, balanced and consensus-driven processes. Presently, these standards are approaching the time for their first review. In fact, ISO 21572, the "protein standard" has already been circulated for systematic review. In order to expedite the review and revision of the nucleic acid standards an ISO Technical Specification (ISO/TS 21098) was drafted to set the criteria for the inclusion of precision data from collaborative studies into the annexes of these standards.
Systematic reviews, systematic error and the acquisition of clinical knowledge
2010-01-01
Background Since its inception, evidence-based medicine and its application through systematic reviews, has been widely accepted. However, it has also been strongly criticised and resisted by some academic groups and clinicians. One of the main criticisms of evidence-based medicine is that it appears to claim to have unique access to absolute scientific truth and thus devalues and replaces other types of knowledge sources. Discussion The various types of clinical knowledge sources are categorised on the basis of Kant's categories of knowledge acquisition, as being either 'analytic' or 'synthetic'. It is shown that these categories do not act in opposition but rather, depend upon each other. The unity of analysis and synthesis in knowledge acquisition is demonstrated during the process of systematic reviewing of clinical trials. Systematic reviews constitute comprehensive synthesis of clinical knowledge but depend upon plausible, analytical hypothesis development for the trials reviewed. The dangers of systematic error regarding the internal validity of acquired knowledge are highlighted on the basis of empirical evidence. It has been shown that the systematic review process reduces systematic error, thus ensuring high internal validity. It is argued that this process does not exclude other types of knowledge sources. Instead, amongst these other types it functions as an integrated element during the acquisition of clinical knowledge. Conclusions The acquisition of clinical knowledge is based on interaction between analysis and synthesis. Systematic reviews provide the highest form of synthetic knowledge acquisition in terms of achieving internal validity of results. In that capacity it informs the analytic knowledge of the clinician but does not replace it. PMID:20537172
Importance of implementing an analytical quality control system in a core laboratory.
Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T
2015-01-01
The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
[Hydrolyzable tannins; biochemistry, nutritional & analytical aspects and health effects].
Olivas-Aguirre, Francisco Javier; Wall-Medrano, Abraham; González-Aguilar, Gustavo A; López-Díaz, Jose Alberto; Álvarez-Parrilla, Emilio; de la Rosa, Laura A; Ramos-Jimenez, Arnulfo
2014-11-01
Hydrolysable tannins (HT) have been of scientific interest because of their nutraceutical potential. Both gallotannins (GT) and ellagitannins (ET) show different biochemical properties that result in various health benefits (eg anti-diabetic, anti-mutagenic, anti-microbial) for consumers, all associated with their antioxidant capacity (AOXc). To analyze the most relevant aspects (biochemical, nutritional/analytical and health effects) of HT reported in the scientific literature. A systematic search was conducted in several databases (PubMed, Cochrane, ScienceDirect) and free-access repositories (Google Scholar) on HT, GT and ET. This information was further sub-classified into biochemical, nutritional and analytical aspects (narrative review) and health effects (systematic review). The high molecular complexity and amount of hydroxyl groups (-OH) in both ET and GT, are responsible not only for a plethora of methods for extraction and purification but also for the several pro-and anti-physiological effects of them such as enzyme inhibitions, protein excretion stimulation, AOXc and anti-proliferative effects. The association of ET/GT with several macromolecules present in foodstuffs and the digestive tract, counteract the AOXc of these compounds but conversely allow the differential distribution of GT and ET to different target organs in such way that their health effects seems to be different. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
A methodological systematic review of what's wrong with meta-ethnography reporting.
France, Emma F; Ring, Nicola; Thomas, Rebecca; Noyes, Jane; Maxwell, Margaret; Jepson, Ruth
2014-11-19
Syntheses of qualitative studies can inform health policy, services and our understanding of patient experience. Meta-ethnography is a systematic seven-phase interpretive qualitative synthesis approach well-suited to producing new theories and conceptual models. However, there are concerns about the quality of meta-ethnography reporting, particularly the analysis and synthesis processes. Our aim was to investigate the application and reporting of methods in recent meta-ethnography journal papers, focusing on the analysis and synthesis process and output. Methodological systematic review of health-related meta-ethnography journal papers published from 2012-2013. We searched six electronic databases, Google Scholar and Zetoc for papers using key terms including 'meta-ethnography.' Two authors independently screened papers by title and abstract with 100% agreement. We identified 32 relevant papers. Three authors independently extracted data and all authors analysed the application and reporting of methods using content analysis. Meta-ethnography was applied in diverse ways, sometimes inappropriately. In 13% of papers the approach did not suit the research aim. In 66% of papers reviewers did not follow the principles of meta-ethnography. The analytical and synthesis processes were poorly reported overall. In only 31% of papers reviewers clearly described how they analysed conceptual data from primary studies (phase 5, 'translation' of studies) and in only one paper (3%) reviewers explicitly described how they conducted the analytic synthesis process (phase 6). In 38% of papers we could not ascertain if reviewers had achieved any new interpretation of primary studies. In over 30% of papers seminal methodological texts which could have informed methods were not cited. We believe this is the first in-depth methodological systematic review of meta-ethnography conduct and reporting. Meta-ethnography is an evolving approach. Current reporting of methods, analysis and synthesis lacks clarity and comprehensiveness. This is a major barrier to use of meta-ethnography findings that could contribute significantly to the evidence base because it makes judging their rigour and credibility difficult. To realise the high potential value of meta-ethnography for enhancing health care and understanding patient experience requires reporting that clearly conveys the methodology, analysis and findings. Tailored meta-ethnography reporting guidelines, developed through expert consensus, could improve reporting.
ERIC Educational Resources Information Center
Kangas, Maria; Bovbjerg, Dana H.; Montgomery, Guy H.
2009-01-01
Reports an error in "Cancer-related fatigue: A systematic and meta-analytic review of non-pharmacological therapies for cancer patients" by Maria Kangas, Dana H. Bovbjerg and Guy H. Montgomery (Psychological Bulletin, 2008[Sep], Vol 134[5], 700-741). The URL to the Supplemental Materials for the article is listed incorrectly in two places in the…
Gentles, Stephen J; Charles, Cathy; Nicholas, David B; Ploeg, Jenny; McKibbon, K Ann
2016-10-11
Overviews of methods are potentially useful means to increase clarity and enhance collective understanding of specific methods topics that may be characterized by ambiguity, inconsistency, or a lack of comprehensiveness. This type of review represents a distinct literature synthesis method, although to date, its methodology remains relatively undeveloped despite several aspects that demand unique review procedures. The purpose of this paper is to initiate discussion about what a rigorous systematic approach to reviews of methods, referred to here as systematic methods overviews, might look like by providing tentative suggestions for approaching specific challenges likely to be encountered. The guidance offered here was derived from experience conducting a systematic methods overview on the topic of sampling in qualitative research. The guidance is organized into several principles that highlight specific objectives for this type of review given the common challenges that must be overcome to achieve them. Optional strategies for achieving each principle are also proposed, along with discussion of how they were successfully implemented in the overview on sampling. We describe seven paired principles and strategies that address the following aspects: delimiting the initial set of publications to consider, searching beyond standard bibliographic databases, searching without the availability of relevant metadata, selecting publications on purposeful conceptual grounds, defining concepts and other information to abstract iteratively, accounting for inconsistent terminology used to describe specific methods topics, and generating rigorous verifiable analytic interpretations. Since a broad aim in systematic methods overviews is to describe and interpret the relevant literature in qualitative terms, we suggest that iterative decision making at various stages of the review process, and a rigorous qualitative approach to analysis are necessary features of this review type. We believe that the principles and strategies provided here will be useful to anyone choosing to undertake a systematic methods overview. This paper represents an initial effort to promote high quality critical evaluations of the literature regarding problematic methods topics, which have the potential to promote clearer, shared understandings, and accelerate advances in research methods. Further work is warranted to develop more definitive guidance.
2014-01-01
Background To improve quality of care and patient outcomes, health system decision-makers need to identify and implement effective interventions. An increasing number of systematic reviews document the effects of quality improvement programs to assist decision-makers in developing new initiatives. However, limitations in the reporting of primary studies and current meta-analysis methods (including approaches for exploring heterogeneity) reduce the utility of existing syntheses for health system decision-makers. This study will explore the role of innovative meta-analysis approaches and the added value of enriched and updated data for increasing the utility of systematic reviews of complex interventions. Methods/Design We will use the dataset from our recent systematic review of 142 randomized trials of diabetes quality improvement programs to evaluate novel approaches for exploring heterogeneity. These will include exploratory methods, such as multivariate meta-regression analyses and all-subsets combinatorial meta-analysis. We will then update our systematic review to include new trials and enrich the dataset by surveying authors of all included trials. In doing so, we will explore the impact of variables not, reported in previous publications, such as details of study context, on the effectiveness of the intervention. We will use innovative analytical methods on the enriched and updated dataset to identify key success factors in the implementation of quality improvement interventions for diabetes. Decision-makers will be involved throughout to help identify and prioritize variables to be explored and to aid in the interpretation and dissemination of results. Discussion This study will inform future systematic reviews of complex interventions and describe the value of enriching and updating data for exploring heterogeneity in meta-analysis. It will also result in an updated comprehensive systematic review of diabetes quality improvement interventions that will be useful to health system decision-makers in developing interventions to improve outcomes for people with diabetes. Systematic review registration PROSPERO registration no. CRD42013005165 PMID:25115289
Evaluating the performance of free-formed surface parts using an analytic network process
NASA Astrophysics Data System (ADS)
Qian, Xueming; Ma, Yanqiao; Liang, Dezhi
2018-03-01
To successfully design parts with a free-formed surface, the critical issue of how to evaluate and select a favourable evaluation strategy before design is raised. The evaluation of free-formed surface parts is a multiple criteria decision-making (MCDM) problem that requires the consideration of a large number of interdependent factors. The analytic network process (ANP) is a relatively new MCDM method that can systematically deal with all kinds of dependences. In this paper, the factors, which come from the life-cycle and influence the design of free-formed surface parts, are proposed. After analysing the interdependence among these factors, a Hybrid ANP (HANP) structure for evaluating the part’s curved surface is constructed. Then, a HANP evaluation of an impeller is presented to illustrate the application of the proposed method.
Hutton, Brian; Wolfe, Dianna; Moher, David; Shamseer, Larissa
2017-05-01
Research waste has received considerable attention from the biomedical community. One noteworthy contributor is incomplete reporting in research publications. When detailing statistical methods and results, ensuring analytic methods and findings are completely documented improves transparency. For publications describing randomised trials and systematic reviews, guidelines have been developed to facilitate complete reporting. This overview summarises aspects of statistical reporting in trials and systematic reviews of health interventions. A narrative approach to summarise features regarding statistical methods and findings from reporting guidelines for trials and reviews was taken. We aim to enhance familiarity of statistical details that should be reported in biomedical research among statisticians and their collaborators. We summarise statistical reporting considerations for trials and systematic reviews from guidance documents including the Consolidated Standards of Reporting Trials (CONSORT) Statement for reporting of trials, the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) Statement for trial protocols, the Statistical Analyses and Methods in the Published Literature (SAMPL) Guidelines for statistical reporting principles, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement for systematic reviews and PRISMA for Protocols (PRISMA-P). Considerations regarding sharing of study data and statistical code are also addressed. Reporting guidelines provide researchers with minimum criteria for reporting. If followed, they can enhance research transparency and contribute improve quality of biomedical publications. Authors should employ these tools for planning and reporting of their research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Unbiased methods for removing systematics from galaxy clustering measurements
NASA Astrophysics Data System (ADS)
Elsner, Franz; Leistedt, Boris; Peiris, Hiranya V.
2016-02-01
Measuring the angular clustering of galaxies as a function of redshift is a powerful method for extracting information from the three-dimensional galaxy distribution. The precision of such measurements will dramatically increase with ongoing and future wide-field galaxy surveys. However, these are also increasingly sensitive to observational and astrophysical contaminants. Here, we study the statistical properties of three methods proposed for controlling such systematics - template subtraction, basic mode projection, and extended mode projection - all of which make use of externally supplied template maps, designed to characterize and capture the spatial variations of potential systematic effects. Based on a detailed mathematical analysis, and in agreement with simulations, we find that the template subtraction method in its original formulation returns biased estimates of the galaxy angular clustering. We derive closed-form expressions that should be used to correct results for this shortcoming. Turning to the basic mode projection algorithm, we prove it to be free of any bias, whereas we conclude that results computed with extended mode projection are biased. Within a simplified setup, we derive analytical expressions for the bias and discuss the options for correcting it in more realistic configurations. Common to all three methods is an increased estimator variance induced by the cleaning process, albeit at different levels. These results enable unbiased high-precision clustering measurements in the presence of spatially varying systematics, an essential step towards realizing the full potential of current and planned galaxy surveys.
Calibrant-Free Analyte Quantitation via a Variable Velocity Flow Cell.
Beck, Jason G; Skuratovsky, Aleksander; Granger, Michael C; Porter, Marc D
2017-01-17
In this paper, we describe a novel method for analyte quantitation that does not rely on calibrants, internal standards, or calibration curves but, rather, leverages the relationship between disparate and predictable surface-directed analyte flux to an array of sensing addresses and a measured resultant signal. To reduce this concept to practice, we fabricated two flow cells such that the mean linear fluid velocity, U, was varied systematically over an array of electrodes positioned along the flow axis. This resulted in a predictable variation of the address-directed flux of a redox analyte, ferrocenedimethanol (FDM). The resultant limiting currents measured at a series of these electrodes, and accurately described by a convective-diffusive transport model, provided a means to calculate an "unknown" concentration without the use of calibrants, internal standards, or a calibration curve. Furthermore, the experiment and concentration calculation only takes minutes to perform. Deviation in calculated FDM concentrations from true values was minimized to less than 0.5% when empirically derived values of U were employed.
Van Ham, Rita; Van Vaeck, Luc; Adams, Freddy C; Adriaens, Annemie
2004-05-01
The analytical use of mass spectra from static secondary ion mass spectrometry for the molecular identification of inorganic analytes in real life surface layers and microobjects requires an empirical insight in the signals to be expected from a given compound. A comprehensive database comprising over 50 salts has been assembled to complement prior data on oxides. The present study allows the systematic trends in the relationship between the detected signals and molecular composition of the analyte to be delineated. The mass spectra provide diagnostic information by means of atomic ions, structural fragments, molecular ions, and adduct ions of the analyte neutrals. The prediction of mass spectra from a given analyte must account for the charge state of the ions in the salt, the formation of oxide-type neutrals from oxy salts, and the occurrence of oxidation-reduction processes.
Explanation-based generalization of partially ordered plans
NASA Technical Reports Server (NTRS)
Kambhampati, Subbarao; Kedar, Smadar
1991-01-01
Most previous work in analytic generalization of plans dealt with totally ordered plans. These methods cannot be directly applied to generalizing partially ordered plans, since they do not capture all interactions among plan operators for all total orders of such plans. We introduce a new method for generalizing partially ordered plans. This method is based on providing explanation-based generalization (EBG) with explanations which systematically capture the interactions among plan operators for all the total orders of a partially-ordered plan. The explanations are based on the Modal Truth Criterion which states the necessary and sufficient conditions for ensuring the truth of a proposition at any point in a plan, for a class of partially ordered plans. The generalizations obtained by this method guarantee successful and interaction-free execution of any total order of the generalized plan. In addition, the systematic derivation of the generalization algorithms from the Modal Truth Criterion obviates the need for carrying out a separate formal proof of correctness of the EBG algorithms.
Advances in aptamer screening and small molecule aptasensors.
Kim, Yeon Seok; Gu, Man Bock
2014-01-01
It has been 20 years since aptamer and SELEX (systematic evolution of ligands by exponential enrichment) were described independently by Andrew Ellington and Larry Gold. Based on the great advantages of aptamers, there have been numerous isolated aptamers for various targets that have actively been applied as therapeutic and analytical tools. Over 2,000 papers related to aptamers or SELEX have been published, attesting to their wide usefulness and the applicability of aptamers. SELEX methods have been modified or re-created over the years to enable aptamer isolation with higher affinity and selectivity in more labor- and time-efficient manners, including automation. Initially, most of the studies about aptamers have focused on the protein targets, which have physiological functions in the body, and their applications as therapeutic agents or receptors for diagnostics. However, aptamers for small molecules such as organic or inorganic compounds, drugs, antibiotics, or metabolites have not been studied sufficiently, despite the ever-increasing need for rapid and simple analytical methods for various chemical targets in the fields of medical diagnostics, environmental monitoring, food safety, and national defense against targets including chemical warfare. This review focuses on not only recent advances in aptamer screening methods but also its analytical application for small molecules.
Challa, Shruthi; Potumarthi, Ravichandra
2013-01-01
Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.
NASA Astrophysics Data System (ADS)
Tauscher, Keith; Rapetti, David; Burns, Jack O.; Switzer, Eric
2018-02-01
The sky-averaged (global) highly redshifted 21 cm spectrum from neutral hydrogen is expected to appear in the VHF range of ∼20–200 MHz and its spectral shape and strength are determined by the heating properties of the first stars and black holes, by the nature and duration of reionization, and by the presence or absence of exotic physics. Measurements of the global signal would therefore provide us with a wealth of astrophysical and cosmological knowledge. However, the signal has not yet been detected because it must be seen through strong foregrounds weighted by a large beam, instrumental calibration errors, and ionospheric, ground, and radio-frequency-interference effects, which we collectively refer to as “systematics.” Here, we present a signal extraction method for global signal experiments which uses Singular Value Decomposition of “training sets” to produce systematics basis functions specifically suited to each observation. Instead of requiring precise absolute knowledge of the systematics, our method effectively requires precise knowledge of how the systematics can vary. After calculating eigenmodes for the signal and systematics, we perform a weighted least square fit of the corresponding coefficients and select the number of modes to include by minimizing an information criterion. We compare the performance of the signal extraction when minimizing various information criteria and find that minimizing the Deviance Information Criterion most consistently yields unbiased fits. The methods used here are built into our widely applicable, publicly available Python package, pylinex, which analytically calculates constraints on signals and systematics from given data, errors, and training sets.
Calculation method of spin accumulations and spin signals in nanostructures using spin resistors
NASA Astrophysics Data System (ADS)
Torres, Williams Savero; Marty, Alain; Laczkowski, Piotr; Jamet, Matthieu; Vila, Laurent; Attané, Jean-Philippe
2018-02-01
Determination of spin accumulations and spin currents is essential for a deep understanding of spin transport in nanostructures and further optimization of spintronic devices. So far, they are easily obtained using different approaches in nanostructures composed of few elements; however their calculation becomes complicated as the number of elements increases. Here, we propose a 1-D spin resistor approach to calculate analytically spin accumulations, spin currents and magneto-resistances in heterostructures. Our method, particularly applied to multi-terminal metallic nanostructures, provides a fast and systematic mean to determine such spin properties in structures where conventional methods remain complex.
Yebra, M Carmen
2012-01-01
A simple and rapid analytical method was developed for the determination of iron, manganese, and zinc in soluble solid samples. The method is based on continuous ultrasonic water dissolution of the sample (5-30 mg) at room temperature followed by flow injection flame atomic absorption spectrometric determination. A good precision of the whole procedure (1.2-4.6%) and a sample throughput of ca. 25 samples h(-1) were obtained. The proposed green analytical method has been successfully applied for the determination of iron, manganese, and zinc in soluble solid food samples (soluble cocoa and soluble coffee) and pharmaceutical preparations (multivitamin tablets). The ranges of concentrations found were 21.4-25.61 μg g(-1) for iron, 5.74-18.30 μg g(-1) for manganese, and 33.27-57.90 μg g(-1) for zinc in soluble solid food samples and 3.75-9.90 μg g(-1) for iron, 0.47-5.05 μg g(-1) for manganese, and 1.55-15.12 μg g(-1) for zinc in multivitamin tablets. The accuracy of the proposed method was established by a comparison with the conventional wet acid digestion method using a paired t-test, indicating the absence of systematic errors.
Systematic Review of the Human Milk Microbiota.
Fitzstevens, John L; Smith, Kelsey C; Hagadorn, James I; Caimano, Melissa J; Matson, Adam P; Brownell, Elizabeth A
2017-06-01
Human milk-associated microbes are among the first to colonize the infant gut and may help to shape both short- and long-term infant health outcomes. We performed a systematic review to characterize the microbiota of human milk. Relevant primary studies were identified through a comprehensive search of PubMed (January 1, 1964, to June 31, 2015). Included studies were conducted among healthy mothers, were written in English, identified bacteria in human milk, used culture-independent methods, and reported primary results at the genus level. Twelve studies satisfied inclusion criteria. All varied in geographic location and human milk collection/storage/analytic methods. Streptococcus was identified in human milk samples in 11 studies (91.6%) and Staphylococcus in 10 (83.3%); both were predominant genera in 6 (50%). Eight of the 12 studies used conventional ribosomal RNA (rRNA) polymerase chain reaction (PCR), of which 7 (87.5%) identified Streptococcus and 6 (80%) identified Staphylococcus as present. Of these 8 studies, 2 (25%) identified Streptococcus and Staphylococcus as predominant genera. Four of the 12 studies used next-generation sequencing (NGS), all of which identified Streptococcus and Staphylococcus as present and predominant genera. Relative to conventional rRNA PCR, NGS is a more sensitive method to identify/quantify bacterial genera in human milk, suggesting the predominance of Streptococcus and Staphylococcus may be underestimated in studies using older methods. These genera, Streptococcus and Staphylococcus, may be universally predominant in human milk, regardless of differences in geographic location or analytic methods. Primary studies designed to evaluate the effect of these 2 genera on short- and long-term infant outcomes are warranted.
Music education and its effect on intellectual abilities in children: a systematic review.
Jaschke, Artur C; Eggermont, Laura H P; Honing, Henkjan; Scherder, Erik J A
2013-01-01
Far transfer between music education and other cognitive skills, such as academic achievement, has been widely examined. However, the results of studies within similar cognitive domains are found to be inconclusive or contradictory. These differences can be traced back to the analytical methods used, differences in the forms of music education studied and differences in neural activation during the processing of these tasks. In order to gain a better picture of the relationships involved, a literature survey was performed in leading databases, such as PubMed/MedLine, psychINFO, ScienceDirect, Embase, ERIC, ASSIA and Jstor from January 2001 to January 2013. All studies included, concerned the far transfer from music education to other cognitive skills in children aged 4-13 years as compared with controls. These studies were independently selected and their quality was assessed by two authors. This systematic review shows the need to address methodological and analytical questions in greater detail. There is a general need to unify methods used in music education research. Furthermore, the hypothesis that intellectual skills, such as mathematics, reading, writing and intelligence can be divided into sub-functions, needs to be examined as one approach to the problems considered here. When this has been done, detailed analysis of cognitive transfer from music education to other disciplines should become possible.
Analytical Energy Gradients for Excited-State Coupled-Cluster Methods
NASA Astrophysics Data System (ADS)
Wladyslawski, Mark; Nooijen, Marcel
The equation-of-motion coupled-cluster (EOM-CC) and similarity transformed equation-of-motion coupled-cluster (STEOM-CC) methods have been firmly established as accurate and routinely applicable extensions of single-reference coupled-cluster theory to describe electronically excited states. An overview of these methods is provided, with emphasis on the many-body similarity transform concept that is the key to a rationalization of their accuracy. The main topic of the paper is the derivation of analytical energy gradients for such non-variational electronic structure approaches, with an ultimate focus on obtaining their detailed algebraic working equations. A general theoretical framework using Lagrange's method of undetermined multipliers is presented, and the method is applied to formulate the EOM-CC and STEOM-CC gradients in abstract operator terms, following the previous work in [P.G. Szalay, Int. J. Quantum Chem. 55 (1995) 151] and [S.R. Gwaltney, R.J. Bartlett, M. Nooijen, J. Chem. Phys. 111 (1999) 58]. Moreover, the systematics of the Lagrange multiplier approach is suitable for automation by computer, enabling the derivation of the detailed derivative equations through a standardized and direct procedure. To this end, we have developed the SMART (Symbolic Manipulation and Regrouping of Tensors) package of automated symbolic algebra routines, written in the Mathematica programming language. The SMART toolkit provides the means to expand, differentiate, and simplify equations by manipulation of the detailed algebraic tensor expressions directly. The Lagrangian multiplier formulation establishes a uniform strategy to perform the automated derivation in a standardized manner: A Lagrange multiplier functional is constructed from the explicit algebraic equations that define the energy in the electronic method; the energy functional is then made fully variational with respect to all of its parameters, and the symbolic differentiations directly yield the explicit equations for the wavefunction amplitudes, the Lagrange multipliers, and the analytical gradient via the perturbation-independent generalized Hellmann-Feynman effective density matrix. This systematic automated derivation procedure is applied to obtain the detailed gradient equations for the excitation energy (EE-), double ionization potential (DIP-), and double electron affinity (DEA-) similarity transformed equation-of-motion coupled-cluster singles-and-doubles (STEOM-CCSD) methods. In addition, the derivatives of the closed-shell-reference excitation energy (EE-), ionization potential (IP-), and electron affinity (EA-) equation-of-motion coupled-cluster singles-and-doubles (EOM-CCSD) methods are derived. Furthermore, the perturbative EOM-PT and STEOM-PT gradients are obtained. The algebraic derivative expressions for these dozen methods are all derived here uniformly through the automated Lagrange multiplier process and are expressed compactly in a chain-rule/intermediate-density formulation, which facilitates a unified modular implementation of analytic energy gradients for CCSD/PT-based electronic methods. The working equations for these analytical gradients are presented in full detail, and their factorization and implementation into an efficient computer code are discussed.
Methods for the thematic synthesis of qualitative research in systematic reviews
Thomas, James; Harden, Angela
2008-01-01
Background There is a growing recognition of the value of synthesising qualitative research in the evidence base in order to facilitate effective and appropriate health care. In response to this, methods for undertaking these syntheses are currently being developed. Thematic analysis is a method that is often used to analyse data in primary qualitative research. This paper reports on the use of this type of analysis in systematic reviews to bring together and integrate the findings of multiple qualitative studies. Methods We describe thematic synthesis, outline several steps for its conduct and illustrate the process and outcome of this approach using a completed review of health promotion research. Thematic synthesis has three stages: the coding of text 'line-by-line'; the development of 'descriptive themes'; and the generation of 'analytical themes'. While the development of descriptive themes remains 'close' to the primary studies, the analytical themes represent a stage of interpretation whereby the reviewers 'go beyond' the primary studies and generate new interpretive constructs, explanations or hypotheses. The use of computer software can facilitate this method of synthesis; detailed guidance is given on how this can be achieved. Results We used thematic synthesis to combine the studies of children's views and identified key themes to explore in the intervention studies. Most interventions were based in school and often combined learning about health benefits with 'hands-on' experience. The studies of children's views suggested that fruit and vegetables should be treated in different ways, and that messages should not focus on health warnings. Interventions that were in line with these suggestions tended to be more effective. Thematic synthesis enabled us to stay 'close' to the results of the primary studies, synthesising them in a transparent way, and facilitating the explicit production of new concepts and hypotheses. Conclusion We compare thematic synthesis to other methods for the synthesis of qualitative research, discussing issues of context and rigour. Thematic synthesis is presented as a tried and tested method that preserves an explicit and transparent link between conclusions and the text of primary studies; as such it preserves principles that have traditionally been important to systematic reviewing. PMID:18616818
Geue, Claudia; Wu, Olivia; Xin, Yiqiao; Heggie, Robert; Hutchinson, Sharon; Martin, Natasha K.; Fenwick, Elisabeth; Goldberg, David
2015-01-01
Introduction Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers. PMID:26689908
Avoiding pitfalls in the determination of halocarboxylic acids: the photochemistry of methylation.
Rubio, F J; Urbansky, E T; Magnuson, M L
2000-06-01
Haloethanoic (haloacetic) acids are formed during chlorination of drinking water and are regulated by the Environmental Protection Agency (EPA). These compounds are normally quantified by gas chromatography with electron capture detection (GC-ECD) as the methyl esters. EPA Method 552 uses diazomethane (CH2N2) for this purpose, but has only been validated by EPA for HAA6: chloro-, dichloro-, bromo-, dibromo-, bromochloro- and trichloroacetic acids. EPA Method 552.2 was developed and validated for all nine analytes (HAA9 = HAA6 + dibromochloro-, bromodichloro- and tribromoethanoic acids). Since the promulgation of Method 552.2, which uses acidic methanol, a debate has ensued over discrepancies observed by various laboratories when using diazomethane instead. In an effort to identify and eliminate potential sources for these discrepancies, a comparative study was undertaken for HAA9. Better accuracy and precision were observed for all HAA9 species by Method 552.2; recoveries were satisfactory in de-ionized and tap water. Method 552 remains satisfactory for HAA6. Systematic differences in instrumental response are observed for the two methods, but these are precise and may be accounted for using similarly treated standards and analyte-fortified (spiked) samples. That notwithstanding, Method 552 (CH2N2) was shown to be unsuitable for dibromochloro-, bromodichloro- and tribromoethanoic acids (HAA9-6). The primary problem appears to be a photoactivated reaction between diazomethane and the HAA9-6 analytes; however, side reactions were found to occur even in the dark. Analyte loss is most pronounced under typical laboratory lighting (white F40 fluorescent lamps + sunlight), but it is also observed under Philips gold F40 lamps (lambda > or = 520 nm), and in the dark.
An analytic approach to resolving problems in medical ethics.
Candee, D; Puka, B
1984-01-01
Education in ethics among practising professionals should provide a systematic procedure for resolving moral problems. A method for such decision-making is outlined using the two classical orientations in moral philosophy, teleology and deontology. Teleological views such as utilitarianism resolve moral dilemmas by calculating the excess of good over harm expected to be produced by each feasible alternative for action. The deontological view focuses on rights, duties, and principles of justice. Both methods are used to resolve the 1971 Johns Hopkins case of a baby born with Down's syndrome and duodenal atresia. PMID:6234395
U-Th-Pb, Sm-Nd, Rb-Sr, and Lu-Hf systematics of returned Mars samples
NASA Technical Reports Server (NTRS)
Tatsumoto, M.; Premo, W. R.
1988-01-01
The advantage of studying returned planetary samples cannot be overstated. A wider range of analytical techniques with higher sensitivities and accuracies can be applied to returned samples. Measurement of U-Th-Pb, Sm-Nd, Rb-Sr, and Lu-Hf isotopic systematics for chronology and isotopic tracer studies of planetary specimens cannot be done in situ with desirable precision. Returned Mars samples will be examined using all the physical, chemical, and geologic methods necessary to gain information on the origin and evolution of Mars. A returned Martian sample would provide ample information regarding the accretionary and evolutionary history of the Martian planetary body and possibly other planets of our solar system.
Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.
Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis
2016-07-01
Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.
[Marketing research in health service].
Ameri, Cinzia; Fiorini, Fulvio
2015-01-01
Marketing research is the systematic and objective search for, and analysis of, information relevant to the identification and solution of any problem in the field of marketing. The key words in this definition are: systematic, objective and analysis. Marketing research seeks to set about its task in a systematic and objective fashion. This means that a detailed and carefully designed research plan is developed in which each stage of the research is specified. Such a research plan is only considered adequate if it specifies: the research problem in concise and precise terms, the information necessary to address the problem, the methods to be employed in gathering the information and the analytical techniques to be used to interpret it. Maintaining objectivity in marketing research is essential if marketing management is to have sufficient confidence in its results to be prepared to take risky decisions based upon those results. To this end, as far as possible, marketing researchers employ the scientific method. The characteristics of the scientific method are that it translates personal prejudices, notions and opinions into explicit propositions (or hypotheses). These are tested empirically. At the same time alternative explanations of the event or phenomena of interest are given equal consideration.
Meta-analysis in evidence-based healthcare: a paradigm shift away from random effects is overdue.
Doi, Suhail A R; Furuya-Kanamori, Luis; Thalib, Lukman; Barendregt, Jan J
2017-12-01
Each year up to 20 000 systematic reviews and meta-analyses are published whose results influence healthcare decisions, thus making the robustness and reliability of meta-analytic methods one of the world's top clinical and public health priorities. The evidence synthesis makes use of either fixed-effect or random-effects statistical methods. The fixed-effect method has largely been replaced by the random-effects method as heterogeneity of study effects led to poor error estimation. However, despite the widespread use and acceptance of the random-effects method to correct this, it too remains unsatisfactory and continues to suffer from defective error estimation, posing a serious threat to decision-making in evidence-based clinical and public health practice. We discuss here the problem with the random-effects approach and demonstrate that there exist better estimators under the fixed-effect model framework that can achieve optimal error estimation. We argue for an urgent return to the earlier framework with updates that address these problems and conclude that doing so can markedly improve the reliability of meta-analytical findings and thus decision-making in healthcare.
Ates, Ebru; Mittendorf, Klaus; Senyuva, Hamide
2013-01-01
An automated sample preparation technique involving cleanup and analytical separation in a single operation using an online coupled TurboFlow (RP-LC system) is reported. This method eliminates time-consuming sample preparation steps that can be potential sources for cross-contamination in the analysis of plasticizers. Using TurboFlow chromatography, liquid samples were injected directly into the automated system without previous extraction or cleanup. Special cleanup columns enabled specific binding of target compounds; higher MW compounds, i.e., fats and proteins, and other matrix interferences with different chemical properties were removed to waste, prior to LC/MS/MS. Systematic stepwise method development using this new technology in the food safety area is described. Selection of optimum columns and mobile phases for loading onto the cleanup column followed by transfer onto the analytical column and MS detection are critical method parameters. The method was optimized for the assay of 10 phthalates (dimethyl, diethyl, dipropyl, butyl benzyl, diisobutyl, dicyclohexyl, dihexyl, diethylhexyl, diisononyl, and diisododecyl) and one adipate (diethylhexyl) in beverages and milk.
Measurement of the $B^-$ lifetime using a simulation free approach for trigger bias correction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaltonen, T.; /Helsinki Inst. of Phys.; Adelman, J.
2010-04-01
The collection of a large number of B hadron decays to hadronic final states at the CDF II detector is possible due to the presence of a trigger that selects events based on track impact parameters. However, the nature of the selection requirements of the trigger introduces a large bias in the observed proper decay time distribution. A lifetime measurement must correct for this bias and the conventional approach has been to use a Monte Carlo simulation. The leading sources of systematic uncertainty in the conventional approach are due to differences between the data and the Monte Carlo simulation. Inmore » this paper they present an analytic method for bias correction without using simulation, thereby removing any uncertainty between data and simulation. This method is presented in the form of a measurement of the lifetime of the B{sup -} using the mode B{sup -} {yields} D{sup 0}{pi}{sup -}. The B{sup -} lifetime is measured as {tau}{sub B{sup -}} = 1.663 {+-} 0.023 {+-} 0.015 ps, where the first uncertainty is statistical and the second systematic. This new method results in a smaller systematic uncertainty in comparison to methods that use simulation to correct for the trigger bias.« less
NASA Astrophysics Data System (ADS)
Gu, Huajie; Duan, Nuo; Wu, Shijia; Hao, Liling; Xia, Yu; Ma, Xiaoyuan; Wang, Zhouping
2016-02-01
Okadaic acid (OA) is a low-molecular-weight marine toxin from shellfish that causes abdominal pain, vomiting and diarrhea, i.e., diarrheic shellfish poisoning. In this study, a ssDNA aptamer that specifically binds to OA with high affinity was obtained via Systematic Evolution of Ligands by Exponential Enrichment (SELEX) assisted by graphene oxide (GO). This aptamer was then applied to fabricate a novel direct competitive enzyme-linked aptamer assay (ELAA). At the optimized conditions, this ELAA method showed a low detection limit (LOD of 0.01 ng/mL), wide linear range (from 0.025 to 10 ng/mL), good recovery rate (92.86-103.34% in OA-spiked clam samples) and repeatability (RSD of 2.28-4.53%). The proposed method can be used to detect OA in seafood products with high sensitivity and can potentially be adapted for the determination of other small molecular analytes.
NASA Astrophysics Data System (ADS)
Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua
2016-12-01
Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples.
Subtracting infrared renormalons from Wilson coefficients: Uniqueness and power dependences on ΛQCD
NASA Astrophysics Data System (ADS)
Mishima, Go; Sumino, Yukinari; Takaura, Hiromasa
2017-06-01
In the context of operator product expansion (OPE) and using the large-β0 approximation, we propose a method to define Wilson coefficients free from uncertainties due to IR renormalons. We first introduce a general observable X (Q2) with an explicit IR cutoff, and then we extract a genuine UV contribution XUV as a cutoff-independent part. XUV includes power corrections ˜(ΛQCD2/Q2)n which are independent of renormalons. Using the integration-by-regions method, we observe that XUV coincides with the leading Wilson coefficient in OPE and also clarify that the power corrections originate from UV region. We examine scheme dependence of XUV and single out a specific scheme favorable in terms of analytical properties. Our method would be optimal with respect to systematicity, analyticity and stability. We test our formulation with the examples of the Adler function, QCD force between Q Q ¯, and R -ratio in e+e- collision.
Measuring allostatic load in the workforce: a systematic review
MAUSS, Daniel; LI, Jian; SCHMIDT, Burkhard; ANGERER, Peter; JARCZOK, Marc N.
2014-01-01
The Allostatic Load Index (ALI) has been used to establish associations between stress and health-related outcomes. This review summarizes the measurement and methodological challenges of allostatic load in occupational settings. Databases of Medline, PubPsych, and Cochrane were searched to systematically explore studies measuring ALI in working adults following the PRISMA statement. Study characteristics, biomarkers and methods were tabulated. Methodological quality was evaluated using a standardized checklist. Sixteen articles (2003–2013) met the inclusion criteria, with a total of 39 (range 6–17) different variables used to calculate ALI. Substantial heterogeneity was observed in the number and type of biomarkers used, the analytic techniques applied and study quality. Particularly, primary mediators were not regularly included in ALI calculation. Consensus on methods to measure ALI in working populations is limited. Research should include longitudinal studies using multi-systemic variables to measure employees at risk for biological wear and tear. PMID:25224337
Researching Mental Health Disorders in the Era of Social Media: Systematic Review.
Wongkoblap, Akkapon; Vadillo, Miguel A; Curcin, Vasa
2017-06-29
Mental illness is quickly becoming one of the most prevalent public health problems worldwide. Social network platforms, where users can express their emotions, feelings, and thoughts, are a valuable source of data for researching mental health, and techniques based on machine learning are increasingly used for this purpose. The objective of this review was to explore the scope and limits of cutting-edge techniques that researchers are using for predictive analytics in mental health and to review associated issues, such as ethical concerns, in this area of research. We performed a systematic literature review in March 2017, using keywords to search articles on data mining of social network data in the context of common mental health disorders, published between 2010 and March 8, 2017 in medical and computer science journals. The initial search returned a total of 5386 articles. Following a careful analysis of the titles, abstracts, and main texts, we selected 48 articles for review. We coded the articles according to key characteristics, techniques used for data collection, data preprocessing, feature extraction, feature selection, model construction, and model verification. The most common analytical method was text analysis, with several studies using different flavors of image analysis and social interaction graph analysis. Despite an increasing number of studies investigating mental health issues using social network data, some common problems persist. Assembling large, high-quality datasets of social media users with mental disorder is problematic, not only due to biases associated with the collection methods, but also with regard to managing consent and selecting appropriate analytics techniques. ©Akkapon Wongkoblap, Miguel A Vadillo, Vasa Curcin. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.06.2017.
Poore, Joshua C; Forlines, Clifton L; Miller, Sarah M; Regan, John R; Irvine, John M
2014-12-01
The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures-personality, cognitive style, motivated cognition-predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated "top-down" cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains.
Forlines, Clifton L.; Miller, Sarah M.; Regan, John R.; Irvine, John M.
2014-01-01
The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures—personality, cognitive style, motivated cognition—predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated “top-down” cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains. PMID:25983670
Schmidt, Kathrin S; Mankertz, Joachim
2018-06-01
A sensitive and robust LC-MS/MS method allowing the rapid screening and confirmation of selective androgen receptor modulators in bovine urine was developed and successfully validated according to Commission Decision 2002/657/EC, chapter 3.1.3 'alternative validation', by applying a matrix-comprehensive in-house validation concept. The confirmation of the analytes in the validation samples was achieved both on the basis of the MRM ion ratios as laid down in Commission Decision 2002/657/EC and by comparison of their enhanced product ion (EPI) spectra with a reference mass spectral library by making use of the QTRAP technology. Here, in addition to the MRM survey scan, EPI spectra were generated in a data-dependent way according to an information-dependent acquisition criterion. Moreover, stability studies of the analytes in solution and in matrix according to an isochronous approach proved the stability of the analytes in solution and in matrix for at least the duration of the validation study. To identify factors that have a significant influence on the test method in routine analysis, a factorial effect analysis was performed. To this end, factors considered to be relevant for the method in routine analysis (e.g. operator, storage duration of the extracts before measurement, different cartridge lots and different hydrolysis conditions) were systematically varied on two levels. The examination of the extent to which these factors influence the measurement results of the individual analytes showed that none of the validation factors exerts a significant influence on the measurement results.
Yebra, M. Carmen
2012-01-01
A simple and rapid analytical method was developed for the determination of iron, manganese, and zinc in soluble solid samples. The method is based on continuous ultrasonic water dissolution of the sample (5–30 mg) at room temperature followed by flow injection flame atomic absorption spectrometric determination. A good precision of the whole procedure (1.2–4.6%) and a sample throughput of ca. 25 samples h–1 were obtained. The proposed green analytical method has been successfully applied for the determination of iron, manganese, and zinc in soluble solid food samples (soluble cocoa and soluble coffee) and pharmaceutical preparations (multivitamin tablets). The ranges of concentrations found were 21.4–25.61 μg g−1 for iron, 5.74–18.30 μg g−1 for manganese, and 33.27–57.90 μg g−1 for zinc in soluble solid food samples and 3.75–9.90 μg g−1 for iron, 0.47–5.05 μg g−1 for manganese, and 1.55–15.12 μg g−1 for zinc in multivitamin tablets. The accuracy of the proposed method was established by a comparison with the conventional wet acid digestion method using a paired t-test, indicating the absence of systematic errors. PMID:22567553
Papaventsis, D; Casali, N; Kontsevaya, I; Drobniewski, F; Cirillo, D M; Nikolayevskyy, V
2017-02-01
We conducted a systematic review to determine the diagnostic accuracy of whole genome sequencing (WGS) of Mycobacterium tuberculosis for the detection of resistance to first- and second-line anti-tuberculosis (TB) drugs. The study was conducted according to the criteria of the Preferred Reporting Items for Systematic Reviews group. A total of 20 publications were included. The sensitivity, specificity, positive-predictive value and negative-predictive value of WGS using phenotypic drug susceptibility testing methods as a reference standard were determined. Anti-TB agents tested included all first-line drugs, a variety of reserve drugs, as well as new drugs. Polymorphisms in a total of 53 genes were tested for associations with drug resistance. Pooled sensitivity and specificity values for detection of resistance to selected first-line drugs were 0.98 (95% CI 0.93-0.98) and 0.98 (95% CI 0.98-1.00) for rifampicin and 0.97 (95% CI 0.94-0.99) and 0.93 (95% CI 0.91-0.96) for isoniazid, respectively. Due to high heterogeneity in study designs, lack of data, knowledge of resistance mechanisms and clarity on exclusion of phylogenetic markers, there was a significant variation in analytical performance of WGS for the remaining first-line, reserved drugs and new drugs. Whole genome sequencing could be considered a promising alternative to existing phenotypic and molecular drug susceptibility testing methods for rifampicin and isoniazid pending standardization of analytical pipelines. To ensure clinical relevance of WGS for detection of M. tuberculosis complex drug resistance, future studies should include information on clinical outcomes. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.
Development of an improved method of consolidating fatigue life data
NASA Technical Reports Server (NTRS)
Leis, B. N.; Sampath, S. G.
1978-01-01
A fatigue data consolidation model that incorporates recent advances in life prediction methodology was developed. A combined analytic and experimental study of fatigue of notched 2024-T3 aluminum alloy under constant amplitude loading was carried out. Because few systematic and complete data sets for 2024-T3 were available in the program generated data for fatigue crack initiation and separation failure for both zero and nonzero mean stresses. Consolidations of these data are presented.
Klassen, Tara L.; von Rüden, Eva-Lotta; Drabek, Janice; Noebels, Jeffrey L.; Goldman, Alica M.
2013-01-01
Genetic testing and research have increased the demand for high-quality DNA that has traditionally been obtained by venipuncture. However, venous blood collection may prove difficult in special populations and when large-scale specimen collection or exchange is prerequisite for international collaborative investigations. Guthrie/FTA card–based blood spots, buccal scrapes, and finger nail clippings are DNA-containing specimens that are uniquely accessible and thus attractive as alternative tissue sources (ATS). The literature details a variety of protocols for extraction of nucleic acids from a singular ATS type, but their utility has not been systematically analyzed in comparison with conventional sources such as venous blood. Additionally, the efficacy of each protocol is often equated with the overall nucleic acid yield but not with the analytical performance of the DNA during mutation detection. Together with a critical in-depth literature review of published extraction methods, we developed and evaluated an all-inclusive approach for serial, systematic, and direct comparison of DNA utility from multiple biological samples. Our results point to the often underappreciated value of these alternative tissue sources and highlight ways to maximize the ATS-derived DNA for optimal quantity, quality, and utility as a function of extraction method. Our comparative analysis clarifies the value of ATS in genomic analysis projects for population-based screening, diagnostics, molecular autopsy, medico-legal investigations, or multi-organ surveys of suspected mosaicisms. PMID:22796560
Petticrew, Mark; Rehfuess, Eva; Noyes, Jane; Higgins, Julian P T; Mayhew, Alain; Pantoja, Tomas; Shemilt, Ian; Sowden, Amanda
2013-11-01
Although there is increasing interest in the evaluation of complex interventions, there is little guidance on how evidence from complex interventions may be reviewed and synthesized, and the relevance of the plethora of evidence synthesis methods to complexity is unclear. This article aims to explore how different meta-analytical approaches can be used to examine aspects of complexity; describe the contribution of various narrative, tabular, and graphical approaches to synthesis; and give an overview of the potential choice of selected qualitative and mixed-method evidence synthesis approaches. The methodological discussions presented here build on a 2-day workshop held in Montebello, Canada, in January 2012, involving methodological experts from the Campbell and Cochrane Collaborations and from other international review centers (Anderson L, Petticrew M, Chandler J, et al. systematic reviews of complex interventions. In press). These systematic review methodologists discussed the broad range of existing methods and considered the relevance of these methods to reviews of complex interventions. The evidence from primary studies of complex interventions may be qualitative or quantitative. There is a wide range of methodological options for reviewing and presenting this evidence. Specific contributions of statistical approaches include the use of meta-analysis, meta-regression, and Bayesian methods, whereas narrative summary approaches provide valuable precursors or alternatives to these. Qualitative and mixed-method approaches include thematic synthesis, framework synthesis, and realist synthesis. A suitable combination of these approaches allows synthesis of evidence for understanding complex interventions. Reviewers need to consider which aspects of complex interventions should be a focus of their review and what types of quantitative and/or qualitative studies they will be including, and this will inform their choice of review methods. These may range from standard meta-analysis through to more complex mixed-method synthesis and synthesis approaches that incorporate theory and/or user's perspectives. Copyright © 2013 Elsevier Inc. All rights reserved.
Application of advanced control techniques to aircraft propulsion systems
NASA Technical Reports Server (NTRS)
Lehtinen, B.
1984-01-01
Two programs are described which involve the application of advanced control techniques to the design of engine control algorithms. Multivariable control theory is used in the F100 MVCS (multivariable control synthesis) program to design controls which coordinate the control inputs for improved engine performance. A systematic method for handling a complex control design task is given. Methods of analytical redundancy are aimed at increasing the control system reliability. The F100 DIA (detection, isolation, and accommodation) program, which investigates the uses of software to replace or augment hardware redundancy for certain critical engine sensor, is described.
A methodology for designing aircraft to low sonic boom constraints
NASA Technical Reports Server (NTRS)
Mack, Robert J.; Needleman, Kathy E.
1991-01-01
A method for designing conceptual supersonic cruise aircraft to meet low sonic boom requirements is outlined and described. The aircraft design is guided through a systematic evolution from initial three view drawing to a final numerical model description, while the designer using the method controls the integration of low sonic boom, high supersonic aerodynamic efficiency, adequate low speed handling, and reasonable structure and materials technologies. Some experience in preliminary aircraft design and in the use of various analytical and numerical codes is required for integrating the volume and lift requirements throughout the design process.
Concurrence of big data analytics and healthcare: A systematic review.
Mehta, Nishita; Pandit, Anil
2018-06-01
The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of Big Data analytics in healthcare. This is because, the usability studies have considered only qualitative approach which describes potential benefits but does not take into account the quantitative study. Also, majority of the studies were from developed countries which brings out the need for promotion of research on Healthcare Big Data analytics in developing countries. Copyright © 2018 Elsevier B.V. All rights reserved.
Dzakpasu, Susie; Powell-Jackson, Timothy; Campbell, Oona M R
2014-03-01
To assess the evidence of the impact of user fees on maternal health service utilization and related health outcomes in low- and middle-income countries, as well as their impact on inequalities in these outcomes. Studies were identified by modifying a search strategy from a related systematic review. Primary studies of any design were included if they reported the effect of fee changes on maternal health service utilization, related health outcomes and inequalities in these outcomes. For each study, data were systematically extracted and a quality assessment conducted. Due to the heterogeneity of study methods, results were examined narratively. Twenty studies were included. Designs and analytic approaches comprised: two interrupted time series, eight repeated cross-sectional, nine before-and-after without comparison groups and one before-and-after in three groups. Overall, the quality of studies was poor. Few studies addressed potential sources of bias, such as secular trends over time, and even basic tests of statistical significance were often not reported. Consistency in the direction of effects provided some evidence of an increase in facility delivery in particular after fees were removed, as well as possible increases in the number of managed delivery complications. There was little evidence of the effect on health outcomes or inequality in accessing care and, where available, the direction of effect varied. Despite the global momentum to abolish user fees for maternal and child health services, robust evidence quantifying impact remains scant. Improved methods for evaluating and reporting on these interventions are recommended, including better descriptions of the interventions and context, looking at a range of outcome measures, and adopting robust analytical methods that allow for adjustment of underlying and seasonal trends, reporting immediate as well as longer-term (e.g. at 6 months and 1 year) effects and using comparison groups where possible.
Rankin, Kristin M; Kroelinger, Charlan D; Rosenberg, Deborah; Barfield, Wanda D
2012-12-01
The purpose of this article is to summarize the methodology, partnerships, and products developed as a result of a distance-based workforce development initiative to improve analytic capacity among maternal and child health (MCH) epidemiologists in state health agencies. This effort was initiated by the Centers for Disease Control's MCH Epidemiology Program and faculty at the University of Illinois at Chicago to encourage and support the use of surveillance data by MCH epidemiologists and program staff in state agencies. Beginning in 2005, distance-based training in advanced analytic skills was provided to MCH epidemiologists. To support participants, this model of workforce development included: lectures about the practical application of innovative epidemiologic methods, development of multidisciplinary teams within and across agencies, and systematic, tailored technical assistance The goal of this initiative evolved to emphasize the direct application of advanced methods to the development of state data products using complex sample surveys, resulting in the articles published in this supplement to MCHJ. Innovative methods were applied by participating MCH epidemiologists, including regional analyses across geographies and datasets, multilevel analyses of state policies, and new indicator development. Support was provided for developing cross-state and regional partnerships and for developing and publishing the results of analytic projects. This collaboration was successful in building analytic capacity, facilitating partnerships and promoting surveillance data use to address state MCH priorities, and may have broader application beyond MCH epidemiology. In an era of decreasing resources, such partnership efforts between state and federal agencies and academia are essential for promoting effective data use.
Development and application of accurate analytical models for single active electron potentials
NASA Astrophysics Data System (ADS)
Miller, Michelle; Jaron-Becker, Agnieszka; Becker, Andreas
2015-05-01
The single active electron (SAE) approximation is a theoretical model frequently employed to study scenarios in which inner-shell electrons may productively be treated as frozen spectators to a physical process of interest, and accurate analytical approximations for these potentials are sought as a useful simulation tool. Density function theory is often used to construct a SAE potential, requiring that a further approximation for the exchange correlation functional be enacted. In this study, we employ the Krieger, Li, and Iafrate (KLI) modification to the optimized-effective-potential (OEP) method to reduce the complexity of the problem to the straightforward solution of a system of linear equations through simple arguments regarding the behavior of the exchange-correlation potential in regions where a single orbital dominates. We employ this method for the solution of atomic and molecular potentials, and use the resultant curve to devise a systematic construction for highly accurate and useful analytical approximations for several systems. Supported by the U.S. Department of Energy (Grant No. DE-FG02-09ER16103), and the U.S. National Science Foundation (Graduate Research Fellowship, Grants No. PHY-1125844 and No. PHY-1068706).
Montoya-Castillo, Andrés; Reichman, David R
2017-01-14
We derive a semi-analytical form for the Wigner transform for the canonical density operator of a discrete system coupled to a harmonic bath based on the path integral expansion of the Boltzmann factor. The introduction of this simple and controllable approach allows for the exact rendering of the canonical distribution and permits systematic convergence of static properties with respect to the number of path integral steps. In addition, the expressions derived here provide an exact and facile interface with quasi- and semi-classical dynamical methods, which enables the direct calculation of equilibrium time correlation functions within a wide array of approaches. We demonstrate that the present method represents a practical path for the calculation of thermodynamic data for the spin-boson and related systems. We illustrate the power of the present approach by detailing the improvement of the quality of Ehrenfest theory for the correlation function C zz (t)=Re⟨σ z (0)σ z (t)⟩ for the spin-boson model with systematic convergence to the exact sampling function. Importantly, the numerically exact nature of the scheme presented here and its compatibility with semiclassical methods allows for the systematic testing of commonly used approximations for the Wigner-transformed canonical density.
Giardiasis as a neglected disease in Brazil: Systematic review of 20 years of publications.
Coelho, Camila Henriques; Durigan, Maurício; Leal, Diego Averaldo Guiguet; Schneider, Adriano de Bernardi; Franco, Regina Maura Bueno; Singer, Steven M
2017-10-01
Giardiasis is an intestinal infection that affects more than two hundred million people annually worldwide; it is caused by the flagellated protozoan Giardia duodenalis. In tropical countries and in low or middle-income settings, like Brazil, its prevalence can be high. There is currently no systematic review on the presence of G. duodenalis in patients, animals or water sources in Brazil. This systematic review was performed according to recommendations established by Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA). As databases for our searches, we have used PubMed, Embase, Scopus and the Brazilian database SciELO using the keywords «Giardia*» and «Brazil». This systematic review identified research studies related to G. duodenalis in water, giardiasis in animals, prevalence of giardiasis across Brazilian regions, genotyping of strains isolated in humans, and giardiasis in indigenous populations. We also propose a network of G. duodenalis transmission in Brazil based on genotypes analyses. This is the first time within the last twenty years that a review is being published on the occurrence of G. duodenalis in Brazil, addressing relevant issues such as prevalence, molecular epidemiology and analytical methods for parasite detection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sahoo, Satiprasad; Dhar, Anirban, E-mail: anirban.dhar@gmail.com; Kar, Amlanjyoti
Environmental management of an area describes a policy for its systematic and sustainable environmental protection. In the present study, regional environmental vulnerability assessment in Hirakud command area of Odisha, India is envisaged based on Grey Analytic Hierarchy Process method (Grey–AHP) using integrated remote sensing (RS) and geographic information system (GIS) techniques. Grey–AHP combines the advantages of classical analytic hierarchy process (AHP) and grey clustering method for accurate estimation of weight coefficients. It is a new method for environmental vulnerability assessment. Environmental vulnerability index (EVI) uses natural, environmental and human impact related factors, e.g., soil, geology, elevation, slope, rainfall, temperature, windmore » speed, normalized difference vegetation index, drainage density, crop intensity, agricultural DRASTIC value, population density and road density. EVI map has been classified into four environmental vulnerability zones (EVZs) namely: ‘low’, ‘moderate’ ‘high’, and ‘extreme’ encompassing 17.87%, 44.44%, 27.81% and 9.88% of the study area, respectively. EVI map indicates that the northern part of the study area is more vulnerable from an environmental point of view. EVI map shows close correlation with elevation. Effectiveness of the zone classification is evaluated by using grey clustering method. General effectiveness is in between “better” and “common classes”. This analysis demonstrates the potential applicability of the methodology. - Highlights: • Environmental vulnerability zone identification based on Grey Analytic Hierarchy Process (AHP) • The effectiveness evaluation by means of a grey clustering method with support from AHP • Use of grey approach eliminates the excessive dependency on the experience of experts.« less
Analytical model and error analysis of arbitrary phasing technique for bunch length measurement
NASA Astrophysics Data System (ADS)
Chen, Qushan; Qin, Bin; Chen, Wei; Fan, Kuanjun; Pei, Yuanji
2018-05-01
An analytical model of an RF phasing method using arbitrary phase scanning for bunch length measurement is reported. We set up a statistical model instead of a linear chirp approximation to analyze the energy modulation process. It is found that, assuming a short bunch (σφ / 2 π → 0) and small relative energy spread (σγ /γr → 0), the energy spread (Y =σγ 2) at the exit of the traveling wave linac has a parabolic relationship with the cosine value of the injection phase (X = cosφr|z=0), i.e., Y = AX2 + BX + C. Analogous to quadrupole strength scanning for emittance measurement, this phase scanning method can be used to obtain the bunch length by measuring the energy spread at different injection phases. The injection phases can be randomly chosen, which is significantly different from the commonly used zero-phasing method. Further, the systematic error of the reported method, such as the influence of the space charge effect, is analyzed. This technique will be especially useful at low energies when the beam quality is dramatically degraded and is hard to measure using the zero-phasing method.
Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M
2008-01-01
Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. Conclusion The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. Method We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit. PMID:18992163
NASA Astrophysics Data System (ADS)
Benninghoff, L.; von Czarnowski, D.; Denkhaus, E.; Lemke, K.
1997-07-01
For the determination of trace element distributions of more than 20 elements in malignant and normal tissues of the human colon, tissue samples (approx. 400 mg wet weight) were digested with 3 ml of nitric acid (sub-boiled quality) by use of an autoclave system. The accuracy of measurements has been investigated by using certified materials. The analytical results were evaluated by using a spreadsheet program to give an overview of the element distribution in cancerous samples and in normal colon tissues. A further application, cluster analysis of the analytical results, was introduced to demonstrate the possibility of classification for cancer diagnosis. To confirm the results of cluster analysis, multivariate three-way principal component analysis was performed. Additionally, microtome frozen sections (10 μm) were prepared from the same tissue samples to compare the analytical results, i.e. the mass fractions of elements, according to the preparation method and to exclude systematic errors depending on the inhomogeneity of the tissues.
Card, Noel A; Stucky, Brian D; Sawalani, Gita M; Little, Todd D
2008-01-01
This meta-analytic review of 148 studies on child and adolescent direct and indirect aggression examined the magnitude of gender differences, intercorrelations between forms, and associations with maladjustment. Results confirmed prior findings of gender differences (favoring boys) in direct aggression and trivial gender differences in indirect aggression. Results also indicated a substantial intercorrelation (r = .76) between these forms. Despite this high intercorrelation, the 2 forms showed unique associations with maladjustment: Direct aggression is more strongly related to externalizing problems, poor peer relations, and low prosocial behavior, and indirect aggression is related to internalizing problems and higher prosocial behavior. Moderation of these effect sizes by method of assessment, age, gender, and several additional variables were systematically investigated.
NASA Astrophysics Data System (ADS)
Pandey, Manoj Kumar; Ramachandran, Ramesh
2010-03-01
The application of solid-state NMR methodology for bio-molecular structure determination requires the measurement of constraints in the form of 13C-13C and 13C-15N distances, torsion angles and, in some cases, correlation of the anisotropic interactions. Since the availability of structurally important constraints in the solid state is limited due to lack of sufficient spectral resolution, the accuracy of the measured constraints become vital in studies relating the three-dimensional structure of proteins to its biological functions. Consequently, the theoretical methods employed to quantify the experimental data become important. To accentuate this aspect, we re-examine analytical two-spin models currently employed in the estimation of 13C-13C distances based on the rotational resonance (R 2) phenomenon. Although the error bars for the estimated distances tend to be in the range 0.5-1.0 Å, R 2 experiments are routinely employed in a variety of systems ranging from simple peptides to more complex amyloidogenic proteins. In this article we address this aspect by highlighting the systematic errors introduced by analytical models employing phenomenological damping terms to describe multi-spin effects. Specifically, the spin dynamics in R 2 experiments is described using Floquet theory employing two different operator formalisms. The systematic errors introduced by the phenomenological damping terms and their limitations are elucidated in two analytical models and analysed by comparing the results with rigorous numerical simulations.
Goedecke, Thomas; Morales, Daniel R; Pacurariu, Alexandra; Kurz, Xavier
2018-03-01
Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non-European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. From 1246 screened articles, 229 were eligible for full-text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill-over effects were rarely evaluated. Two-thirds used before-after time series and 15.7% before-after cross-sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Whilst impact evaluation of pharmacovigilance and product-specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. © 2017 The Authors. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.
Kazerani, Maryam; Davoudian, Atefeh; Zayeri, Farid; Soori, Hamid
2017-01-01
Background: Systematic reviews and meta-analysis have significant advantages over conventional reviews in that all available data should be presented. This study aimed to evaluate Iranian systematic reviews and meta-analysis abstracts indexed in WOS and Scopus during 2003-2012 based on PRISMA checklist. Methods: This is an analytical study. We evaluated 46 article abstracts indexed in WOS, 89 article abstracts indexed in Scopus and 158 article abstracts indexed in WOS and Scopus both (overlapped group). The quality of the abstracts was evaluated according to the PRISMA checklist for abstracts. Some indicators including distribution per year, total citation, average citations per year, average citations per documents and average citations per year in each article were determined through searching the WOS and Scopus Databases' analytical section. Then, the correlations between the abstract's PRISMA scores, average citations per year, and publication year were calculated. Results: The abstract's quality is not desirable as far as the PRISMA criteria are concerned. In other words, none of the articles' abstracts is in line with the PRISMA items. The average of scores of the current study was 5.9 while the maximum score was 12. The PRISMA criteria showed the highest compliance with "Objectives" (98.6%), the second highest with "Synthesis of result" (85%) and "Title" (80.2%) and the lowest compliance with "Registration" (2%). There was a positive correlation between the compliance of PRISMA score and the average citations per year while there was a negative correlation between PRISMA score and the publication year. Conclusion: It seems that the suggested criteria for reporting Iranian systematic reviews and meta-analysis are not considered adequately by the writers and even scientific journal editors.
Kazerani, Maryam; Davoudian, Atefeh; Zayeri, Farid; Soori, Hamid
2017-01-01
Background: Systematic reviews and meta-analysis have significant advantages over conventional reviews in that all available data should be presented. This study aimed to evaluate Iranian systematic reviews and meta-analysis abstracts indexed in WOS and Scopus during 2003-2012 based on PRISMA checklist. Methods: This is an analytical study. We evaluated 46 article abstracts indexed in WOS, 89 article abstracts indexed in Scopus and 158 article abstracts indexed in WOS and Scopus both (overlapped group). The quality of the abstracts was evaluated according to the PRISMA checklist for abstracts. Some indicators including distribution per year, total citation, average citations per year, average citations per documents and average citations per year in each article were determined through searching the WOS and Scopus Databases’ analytical section. Then, the correlations between the abstract's PRISMA scores, average citations per year, and publication year were calculated. Results: The abstract’s quality is not desirable as far as the PRISMA criteria are concerned. In other words, none of the articles’ abstracts is in line with the PRISMA items. The average of scores of the current study was 5.9 while the maximum score was 12. The PRISMA criteria showed the highest compliance with "Objectives" (98.6%), the second highest with "Synthesis of result" (85%) and "Title" (80.2%) and the lowest compliance with "Registration" (2%). There was a positive correlation between the compliance of PRISMA score and the average citations per year while there was a negative correlation between PRISMA score and the publication year. Conclusion: It seems that the suggested criteria for reporting Iranian systematic reviews and meta-analysis are not considered adequately by the writers and even scientific journal editors. PMID:28955668
Joseph, George; Devi, Ranjani; Marley, Elaine C; Leeman, David
2018-05-01
Single- and multilaboratory testing data have provided systematic scientific evidence that a simple, selective, accurate, and precise method can be used as a potential candidate reference method for dispute resolution in determining total biotin in all forms of infant, adult, and/or pediatric formula. Using LC coupled with immunoaffinity column cleanup extraction, the method fully meets the intended purpose and applicability statement in AOAC Standard Method Performance Requirement 2014.005. The method was applied to a cross-section of infant formula and adult nutritional matrixes, and acceptable precision and accuracy were established. The analytical platform is inexpensive, and the method can be used in almost any laboratory worldwide with basic facilities. The immunoaffinity column cleanup extraction is the key step to successful analysis.
Perich, C; Ricós, C; Alvarez, V; Biosca, C; Boned, B; Cava, F; Doménech, M V; Fernández-Calle, P; Fernández-Fernández, P; García-Lario, J V; Minchinela, J; Simón, M; Jansen, R
2014-05-15
Current external quality assurance schemes have been classified into six categories, according to their ability to verify the degree of standardization of the participating measurement procedures. SKML (Netherlands) is a Category 1 EQA scheme (commutable EQA materials with values assigned by reference methods), whereas SEQC (Spain) is a Category 5 scheme (replicate analyses of non-commutable materials with no values assigned by reference methods). The results obtained by a group of Spanish laboratories participating in a pilot study organized by SKML are examined, with the aim of pointing out the improvements over our current scheme that a Category 1 program could provide. Imprecision and bias are calculated for each analyte and laboratory, and compared with quality specifications derived from biological variation. Of the 26 analytes studied, 9 had results comparable with those from reference methods, and 10 analytes did not have comparable results. The remaining 7 analytes measured did not have available reference method values, and in these cases, comparison with the peer group showed comparable results. The reasons for disagreement in the second group can be summarized as: use of non-standard methods (IFCC without exogenous pyridoxal phosphate for AST and ALT, Jaffé kinetic at low-normal creatinine concentrations and with eGFR); non-commutability of the reference material used to assign values to the routine calibrator (calcium, magnesium and sodium); use of reference materials without established commutability instead of reference methods for AST and GGT, and lack of a systematic effort by manufacturers to harmonize results. Results obtained in this work demonstrate the important role of external quality assurance programs using commutable materials with values assigned by reference methods to correctly monitor the standardization of laboratory tests with consequent minimization of risk to patients. Copyright © 2013 Elsevier B.V. All rights reserved.
Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M
2008-11-07
Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit.
Moyer, Cheryl A.; Johnson, Cassidy; Kaselitz, Elizabeth; Aborigo, Raymond
2017-01-01
ABSTRACT Background: Social, cultural, and behavioral factors are often potent upstream contributors to maternal, neonatal, and child mortality, especially in low- and middle-income countries (LMICs). Social autopsy is one method of identifying the impact of such factors, yet it is unclear how social autopsy methods are being used in LMICs. Objective: This study aimed to identify the most common social autopsy instruments, describe overarching findings across populations and geography, and identify gaps in the existing social autopsy literature. Methods: A systematic search of the peer-reviewed literature from 2005 to 2016 was conducted. Studies were included if they were conducted in an LMIC, focused on maternal/neonatal/infant/child health, reported on the results of original research, and explicitly mentioned the use of a social autopsy tool. Results: Sixteen articles out of 1950 citations were included, representing research conducted in 11 countries. Five different tools were described, with two primary conceptual frameworks used to guide analysis: Pathway to Survival and Three Delays models. Studies varied in methods for identifying deaths, and recall periods for respondents ranged from 6 weeks to 5+ years. Across studies, recognition of danger signs appeared to be high, while subsequent care-seeking was inconsistent. Cost, distance to facility, and transportation issues were frequently cited barriers to care-seeking, however, additional barriers were reported that varied by location. Gaps in the social autopsy literature include the lack of: harmonized tools and analytical methods that allow for cross-study comparisons, discussion of complexity of decision making for care seeking, qualitative narratives that address inconsistencies in responses, and the explicit inclusion of perspectives from husbands and fathers. Conclusion: Despite the nascence of the field, research across 11 countries has included social autopsy methods, using a variety of tools, sampling methods, and analytical frameworks to determine how social factors impact maternal, neonatal, and child health outcomes. PMID:29261449
Parrinello, Christina M.; Grams, Morgan E.; Couper, David; Ballantyne, Christie M.; Hoogeveen, Ron C.; Eckfeldt, John H.; Selvin, Elizabeth; Coresh, Josef
2016-01-01
Background Equivalence of laboratory tests over time is important for longitudinal studies. Even a small systematic difference (bias) can result in substantial misclassification. Methods We selected 200 Atherosclerosis Risk in Communities Study participants attending all 5 study visits over 25 years. Eight analytes were re-measured in 2011–13 from stored blood samples from multiple visits: creatinine, uric acid, glucose, total cholesterol, HDL-cholesterol, LDL-cholesterol, triglycerides, and high-sensitivity C-reactive protein. Original values were recalibrated to re-measured values using Deming regression. Differences >10% were considered to reflect substantial bias, and correction equations were applied to affected analytes in the total study population. We examined trends in chronic kidney disease (CKD) pre- and post-recalibration. Results Repeat measures were highly correlated with original values (Pearson’s r>0.85 after removing outliers [median 4.5% of paired measurements]), but 2 of 8 analytes (creatinine and uric acid) had differences >10%. Original values of creatinine and uric acid were recalibrated to current values using correction equations. CKD prevalence differed substantially after recalibration of creatinine (visits 1, 2, 4 and 5 pre-recalibration: 21.7%, 36.1%, 3.5%, 29.4%; post-recalibration: 1.3%, 2.2%, 6.4%, 29.4%). For HDL-cholesterol, the current direct enzymatic method differed substantially from magnesium dextran precipitation used during visits 1–4. Conclusions Analytes re-measured in samples stored for ~25 years were highly correlated with original values, but two of the 8 analytes showed substantial bias at multiple visits. Laboratory recalibration improved reproducibility of test results across visits and resulted in substantial differences in CKD prevalence. We demonstrate the importance of consistent recalibration of laboratory assays in a cohort study. PMID:25952043
NASA Technical Reports Server (NTRS)
Koshak, William; Solakiewicz, Richard
2013-01-01
An analytic perturbation method is introduced for estimating the lightning ground flash fraction in a set of N lightning flashes observed by a satellite lightning mapper. The value of N is large, typically in the thousands, and the observations consist of the maximum optical group area produced by each flash. The method is tested using simulated observations that are based on Optical Transient Detector (OTD) and Lightning Imaging Sensor (LIS) data. National Lightning Detection NetworkTM (NLDN) data is used to determine the flash-type (ground or cloud) of the satellite-observed flashes, and provides the ground flash fraction truth for the simulation runs. It is found that the mean ground flash fraction retrieval errors are below 0.04 across the full range 0-1 under certain simulation conditions. In general, it is demonstrated that the retrieval errors depend on many factors (i.e., the number, N, of satellite observations, the magnitude of random and systematic measurement errors, and the number of samples used to form certain climate distributions employed in the model).
Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua
2016-01-01
Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples. PMID:27924944
Hou, Xiaohong; Zheng, Xin; Zhang, Conglu; Ma, Xiaowei; Ling, Qiyuan; Zhao, Longshan
2014-10-15
A novel ultrasound-assisted dispersive liquid-liquid microextraction based on solidification of floating organic droplet method (UA-DLLME-SFO) combined with gas chromatography (GC) was developed for the determination of eight pyrethroid pesticides in tea for the first time. After ultrasound and centrifugation, 1-dodecanol and ethanol was used as the extraction and dispersive solvent, respectively. A series of parameters, including extraction solvent and volume, dispersive solvent and volume, extraction time, pH, and ultrasonic time influencing the microextraction efficiency were systematically investigated. Under the optimal conditions, the enrichment factors (EFs) were from 292 to 883 for the eight analytes. The linear ranges for the analytes were from 5 to 100μg/kg. The method recoveries ranged from 92.1% to 99.6%, with the corresponding RSDs less than 6.0%. The developed method was considered to be simple, fast, and precise to satisfy the requirements of the residual analysis of pyrethroid pesticides. Copyright © 2014 Elsevier B.V. All rights reserved.
AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku
2014-05-27
The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.
Contandriopoulos, Damien; Lemire, Marc; Denis, Jean-Louis; Tremblay, Émile
2010-01-01
Context: This article presents the main results from a large-scale analytical systematic review on knowledge exchange interventions at the organizational and policymaking levels. The review integrated two broad traditions, one roughly focused on the use of social science research results and the other focused on policymaking and lobbying processes. Methods: Data collection was done using systematic snowball sampling. First, we used prospective snowballing to identify all documents citing any of a set of thirty-three seminal papers. This process identified 4,102 documents, 102 of which were retained for in-depth analysis. The bibliographies of these 102 documents were merged and used to identify retrospectively all articles cited five times or more and all books cited seven times or more. All together, 205 documents were analyzed. To develop an integrated model, the data were synthesized using an analytical approach. Findings: This article developed integrated conceptualizations of the forms of collective knowledge exchange systems, the nature of the knowledge exchanged, and the definition of collective-level use. This literature synthesis is organized around three dimensions of context: level of polarization (politics), cost-sharing equilibrium (economics), and institutionalized structures of communication (social structuring). Conclusions: The model developed here suggests that research is unlikely to provide context-independent evidence for the intrinsic efficacy of knowledge exchange strategies. To design a knowledge exchange intervention to maximize knowledge use, a detailed analysis of the context could use the kind of framework developed here. PMID:21166865
Robust electroencephalogram phase estimation with applications in brain-computer interface systems.
Seraj, Esmaeil; Sameni, Reza
2017-03-01
In this study, a robust method is developed for frequency-specific electroencephalogram (EEG) phase extraction using the analytic representation of the EEG. Based on recent theoretical findings in this area, it is shown that some of the phase variations-previously associated to the brain response-are systematic side-effects of the methods used for EEG phase calculation, especially during low analytical amplitude segments of the EEG. With this insight, the proposed method generates randomized ensembles of the EEG phase using minor perturbations in the zero-pole loci of narrow-band filters, followed by phase estimation using the signal's analytical form and ensemble averaging over the randomized ensembles to obtain a robust EEG phase and frequency. This Monte Carlo estimation method is shown to be very robust to noise and minor changes of the filter parameters and reduces the effect of fake EEG phase jumps, which do not have a cerebral origin. As proof of concept, the proposed method is used for extracting EEG phase features for a brain computer interface (BCI) application. The results show significant improvement in classification rates using rather simple phase-related features and a standard K-nearest neighbors and random forest classifiers, over a standard BCI dataset. The average performance was improved between 4-7% (in absence of additive noise) and 8-12% (in presence of additive noise). The significance of these improvements was statistically confirmed by a paired sample t-test, with 0.01 and 0.03 p-values, respectively. The proposed method for EEG phase calculation is very generic and may be applied to other EEG phase-based studies.
Sharon, Maheshwar; Apte, P R; Purandare, S C; Zacharia, Renju
2005-02-01
Seven variable parameters of the chemical vapor deposition system have been optimized with the help of the Taguchi analytical method for getting a desired product, e.g., carbon nanotubes or carbon nanobeads. It is observed that almost all selected parameters influence the growth of carbon nanotubes. However, among them, the nature of precursor (racemic, R or Technical grade camphor) and the carrier gas (hydrogen, argon and mixture of argon/hydrogen) seem to be more important parameters affecting the growth of carbon nanotubes. Whereas, for the growth of nanobeads, out of seven parameters, only two, i.e., catalyst (powder of iron, cobalt, and nickel) and temperature (1023 K, 1123 K, and 1273 K), are the most influential parameters. Systematic defects or islands on the substrate surface enhance nucleation of novel carbon materials. Quantitative contributions of process parameters as well as optimum factor levels are obtained by performing analysis of variance (ANOVA) and analysis of mean (ANOM), respectively.
Carpinteiro, J; Rodríguez, I; Cela, R
2004-11-01
The performance of solid-phase microextraction (SPME) applied to the determination of butyltin compounds in sediment samples is systematically evaluated. Matrix effects and influence of blank signals on the detection limits of the method are studied in detail. The interval of linear response is also evaluated in order to assess the applicability of the method to sediments polluted with butyltin compounds over a large range of concentrations. Advantages and drawbacks of including an SPME step, instead of the classic liquid-liquid extraction of the derivatized analytes, in the determination of butyltin compounds in sediment samples are considered in terms of achieved detection limits and experimental effort. Analytes were extracted from the samples by sonication using glacial acetic acid. An aliquot of the centrifuged extract was placed on a vial where compounds were ethylated and concentrated on a PDMS fiber using the headspace mode. Determinations were carried out using GC-MIP AED.
Johnston, Lisa G; Hakim, Avi J; Dittrich, Samantha; Burnett, Janet; Kim, Evelyn; White, Richard G
2016-08-01
Reporting key details of respondent-driven sampling (RDS) survey implementation and analysis is essential for assessing the quality of RDS surveys. RDS is both a recruitment and analytic method and, as such, it is important to adequately describe both aspects in publications. We extracted data from peer-reviewed literature published through September, 2013 that reported collected biological specimens using RDS. We identified 151 eligible peer-reviewed articles describing 222 surveys conducted in seven regions throughout the world. Most published surveys reported basic implementation information such as survey city, country, year, population sampled, interview method, and final sample size. However, many surveys did not report essential methodological and analytical information for assessing RDS survey quality, including number of recruitment sites, seeds at start and end, maximum number of waves, and whether data were adjusted for network size. Understanding the quality of data collection and analysis in RDS is useful for effectively planning public health service delivery and funding priorities.
NASA Astrophysics Data System (ADS)
Wu, F.; Wu, T.-H.; Li, X.-Y.
2018-03-01
This article aims to present a systematic indentation theory on a half-space of multi-ferroic composite medium with transverse isotropy. The effect of sliding friction between the indenter and substrate is taken into account. The cylindrical flat-ended indenter is assumed to be electrically/magnetically conducting or insulating, which leads to four sets of mixed boundary-value problems. The indentation forces in the normal and tangential directions are related to the Coulomb friction law. For each case, the integral equations governing the contact behavior are developed by means of the generalized method of potential theory, and the corresponding coupling field is obtained in terms of elementary functions. The effect of sliding on the contact behavior is investigated. Finite element method (FEM) in the context of magneto-electro-elasticity is developed to discuss the validity of the analytical solutions. The obtained analytical solutions may serve as benchmarks to various simplified analyses and numerical codes and as a guide for future experimental studies.
A Variational Approach to the Analysis of Dissipative Electromechanical Systems
Allison, Andrew; Pearce, Charles E. M.; Abbott, Derek
2014-01-01
We develop a method for systematically constructing Lagrangian functions for dissipative mechanical, electrical, and electromechanical systems. We derive the equations of motion for some typical electromechanical systems using deterministic principles that are strictly variational. We do not use any ad hoc features that are added on after the analysis has been completed, such as the Rayleigh dissipation function. We generalise the concept of potential, and define generalised potentials for dissipative lumped system elements. Our innovation offers a unified approach to the analysis of electromechanical systems where there are energy and power terms in both the mechanical and electrical parts of the system. Using our novel technique, we can take advantage of the analytic approach from mechanics, and we can apply these powerful analytical methods to electrical and to electromechanical systems. We can analyse systems that include non-conservative forces. Our methodology is deterministic, and does does require any special intuition, and is thus suitable for automation via a computer-based algebra package. PMID:24586221
Tomaselli Muensterman, Elena; Tisdale, James E
2018-06-08
Prolongation of the heart rate-corrected QT (QTc) interval increases the risk for torsades de pointes (TdP), a potentially fatal arrhythmia. The likelihood of TdP is higher in patients with risk factors, which include female sex, older age, heart failure with reduced ejection fraction, hypokalemia, hypomagnesemia, concomitant administration of ≥ 2 QTc interval-prolonging medications, among others. Assessment and quantification of risk factors may facilitate prediction of patients at highest risk for developing QTc interval prolongation and TdP. Investigators have utilized the field of predictive analytics, which generates predictions using techniques including data mining, modeling, machine learning, and others, to develop methods of risk quantification and prediction of QTc interval prolongation. Predictive analytics have also been incorporated into clinical decision support (CDS) tools to alert clinicians regarding patients at increased risk of developing QTc interval prolongation. The objectives of this paper are to assess the effectiveness of predictive analytics for identification of patients at risk of drug-induced QTc interval prolongation, and to discuss the efficacy of incorporation of predictive analytics into CDS tools in clinical practice. A systematic review of English language articles (human subjects only) was performed, yielding 57 articles, with an additional 4 articles identified from other sources; a total of 10 articles were included in this review. Risk scores for QTc interval prolongation have been developed in various patient populations including those in cardiac intensive care units (ICUs) and in broader populations of hospitalized or health system patients. One group developed a risk score that includes information regarding genetic polymorphisms; this score significantly predicted TdP. Development of QTc interval prolongation risk prediction models and incorporation of these models into CDS tools reduces the risk of QTc interval prolongation in cardiac ICUs and identifies health-system patients at increased risk for mortality. The impact of these QTc interval prolongation predictive analytics on overall patient safety outcomes, such as TdP and sudden cardiac death relative to the cost of development and implementation, requires further study. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
González-Fuenzalida, R. A.; Moliner-Martínez, Y.; Prima-Garcia, Helena; Ribera, Antonio; Campins-Falcó, P.; Zaragozá, Ramon J.
2014-01-01
The use of magnetic nanomaterials for analytical applications has increased in the recent years. In particular, magnetic nanomaterials have shown great potential as adsorbent phase in several extraction procedures due to the significant advantages over the conventional methods. In the present work, the influence of magnetic forces over the extraction efficiency of triazines using superparamagnetic silica nanoparticles (NPs) in magnetic in tube solid phase microextraction (Magnetic-IT-SPME) coupled to CapLC has been evaluated. Atrazine, terbutylazine and simazine has been selected as target analytes. The superparamagnetic silica nanomaterial (SiO2-Fe3O4) deposited onto the surface of a capillary column gave rise to a magnetic extraction phase for IT-SPME that provided a enhancemment of the extraction efficiency for triazines. This improvement is based on two phenomena, the superparamegnetic behavior of Fe3O4 NPs and the diamagnetic repulsions that take place in a microfluidic device such a capillary column. A systematic study of analytes adsorption and desorption was conducted as function of the magnetic field and the relationship with triazines magnetic susceptibility. The positive influence of magnetism on the extraction procedure was demonstrated. The analytical characteristics of the optimized procedure were established and the method was applied to the determination of the target analytes in water samples with satisfactory results. When coupling Magnetic-IT-SPME with CapLC, improved adsorption efficiencies (60%–63%) were achieved compared with conventional adsorption materials (0.8%–3%). PMID:28344221
Kröner, Frieder; Hubbuch, Jürgen
2013-04-12
pH gradient protein separations are widely used techniques in the field of protein analytics, of which isoelectric focusing is the most well known application. The chromatographic variant, based on the formation of pH gradients in ion exchange columns is only rarely applied due to the difficulties to form controllable, linear pH gradients over a broad pH range. This work describes a method for the systematic generation of buffer compositions with linear titration curves, resulting in well controllable pH gradients. To generate buffer compositions with linear titration curves an in silico method was successfully developed. With this tool, buffer compositions for pH gradient ion exchange chromatography with pH ranges spanning up to 7.5 pH units were established and successfully validated. Subsequently, the buffer systems were used to characterize the elution behavior of 22 different model proteins in cation and anion exchange pH gradient chromatography. The results of both chromatographic modes as well as isoelectric focusing were compared to describe differences in between the methods. Copyright © 2013 Elsevier B.V. All rights reserved.
Basic analytical methods for identification of erythropoiesis-stimulating agents in doping control
NASA Astrophysics Data System (ADS)
Postnikov, P. V.; Krotov, G. I.; Efimova, Yu A.; Rodchenkov, G. M.
2016-02-01
The design of new erythropoiesis-stimulating agents for clinical use necessitates constant development of methods for detecting the abuse of these substances, which are prohibited under the World Anti-Doping Code and are included in the World Anti-Doping Agency (WADA) prohibited list. This review integrates and describes systematically the published data on the key methods currently used by WADA-accredited anti-doping laboratories around the world to detect the abuse of erythropoiesis-stimulating agents, including direct methods (various polyacrylamide gel electrophoresis techniques, enzyme-linked immunosorbent assay, membrane enzyme immunoassay and mass spectrometry) and indirect methods (athlete biological passport). Particular attention is given to promising approaches and investigations that can be used to control prohibited erythropoietins in the near future. The bibliography includes 122 references.
["Long-branch Attraction" artifact in phylogenetic reconstruction].
Li, Yi-Wei; Yu, Li; Zhang, Ya-Ping
2007-06-01
Phylogenetic reconstruction among various organisms not only helps understand their evolutionary history but also reveal several fundamental evolutionary questions. Understanding of the evolutionary relationships among organisms establishes the foundation for the investigations of other biological disciplines. However, almost all the widely used phylogenetic methods have limitations which fail to eliminate systematic errors effectively, preventing the reconstruction of true organismal relationships. "Long-branch Attraction" (LBA) artifact is one of the most disturbing factors in phylogenetic reconstruction. In this review, the conception and analytic method as well as the avoidance strategy of LBA were summarized. In addition, several typical examples were provided. The approach to avoid and resolve LBA artifact has been discussed.
NASA Technical Reports Server (NTRS)
Drummond, J. D.; Weidenschilling, S. J.; Chapman, C. R.; Davis, D. R.
1991-01-01
The Drummond et al. (1988) analysis of main-belt asteroids is presently extended, using three independent methods to derive poles, periods, phase functions, and triaxial ellipsoid shapes from lightcurve maxima and minima. This group of 26 asteroids is also reinvestigated with a view to the distributions of triaxial shapes and obliquity distributions. Poles weakly tend to avoid asteroid orbital planes; a rough-smooth dichotomization appears to be justified by the persistence of two solar phase angle-amplitude relations. Seven of the objects may be Jacobi ellipsoids if axial ratios are slightly exaggerated by a systematic effect of the analytical method employed.
NASA Technical Reports Server (NTRS)
Littell, Justin D.; Binienda, Wieslaw K.; Arnold, William A.; Roberts, Gary D.; Goldberg, Robert K.
2010-01-01
The reliability of impact simulations for aircraft components made with triaxial-braided carbon-fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Methods to characterize the material properties used in the analytical models from a systematically obtained set of test data are also lacking. A macroscopic finite element based analytical model to analyze the impact response of these materials has been developed. The stiffness and strength properties utilized in the material model are obtained from a set of quasi-static in-plane tension, compression and shear coupon level tests. Full-field optical strain measurement techniques are applied in the testing, and the results are used to help in characterizing the model. The unit cell of the braided composite is modeled as a series of shell elements, where each element is modeled as a laminated composite. The braided architecture can thus be approximated within the analytical model. The transient dynamic finite element code LS-DYNA is utilized to conduct the finite element simulations, and an internal LS-DYNA constitutive model is utilized in the analysis. Methods to obtain the stiffness and strength properties required by the constitutive model from the available test data are developed. Simulations of quasi-static coupon tests and impact tests of a represented braided composite are conducted. Overall, the developed method shows promise, but improvements that are needed in test and analysis methods for better predictive capability are examined.
Huang, Yang; Zhang, Tingting; Zhao, Yumei; Zhou, Haibo; Tang, Guangyun; Fillet, Marianne; Crommen, Jacques; Jiang, Zhengjin
2017-09-10
Nucleobases, nucleosides and ginsenosides, which have a significant impact on the physiological activity of organisms, are reported to be the active components of ginseng, while they are less present in ginseng extracts. Few analytical methods have been developed so far to simultaneously analyze these three classes of compounds with different polarities present in ginseng extracts. In the present study, a simple and efficient analytical method was successfully developed for the simultaneous separation of 17 nucleobases, nucleosides and ginsenosides in ginseng extracts using supercritical fluid chromatography coupled with single quadrupole mass spectrometry (SFC-MS). The effect of various experimental factors on the separation performance, such as the column type, temperature and backpressure, the type of modifier and additive, and the concentration of make-up solvent were systematically investigated. Under the selected conditions, the developed method was successfully applied to the quality evaluation of 14 batches of ginseng extracts from different origins. The results obtained for the different batches indicate that this method could be employed for the quality assessment of ginseng extracts. Copyright © 2017 Elsevier B.V. All rights reserved.
A systematic scanning election microscope analytical technique has been developed to examine granular activated carbon used a a medium for biomass attachment in liquid waste treatment. The procedure allows for the objective monitoring, comparing, and trouble shooting of combined ...
Abdul-Karim, Nadia; Blackman, Christopher S; Gill, Philip P; Karu, Kersti
2016-10-05
The continued usage of explosive devices, as well as the ever growing threat of 'dirty' bombs necessitates a comprehensive understanding of particle dispersal during detonation events in order to develop effectual methods for targeting explosive and/or additive remediation efforts. Herein, the distribution of explosive analytes from controlled detonations of aluminised ammonium nitrate and an RDX-based explosive composition were established by systematically sampling sites positioned around each firing. This is the first experimental study to produce evidence that the post-blast residue mass can distribute according to an approximate inverse-square law model, while also demonstrating for the first time that distribution trends can vary depending on individual analytes. Furthermore, by incorporating blast-wave overpressure measurements, high-speed imaging for fireball volume recordings, and monitoring of environmental conditions, it was determined that the principle factor affecting all analyte dispersals was the wind direction, with other factors affecting specific analytes to varying degrees. The dispersal mechanism for explosive residue is primarily the smoke cloud, a finding which in itself has wider impacts on the environment and fundamental detonation theory. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
The Analytic Hierarchy Process and Participatory Decisionmaking
Daniel L. Schmoldt; Daniel L. Peterson; Robert L. Smith
1995-01-01
Managing natural resource lands requires social, as well as biophysical, considerations. Unfortunately, it is extremely difficult to accurately assess and quantify changing social preferences, and to aggregate conflicting opinions held by diverse social groups. The Analytic Hierarchy Process (AHP) provides a systematic, explicit, rigorous, and robust mechanism for...
Using Analytic Hierarchy Process in Textbook Evaluation
ERIC Educational Resources Information Center
Kato, Shigeo
2014-01-01
This study demonstrates the application of the analytic hierarchy process (AHP) in English language teaching materials evaluation, focusing in particular on its potential for systematically integrating different components of evaluation criteria in a variety of teaching contexts. AHP is a measurement procedure wherein pairwise comparisons are made…
Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I
2016-10-18
There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.
The Water-Energy-Food Nexus: Advancing Innovative, Policy-Relevant Methods
NASA Astrophysics Data System (ADS)
Crootof, A.; Albrecht, T.; Scott, C. A.
2017-12-01
The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex Anthropocene challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, a primary limitation of the nexus approach is the absence - or gaps and inconsistent use - of adequate methods to advance an innovative and policy-relevant nexus approach. This paper presents an analytical framework to identify robust nexus methods that align with nexus thinking and highlights innovative nexus methods at the frontier. The current state of nexus methods was assessed with a systematic review of 245 journal articles and book chapters. This review revealed (a) use of specific and reproducible methods for nexus assessment is uncommon - less than one-third of the reviewed studies present explicit methods; (b) nexus methods frequently fall short of capturing interactions among water, energy, and food - the very concept they purport to address; (c) assessments strongly favor quantitative approaches - 70% use primarily quantitative tools; (d) use of social science methods is limited (26%); and (e) many nexus methods are confined to disciplinary silos - only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. Despite some pitfalls of current nexus methods, there are a host of studies that offer innovative approaches to help quantify nexus linkages and interactions among sectors, conceptualize dynamic feedbacks, and support mixed method approaches to better understand WEF systems. Applying our analytical framework to all 245 studies, we identify, and analyze herein, seventeen studies that implement innovative multi-method and cross-scalar tools to demonstrate promising advances toward improved nexus assessment. This paper finds that, to make the WEF nexus effective as a policy-relevant analytical tool, methods are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and policy-makers.
A new concept of pencil beam dose calculation for 40-200 keV photons using analytical dose kernels.
Bartzsch, Stefan; Oelfke, Uwe
2013-11-01
The advent of widespread kV-cone beam computer tomography in image guided radiation therapy and special therapeutic application of keV photons, e.g., in microbeam radiation therapy (MRT) require accurate and fast dose calculations for photon beams with energies between 40 and 200 keV. Multiple photon scattering originating from Compton scattering and the strong dependence of the photoelectric cross section on the atomic number of the interacting tissue render these dose calculations by far more challenging than the ones established for corresponding MeV beams. That is why so far developed analytical models of kV photon dose calculations fail to provide the required accuracy and one has to rely on time consuming Monte Carlo simulation techniques. In this paper, the authors introduce a novel analytical approach for kV photon dose calculations with an accuracy that is almost comparable to the one of Monte Carlo simulations. First, analytical point dose and pencil beam kernels are derived for homogeneous media and compared to Monte Carlo simulations performed with the Geant4 toolkit. The dose contributions are systematically separated into contributions from the relevant orders of multiple photon scattering. Moreover, approximate scaling laws for the extension of the algorithm to inhomogeneous media are derived. The comparison of the analytically derived dose kernels in water showed an excellent agreement with the Monte Carlo method. Calculated values deviate less than 5% from Monte Carlo derived dose values, for doses above 1% of the maximum dose. The analytical structure of the kernels allows adaption to arbitrary materials and photon spectra in the given energy range of 40-200 keV. The presented analytical methods can be employed in a fast treatment planning system for MRT. In convolution based algorithms dose calculation times can be reduced to a few minutes.
NASA Astrophysics Data System (ADS)
Jayamani, E.; Perera, D. S.; Soon, K. H.; Bakri, M. K. B.
2017-04-01
A systematic method of material analysis aiming for fuel efficiency improvement with the utilization of natural fiber reinforced polymer matrix composites in the automobile industry is proposed. A multi-factor based decision criteria with Analytical Hierarchy Process (AHP) was used and executed through MATLAB to achieve improved fuel efficiency through the weight reduction of vehicular components by effective comparison between two engine hood designs. The reduction was simulated by utilizing natural fiber polymer composites with thermoplastic polypropylene (PP) as the matrix polymer and benchmarked against a synthetic based composite component. Results showed that PP with 35% of flax fiber loading achieved a 0.4% improvement in fuel efficiency, and it was the highest among the 27 candidate fibers.
Evaluating supplier quality performance using fuzzy analytical hierarchy process
NASA Astrophysics Data System (ADS)
Ahmad, Nazihah; Kasim, Maznah Mat; Rajoo, Shanmugam Sundram Kalimuthu
2014-12-01
Evaluating supplier quality performance is vital in ensuring continuous supply chain improvement, reducing the operational costs and risks towards meeting customer's expectation. This paper aims to illustrate an application of Fuzzy Analytical Hierarchy Process to prioritize the evaluation criteria in a context of automotive manufacturing in Malaysia. Five main criteria were identified which were quality, cost, delivery, customer serviceand technology support. These criteria had been arranged into hierarchical structure and evaluated by an expert. The relative importance of each criteria was determined by using linguistic variables which were represented as triangular fuzzy numbers. The Center of Gravity defuzzification method was used to convert the fuzzy evaluations into their corresponding crisps values. Such fuzzy evaluation can be used as a systematic tool to overcome the uncertainty evaluation of suppliers' performance which usually associated with human being subjective judgments.
Weisner, Thomas S; Fiese, Barbara H
2011-12-01
Mixed methods in family psychology refer to the systematic integration of qualitative and quantitative techniques to represent family processes and settings. Over the past decade, significant advances have been made in study design, analytic strategies, and technological support (such as software) that allow for the integration of quantitative and qualitative methods and for making appropriate inferences from mixed methods. This special section of the Journal of Family Psychology illustrates how mixed methods may be used to advance knowledge in family science through identifying important cultural differences in family structure, beliefs, and practices, and revealing patterns of family relationships to generate new measurement paradigms and inform clinical practice. Guidance is offered to advance mixed methods research in family psychology through sound principles of peer review.
Algorithms and software for U-Pb geochronology by LA-ICPMS
NASA Astrophysics Data System (ADS)
McLean, Noah M.; Bowring, James F.; Gehrels, George
2016-07-01
The past 15 years have produced numerous innovations in geochronology, including experimental methods, instrumentation, and software that are revolutionizing the acquisition and application of geochronological data. For example, exciting advances are being driven by Laser-Ablation ICP Mass Spectrometry (LA-ICPMS), which allows for rapid determination of U-Th-Pb ages with 10s of micrometer-scale spatial resolution. This method has become the most commonly applied tool for dating zircons, constraining a host of geological problems. The LA-ICPMS community is now faced with archiving these data with associated analytical results and, more importantly, ensuring that data meet the highest standards for precision and accuracy and that interlaboratory biases are minimized. However, there is little consensus with regard to analytical strategies and data reduction protocols for LA-ICPMS geochronology. The result is systematic interlaboratory bias and both underestimation and overestimation of uncertainties on calculated dates that, in turn, decrease the value of data in repositories such as EarthChem, which archives data and analytical results from participating laboratories. We present free open-source software that implements new algorithms for evaluating and resolving many of these discrepancies. This solution is the result of a collaborative effort to extend the U-Pb_Redux software for the ID-TIMS community to the LA-ICPMS community. Now named ET_Redux, our new software automates the analytical and scientific workflows of data acquisition, statistical filtering, data analysis and interpretation, publication, community-based archiving, and the compilation and comparison of data from different laboratories to support collaborative science.
2014-01-01
Background Opioids are psychoactive analgesic drugs prescribed for pain relief and palliative care. Due to their addictive potential, effort and vigilance in controlling prescriptions is needed to avoid misuse and dependence. Despite the effort, the prevalence of opioid use disorder continues to rise. Opioid substitution therapies are commonly used to treat opioid dependence; however, there is minimal consensus as to which therapy is most effective. Available treatments include methadone, heroin, buprenorphine, as well as naltrexone. This systematic review aims to assess and compare the effect of all available opioid substitution therapies on the treatment of opioid dependence. Methods/Design The authors will search Medline, EMBASE, PubMed, PsycINFO, Web of Science, Cochrane Library, Cochrane Clinical Trials Registry, World Health Organization International Clinical Trials Registry Platform Search Portal, and the National Institutes for Health Clinical Trials Registry. The title, abstract, and full-text screening will be completed in duplicate. When appropriate, multiple treatment comparison Bayesian meta-analytic methods will be performed to deduce summary statistics estimating the effectiveness of all opioid substitution therapies in terms of retention and response to treatment (as measured through continued opioid abuse). Discussion Using evidence gained from this systematic review, we anticipate disseminating an objective review of the current available literature on the effectiveness of all opioid substitution therapies for the treatment of opioid use disorder. The results of this systematic review are imperative to the further enhancement of clinical practice in addiction medicine. Systematic review registration PROSPERO CRD42013006507. PMID:25239213
Potential, velocity, and density fields from sparse and noisy redshift-distance samples - Method
NASA Technical Reports Server (NTRS)
Dekel, Avishai; Bertschinger, Edmund; Faber, Sandra M.
1990-01-01
A method for recovering the three-dimensional potential, velocity, and density fields from large-scale redshift-distance samples is described. Galaxies are taken as tracers of the velocity field, not of the mass. The density field and the initial conditions are calculated using an iterative procedure that applies the no-vorticity assumption at an initial time and uses the Zel'dovich approximation to relate initial and final positions of particles on a grid. The method is tested using a cosmological N-body simulation 'observed' at the positions of real galaxies in a redshift-distance sample, taking into account their distance measurement errors. Malmquist bias and other systematic and statistical errors are extensively explored using both analytical techniques and Monte Carlo simulations.
NASA Astrophysics Data System (ADS)
Sitnikov, Dmitri G.; Monnin, Cian S.; Vuckovic, Dajana
2016-12-01
The comparison of extraction methods for global metabolomics is usually executed in biofluids only and focuses on metabolite coverage and method repeatability. This limits our detailed understanding of extraction parameters such as recovery and matrix effects and prevents side-by-side comparison of different sample preparation strategies. To address this gap in knowledge, seven solvent-based and solid-phase extraction methods were systematically evaluated using standard analytes spiked into both buffer and human plasma. We compared recovery, coverage, repeatability, matrix effects, selectivity and orthogonality of all methods tested for non-lipid metabolome in combination with reversed-phased and mixed-mode liquid chromatography mass spectrometry analysis (LC-MS). Our results confirmed wide selectivity and excellent precision of solvent precipitations, but revealed their high susceptibility to matrix effects. The use of all seven methods showed high overlap and redundancy which resulted in metabolite coverage increases of 34-80% depending on LC-MS method employed as compared to the best single extraction protocol (methanol/ethanol precipitation) despite 7x increase in MS analysis time and sample consumption. The most orthogonal methods to methanol-based precipitation were ion-exchange solid-phase extraction and liquid-liquid extraction using methyl-tertbutyl ether. Our results help facilitate rational design and selection of sample preparation methods and internal standards for global metabolomics.
Sitnikov, Dmitri G.; Monnin, Cian S.; Vuckovic, Dajana
2016-01-01
The comparison of extraction methods for global metabolomics is usually executed in biofluids only and focuses on metabolite coverage and method repeatability. This limits our detailed understanding of extraction parameters such as recovery and matrix effects and prevents side-by-side comparison of different sample preparation strategies. To address this gap in knowledge, seven solvent-based and solid-phase extraction methods were systematically evaluated using standard analytes spiked into both buffer and human plasma. We compared recovery, coverage, repeatability, matrix effects, selectivity and orthogonality of all methods tested for non-lipid metabolome in combination with reversed-phased and mixed-mode liquid chromatography mass spectrometry analysis (LC-MS). Our results confirmed wide selectivity and excellent precision of solvent precipitations, but revealed their high susceptibility to matrix effects. The use of all seven methods showed high overlap and redundancy which resulted in metabolite coverage increases of 34–80% depending on LC-MS method employed as compared to the best single extraction protocol (methanol/ethanol precipitation) despite 7x increase in MS analysis time and sample consumption. The most orthogonal methods to methanol-based precipitation were ion-exchange solid-phase extraction and liquid-liquid extraction using methyl-tertbutyl ether. Our results help facilitate rational design and selection of sample preparation methods and internal standards for global metabolomics. PMID:28000704
An overview of methods for comparative effectiveness research.
Meyer, Anne-Marie; Wheeler, Stephanie B; Weinberger, Morris; Chen, Ronald C; Carpenter, William R
2014-01-01
Comparative effectiveness research (CER) is a broad category of outcomes research encompassing many different methods employed by researchers and clinicians from numerous disciplines. The goal of cancer-focused CER is to generate new knowledge to assist cancer stakeholders in making informed decisions that will improve health care and outcomes of both individuals and populations. There are numerous CER methods that may be used to examine specific questions, including randomized controlled trials, observational studies, systematic literature reviews, and decision sciences modeling. Each has its strengths and weaknesses. To both inform and serve as a reference for readers of this issue of Seminars in Radiation Oncology as well as the broader oncology community, we describe CER and several of the more commonly used approaches and analytical methods. © 2013 Published by Elsevier Inc.
Costing evidence for health care decision-making in Austria: A systematic review
Mayer, Susanne; Kiss, Noemi; Łaszewska, Agata
2017-01-01
Background With rising healthcare costs comes an increasing demand for evidence-informed resource allocation using economic evaluations worldwide. Furthermore, standardization of costing and reporting methods both at international and national levels are imperative to make economic evaluations a valid tool for decision-making. The aim of this review is to assess the availability and consistency of costing evidence that could be used for decision-making in Austria. It describes systematically the current economic evaluation and costing studies landscape focusing on the applied costing methods and their reporting standards. Findings are discussed in terms of their likely impacts on evidence-based decision-making and potential suggestions for areas of development. Methods A systematic literature review of English and German language peer-reviewed as well as grey literature (2004–2015) was conducted to identify Austrian economic analyses. The databases MEDLINE, EMBASE, SSCI, EconLit, NHS EED and Scopus were searched. Publication and study characteristics, costing methods, reporting standards and valuation sources were systematically synthesised and assessed. Results A total of 93 studies were included. 87% were journal articles, 13% were reports. 41% of all studies were full economic evaluations, mostly cost-effectiveness analyses. Based on relevant standards the most commonly observed limitations were that 60% of the studies did not clearly state an analytical perspective, 25% of the studies did not provide the year of costing, 27% did not comprehensively list all valuation sources, and 38% did not report all applied unit costs. Conclusion There are substantial inconsistencies in the costing methods and reporting standards in economic analyses in Austria, which may contribute to a low acceptance and lack of interest in economic evaluation-informed decision making. To improve comparability and quality of future studies, national costing guidelines should be updated with more specific methodological guidance and a national reference cost library should be set up to allow harmonisation of valuation methods. PMID:28806728
Silvestre, Dolores; Fraga, Miriam; Gormaz, María; Torres, Ester; Vento, Máximo
2014-07-01
The variability of human milk (HM) composition renders analysis of its components essential for optimal nutrition of preterm fed either with donor's or own mother's milk. To fulfil this requirement, various analytical instruments have been subjected to scientific and clinical evaluation. The objective of this study was to evaluate the suitability of a rapid method for the analysis of macronutrients in HM as compared with the analytical methods applied by cow's milk industry. Mature milk from 39 donors was analysed using an infrared human milk analyser (HMA) and compared with biochemical reference laboratory methods. The statistical analysis was based on the use of paired data tests. The use of an infrared HMA for the analysis of lipids, proteins and lactose in HM proved satisfactory as regards the rapidity, simplicity and the required sample volume. The instrument afforded good linearity and precision in application to all three nutrients. However, accuracy was not acceptable when compared with the reference methods, with overestimation of the lipid content and underestimation of the amount of proteins and lactose contents. The use of mid-infrared HMA might become the standard for rapid analysis of HM once standardisation and rigorous and systematic calibration is provided. © 2012 John Wiley & Sons Ltd.
Colloidal Mechanisms of Gold Nanoparticle Loss in Asymmetric Flow Field-Flow Fractionation.
Jochem, Aljosha-Rakim; Ankah, Genesis Ngwa; Meyer, Lars-Arne; Elsenberg, Stephan; Johann, Christoph; Kraus, Tobias
2016-10-07
Flow field-flow fractionation is a powerful method for the analysis of nanoparticle size distributions, but its widespread use has been hampered by large analyte losses, especially of metal nanoparticles. Here, we report on the colloidal mechanisms underlying the losses. We systematically studied gold nanoparticles (AuNPs) during asymmetrical flow field-flow fractionation (AF4) by systematic variation of the particle properties and the eluent composition. Recoveries of AuNPs (core diameter 12 nm) stabilized by citrate or polyethylene glycol (PEG) at different ionic strengths were determined. We used online UV-vis detection and off-line elementary analysis to follow particle losses during full analysis runs, runs without cross-flow, and runs with parts of the instrument bypassed. The combination allowed us to calculate relative and absolute analyte losses at different stages of the analytic protocol. We found different loss mechanisms depending on the ligand. Citrate-stabilized particles degraded during analysis and suffered large losses (up to 74%). PEG-stabilized particles had smaller relative losses at moderate ionic strengths (1-20%) that depended on PEG length. Long PEGs at higher ionic strengths (≥5 mM) caused particle loss due to bridging adsorption at the membrane. Bulk agglomeration was not a relevant loss mechanism at low ionic strengths ≤5 mM for any of the studied particles. An unexpectedly large fraction of particles was lost at tubing and other internal surfaces. We propose that the colloidal mechanisms observed here are relevant loss mechanisms in many particle analysis protocols and discuss strategies to avoid them.
NASA Astrophysics Data System (ADS)
Nielsen, Roger L.; Ustunisik, Gokce; Weinsteiger, Allison B.; Tepley, Frank J.; Johnston, A. Dana; Kent, Adam J. R.
2017-09-01
Quantitative models of petrologic processes require accurate partition coefficients. Our ability to obtain accurate partition coefficients is constrained by their dependence on pressure temperature and composition, and on the experimental and analytical techniques we apply. The source and magnitude of error in experimental studies of trace element partitioning may go unrecognized if one examines only the processed published data. The most important sources of error are relict crystals, and analyses of more than one phase in the analytical volume. Because we have typically published averaged data, identification of compromised data is difficult if not impossible. We addressed this problem by examining unprocessed data from plagioclase/melt partitioning experiments, by comparing models based on that data with existing partitioning models, and evaluated the degree to which the partitioning models are dependent on the calibration data. We found that partitioning models are dependent on the calibration data in ways that result in erroneous model values, and that the error will be systematic and dependent on the value of the partition coefficient. In effect, use of different calibration datasets will result in partitioning models whose results are systematically biased, and that one can arrive at different and conflicting conclusions depending on how a model is calibrated, defeating the purpose of applying the models. Ultimately this is an experimental data problem, which can be solved if we publish individual analyses (not averages) or use a projection method wherein we use an independent compositional constraint to identify and estimate the uncontaminated composition of each phase.
Bhui, Kamaldeep S.; Dinos, Sokratis; Stansfeld, Stephen A.; White, Peter D.
2012-01-01
Background. Psychosocial stressors in the workplace are a cause of anxiety and depressive illnesses, suicide and family disruption. Methods. The present review synthesizes the evidence from existing systematic reviews published between 1990 and July 2011. We assessed the effectiveness of individual, organisational and mixed interventions on two outcomes: mental health and absenteeism. Results. In total, 23 systematic reviews included 499 primary studies; there were 11 meta-analyses and 12 narrative reviews. Meta-analytic studies found a greater effect size of individual interventions on individual outcomes. Organisational interventions showed mixed evidence of benefit. Organisational programmes for physical activity showed a reduction in absenteeism. The findings from the meta-analytic reviews were consistent with the findings from the narrative reviews. Specifically, cognitive-behavioural programmes produced larger effects at the individual level compared with other interventions. Some interventions appeared to lead to deterioration in mental health and absenteeism outcomes.Gaps in the literature include studies of organisational outcomes like absenteeism, the influence of specific occupations and size of organisations, and studies of the comparative effectiveness of primary, secondary and tertiary prevention. Conclusions. Individual interventions (like CBT) improve individuals' mental health. Physical activity as an organisational intervention reduces absenteeism. Research needs to target gaps in the evidence. PMID:22496705
Peters, Frank T; Remane, Daniela
2012-06-01
In the last decade, liquid chromatography coupled to (tandem) mass spectrometry (LC-MS(-MS)) has become a versatile technique with many routine applications in clinical and forensic toxicology. However, it is well-known that ionization in LC-MS(-MS) is prone to so-called matrix effects, i.e., alteration in response due to the presence of co-eluting compounds that may increase (ion enhancement) or reduce (ion suppression) ionization of the analyte. Since the first reports on such matrix effects, numerous papers have been published on this matter and the subject has been reviewed several times. However, none of the existing reviews has specifically addressed aspects of matrix effects of particular interest and relevance to clinical and forensic toxicology, for example matrix effects in methods for multi-analyte or systematic toxicological analysis or matrix effects in (alternative) matrices almost exclusively analyzed in clinical and forensic toxicology, for example meconium, hair, oral fluid, or decomposed samples in postmortem toxicology. This review article will therefore focus on these issues, critically discussing experiments and results of matrix effects in LC-MS(-MS) applications in clinical and forensic toxicology. Moreover, it provides guidance on performance of studies on matrix effects in LC-MS(-MS) procedures in systematic toxicological analysis and postmortem toxicology.
A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales
Ayton, Gary S.; Voth, Gregory A.
2009-01-01
A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohi, J.
Supporting analysis and assessments can provide a sound analytic foundation and focus for program planning, evaluation, and coordination, particularly if issues of hydrogen production, distribution, storage, safety, and infrastructure can be analyzed in a comprehensive and systematic manner. The overall purpose of this activity is to coordinate all key analytic tasks-such as technology and market status, opportunities, and trends; environmental costs and benefits; and regulatory constraints and opportunities-within a long-term and systematic analytic foundation for program planning and evaluation. Within this context, the purpose of the project is to help develop and evaluate programmatic pathway options that incorporate near andmore » mid-term strategies to achieve the long-term goals of the Hydrogen Program. In FY 95, NREL will develop a comprehensive effort with industry, state and local agencies, and other federal agencies to identify and evaluate programmatic pathway options to achieve the long-term goals of the Program. Activity to date is reported.« less
Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J
2016-02-01
Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
Cucu, Daniela; Woods, Mike
2008-08-01
The paper aims to present a practical approach for testing laboratories to ensure the quality of their test results. It is based on the experience gained in assessing a large number of testing laboratories, discussing with management and staff, reviewing results obtained in national and international PTs and ILCs and exchanging information in the EA laboratory committee. According to EN ISO/IEC 17025, an accredited laboratory has to implement a programme to ensure the quality of its test results for each measurand. Pre-analytical, analytical and post-analytical measures shall be applied in a systematic manner. They shall include both quality control and quality assurance measures. When designing the quality assurance programme a laboratory should consider pre-analytical activities (like personnel training, selection and validation of test methods, qualifying equipment), analytical activities ranging from sampling, sample preparation, instrumental analysis and post-analytical activities (like decoding, calculation, use of statistical tests or packages, management of results). Designed on different levels (analyst, quality manager and technical manager), including a variety of measures, the programme shall ensure the validity and accuracy of test results, the adequacy of the management system, prove the laboratory's competence in performing tests under accreditation and last but not least show the comparability of test results. Laboratory management should establish performance targets and review periodically QC/QA results against them, implementing appropriate measures in case of non-compliance.
A new numerical method for calculating extrema of received power for polarimetric SAR
Zhang, Y.; Zhang, Jiahua; Lu, Z.; Gong, W.
2009-01-01
A numerical method called cross-step iteration is proposed to calculate the maximal/minimal received power for polarized imagery based on a target's Kennaugh matrix. This method is much more efficient than the systematic method, which searches for the extrema of received power by varying the polarization ellipse angles of receiving and transmitting polarizations. It is also more advantageous than the Schuler method, which has been adopted by the PolSARPro package, because the cross-step iteration method requires less computation time and can derive both the maximal and minimal received powers, whereas the Schuler method is designed to work out only the maximal received power. The analytical model of received-power optimization indicates that the first eigenvalue of the Kennaugh matrix is the supremum of the maximal received power. The difference between these two parameters reflects the depolarization effect of the target's backscattering, which might be useful for target discrimination. ?? 2009 IEEE.
A systematic investigation of sample diluents in modern supercritical fluid chromatography.
Desfontaine, Vincent; Tarafder, Abhijit; Hill, Jason; Fairchild, Jacob; Grand-Guillaume Perrenoud, Alexandre; Veuthey, Jean-Luc; Guillarme, Davy
2017-08-18
This paper focuses on the possibility to inject large volumes (up to 10μL) in ultra-high performance supercritical fluid chromatography (UHPSFC) under generic gradient conditions. Several injection and method parameters have been individually evaluated (i.e. analyte concentration, injection volume, initial percentage of co-solvent in the gradient, nature of the weak needle wash solvent, nature of the sample diluent, nature of the column and of the analyte). The most critical parameters were further investigated using in a multivariate approach. The overall results suggested that several aprotic solvents including methyl tert-butyl ether (MTBE), dichloromethane, acetonitrile or cyclopentyl methyl ether (CPME) were well adapted for the injection of large volume in UHPSFC, while MeOH was generally the worst alternative. However, the nature of the stationary phase also had a strong impact and some of these diluents did not perform equally on each column. This was due to the existence of a competition in the adsorption of the analyte and the diluent on the stationary phase. This observation introduced the idea that the sample diluent should not only be chosen according to the analyte but also to the column chemistry to limit the interactions between the diluent and the ligands. Other important characteristics of the "ideal" SFC sample diluent were finally highlighted. Aprotic solvents with low viscosity are preferable to avoid strong solvent effects and viscous fingering, respectively. In the end, the authors suggest that the choice of the sample diluent should be part of the method development, as a function of the analyte and the selected stationary phase. Copyright © 2017 Elsevier B.V. All rights reserved.
Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M
2015-08-01
To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Blumenberg, Cauane; Barros, Aluísio J D
2018-07-01
To systematically review the literature and compare response rates (RRs) of web surveys to alternative data collection methods in the context of epidemiologic and public health studies. We reviewed the literature using PubMed, LILACS, SciELO, WebSM, and Google Scholar databases. We selected epidemiologic and public health studies that considered the general population and used two parallel data collection methods, being one web-based. RR differences were analyzed using two-sample test of proportions, and pooled using random effects. We investigated agreement using Bland-and-Altman, and correlation using Pearson's coefficient. We selected 19 studies (nine randomized trials). The RR of the web-based data collection was 12.9 percentage points (p.p.) lower (95% CI = - 19.0, - 6.8) than the alternative methods, and 15.7 p.p. lower (95% CI = - 24.2, - 7.3) considering only randomized trials. Monetary incentives did not reduce the RR differences. A strong positive correlation (r = 0.83) between the RRs was observed. Web-based data collection present lower RRs compared to alternative methods. However, it is not recommended to interpret this as a meta-analytical evidence due to the high heterogeneity of the studies.
The HiWATE (Health Impacts of long-term exposure to disinfection by-products in drinking WATEr) project is the first systematic analysis that combines the epidemiology on adverse pregnancy outcomes with analytical chemistry and analytical biology in the European Union. This study...
The HiWATE (Health Impacts of long-term exposure to disinfection by-products in drinking WATEr) project is the first systematic analysis that combines the epidemiology on adverse pregnancy outcomes with analytical chemistry and analytical biology in the European Union. This study...
Three-dimensional analytical solutions of the atmospheric diffusion equation with multiple sources and height-dependent wind speed and eddy diffusivities are derived in a systematic fashion. For homogeneous Neumann (total reflection), Dirichlet (total adsorpti...
Three-dimensional analytical solutions of the atmospheric diffusion equation with multiple sources and height-dependent wind speed and eddy diffusivities are derived in a systematic fashion. For homogeneous Neumann (total reflection), Dirichlet (total adsorpti...
NASA Astrophysics Data System (ADS)
Rosen, Elias P.; Bokhart, Mark T.; Ghashghaei, H. Troy; Muddiman, David C.
2015-06-01
Analyte signal in a laser desorption/postionization scheme such as infrared matrix-assisted laser desorption electrospray ionization (IR-MALDESI) is strongly coupled to the degree of overlap between the desorbed plume of neutral material from a sample and an orthogonal electrospray. In this work, we systematically examine the effect of desorption conditions on IR-MALDESI response to pharmaceutical drugs and endogenous lipids in biological tissue using a design of experiments approach. Optimized desorption conditions have then been used to conduct an untargeted lipidomic analysis of whole body sagittal sections of neonate mouse. IR-MALDESI response to a wide range of lipid classes has been demonstrated, with enhanced lipid coverage received by varying the laser wavelength used for mass spectrometry imaging (MSI). Targeted MS2 imaging (MS2I) of an analyte, cocaine, deposited beneath whole body sections allowed determination of tissue-specific ion response factors, and CID fragments of cocaine were monitored to comment on wavelength-dependent internal energy deposition based on the "survival yield" method.
Kinetic theory analysis of rarefied gas flow through finite length slots
NASA Technical Reports Server (NTRS)
Raghuraman, P.
1972-01-01
An analytic study is made of the flow a rarefied monatomic gas through a two dimensional slot. The parameters of the problem are the ratios of downstream to upstream pressures, the Knudsen number at the high pressure end (based on slot half width) and the length to slot half width ratio. A moment method of solution is used by assuming a discontinuous distribution function consisting of four Maxwellians split equally in angular space. Numerical solutions are obtained for the resulting equations. The characteristics of the transition regime are portrayed. The solutions in the free molecule limit are systematically lower than the results obtained in that limit by more accurate numerical methods.
Quantum networks in divergence-free circuit QED
NASA Astrophysics Data System (ADS)
Parra-Rodriguez, A.; Rico, E.; Solano, E.; Egusquiza, I. L.
2018-04-01
Superconducting circuits are one of the leading quantum platforms for quantum technologies. With growing system complexity, it is of crucial importance to develop scalable circuit models that contain the minimum information required to predict the behaviour of the physical system. Based on microwave engineering methods, divergent and non-divergent Hamiltonian models in circuit quantum electrodynamics have been proposed to explain the dynamics of superconducting quantum networks coupled to infinite-dimensional systems, such as transmission lines and general impedance environments. Here, we study systematically common linear coupling configurations between networks and infinite-dimensional systems. The main result is that the simple Lagrangian models for these configurations present an intrinsic natural length that provides a natural ultraviolet cutoff. This length is due to the unavoidable dressing of the environment modes by the network. In this manner, the coupling parameters between their components correctly manifest their natural decoupling at high frequencies. Furthermore, we show the requirements to correctly separate infinite-dimensional coupled systems in local bases. We also compare our analytical results with other analytical and approximate methods available in the literature. Finally, we propose several applications of these general methods to analogue quantum simulation of multi-spin-boson models in non-perturbative coupling regimes.
Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya
2015-01-07
To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.
Sun, Yuhan; Qi, Peipei; Cang, Tao; Wang, Zhiwei; Wang, Xiangyun; Yang, Xuewei; Wang, Lidong; Xu, Xiahong; Wang, Qiang; Wang, Xinquan; Zhao, Changshan
2018-06-01
As a key representative organism, earthworms can directly illustrate the influence of pesticides on environmental organisms in soil ecosystems. The present work aimed to develop a high-throughput multipesticides residue analytical method for earthworms using solid-liquid extraction with acetonitrile as the solvent and magnetic material-based dispersive solid-phase extraction for purification. Magnetic Fe 3 O 4 nanoparticles were modified with a thin silica layer to form Fe 3 O 4 -SiO 2 nanoparticles, which were fully characterized by field-emission scanning electron microscopy, transmission electron microscopy, Fourier-transform infrared spectroscopy, X-ray diffractometry, and vibrating sample magnetometry. The Fe 3 O 4 -SiO 2 nanoparticles were used as the separation media in dispersive solid-phase extraction with primary secondary amine and ZrO 2 as the cleanup adsorbents to eliminate matrix interferences. The amounts of nanoparticles and adsorbents were optimized for the simultaneous determination of 44 pesticides and six metabolites in earthworms by liquid chromatography with tandem mass spectrometry. The method performance was systematically validated with satisfactory results. The limits of quantification were 20 μg/kg for all analytes studied, while the recoveries of the target analytes ranged from 65.1 to 127% with relative standard deviation values lower than 15.0%. The developed method was subsequently utilized to explore the bioaccumulation of bitertanol in earthworms exposed to contaminated soil, verifying its feasibility for real sample analysis. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Firing rate dynamics in recurrent spiking neural networks with intrinsic and network heterogeneity.
Ly, Cheng
2015-12-01
Heterogeneity of neural attributes has recently gained a lot of attention and is increasing recognized as a crucial feature in neural processing. Despite its importance, this physiological feature has traditionally been neglected in theoretical studies of cortical neural networks. Thus, there is still a lot unknown about the consequences of cellular and circuit heterogeneity in spiking neural networks. In particular, combining network or synaptic heterogeneity and intrinsic heterogeneity has yet to be considered systematically despite the fact that both are known to exist and likely have significant roles in neural network dynamics. In a canonical recurrent spiking neural network model, we study how these two forms of heterogeneity lead to different distributions of excitatory firing rates. To analytically characterize how these types of heterogeneities affect the network, we employ a dimension reduction method that relies on a combination of Monte Carlo simulations and probability density function equations. We find that the relationship between intrinsic and network heterogeneity has a strong effect on the overall level of heterogeneity of the firing rates. Specifically, this relationship can lead to amplification or attenuation of firing rate heterogeneity, and these effects depend on whether the recurrent network is firing asynchronously or rhythmically firing. These observations are captured with the aforementioned reduction method, and furthermore simpler analytic descriptions based on this dimension reduction method are developed. The final analytic descriptions provide compact and descriptive formulas for how the relationship between intrinsic and network heterogeneity determines the firing rate heterogeneity dynamics in various settings.
ERIC Educational Resources Information Center
Shen, Hao-Yu; Shen, Bo; Hardacre, Christopher
2013-01-01
A systematic approach to develop the teaching of instrumental analytical chemistry is discussed, as well as a conceptual framework for organizing and executing lectures and a laboratory course. Three main components are used in this course: theoretical knowledge developed in the classroom, simulations via a virtual laboratory, and practical…
NASA Astrophysics Data System (ADS)
Adrich, Przemysław
2016-05-01
In Part I of this work existing methods and problems in dual foil electron beam forming system design are presented. On this basis, a new method of designing these systems is introduced. The motivation behind this work is to eliminate the shortcomings of the existing design methods and improve overall efficiency of the dual foil design process. The existing methods are based on approximate analytical models applied in an unrealistically simplified geometry. Designing a dual foil system with these methods is a rather labor intensive task as corrections to account for the effects not included in the analytical models have to be calculated separately and accounted for in an iterative procedure. To eliminate these drawbacks, the new design method is based entirely on Monte Carlo modeling in a realistic geometry and using physics models that include all relevant processes. In our approach, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of the system performance in function of parameters of the foils. The new method, while being computationally intensive, minimizes the involvement of the designer and considerably shortens the overall design time. The results are of high quality as all the relevant physics and geometry details are naturally accounted for. To demonstrate the feasibility of practical implementation of the new method, specialized software tools were developed and applied to solve a real life design problem, as described in Part II of this work.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Costing evidence for health care decision-making in Austria: A systematic review.
Mayer, Susanne; Kiss, Noemi; Łaszewska, Agata; Simon, Judit
2017-01-01
With rising healthcare costs comes an increasing demand for evidence-informed resource allocation using economic evaluations worldwide. Furthermore, standardization of costing and reporting methods both at international and national levels are imperative to make economic evaluations a valid tool for decision-making. The aim of this review is to assess the availability and consistency of costing evidence that could be used for decision-making in Austria. It describes systematically the current economic evaluation and costing studies landscape focusing on the applied costing methods and their reporting standards. Findings are discussed in terms of their likely impacts on evidence-based decision-making and potential suggestions for areas of development. A systematic literature review of English and German language peer-reviewed as well as grey literature (2004-2015) was conducted to identify Austrian economic analyses. The databases MEDLINE, EMBASE, SSCI, EconLit, NHS EED and Scopus were searched. Publication and study characteristics, costing methods, reporting standards and valuation sources were systematically synthesised and assessed. A total of 93 studies were included. 87% were journal articles, 13% were reports. 41% of all studies were full economic evaluations, mostly cost-effectiveness analyses. Based on relevant standards the most commonly observed limitations were that 60% of the studies did not clearly state an analytical perspective, 25% of the studies did not provide the year of costing, 27% did not comprehensively list all valuation sources, and 38% did not report all applied unit costs. There are substantial inconsistencies in the costing methods and reporting standards in economic analyses in Austria, which may contribute to a low acceptance and lack of interest in economic evaluation-informed decision making. To improve comparability and quality of future studies, national costing guidelines should be updated with more specific methodological guidance and a national reference cost library should be set up to allow harmonisation of valuation methods.
Data analytics and parallel-coordinate materials property charts
NASA Astrophysics Data System (ADS)
Rickman, Jeffrey M.
2018-01-01
It is often advantageous to display material properties relationships in the form of charts that highlight important correlations and thereby enhance our understanding of materials behavior and facilitate materials selection. Unfortunately, in many cases, these correlations are highly multidimensional in nature, and one typically employs low-dimensional cross-sections of the property space to convey some aspects of these relationships. To overcome some of these difficulties, in this work we employ methods of data analytics in conjunction with a visualization strategy, known as parallel coordinates, to represent better multidimensional materials data and to extract useful relationships among properties. We illustrate the utility of this approach by the construction and systematic analysis of multidimensional materials properties charts for metallic and ceramic systems. These charts simplify the description of high-dimensional geometry, enable dimensional reduction and the identification of significant property correlations and underline distinctions among different materials classes.
Progress Towards an Open Data Ecosystem for Australian Geochemistry and Geochronology Data
NASA Astrophysics Data System (ADS)
McInnes, B.; Rawling, T.; Brown, W.; Liffers, M.; Wyborn, L. A.; Brown, A.; Cox, S. J. D.
2016-12-01
Technological improvements in laboratory automation and microanalytical methods are producing an unprecedented volume of high-value geochemical data for use by geoscientists in understanding geological and planetary processes. In contrast, the research infrastructure necessary to systematically manage, deliver and archive analytical data has not progressed much beyond the minimum effort necessary to produce a peer-reviewed publication. Anecdotal evidence indicates that the majority of publically funded data is underreported, and what is published is relatively undiscoverable to experienced researchers let alone the general public. Government-funded "open data" initiatives have a role to play in the development of networks of data management and delivery ecosystems and practices allowing access to publically funded data. This paper reports on progress in Australia towards creation of an open data ecosystem involving multiple academic and government research institutions cooperating to create an open data architecture linking researchers, physical samples, sample metadata, laboratory metadata, analytical data and consumers.
NASA Technical Reports Server (NTRS)
Book, W. J.
1973-01-01
An investigation is reported involving a mathematical procedure using 4 x 4 transformation matrices for analyzing the vibrations of flexible manipulators. Previous studies with the procedure are summarized and the method is extended to include flexible joints as well as links, and to account for the effects of various power transmission schemes. A systematic study of the allocation of structural material and the placement of components such as motors and gearboxes was undertaken using the analytical tools developed. As one step in this direction the variables which relate the vibration parameters of the arm to the task and environment of the arm were isolated and nondimensionalized. The 4 x 4 transformation matrices were also used to develop analytical expressions for the terms of the complete 6 x 6 compliance matrix for the case of two flexible links joined by a rotating joint, flexible about its axis of rotation.
NASA Astrophysics Data System (ADS)
van Westen, Thijs; Gross, Joachim
2017-07-01
The Helmholtz energy of a fluid interacting by a Lennard-Jones pair potential is expanded in a perturbation series. Both the methods of Barker-Henderson (BH) and of Weeks-Chandler-Andersen (WCA) are evaluated for the division of the intermolecular potential into reference and perturbation parts. The first four perturbation terms are evaluated for various densities and temperatures (in the ranges ρ*=0 -1.5 and T*=0.5 -12 ) using Monte Carlo simulations in the canonical ensemble. The simulation results are used to test several approximate theoretical methods for describing perturbation terms or for developing an approximate infinite order perturbation series. Additionally, the simulations serve as a basis for developing fully analytical third order BH and WCA perturbation theories. The development of analytical theories allows (1) a careful comparison between the BH and WCA formalisms, and (2) a systematic examination of the effect of higher-order perturbation terms on calculated thermodynamic properties of fluids. Properties included in the comparison are supercritical thermodynamic properties (pressure, internal energy, and chemical potential), vapor-liquid phase equilibria, second virial coefficients, and heat capacities. For all properties studied, we find a systematically improved description upon using a higher-order perturbation theory. A result of particular relevance is that a third order perturbation theory is capable of providing a quantitative description of second virial coefficients to temperatures as low as the triple-point of the Lennard-Jones fluid. We find no reason to prefer the WCA formalism over the BH formalism.
Morini, Luca; Vignali, Claudia; Tricomi, Paolo; Groppi, Angelo
2015-09-01
The body of a 30-year-old woman was found in Como lake at a depth of about 120 meters in her own car after 3 years of immersion. The aim of this study was to evaluate psychoactive drugs as well as alcohol biomarkers in biological matrices. The following analyses were initially performed: GC-MS systematic toxicological analysis on biological fluids and tissues; GC-MS analysis of drugs of abuse on pubic hair; direct ethanol metabolite determination in pubic hair by LC-MS/MS. After 7 years, the samples, that had been stored at -20°C, were re-analyzed and submitted to an LC-MS/MS targeted screening method, using multiple reaction monitoring mode. These analyses detected citalopram (150-3000 ng/mL), desmethylcitalopram (50-2300 ng/mL), clotiapine (20-65 ng/mL), and ethyl glucuronide (97 pg/mg). The methods showed an acceptable reproducibility, and the concentrations of citalopram and desmethylcitalopram calculated through the two analytical techniques did not significantly differ in biological fluids. © 2015 American Academy of Forensic Sciences.
Data informatics for the Detection, Characterization, and Attribution of Climate Extremes
NASA Astrophysics Data System (ADS)
Collins, W.; Wehner, M. F.; O'Brien, T. A.; Paciorek, C. J.; Krishnan, H.; Johnson, J. N.; Prabhat, M.
2015-12-01
The potential for increasing frequency and intensity of extremephenomena including downpours, heat waves, and tropical cyclonesconstitutes one of the primary risks of climate change for society andthe environment. The challenge of characterizing these risks is thatextremes represent the "tails" of distributions of atmosphericphenomena and are, by definition, highly localized and typicallyrelatively transient. Therefore very large volumes of observationaldata and projections of future climate are required to quantify theirproperties in a robust manner. Massive data analytics are required inorder to detect individual extremes, accumulate statistics on theirproperties, quantify how these statistics are changing with time, andattribute the effects of anthropogenic global warming on thesestatistics. We describe examples of the suite of techniques the climate communityis developing to address these analytical challenges. The techniquesinclude massively parallel methods for detecting and trackingatmospheric rivers and cyclones; data-intensive extensions togeneralized extreme value theory to summarize the properties ofextremes; and multi-model ensembles of hindcasts to quantify theattributable risk of anthropogenic influence on individual extremes.We conclude by highlighting examples of these methods developed by ourCASCADE (Calibrated and Systematic Characterization, Attribution, andDetection of Extremes) project.
ERIC Educational Resources Information Center
Papamitsiou, Zacharoula; Economides, Anastasios A.
2014-01-01
This paper aims to provide the reader with a comprehensive background for understanding current knowledge on Learning Analytics (LA) and Educational Data Mining (EDM) and its impact on adaptive learning. It constitutes an overview of empirical evidence behind key objectives of the potential adoption of LA/EDM in generic educational strategic…
ERIC Educational Resources Information Center
Cheung, Mike W. L.; Chan, Wai
2009-01-01
Structural equation modeling (SEM) is widely used as a statistical framework to test complex models in behavioral and social sciences. When the number of publications increases, there is a need to systematically synthesize them. Methodology of synthesizing findings in the context of SEM is known as meta-analytic SEM (MASEM). Although correlation…
Improving Causal Inferences in Meta-analyses of Longitudinal Studies: Spanking as an Illustration.
Larzelere, Robert E; Gunnoe, Marjorie Lindner; Ferguson, Christopher J
2018-05-24
To evaluate and improve the validity of causal inferences from meta-analyses of longitudinal studies, two adjustments for Time-1 outcome scores and a temporally backwards test are demonstrated. Causal inferences would be supported by robust results across both adjustment methods, distinct from results run backwards. A systematic strategy for evaluating potential confounds is also introduced. The methods are illustrated by assessing the impact of spanking on subsequent externalizing problems (child age: 18 months to 11 years). Significant results indicated a small risk or a small benefit of spanking, depending on the adjustment method. These meta-analytic methods are applicable for research on alternatives to spanking and other developmental science topics. The underlying principles can also improve causal inferences in individual studies. © 2018 Society for Research in Child Development.
Lagrangian methods in the analysis of nonlinear wave interactions in plasma
NASA Technical Reports Server (NTRS)
Galloway, J. J.
1972-01-01
An averaged-Lagrangian method is developed for obtaining the equations which describe the nonlinear interactions of the wave (oscillatory) and background (nonoscillatory) components which comprise a continuous medium. The method applies to monochromatic waves in any continuous medium that can be described by a Lagrangian density, but is demonstrated in the context of plasma physics. The theory is presented in a more general and unified form by way of a new averaged-Lagrangian formalism which simplifies the perturbation ordering procedure. Earlier theory is extended to deal with a medium distributed in velocity space and to account for the interaction of the background with the waves. The analytic steps are systematized, so as to maximize calculational efficiency. An assessment of the applicability and limitations of the method shows that it has some definite advantages over other approaches in efficiency and versatility.
NASA Astrophysics Data System (ADS)
Yao, Yanbo; Duan, Xiaoshuang; Luo, Jiangjiang; Liu, Tao
2017-11-01
The use of the van der Pauw (VDP) method for characterizing and evaluating the piezoresistive behavior of carbon nanomaterial enabled piezoresistive sensors have not been systematically studied. By using single-wall carbon nanotube (SWCNT) thin films as a model system, herein we report a coupled electrical-mechanical experimental study in conjunction with a multiphysics finite element simulation as well as an analytic analysis to compare the two-probe and VDP testing configuration in evaluating the piezoresistive behavior of carbon nanomaterial enabled piezoresistive sensors. The key features regarding the sample aspect ratio dependent piezoresistive sensitivity or gauge factor were identified for the VDP testing configuration. It was found that the VDP test configuration offers consistently higher piezoresistive sensitivity than the two-probe testing method.
The current role of on-line extraction approaches in clinical and forensic toxicology.
Mueller, Daniel M
2014-08-01
In today's clinical and forensic toxicological laboratories, automation is of interest because of its ability to optimize processes, to reduce manual workload and handling errors and to minimize exposition to potentially infectious samples. Extraction is usually the most time-consuming step; therefore, automation of this step is reasonable. Currently, from the field of clinical and forensic toxicology, methods using the following on-line extraction techniques have been published: on-line solid-phase extraction, turbulent flow chromatography, solid-phase microextraction, microextraction by packed sorbent, single-drop microextraction and on-line desorption of dried blood spots. Most of these published methods are either single-analyte or multicomponent procedures; methods intended for systematic toxicological analysis are relatively scarce. However, the use of on-line extraction will certainly increase in the near future.
Königs, Marsh; Beurskens, Eva A; Snoep, Lian; Scherder, Erik J; Oosterlaan, Jaap
2018-06-01
To systematically review evidence on the effects of timing and intensity of neurorehabilitation on the functional recovery of patients with moderate to severe traumatic brain injury (TBI) and aggregate the available evidence using meta-analytic methods. PubMed, Embase, PsycINFO, and Cochrane Database. Electronic databases were searched for prospective controlled clinical trials assessing the effect of timing or intensity of multidisciplinary neurorehabilitation programs on functional outcome of patients with moderate or severe TBI. A total of 5961 unique records were screened for relevance, of which 58 full-text articles were assessed for eligibility by 2 independent authors. Eleven articles were included for systematic review and meta-analysis. Two independent authors performed data extraction and risk of bias analysis using the Cochrane Collaboration tool. Discrepancies between authors were resolved by consensus. Systematic review of a total of 6 randomized controlled trials, 1 quasi-randomized trial, and 4 controlled trials revealed consistent evidence for a beneficial effect of early onset neurorehabilitation in the trauma center and intensive neurorehabilitation in the rehabilitation facility on functional outcome compared with usual care. Meta-analytic quantification revealed a large-sized positive effect for early onset rehabilitation programs (d=1.02; P<.001; 95% confidence interval [CI], 0.56-1.47) and a medium-sized positive effect for intensive neurorehabilitation programs (d=.67; P<.001; 95% CI, .38-.97) compared with usual care. These effects were replicated based solely on studies with a low overall risk of bias. The available evidence indicates that early onset neurorehabilitation in the trauma center and more intensive neurorehabilitation in the rehabilitation facility promote functional recovery of patients with moderate to severe TBI compared with usual care. These findings support the integration of early onset and more intensive neurorehabilitation in the chain of care for patients with TBI. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Otsuki, Michio; Matsukawa, Hiroshi
2013-01-01
In many sliding systems consisting of solid object on a solid substrate under dry condition, the friction force does not depend on the apparent contact area and is proportional to the loading force. This behaviour is called Amontons' law and indicates that the friction coefficient, or the ratio of the friction force to the loading force, is constant. Here, however, using numerical and analytical methods, we show that Amontons' law breaks down systematically under certain conditions for an elastic object experiencing a friction force that locally obeys Amontons' law. The macroscopic static friction coefficient, which corresponds to the onset of bulk sliding of the object, decreases as pressure or system length increases. This decrease results from precursor slips before the onset of bulk sliding, and is consistent with the results of certain previous experiments. The mechanisms for these behaviours are clarified. These results will provide new insight into controlling friction. PMID:23545778
Crown, William; Buyukkaramikli, Nasuh; Thokala, Praveen; Morton, Alec; Sir, Mustafa Y; Marshall, Deborah A; Tosh, Jon; Padula, William V; Ijzerman, Maarten J; Wong, Peter K; Pasupathy, Kalyan S
2017-03-01
Providing health services with the greatest possible value to patients and society given the constraints imposed by patient characteristics, health care system characteristics, budgets, and so forth relies heavily on the design of structures and processes. Such problems are complex and require a rigorous and systematic approach to identify the best solution. Constrained optimization is a set of methods designed to identify efficiently and systematically the best solution (the optimal solution) to a problem characterized by a number of potential solutions in the presence of identified constraints. This report identifies 1) key concepts and the main steps in building an optimization model; 2) the types of problems for which optimal solutions can be determined in real-world health applications; and 3) the appropriate optimization methods for these problems. We first present a simple graphical model based on the treatment of "regular" and "severe" patients, which maximizes the overall health benefit subject to time and budget constraints. We then relate it back to how optimization is relevant in health services research for addressing present day challenges. We also explain how these mathematical optimization methods relate to simulation methods, to standard health economic analysis techniques, and to the emergent fields of analytics and machine learning. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Giacumo, Lisa A.; Breman, Jeroen
2016-01-01
This article provides a systematic literature review about nonprofit and for-profit organizations using "big data" to inform performance improvement initiatives. The review of literature resulted in 4 peer-reviewed articles and an additional 33 studies covering the topic for these contexts. The review found that big data and analytics…
Perceiving Learning at a Glance: A Systematic Literature Review of Learning Dashboard Research
ERIC Educational Resources Information Center
Schwendimann, Beat A.; Rodriguez-Triana, Maria Jesus; Vozniuk, Andrii; Prieto, Luis P.; Boroujeni, Mina Shirvani; Holzer, Adrian; Gillet, Denis; Dillenbourg, Pierre
2017-01-01
This paper presents a systematic literature review of the state-of-the-art of research on learning dashboards in the fields of Learning Analytics and Educational Data Mining. Research on learning dashboards aims to identify what data is meaningful to different stakeholders and how data can be presented to support sense-making processes. Learning…
ERIC Educational Resources Information Center
Andrade, Alejandro; Delandshere, Ginette; Danish, Joshua A.
2016-01-01
One of the challenges many learning scientists face is the laborious task of coding large amounts of video data and consistently identifying social actions, which is time consuming and difficult to accomplish in a systematic and consistent manner. It is easier to catalog observable behaviours (e.g., body motions or gaze) without explicitly…
A Framework for Integrating Environmental Justice in Regulatory Analysis
Nweke, Onyemaechi C.
2011-01-01
With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235
ERIC Educational Resources Information Center
Moissa, Barbara; Gasparini, Isabela; Kemczinski, Avanilde
2015-01-01
Learning Analytics (LA) is a field that aims to optimize learning through the study of dynamical processes occurring in the students' context. It covers the measurement, collection, analysis and reporting of data about students and their contexts. This study aims at surveying existing research on LA to identify approaches, topics, and needs for…
Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)
2001-01-01
The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.
Ho, Sirikit; Lukacs, Zoltan; Hoffmann, Georg F; Lindner, Martin; Wetter, Thomas
2007-07-01
In newborn screening with tandem mass spectrometry, multiple intermediary metabolites are quantified in a single analytical run for the diagnosis of fatty-acid oxidation disorders, organic acidurias, and aminoacidurias. Published diagnostic criteria for these disorders normally incorporate a primary metabolic marker combined with secondary markers, often analyte ratios, for which the markers have been chosen to reflect metabolic pathway deviations. We applied a procedure to extract new markers and diagnostic criteria for newborn screening to the data of newborns with confirmed medium-chain acyl-CoA dehydrogenase deficiency (MCADD) and a control group from the newborn screening program, Heidelberg, Germany. We validated the results with external data of the screening center in Hamburg, Germany. We extracted new markers by performing a systematic search for analyte combinations (features) with high discriminatory performance for MCADD. To select feature thresholds, we applied automated procedures to separate controls and cases on the basis of the feature values. Finally, we built classifiers from these new markers to serve as diagnostic criteria in screening for MCADD. On the basis of chi(2) scores, we identified approximately 800 of >628,000 new analyte combinations with superior discriminatory performance compared with the best published combinations. Classifiers built with the new features achieved diagnostic sensitivities and specificities approaching 100%. Feature construction methods provide ways to disclose information hidden in the set of measured analytes. Other diagnostic tasks based on high-dimensional metabolic data might also profit from this approach.
LaKind, Judy S; Anthony, Laura G; Goodman, Michael
2017-01-01
Environmental epidemiology data are becoming increasingly important in public health decision making, which commonly incorporates a systematic review of multiple studies. This review addresses two fundamental questions: What is the quality of available reviews on associations between exposure to synthetic organic chemicals and neurodevelopmental outcomes? What is the value (e.g., quality and consistency) of the underlying literature? Published reviews on associations between synthetic organic environmental chemical exposures and neurodevelopmental outcomes in children were systematically evaluated. Seventy-four relevant reviews were identified, and these were evaluated with respect to four methodological characteristics: (1) systematic inclusion/exclusion criteria and reproducible methods for search and retrieval of studies; (2) structured evaluation of underlying data quality; (3) systematic assessment of consistency across specific exposure-outcome associations; and (4) evaluation of reporting/publication bias. None of the 74 reviews fully met the criteria for all four methodological characteristics. Only four reviews met two criteria, and six reviews fulfilled only one criterion. Perhaps more importantly, the higher quality reviews were not able to meet all of the criteria owing to the shortcomings of underlying studies, which lacked comparability in terms of specific research question of interest, overall design, exposure assessment, outcome ascertainment, and analytic methods. Thus, even the most thoughtful and rigorous review may be of limited value if the underlying literature includes investigations that address different hypotheses and are beset by methodological inconsistencies and limitations. Issues identified in this review of reviews illustrate considerable challenges that are facing assessments of epidemiological evidence.
A Lean Six Sigma approach to the improvement of the selenium analysis method.
Cloete, Bronwyn C; Bester, André
2012-11-02
Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.
Wang, Yuanze; van Oosterwijk, Niels; Ali, Ameena M; Adawy, Alaa; Anindya, Atsarina L; Dömling, Alexander S S; Groves, Matthew R
2017-08-24
Refolding of proteins derived from inclusion bodies is very promising as it can provide a reliable source of target proteins of high purity. However, inclusion body-based protein production is often limited by the lack of techniques for the detection of correctly refolded protein. Thus, the selection of the refolding conditions is mostly achieved using trial and error approaches and is thus a time-consuming process. In this study, we use the latest developments in the differential scanning fluorimetry guided refolding approach as an analytical method to detect correctly refolded protein. We describe a systematic buffer screen that contains a 96-well primary pH-refolding screen in conjunction with a secondary additive screen. Our research demonstrates that this approach could be applied for determining refolding conditions for several proteins. In addition, it revealed which "helper" molecules, such as arginine and additives are essential. Four different proteins: HA-RBD, MDM2, IL-17A and PD-L1 were used to validate our refolding approach. Our systematic protocol evaluates the impact of the "helper" molecules, the pH, buffer system and time on the protein refolding process in a high-throughput fashion. Finally, we demonstrate that refolding time and a secondary thermal shift assay buffer screen are critical factors for improving refolding efficiency.
Wang, Huazi; Hu, Lu; Liu, Xinya; Yin, Shujun; Lu, Runhua; Zhang, Sanbing; Zhou, Wenfeng; Gao, Haixiang
2017-09-22
In the present study, a simple and rapid sample preparation method designated ultrasound-assisted dispersive liquid-liquid microextraction based on a deep eutectic solvent (DES) followed by high-performance liquid chromatography with ultraviolet (UV) detection (HPLC-UVD) was developed for the extraction and determination of UV filters from water samples. The model analytes were 2,4-dihydroxybenzophenone (BP-1), benzophenone (BP) and 2-hydroxy-4-methoxybenzophenone (BP-3). The hydrophobic DES was prepared by mixing trioctylmethylammonium chloride (TAC) and decanoic acid (DecA). Various influencing factors (selection of the extractant, amount of DES, ultrasound duration, salt addition, sample volume, sample pH, centrifuge rate and duration) on UV filter recovery were systematically investigated. Under optimal conditions, the proposed method provided good recoveries in the range of 90.2-103.5% and relative standard deviations (inter-day and intra-day precision, n=5) below 5.9%. The enrichment factors for the analytes ranged from 67 to 76. The limits of detection varied from 0.15 to 0.30ngmL -1 , depending on the analytes. The linearities were between 0.5 and 500ngmL -1 for BP-1 and BP and between 1 and 500ngmL -1 for BP-3, with coefficients of determination greater than 0.99. Finally, the proposed method was applied to the determination of UV filters in swimming pool and river water samples, and acceptable relative recoveries ranging from 82.1 to 106.5% were obtained. Copyright © 2017. Published by Elsevier B.V.
Nanoscaled aptasensors for multi-analyte sensing
Saberian-Borujeni, Mehdi; Johari-Ahar, Mohammad; Hamzeiy, Hossein; Barar, Jaleh; Omidi, Yadollah
2014-01-01
Introduction: Nanoscaled aptamers (Aps), as short single-stranded DNA or RNA oligonucleotides, are able to bind to their specific targets with high affinity, upon which they are considered as powerful diagnostic and analytical sensing tools (the so-called "aptasensors"). Aptamers are selected from a random pool of oligonucleotides through a procedure known as "systematic evolution of ligands by exponential enrichment". Methods: In this work, the most recent studies in the field of aptasensors are reviewed and discussed with a main focus on the potential of aptasensors for the multianalyte detection(s). Results: Due to the specific folding capability of aptamers in the presence of analyte, aptasensors have substantially successfully been exploited for the detection of a wide range of small and large molecules (e.g., drugs and their metabolites, toxins, and associated biomarkers in various diseases) at very low concentrations in the biological fluids/samples even in presence of interfering species. Conclusion: Biological samples are generally considered as complexes in the real biological media. Hence, the development of aptasensors with capability to determine various targets simultaneously within a biological matrix seems to be our main challenge. To this end, integration of various key scientific dominions such as bioengineering and systems biology with biomedical researches are inevitable. PMID:25671177
López-Guerra, Enrique A
2017-01-01
We explore the contact problem of a flat-end indenter penetrating intermittently a generalized viscoelastic surface, containing multiple characteristic times. This problem is especially relevant for nanoprobing of viscoelastic surfaces with the highly popular tapping-mode AFM imaging technique. By focusing on the material perspective and employing a rigorous rheological approach, we deliver analytical closed-form solutions that provide physical insight into the viscoelastic sources of repulsive forces, tip–sample dissipation and virial of the interaction. We also offer a systematic comparison to the well-established standard harmonic excitation, which is the case relevant for dynamic mechanical analysis (DMA) and for AFM techniques where tip–sample sinusoidal interaction is permanent. This comparison highlights the substantial complexity added by the intermittent-contact nature of the interaction, which precludes the derivation of straightforward equations as is the case for the well-known harmonic excitations. The derivations offered have been thoroughly validated through numerical simulations. Despite the complexities inherent to the intermittent-contact nature of the technique, the analytical findings highlight the potential feasibility of extracting meaningful viscoelastic properties with this imaging method. PMID:29114450
NASA Astrophysics Data System (ADS)
Tao, Zhang; Wei, Wang; Wang, Gary Z.; Luo, Hongyan; Liang, Cunzhu; Liu, Jiahui; An, Huijun; Pei, Hao; Zhong, Huidong; Chen, Xiaojun
2006-08-01
AHP is a kind of very effective systematic analytical method, widely applied to energy utilizing and resource analyzing, talent predicting, economic management project, urban industries planning, communications and transportation, water resource using and so on, using this method to solve the problem of ecology also have very strong practicability and validity. Using 15 kinds of endured plants in East Alashan-West Erdos as the research objects, this paper adopts 3S technique and outfield investigates to confirm the geographical distributions, extent density in the distribution area, plant community construction, and plant community environment of the plants. Then invite the experts to give marks according to this datum and using the AHP method to deal with the results, thereby get the priority protective order of endangered plants in East Alashan-West Erdos.
Laborde-Castérot, Hervé; Agrinier, Nelly; Thilly, Nathalie
2015-10-01
Propensity score (PS) and instrumental variable (IV) are analytical techniques used to adjust for confounding in observational research. More and more, they seem to be used simultaneously in studies evaluating health interventions. The present review aimed to analyze the agreement between PS and IV results in medical research published to date. Review of all published observational studies that evaluated a clinical intervention using simultaneously PS and IV analyses, as identified in MEDLINE and Web of Science. Thirty-seven studies, most of them published during the previous 5 years, reported 55 comparisons between results from PS and IV analyses. There was a slight/fair agreement between the methods [Cohen's kappa coefficient = 0.21 (95% confidence interval: 0.00, 0.41)]. In 23 cases (42%), results were nonsignificant for one method and significant for the other, and IV analysis results were nonsignificant in most situations (87%). Discrepancies are frequent between PS and IV analyses and can be interpreted in various ways. This suggests that researchers should carefully consider their analytical choices, and readers should be cautious when interpreting results, until further studies clarify the respective roles of the two methods in observational comparative effectiveness research. Copyright © 2015 Elsevier Inc. All rights reserved.
Photoplasma of optically excited metal vapors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bezuglov, N.N.; Llyucharev, A.N.; Stacewicz, T.
1994-09-01
A wide range of questions associated with various aspects of photoplasma physics is considered. A comprehensive analysis of processes of optical excitation and de-excitation depending on optical characteristics of an absorbing gas medium is given. Analytical methods used for determining the excitation degree of photoresonance plasma in conditions of resonance radiation transfer are described. The accuracy of the Biberman approximation for effective lifetimes in population kinetics of resonance plasma states is analyzed for many experimental conditions. A detailed discussion of primary ionization mechanisms in photoplasma is given; the kinetics of ionization processes is discussed; and systematization of various types ofmore » photoresonance plasma is presented. Basis aspects of the LIBORS model, which is widely used for studying ionization kinetics of laser photoresonance plasma, and its limitations are considered. An ingenious method used to analytically solve a class of decay-type nonlinear problems, which arise for the capture equation in the case of noticeable saturation of a resonance transition by a short laser pulse, is described. A reliable quantitative description of fluorescence decay curve peculiarities that are associated with the bleaching of gases at resonance line frequencies can be obtained by this method. Some possible applications of photoplasma in problems of optics and spectroscopy are considered. 75 refs., 24 figs., 1 tab.« less
High temperature polymer degradation: Rapid IR flow-through method for volatile quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giron, Nicholas H.; Celina, Mathew C.
Accelerated aging of polymers at elevated temperatures often involves the generation of volatiles. These can be formed as the products of oxidative degradation reactions or intrinsic pyrolytic decomposition as part of polymer scission reactions. A simple analytical method for the quantification of water, CO 2, and CO as fundamental signatures of degradation kinetics is required. Here, we describe an analytical framework and develops a rapid mid-IR based gas analysis methodology to quantify volatiles that are contained in small ampoules after aging exposures. The approach requires identification of unique spectral signatures, systematic calibration with known concentrations of volatiles, and a rapidmore » acquisition FTIR spectrometer for time resolved successive spectra. Furthermore, the volatiles are flushed out from the ampoule with dry N2 carrier gas and are then quantified through spectral and time integration. This method is sufficiently sensitive to determine absolute yields of ~50 μg water or CO 2, which relates to probing mass losses of less than 0.01% for a 1 g sample, i.e. the early stages in the degradation process. Such quantitative gas analysis is not easily achieved with other approaches. Our approach opens up the possibility of quantitative monitoring of volatile evolution as an avenue to explore polymer degradation kinetics and its dependence on time and temperature.« less
High temperature polymer degradation: Rapid IR flow-through method for volatile quantification
Giron, Nicholas H.; Celina, Mathew C.
2017-05-19
Accelerated aging of polymers at elevated temperatures often involves the generation of volatiles. These can be formed as the products of oxidative degradation reactions or intrinsic pyrolytic decomposition as part of polymer scission reactions. A simple analytical method for the quantification of water, CO 2, and CO as fundamental signatures of degradation kinetics is required. Here, we describe an analytical framework and develops a rapid mid-IR based gas analysis methodology to quantify volatiles that are contained in small ampoules after aging exposures. The approach requires identification of unique spectral signatures, systematic calibration with known concentrations of volatiles, and a rapidmore » acquisition FTIR spectrometer for time resolved successive spectra. Furthermore, the volatiles are flushed out from the ampoule with dry N2 carrier gas and are then quantified through spectral and time integration. This method is sufficiently sensitive to determine absolute yields of ~50 μg water or CO 2, which relates to probing mass losses of less than 0.01% for a 1 g sample, i.e. the early stages in the degradation process. Such quantitative gas analysis is not easily achieved with other approaches. Our approach opens up the possibility of quantitative monitoring of volatile evolution as an avenue to explore polymer degradation kinetics and its dependence on time and temperature.« less
Lai, Samuel Kin-Man; Cheng, Yu-Hong; Tang, Ho-Wai; Ng, Kwan-Ming
2017-08-09
Systematically controlling heat transfer in the surface-assisted laser desorption/ionization (SALDI) process and thus enhancing the analytical performance of SALDI-MS remains a challenging task. In the current study, by tuning the metal contents of Ag-Au alloy nanoparticle substrates (AgNPs, Ag55Au45NPs, Ag15Au85NPs and AuNPs, ∅: ∼2.0 nm), it was found that both SALDI ion-desorption efficiency and heat transfer can be controlled in a wide range of laser fluence (21.3 mJ cm -2 to 125.9 mJ cm -2 ). It was discovered that ion detection sensitivity can be enhanced at any laser fluence by tuning up the Ag content of the alloy nanoparticle, whereas the extent of ion fragmentation can be reduced by tuning up the Au content. The enhancement effect of Ag content on ion desorption was found to be attributable to the increase in laser absorption efficiency (at 355 nm) with Ag content. Tuning the laser absorption efficiency by changing the metal composition was also effective in controlling the heat transfer from the NPs to the analytes. The laser-induced heating of Ag-rich alloy NPs could be balanced or even overridden by increasing the Au content of NPs, resulting in the reduction of the fragmentation of analytes. In the correlation of experimental measurement with molecular dynamics simulation, the effect of metal composition on the dynamics of the ion desorption process was also elucidated. Upon increasing the Ag content, it was also found that phase transition temperatures, such as melting, vaporization and phase explosion temperature, of NPs could be reduced. This further enhanced the desorption of analyte ions via phase-transition-driven desorption processes. The significant cooling effect on the analyte ions observed at high laser fluence was also determined to be originated from the phase explosion of the NPs. This study revealed that the development of alloy nanoparticles as SALDI substrates can constitute an effective means for the systematic control of ion-desorption efficiency and the extent of heat transfer, which could potentially enhance the analytical performance of SALDI-MS.
Adkar, Prafulla P.; Bhaskar, V. H.
2014-01-01
Pandanus odoratissimus Linn. (family: Pandanaceae) is traditionally recommended by the Indian Ayurvedic medicines for treatment of headache, rheumatism, spasm, cold/flu, epilepsy, wounds, boils, scabies, leucoderma, ulcers, colic, hepatitis, smallpox, leprosy, syphilis, and cancer and as a cardiotonic, antioxidant, dysuric, and aphrodisiac. It contains phytochemicals, namely, lignans and isoflavones, coumestrol, alkaloids, steroids, carbohydrates, phenolic compounds, glycosides, proteins, amino acids as well as vitamins and nutrients, and so forth. It is having immense importance in nutrition. A 100 g edible Pandanus pericarp is mainly comprised of water and carbohydrates (80 and 17 g, resp.) and protein (1.3 mg), fat (0.7 mg), and fiber (3.5 g). Pandanus fruits paste provides 321 kilocalories, protein (2.2 g), calcium (134 mg), phosphorus (108 mg), iron (5.7 mg), thiamin (0.04 mg), vitamin C (5 mg), and beta-carotene (19 to 19,000 μg) (a carotenoid that is a precursor to vitamin A). Pandanus fruit is an important source of vitamins C, B1, B2, B3, and so forth, usually prepared as a Pandanus floured drink. Traditional claims were scientifically evaluated by the various authors and the phytochemical profile of plant parts was well established. The methods for analytical estimations were developed. However, there is paucity of systematic compilation of scientifically important information about this plant. In the present review we have systematically reviewed and compiled information of pharmacognostic, ethnopharmacology, phytochemistry, pharmacology, nutritional aspects, and analytical methods. This review will enrich knowledge leading the way into the discovery of new therapeutic agents with improved and intriguing pharmacological properties. PMID:25949238
Schwartz, Matthias; Meyer, Björn; Wirnitzer, Bernhard; Hopf, Carsten
2015-03-01
Conventional mass spectrometry image preprocessing methods used for denoising, such as the Savitzky-Golay smoothing or discrete wavelet transformation, typically do not only remove noise but also weak signals. Recently, memory-efficient principal component analysis (PCA) in conjunction with random projections (RP) has been proposed for reversible compression and analysis of large mass spectrometry imaging datasets. It considers single-pixel spectra in their local context and consequently offers the prospect of using information from the spectra of adjacent pixels for denoising or signal enhancement. However, little systematic analysis of key RP-PCA parameters has been reported so far, and the utility and validity of this method for context-dependent enhancement of known medically or pharmacologically relevant weak analyte signals in linear-mode matrix-assisted laser desorption/ionization (MALDI) mass spectra has not been explored yet. Here, we investigate MALDI imaging datasets from mouse models of Alzheimer's disease and gastric cancer to systematically assess the importance of selecting the right number of random projections k and of principal components (PCs) L for reconstructing reproducibly denoised images after compression. We provide detailed quantitative data for comparison of RP-PCA-denoising with the Savitzky-Golay and wavelet-based denoising in these mouse models as a resource for the mass spectrometry imaging community. Most importantly, we demonstrate that RP-PCA preprocessing can enhance signals of low-intensity amyloid-β peptide isoforms such as Aβ1-26 even in sparsely distributed Alzheimer's β-amyloid plaques and that it enables enhanced imaging of multiply acetylated histone H4 isoforms in response to pharmacological histone deacetylase inhibition in vivo. We conclude that RP-PCA denoising may be a useful preprocessing step in biomarker discovery workflows.
Helfer, Bartosz; Prosser, Aaron; Samara, Myrto T; Geddes, John R; Cipriani, Andrea; Davis, John M; Mavridis, Dimitris; Salanti, Georgia; Leucht, Stefan
2015-04-14
As the number of systematic reviews is growing rapidly, we systematically investigate whether meta-analyses published in leading medical journals present an outline of available evidence by referring to previous meta-analyses and systematic reviews. We searched PubMed for recent meta-analyses of pharmacological treatments published in high impact factor journals. Previous systematic reviews and meta-analyses were identified with electronic searches of keywords and by searching reference sections. We analyzed the number of meta-analyses and systematic reviews that were cited, described and discussed in each recent meta-analysis. Moreover, we investigated publication characteristics that potentially influence the referencing practices. We identified 52 recent meta-analyses and 242 previous meta-analyses on the same topics. Of these, 66% of identified previous meta-analyses were cited, 36% described, and only 20% discussed by recent meta-analyses. The probability of citing a previous meta-analysis was positively associated with its publication in a journal with a higher impact factor (odds ratio, 1.49; 95% confidence interval, 1.06 to 2.10) and more recent publication year (odds ratio, 1.19; 95% confidence interval 1.03 to 1.37). Additionally, the probability of a previous study being described by the recent meta-analysis was inversely associated with the concordance of results (odds ratio, 0.38; 95% confidence interval, 0.17 to 0.88), and the probability of being discussed was increased for previous studies that employed meta-analytic methods (odds ratio, 32.36; 95% confidence interval, 2.00 to 522.85). Meta-analyses on pharmacological treatments do not consistently refer to and discuss findings of previous meta-analyses on the same topic. Such neglect can lead to research waste and be confusing for readers. Journals should make the discussion of related meta-analyses mandatory.
Electrocardiographic interpretation skills of cardiology residents: are they competent?
Sibbald, Matthew; Davies, Edward G; Dorian, Paul; Yu, Eric H C
2014-12-01
Achieving competency at electrocardiogram (ECG) interpretation among cardiology subspecialty residents has traditionally focused on interpreting a target number of ECGs during training. However, there is little evidence to support this approach. Further, there are no data documenting the competency of ECG interpretation skills among cardiology residents, who become de facto the gold standard in their practice communities. We tested 29 Cardiology residents from all 3 years in a large training program using a set of 20 ECGs collected from a community cardiology practice over a 1-month period. Residents interpreted half of the ECGs using a standard analytic framework, and half using their own approach. Residents were scored on the number of correct and incorrect diagnoses listed. Overall diagnostic accuracy was 58%. Of 6 potentially life-threatening diagnoses, residents missed 36% (123 of 348) including hyperkalemia (81%), long QT (52%), complete heart block (35%), and ventricular tachycardia (19%). Residents provided additional inappropriate diagnoses on 238 ECGs (41%). Diagnostic accuracy was similar between ECGs interpreted using an analytic framework vs ECGs interpreted without an analytic framework (59% vs 58%; F(1,1333) = 0.26; P = 0.61). Cardiology resident proficiency at ECG interpretation is suboptimal. Despite the use of an analytic framework, there remain significant deficiencies in ECG interpretation among Cardiology residents. A more systematic method of addressing these important learning gaps is urgently needed. Copyright © 2014 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.
Wright, Stuart J; Vass, Caroline M; Sim, Gene; Burton, Michael; Fiebig, Denzil G; Payne, Katherine
2018-02-28
Scale heterogeneity, or differences in the error variance of choices, may account for a significant amount of the observed variation in the results of discrete choice experiments (DCEs) when comparing preferences between different groups of respondents. The aim of this study was to identify if, and how, scale heterogeneity has been addressed in healthcare DCEs that compare the preferences of different groups. A systematic review identified all healthcare DCEs published between 1990 and February 2016. The full-text of each DCE was then screened to identify studies that compared preferences using data generated from multiple groups. Data were extracted and tabulated on year of publication, samples compared, tests for scale heterogeneity, and analytical methods to account for scale heterogeneity. Narrative analysis was used to describe if, and how, scale heterogeneity was accounted for when preferences were compared. A total of 626 healthcare DCEs were identified. Of these 199 (32%) aimed to compare the preferences of different groups specified at the design stage, while 79 (13%) compared the preferences of groups identified at the analysis stage. Of the 278 included papers, 49 (18%) discussed potential scale issues, 18 (7%) used a formal method of analysis to account for scale between groups, and 2 (1%) accounted for scale differences between preference groups at the analysis stage. Scale heterogeneity was present in 65% (n = 13) of studies that tested for it. Analytical methods to test for scale heterogeneity included coefficient plots (n = 5, 2%), heteroscedastic conditional logit models (n = 6, 2%), Swait and Louviere tests (n = 4, 1%), generalised multinomial logit models (n = 5, 2%), and scale-adjusted latent class analysis (n = 2, 1%). Scale heterogeneity is a prevalent issue in healthcare DCEs. Despite this, few published DCEs have discussed such issues, and fewer still have used formal methods to identify and account for the impact of scale heterogeneity. The use of formal methods to test for scale heterogeneity should be used, otherwise the results of DCEs potentially risk producing biased and potentially misleading conclusions regarding preferences for aspects of healthcare.
Kitchen, Elizabeth; Bell, John D.; Reeve, Suzanne; Sudweeks, Richard R.; Bradshaw, William S.
2003-01-01
A large-enrollment, undergraduate cellular biology lecture course is described whose primary goal is to help students acquire skill in the interpretation of experimental data. The premise is that this kind of analytical reasoning is not intuitive for most people and, in the absence of hands-on laboratory experience, will not readily develop unless instructional methods and examinations specifically designed to foster it are employed. Promoting scientific thinking forces changes in the roles of both teacher and student. We describe didactic strategies that include directed practice of data analysis in a workshop format, active learning through verbal and written communication, visualization of abstractions diagrammatically, and the use of ancillary small-group mentoring sessions with faculty. The implications for a teacher in reducing the breadth and depth of coverage, becoming coach instead of lecturer, and helping students to diagnose cognitive weaknesses are discussed. In order to determine the efficacy of these strategies, we have carefully monitored student performance and have demonstrated a large gain in a pre- and posttest comparison of scores on identical problems, improved test scores on several successive midterm examinations when the statistical analysis accounts for the relative difficulty of the problems, and higher scores in comparison to students in a control course whose objective was information transfer, not acquisition of reasoning skills. A novel analytical index (student mobility profile) is described that demonstrates that this improvement was not random, but a systematic outcome of the teaching/learning strategies employed. An assessment of attitudes showed that, in spite of finding it difficult, students endorse this approach to learning, but also favor curricular changes that would introduce an analytical emphasis earlier in their training. PMID:14506506
ERIC Educational Resources Information Center
Shakeel, M. Danish; Anderson, Kaitlin P.; Wolf, Patrick J.
2016-01-01
The objective of this meta-analysis is to rigorously assess the participant effects of private school vouchers, or in other words, to estimate the average academic impacts that the offer (or use) of a voucher has on a student. This review adds to the literature by being the first to systematically review all Randomized Control Trials (RCTs) in an…
Imai, Chisato; Hashizume, Masahiro
2015-01-01
Background: Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Methods: Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Findings: Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. Conclusion: The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases. PMID:25859149
Ultra-small dye-doped silica nanoparticles via modified sol-gel technique.
Riccò, R; Nizzero, S; Penna, E; Meneghello, A; Cretaio, E; Enrichi, F
2018-01-01
In modern biosensing and imaging, fluorescence-based methods constitute the most diffused approach to achieve optimal detection of analytes, both in solution and on the single-particle level. Despite the huge progresses made in recent decades in the development of plasmonic biosensors and label-free sensing techniques, fluorescent molecules remain the most commonly used contrast agents to date for commercial imaging and detection methods. However, they exhibit low stability, can be difficult to functionalise, and often result in a low signal-to-noise ratio. Thus, embedding fluorescent probes into robust and bio-compatible materials, such as silica nanoparticles, can substantially enhance the detection limit and dramatically increase the sensitivity. In this work, ultra-small fluorescent silica nanoparticles (NPs) for optical biosensing applications were doped with a fluorescent dye, using simple water-based sol-gel approaches based on the classical Stöber procedure. By systematically modulating reaction parameters, controllable size tuning of particle diameters as low as 10 nm was achieved. Particles morphology and optical response were evaluated showing a possible single-molecule behaviour, without employing microemulsion methods to achieve similar results. Graphical abstractWe report a simple, cheap, reliable protocol for the synthesis and systematic tuning of ultra-small (< 10 nm) dye-doped luminescent silica nanoparticles.
Wagner, Kay Cimpl; Byrd, Gary D.
2004-01-01
Objective: This study was undertaken to determine if a systematic review of the evidence from thirty years of literature evaluating clinical medical librarian (CML) programs could help clarify the effectiveness of this outreach service model. Methods: A descriptive review of the CML literature describes the general characteristics of these services as they have been implemented, primarily in teaching-hospital settings. Comprehensive searches for CML studies using quantitative or qualitative evaluation methods were conducted in the medical, allied health, librarianship, and social sciences literature. Findings: Thirty-five studies published between 1974 and 2001 met the review criteria. Most (30) evaluated single, active programs and used descriptive research methods (e.g., use statistics or surveys/questionnaires). A weighted average of 89% of users in twelve studies found CML services useful and of high quality, and 65% of users in another overlapping, but not identical, twelve studies said these services contributed to improved patient care. Conclusions: The total amount of research evidence for CML program effectiveness is not great and most of it is descriptive rather than comparative or analytically qualitative. Standards are needed to consistently evaluate CML or informationist programs in the future. A carefully structured multiprogram study including three to five of the best current programs is needed to define the true value of these services. PMID:14762460
Guise, Andy; Horyniak, Danielle; Melo, Jason; McNeil, Ryan; Werb, Dan
2017-12-01
Understanding the experience of initiating injection drug use and its social contexts is crucial to inform efforts to prevent transitions into this mode of drug consumption and support harm reduction. We reviewed and synthesized existing qualitative scientific literature systematically to identify the socio-structural contexts for, and experiences of, the initiation of injection drug use. We searched six databases (Medline, Embase, PsychINFO, CINAHL, IBSS and SSCI) systematically, along with a manual search, including key journals and subject experts. Peer-reviewed studies were included if they qualitatively explored experiences of or socio-structural contexts for injection drug use initiation. A thematic synthesis approach was used to identify descriptive and analytical themes throughout studies. From 1731 initial results, 41 studies reporting data from 1996 participants were included. We developed eight descriptive themes and two analytical (higher-order) themes. The first analytical theme focused on injecting initiation resulting from a social process enabled and constrained by socio-structural factors: social networks and individual interactions, socialization into drug-using identities and choices enabled and constrained by social context all combine to produce processes of injection initiation. The second analytical theme addressed pathways that explore varying meanings attached to injection initiation and how they link to social context: seeking pleasure, responses to increasing tolerance to drugs, securing belonging and identity and coping with pain and trauma. Qualitative research shows that injection drug use initiation has varying and distinct meanings for individuals involved and is a dynamic process shaped by social and structural factors. Interventions should therefore respond to the socio-structural influences on injecting drug use initiation by seeking to modify the contexts for initiation, rather than solely prioritizing the reduction of individual harms through behavior change. © 2017 Society for the Study of Addiction.
Meta-analysis of diagnostic accuracy studies in mental health
Takwoingi, Yemisi; Riley, Richard D; Deeks, Jonathan J
2015-01-01
Objectives To explain methods for data synthesis of evidence from diagnostic test accuracy (DTA) studies, and to illustrate different types of analyses that may be performed in a DTA systematic review. Methods We described properties of meta-analytic methods for quantitative synthesis of evidence. We used a DTA review comparing the accuracy of three screening questionnaires for bipolar disorder to illustrate application of the methods for each type of analysis. Results The discriminatory ability of a test is commonly expressed in terms of sensitivity (proportion of those with the condition who test positive) and specificity (proportion of those without the condition who test negative). There is a trade-off between sensitivity and specificity, as an increasing threshold for defining test positivity will decrease sensitivity and increase specificity. Methods recommended for meta-analysis of DTA studies --such as the bivariate or hierarchical summary receiver operating characteristic (HSROC) model --jointly summarise sensitivity and specificity while taking into account this threshold effect, as well as allowing for between study differences in test performance beyond what would be expected by chance. The bivariate model focuses on estimation of a summary sensitivity and specificity at a common threshold while the HSROC model focuses on the estimation of a summary curve from studies that have used different thresholds. Conclusions Meta-analyses of diagnostic accuracy studies can provide answers to important clinical questions. We hope this article will provide clinicians with sufficient understanding of the terminology and methods to aid interpretation of systematic reviews and facilitate better patient care. PMID:26446042
Prioritizing pesticide compounds for analytical methods development
Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.
2012-01-01
The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1 compounds are high priority as new analytes. The objective for analytical methods development is to design an integrated analytical strategy that includes as many of the Tier 1 pesticide compounds as possible in a relatively few, cost-effective methods. More than 60 percent of the Tier 1 compounds are high priority because they are anticipated to be present at concentrations approaching levels that could be of concern to human health or aquatic life in surface water or groundwater. An additional 17 percent of Tier 1 compounds were frequently detected in monitoring studies, but either were not measured at levels potentially relevant to humans or aquatic organisms, or do not have benchmarks available with which to compare concentrations. The remaining 21 percent are pesticide degradates that were included because their parent pesticides were in Tier 1. Tier 1 pesticide compounds for water span all major pesticide use groups and a diverse range of chemical classes, with herbicides and their degradates composing half of compounds. Many of the high priority pesticide compounds also are in several national regulatory programs for water, including those that are regulated in drinking water by the U.S. Environmental Protection Agency under the Safe Drinking Water Act and those that are on the latest Contaminant Candidate List. For sediment, a total of 175 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods available for monitoring and studies. More than 60 percent of these compounds are included in some USGS analytical method; however, some are spread across several research methods that are expensive to perform, and monitoring data are not extensive for many compounds. The remaining Tier 1 compounds for sediment are high priority as new analytes. The objective for analytical methods development for sediment is to enhance an existing analytical method that currently includes nearly half of the pesticide compounds in Tier 1 by adding as many additional Tier 1 compounds as are analytically compatible. About 35 percent of the Tier 1 compounds for sediment are high priority on the basis of measured occurrence. A total of 74 compounds, or 42 percent, are high priority on the basis of predicted likelihood of occurrence according to physical-chemical properties, and either have potential toxicity to aquatic life, high pesticide useage, or both. The remaining 22 percent of Tier 1 pesticide compounds were either degradates of Tier 1 parent compounds or included for other reasons. As with water, the Tier 1 pesticide compounds for sediment are distributed across the major pesticide-use groups; insecticides and their degradates are the largest fraction, making up 45 percent of Tier 1. In contrast to water, organochlorines, at 17 percent, are the largest chemical class for Tier 1 in sediment, which is to be expected because there is continued widespread detection in sediments of persistent organochlorine pesticides and their degradates at concentrations high enough for potential effects on aquatic life. Compared to water, there are fewer available benchmarks with which to compare contaminant concentrations in sediment, but a total of 19 Tier 1 compounds have at least one sediment benchmark or screening value for aquatic organisms. Of the 175 compounds in Tier 1, 77 percent have high aquatic-life toxicity, as defined for this process. This evaluation of pesticides and degradates resulted in two lists of compounds that are priorities for USGS analytical methods development, one for water and one for sediment. These lists will be used as the basis for redesigning and enhancing USGS analytical capabilities for pesticides in order to capture as many high-priority pesticide compounds as possible using an economically feasible approach.
Noh, Jaesung; Lee, Kun Mo
2003-05-01
A relative significance factor (f(i)) of an impact category is the external weight of the impact category. The objective of this study is to propose a systematic and easy-to-use method for the determination of f(i). Multiattribute decision-making (MADM) methods including the analytical hierarchy process (AHP), the rank-order centroid method, and the fuzzy method were evaluated for this purpose. The results and practical aspects of using the three methods are compared. Each method shows the same trend, with minor differences in the value of f(i). Thus, all three methods can be applied to the determination of f(i). The rank order centroid method reduces the number of pairwise comparisons by placing the alternatives in order, although it has inherent weakness over the fuzzy method in expressing the degree of vagueness associated with assigning weights to criteria and alternatives. The rank order centroid method is considered a practical method for the determination of f(i) because it is easier and simpler to use compared to the AHP and the fuzzy method.
Symmetric airfoil geometry effects on leading edge noise.
Gill, James; Zhang, X; Joseph, P
2013-10-01
Computational aeroacoustic methods are applied to the modeling of noise due to interactions between gusts and the leading edge of real symmetric airfoils. Single frequency harmonic gusts are interacted with various airfoil geometries at zero angle of attack. The effects of airfoil thickness and leading edge radius on noise are investigated systematically and independently for the first time, at higher frequencies than previously used in computational methods. Increases in both leading edge radius and thickness are found to reduce the predicted noise. This noise reduction effect becomes greater with increasing frequency and Mach number. The dominant noise reduction mechanism for airfoils with real geometry is found to be related to the leading edge stagnation region. It is shown that accurate leading edge noise predictions can be made when assuming an inviscid meanflow, but that it is not valid to assume a uniform meanflow. Analytic flat plate predictions are found to over-predict the noise due to a NACA 0002 airfoil by up to 3 dB at high frequencies. The accuracy of analytic flat plate solutions can be expected to decrease with increasing airfoil thickness, leading edge radius, gust frequency, and Mach number.
Meta-Study as Diagnostic: Toward Content Over Form in Qualitative Synthesis.
Frost, Julia; Garside, Ruth; Cooper, Chris; Britten, Nicky
2016-02-01
Having previously conducted qualitative syntheses of the diabetes literature, we wanted to explore the changes in theoretical approaches, methodological practices, and the construction of substantive knowledge which have recently been presented in the qualitative diabetes literature. The aim of this research was to explore the feasibility of synthesizing existing qualitative syntheses of patient perspectives of diabetes using meta-study methodology. A systematic review of qualitative literature, published between 2000 and 2013, was conducted. Six articles were identified as qualitative syntheses. The meta-study methodology was used to compare the theoretical, methodological, analytic, and synthetic processes across the six studies, exploring the potential for an overarching synthesis. We identified that while research questions have increasingly concentrated on specific aspects of diabetes, the focus on systematic review processes has led to the neglect of qualitative theory and methods. This can inhibit the production of compelling results with meaningful clinical applications. Although unable to produce a synthesis of syntheses, we recommend that researchers who conduct qualitative syntheses pay equal attention to qualitative traditions and systematic review processes, to produce research products that are both credible and applicable. © The Author(s) 2015.
Methodological quality is underrated in systematic reviews and meta-analyses in health psychology.
Oliveras, Isabel; Losilla, Josep-Maria; Vives, Jaume
2017-06-01
In this paper, we compile and describe the main approaches proposed in the literature to include methodological quality (MQ) or risk of bias (RoB) into research synthesis. We also meta-review how the RoB of observational primary studies is being assessed and to what extent the results are incorporated in the conclusions of research synthesis. Electronic databases were searched for systematic reviews or meta-analyses related to health and clinical psychology. A random sample of 90 reviews published between January 2010 and May 2016 was examined. A total of 46 reviews (51%) performed a formal assessment of the RoB of primary studies. Only 17 reviews (19%) linked the outcomes of quality assessment with the results of the review. According to the previous literature, our results corroborate the lack of guidance to incorporate the RoB assessment in the results of systematic reviews and meta-analyses. Our recommendation is to appraise MQ according to domains of RoB to rate the degree of credibility of the results of a research synthesis, as well as subgroup analysis or meta-regression as analytical methods to incorporate the quality assessment. Copyright © 2017 Elsevier Inc. All rights reserved.
Assessment of active methods for removal of LEO debris
NASA Astrophysics Data System (ADS)
Hakima, Houman; Emami, M. Reza
2018-03-01
This paper investigates the applicability of five active methods for removal of large low Earth orbit debris. The removal methods, namely net, laser, electrodynamic tether, ion beam shepherd, and robotic arm, are selected based on a set of high-level space mission constraints. Mission level criteria are then utilized to assess the performance of each redirection method in light of the results obtained from a Monte Carlo simulation. The simulation provides an insight into the removal time, performance robustness, and propellant mass criteria for the targeted debris range. The remaining attributes are quantified based on the models provided in the literature, which take into account several important parameters pertaining to each removal method. The means of assigning attributes to each assessment criterion is discussed in detail. A systematic comparison is performed using two different assessment schemes: Analytical Hierarchy Process and utility-based approach. A third assessment technique, namely the potential-loss analysis, is utilized to highlight the effect of risks in each removal methods.
Kim, Tae-gu; Kang, Young-sig; Lee, Hyung-won
2011-01-01
To begin a zero accident campaign for industry, the first thing is to estimate the industrial accident rate and the zero accident time systematically. This paper considers the social and technical change of the business environment after beginning the zero accident campaign through quantitative time series analysis methods. These methods include sum of squared errors (SSE), regression analysis method (RAM), exponential smoothing method (ESM), double exponential smoothing method (DESM), auto-regressive integrated moving average (ARIMA) model, and the proposed analytic function method (AFM). The program is developed to estimate the accident rate, zero accident time and achievement probability of an efficient industrial environment. In this paper, MFC (Microsoft Foundation Class) software of Visual Studio 2008 was used to develop a zero accident program. The results of this paper will provide major information for industrial accident prevention and be an important part of stimulating the zero accident campaign within all industrial environments.
Li, Wei; Zhang, Min; Wang, Mingyu; Han, Zhantao; Liu, Jiankai; Chen, Zhezhou; Liu, Bo; Yan, Yan; Liu, Zhu
2018-06-01
Brownfield sites pollution and remediation is an urgent environmental issue worldwide. The screening and assessment of remedial alternatives is especially complex owing to its multiple criteria that involves technique, economy, and policy. To help the decision-makers selecting the remedial alternatives efficiently, the criteria framework conducted by the U.S. EPA is improved and a comprehensive method that integrates multiple criteria decision analysis (MCDA) with numerical simulation is conducted in this paper. The criteria framework is modified and classified into three categories: qualitative, semi-quantitative, and quantitative criteria, MCDA method, AHP-PROMETHEE (analytical hierarchy process-preference ranking organization method for enrichment evaluation) is used to determine the priority ranking of the remedial alternatives and the solute transport simulation is conducted to assess the remedial efficiency. A case study was present to demonstrate the screening method in a brownfield site in Cangzhou, northern China. The results show that the systematic method provides a reliable way to quantify the priority of the remedial alternatives.
The Importance of Magnesium in the Human Body: A Systematic Literature Review.
Glasdam, Sidsel-Marie; Glasdam, Stinne; Peters, Günther H
2016-01-01
Magnesium, the second and fourth most abundant cation in the intracellular compartment and whole body, respectively, is of great physiologic importance. Magnesium exists as bound and free ionized forms depending on temperature, pH, ionic strength, and competing ions. Free magnesium participates in many biochemical processes and is most commonly measured by ion-selective electrode. This analytical approach is problematic because complete selectivity is not possible due to competition with other ions, i.e., calcium, and pH interference. Unfortunately, many studies have focused on measurement of total magnesium rather than its free bioactive form making it difficult to correlate to disease states. This systematic literature review presents current analytical challenges in obtaining accurate and reproducible test results for magnesium. © 2016 Elsevier Inc. All rights reserved.
Chambers, Duncan; Paton, Fiona; Wilson, Paul; Eastwood, Alison; Craig, Dawn; Fox, Dave; Jayne, David; McGinnes, Erika
2014-01-01
Objectives To identify and critically assess the extent to which systematic reviews of enhanced recovery programmes for patients undergoing colorectal surgery differ in their methodology and reported estimates of effect. Design Review of published systematic reviews. We searched the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment (HTA) Database from 1990 to March 2013. Systematic reviews of enhanced recovery programmes for patients undergoing colorectal surgery were eligible for inclusion. Primary and secondary outcome measures The primary outcome was length of hospital stay. We assessed changes in pooled estimates of treatment effect over time and how these might have been influenced by decisions taken by researchers as well as by the availability of new trials. The quality of systematic reviews was assessed using the Centre for Reviews and Dissemination (CRD) DARE critical appraisal process. Results 10 systematic reviews were included. Systematic reviews of randomised controlled trials have consistently shown a reduction in length of hospital stay with enhanced recovery compared with traditional care. The estimated effect tended to increase from 2006 to 2010 as more trials were published but has not altered significantly in the most recent review, despite the inclusion of several unique trials. The best estimate appears to be an average reduction of around 2.5 days in primary postoperative length of stay. Differences between reviews reflected differences in interpretation of inclusion criteria, searching and analytical methods or software. Conclusions Systematic reviews of enhanced recovery programmes show a high level of research waste, with multiple reviews covering identical or very similar groups of trials. Where multiple reviews exist on a topic, interpretation may require careful attention to apparently minor differences between reviews. Researchers can help readers by acknowledging existing reviews and through clear reporting of key decisions, especially on inclusion/exclusion and on statistical pooling. PMID:24879828
NASA Astrophysics Data System (ADS)
Lebedeva, R. V.; Tumanova, A. N.; Mashin, N. I.
2007-07-01
We carried out a systematic study of the influence of the main component on the change of analytical signal during atomic-emission analysis of boron compounds. Changes in the intensity of spectral lines of microimpurities as functions of their concentrations in the analytical system based on graphite powder with a variable content of boric acid and boron oxide are presented.
ERIC Educational Resources Information Center
Garofalo, James; Hindelang, Michael J.
The purpose of the document is to identify ways in which National Crime Survey (NCS) data can be used by criminal justice researchers and programs. The report provides an overview of the Application of Victimization Survey Results Project, describes the analytic reports compiled by the project staff, and cites the kinds of systematic information…
NASA Astrophysics Data System (ADS)
Grotti, Marco; Abelmoschi, Maria Luisa; Soggia, Francesco; Tiberiade, Christian; Frache, Roberto
2000-12-01
The multivariate effects of Na, K, Mg and Ca as nitrates on the electrothermal atomisation of manganese, cadmium and iron were studied by multiple linear regression modelling. Since the models proved to efficiently predict the effects of the considered matrix elements in a wide range of concentrations, they were applied to correct the interferences occurring in the determination of trace elements in seawater after pre-concentration of the analytes. In order to obtain a statistically significant number of samples, a large volume of the certified seawater reference materials CASS-3 and NASS-3 was treated with Chelex-100 resin; then, the chelating resin was separated from the solution, divided into several sub-samples, each of them was eluted with nitric acid and analysed by electrothermal atomic absorption spectrometry (for trace element determinations) and inductively coupled plasma optical emission spectrometry (for matrix element determinations). To minimise any other systematic error besides that due to matrix effects, accuracy of the pre-concentration step and contamination levels of the procedure were checked by inductively coupled plasma mass spectrometric measurements. Analytical results obtained by applying the multiple linear regression models were compared with those obtained with other calibration methods, such as external calibration using acid-based standards, external calibration using matrix-matched standards and the analyte addition technique. Empirical models proved to efficiently reduce interferences occurring in the analysis of real samples, allowing an improvement of accuracy better than for other calibration methods.
NMR and MS Methods for Metabolomics.
Amberg, Alexander; Riefke, Björn; Schlotterbeck, Götz; Ross, Alfred; Senn, Hans; Dieterle, Frank; Keck, Matthias
2017-01-01
Metabolomics, also often referred as "metabolic profiling," is the systematic profiling of metabolites in biofluids or tissues of organisms and their temporal changes. In the last decade, metabolomics has become more and more popular in drug development, molecular medicine, and other biotechnology fields, since it profiles directly the phenotype and changes thereof in contrast to other "-omics" technologies. The increasing popularity of metabolomics has been possible only due to the enormous development in the technology and bioinformatics fields. In particular, the analytical technologies supporting metabolomics, i.e., NMR, UPLC-MS, and GC-MS, have evolved into sensitive and highly reproducible platforms allowing the determination of hundreds of metabolites in parallel. This chapter describes the best practices of metabolomics as seen today. All important steps of metabolic profiling in drug development and molecular medicine are described in great detail, starting from sample preparation to determining the measurement details of all analytical platforms, and finally to discussing the corresponding specific steps of data analysis.
NMR and MS methods for metabonomics.
Dieterle, Frank; Riefke, Björn; Schlotterbeck, Götz; Ross, Alfred; Senn, Hans; Amberg, Alexander
2011-01-01
Metabonomics, also often referred to as "metabolomics" or "metabolic profiling," is the systematic profiling of metabolites in bio-fluids or tissues of organisms and their temporal changes. In the last decade, metabonomics has become increasingly popular in drug development, molecular medicine, and other biotechnology fields, since it profiles directly the phenotype and changes thereof in contrast to other "-omics" technologies. The increasing popularity of metabonomics has been possible only due to the enormous development in the technology and bioinformatics fields. In particular, the analytical technologies supporting metabonomics, i.e., NMR, LC-MS, UPLC-MS, and GC-MS have evolved into sensitive and highly reproducible platforms allowing the determination of hundreds of metabolites in parallel. This chapter describes the best practices of metabonomics as seen today. All important steps of metabolic profiling in drug development and molecular medicine are described in great detail, starting from sample preparation, to determining the measurement details of all analytical platforms, and finally, to discussing the corresponding specific steps of data analysis.
Optimism and Physical Health: A Meta-analytic Review
Rasmussen, Heather N.; Greenhouse, Joel B.
2010-01-01
Background Prior research links optimism to physical health, but the strength of the association has not been systematically evaluated. Purpose The purpose of this study is to conduct a meta-analytic review to determine the strength of the association between optimism and physical health. Methods The findings from 83 studies, with 108 effect sizes (ESs), were included in the analyses, using random-effects models. Results Overall, the mean ES characterizing the relationship between optimism and physical health outcomes was 0.17, p<.001. ESs were larger for studies using subjective (versus objective) measures of physical health. Subsidiary analyses were also conducted grouping studies into those that focused solely on mortality, survival, cardiovascular outcomes, physiological markers (including immune function), immune function only, cancer outcomes, outcomes related to pregnancy, physical symptoms, or pain. In each case, optimism was a significant predictor of health outcomes or markers, all p<.001. Conclusions Optimism is a significant predictor of positive physical health outcomes. PMID:19711142
Hawkeye and AMOS: visualizing and assessing the quality of genome assemblies
Schatz, Michael C.; Phillippy, Adam M.; Sommer, Daniel D.; Delcher, Arthur L.; Puiu, Daniela; Narzisi, Giuseppe; Salzberg, Steven L.; Pop, Mihai
2013-01-01
Since its launch in 2004, the open-source AMOS project has released several innovative DNA sequence analysis applications including: Hawkeye, a visual analytics tool for inspecting the structure of genome assemblies; the Assembly Forensics and FRCurve pipelines for systematically evaluating the quality of a genome assembly; and AMOScmp, the first comparative genome assembler. These applications have been used to assemble and analyze dozens of genomes ranging in complexity from simple microbial species through mammalian genomes. Recent efforts have been focused on enhancing support for new data characteristics brought on by second- and now third-generation sequencing. This review describes the major components of AMOS in light of these challenges, with an emphasis on methods for assessing assembly quality and the visual analytics capabilities of Hawkeye. These interactive graphical aspects are essential for navigating and understanding the complexities of a genome assembly, from the overall genome structure down to individual bases. Hawkeye and AMOS are available open source at http://amos.sourceforge.net. PMID:22199379
Bulk diffusion in a kinetically constrained lattice gas
NASA Astrophysics Data System (ADS)
Arita, Chikashi; Krapivsky, P. L.; Mallick, Kirone
2018-03-01
In the hydrodynamic regime, the evolution of a stochastic lattice gas with symmetric hopping rules is described by a diffusion equation with density-dependent diffusion coefficient encapsulating all microscopic details of the dynamics. This diffusion coefficient is, in principle, determined by a Green-Kubo formula. In practice, even when the equilibrium properties of a lattice gas are analytically known, the diffusion coefficient cannot be computed except when a lattice gas additionally satisfies the gradient condition. We develop a procedure to systematically obtain analytical approximations for the diffusion coefficient for non-gradient lattice gases with known equilibrium. The method relies on a variational formula found by Varadhan and Spohn which is a version of the Green-Kubo formula particularly suitable for diffusive lattice gases. Restricting the variational formula to finite-dimensional sub-spaces allows one to perform the minimization and gives upper bounds for the diffusion coefficient. We apply this approach to a kinetically constrained non-gradient lattice gas in two dimensions, viz. to the Kob-Andersen model on the square lattice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.
2010-05-23
The increasing asymmetric nature of threats to the security, health and sustainable growth of our society requires that anticipatory reasoning become an everyday activity. Currently, the use of anticipatory reasoning is hindered by the lack of systematic methods for combining knowledge- and evidence-based models, integrating modeling algorithms, and assessing model validity, accuracy and utility. The workshop addresses these gaps with the intent of fostering the creation of a community of interest on model integration and evaluation that may serve as an aggregation point for existing efforts and a launch pad for new approaches.
Defining Clonal Color in Fluorescent Multi-Clonal Tracking
Wu, Juwell W.; Turcotte, Raphaël; Alt, Clemens; Runnels, Judith M.; Tsao, Hensin; Lin, Charles P.
2016-01-01
Clonal heterogeneity and selection underpin many biological processes including development and tumor progression. Combinatorial fluorescent protein expression in germline cells has proven its utility for tracking the formation and regeneration of different organ systems. Such cell populations encoded by combinatorial fluorescent proteins are also attractive tools for understanding clonal expansion and clonal competition in cancer. However, the assignment of clonal identity requires an analytical framework in which clonal markings can be parameterized and validated. Here we present a systematic and quantitative method for RGB analysis of fluorescent melanoma cancer clones. We then demonstrate refined clonal trackability of melanoma cells using this scheme. PMID:27073117
Beg, Sarwar; Chaudhary, Vandna; Sharma, Gajanand; Garg, Babita; Panda, Sagar Suman; Singh, Bhupinder
2016-06-01
The present studies describe the systematic quality by design (QbD)-oriented development and validation of a simple, rapid, sensitive and cost-effective reversed-phase HPLC bioanalytical method for nevirapine in rat plasma. Chromatographic separation was carried out on a C18 column using isocratic 68:9:23% v/v elution of methanol, acetonitrile and water (pH 3, adjusted by orthophosphoric acid) at a flow rate of 1.0 mL/min using UV detection at 230 nm. A Box-Behnken design was applied for chromatographic method optimization taking mobile phase ratio, pH and flow rate as the critical method parameters (CMPs) from screening studies. Peak area, retention time, theoretical plates and peak tailing were measured as the critical analytical attributes (CAAs). Further, the bioanalytical liquid-liquid extraction process was optimized using an optimal design by selecting extraction time, centrifugation speed and temperature as the CMPs for percentage recovery of nevirapine as the CAA. The search for an optimum chromatographic solution was conducted through numerical desirability function. Validation studies performed as per the US Food and Drug Administration requirements revealed results within the acceptance limit. In a nutshell, the studies successfully demonstrate the utility of analytical QbD approach for the rational development of a bioanalytical method with enhanced chromatographic separation and recovery of nevirapine in rat plasma. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Fraulo, Pasquale; Morena, Carmelo; Costa, Antonella
2014-10-01
Anisakidae larvae belonging to the genera Anisakis and Pseudoterranova, are the most responsible for zoonosis transmitted by fish products (anisakidosis). Acquired by the consumption of raw or undercooked marine fish or squid, the anisakid larvae may cause pathogenic diseases like gastric or intestinal anisakiasis and gastro-allergic disorders. In accordance with current EU legislation, the fresh fish products must be inspected visually in order to detect the possible presence of visible parasites. It is recognized that the visual method is not accurate enough to detect the larvae of parasites in food preparations containing raw or practically raw seafood and it clearly emerges that the official system of control needs to be able to utilise an most efficient analytical technique. In this work, the authors have drawn up and validated an analytical method, which involves artificial digestion and the use of a heated magnetic stirrer, based on the EU Regulation n. 2075/2005. The larvae isolated are then subjected to morphological identification at genus level by using optical microscope. The method, proved to be suitable for the detection of live and dead larvae of anisakidae in ready-to-eat foodstuffs containing raw fish or cephalopods and it is fast and accurate. The method showed high levels of sensitivity and specificity, and the suitability of its use in official food control was confirmed. Its use should be incorporated systematically into specific monitoring programs for the control of foodstuffs containing raw fish products.
Analytical collisionless damping rate of geodesic acoustic mode
NASA Astrophysics Data System (ADS)
Ren, H.; Xu, X. Q.
2016-10-01
Collisionless damping of geodesic acoustic mode (GAM) is analytically investigated by considering the finite-orbit-width (FOW) resonance effect to the 3rd order in the gyro-kinetic equations. A concise and transparent expression for the damping rate is presented for the first time. Good agreement is found between the analytical damping rate and the previous TEMPEST simulation result (Xu 2008 et al Phys. Rev. Lett. 100 215001) for systematic q scans. Our result also shows that it is of sufficient accuracy and has to take into account the FOW effect to the 3rd order.
A systematic review of dynamics in climate risk and vulnerability assessments
NASA Astrophysics Data System (ADS)
Jurgilevich, Alexandra; Räsänen, Aleksi; Groundstroem, Fanny; Juhola, Sirkku
2017-01-01
Understanding climate risk is crucial for effective adaptation action, and a number of assessment methodologies have emerged. We argue that the dynamics of the individual components in climate risk and vulnerability assessments has received little attention. In order to highlight this, we systematically reviewed 42 sub-national climate risk and vulnerability assessments. We analysed the assessments using an analytical framework with which we evaluated (1) the conceptual approaches to vulnerability and exposure used, (2) if current or future risks were assessed, and (3) if and how changes over time (i.e. dynamics) were considered. Of the reviewed assessments, over half addressed future risks or vulnerability; and of these future-oriented studies, less than 1/3 considered both vulnerability and exposure dynamics. While the number of studies that include dynamics is growing, and while all studies included socio-economic aspects, often only biophysical dynamics was taken into account. We discuss the challenges of assessing socio-economic and spatial dynamics, particularly the poor availability of data and methods. We suggest that future-oriented studies assessing risk dynamics would benefit from larger stakeholder involvement, discussion of the assessment purpose, the use of multiple methods, inclusion of uncertainty/sensitivity analyses and pathway approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean
A new field of research, visual analytics, has recently been introduced. This has been defined as “the science of analytical reasoning facilitated by visual interfaces." Visual analytic environments, therefore, support analytical reasoning using visual representations and interactions, with data representations and transformation capabilities, to support production, presentation and dissemination. As researchers begin to develop visual analytic environments, it will be advantageous to develop metrics and methodologies to help researchers measure the progress of their work and understand the impact their work will have on the users who will work in such environments. This paper presents five areas or aspects ofmore » visual analytic environments that should be considered as metrics and methodologies for evaluation are developed. Evaluation aspects need to include usability, but it is necessary to go beyond basic usability. The areas of situation awareness, collaboration, interaction, creativity, and utility are proposed as areas for initial consideration. The steps that need to be undertaken to develop systematic evaluation methodologies and metrics for visual analytic environments are outlined.« less
A framework for the damage evaluation of acoustic emission signals through Hilbert-Huang transform
NASA Astrophysics Data System (ADS)
Siracusano, Giulio; Lamonaca, Francesco; Tomasello, Riccardo; Garescì, Francesca; Corte, Aurelio La; Carnì, Domenico Luca; Carpentieri, Mario; Grimaldi, Domenico; Finocchio, Giovanni
2016-06-01
The acoustic emission (AE) is a powerful and potential nondestructive testing method for structural monitoring in civil engineering. Here, we show how systematic investigation of crack phenomena based on AE data can be significantly improved by the use of advanced signal processing techniques. Such data are a fundamental source of information that can be used as the basis for evaluating the status of the material, thereby paving the way for a new frontier of innovation made by data-enabled analytics. In this article, we propose a framework based on the Hilbert-Huang Transform for the evaluation of material damages that (i) facilitates the systematic employment of both established and promising analysis criteria, and (ii) provides unsupervised tools to achieve an accurate classification of the fracture type, the discrimination between longitudinal (P-) and traversal (S-) waves related to an AE event. The experimental validation shows promising results for a reliable assessment of the health status through the monitoring of civil infrastructures.
NASA Astrophysics Data System (ADS)
Kalinowska, Monika; Świsłocka, Renata; Lewandowski, Włodzimierz
2007-05-01
The effect of alkali metals (Li → Na → K → Rb → Cs) on the electronic structure of cinnamic acid (phenylacrylic acid) was studied. In this research many miscellaneous analytical methods, which complement one another, were used: infrared (FT-IR), Raman (FT-Raman), nuclear magnetic resonance ( 1H, 13C NMR) and quantum mechanical calculations. The spectroscopic studies lead to conclusions concerning the distribution of the electronic charge in molecule, the delocalization energy of π-electrons and the reactivity of metal complexes. The change of metal along with the series: Li → Na → K → Rb → Cs caused: (1) the change of electronic charge distribution in cinnamate anion what is seen via the occurrence of the systematic shifts of several bands in the experimental and theoretical IR and Raman spectra of cinnamates, (2) systematic chemical shifts for protons 1H and 13C nuclei.
Targeted Quantitation of Proteins by Mass Spectrometry
2013-01-01
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332
Targeted quantitation of proteins by mass spectrometry.
Liebler, Daniel C; Zimmerman, Lisa J
2013-06-04
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.
Zhao, Yongxi; Kong, Yu; Wang, Bo; Wu, Yayan; Wu, Hong
2007-03-30
A simple and rapid micellar electrokinetic chromatography (MEKC) method with UV detection was developed for the simultaneous separation and determination of all-trans- and 13-cis-retinoic acids in rabbit serum by on-line sweeping concentration technique. The serum sample was simply deproteinized and centrifuged. Various parameters affecting sample enrichment and separation were systematically investigated. Under optimal conditions, the analytes could be well separated within 17min, and the relative standard deviations (RSD) of migration times and peak areas were less than 3.4%. Compared with the conventional MEKC injection method, the 18- and 19-fold improvements in sensitivity were achieved, respectively. The proposed method has been successfully applied to the determination of all-trans- and 13-cis-retinoic acids in serum samples from rabbits and could be feasible for the further pharmacokinetics study of all-trans-retinoic acid.
Do Our Means of Inquiry Match our Intentions?
Petscher, Yaacov
2016-01-01
A key stage of the scientific method is the analysis of data, yet despite the variety of methods that are available to researchers they are most frequently distilled to a model that focuses on the average relation between variables. Although research questions are frequently conceived with broad inquiry in mind, most regression methods are limited in comprehensively evaluating how observed behaviors are related to each other. Quantile regression is a largely unknown yet well-suited analytic technique similar to traditional regression analysis, but allows for a more systematic approach to understanding complex associations among observed phenomena in the psychological sciences. Data from the National Education Longitudinal Study of 1988/2000 are used to illustrate how quantile regression overcomes the limitations of average associations in linear regression by showing that psychological well-being and sex each differentially relate to reading achievement depending on one’s level of reading achievement. PMID:27486410
Peckner, Ryan; Myers, Samuel A; Jacome, Alvaro Sebastian Vaca; Egertson, Jarrett D; Abelin, Jennifer G; MacCoss, Michael J; Carr, Steven A; Jaffe, Jacob D
2018-05-01
Mass spectrometry with data-independent acquisition (DIA) is a promising method to improve the comprehensiveness and reproducibility of targeted and discovery proteomics, in theory by systematically measuring all peptide precursors in a biological sample. However, the analytical challenges involved in discriminating between peptides with similar sequences in convoluted spectra have limited its applicability in important cases, such as the detection of single-nucleotide polymorphisms (SNPs) and alternative site localizations in phosphoproteomics data. We report Specter (https://github.com/rpeckner-broad/Specter), an open-source software tool that uses linear algebra to deconvolute DIA mixture spectra directly through comparison to a spectral library, thus circumventing the problems associated with typical fragment-correlation-based approaches. We validate the sensitivity of Specter and its performance relative to that of other methods, and show that Specter is able to successfully analyze cases involving highly similar peptides that are typically challenging for DIA analysis methods.
Physical-geometric optics method for large size faceted particles.
Sun, Bingqiang; Yang, Ping; Kattawar, George W; Zhang, Xiaodong
2017-10-02
A new physical-geometric optics method is developed to compute the single-scattering properties of faceted particles. It incorporates a general absorption vector to accurately account for inhomogeneous wave effects, and subsequently yields the relevant analytical formulas effective and computationally efficient for absorptive scattering particles. A bundle of rays incident on a certain facet can be traced as a single beam. For a beam incident on multiple facets, a systematic beam-splitting technique based on computer graphics is used to split the original beam into several sub-beams so that each sub-beam is incident only on an individual facet. The new beam-splitting technique significantly reduces the computational burden. The present physical-geometric optics method can be generalized to arbitrary faceted particles with either convex or concave shapes and with a homogeneous or an inhomogeneous (e.g., a particle with a core) composition. The single-scattering properties of irregular convex homogeneous and inhomogeneous hexahedra are simulated and compared to their counterparts from two other methods including a numerically rigorous method.
Editorial: Mathematical Methods and Modeling in Machine Fault Diagnosis
Yan, Ruqiang; Chen, Xuefeng; Li, Weihua; ...
2014-12-18
Modern mathematics has commonly been utilized as an effective tool to model mechanical equipment so that their dynamic characteristics can be studied analytically. This will help identify potential failures of mechanical equipment by observing change in the equipment’s dynamic parameters. On the other hand, dynamic signals are also important and provide reliable information about the equipment’s working status. Modern mathematics has also provided us with a systematic way to design and implement various signal processing methods, which are used to analyze these dynamic signals, and to enhance intrinsic signal components that are directly related to machine failures. This special issuemore » is aimed at stimulating not only new insights on mathematical methods for modeling but also recently developed signal processing methods, such as sparse decomposition with potential applications in machine fault diagnosis. Finally, the papers included in this special issue provide a glimpse into some of the research and applications in the field of machine fault diagnosis through applications of the modern mathematical methods.« less
Qin, Zong; Ji, Chuangang; Wang, Kai; Liu, Sheng
2012-10-08
In this paper, condition for uniform lighting generated by light emitting diode (LED) array was systematically studied. To take human vision effect into consideration, contrast sensitivity function (CSF) was novelly adopted as critical criterion for uniform lighting instead of conventionally used Sparrow's Criterion (SC). Through CSF method, design parameters including system thickness, LED pitch, LED's spatial radiation distribution and viewing condition can be analytically combined. In a specific LED array lighting system (LALS) with foursquare LED arrangement, different types of LEDs (Lambertian and Batwing type) and given viewing condition, optimum system thicknesses and LED pitches were calculated and compared with those got through SC method. Results show that CSF method can achieve more appropriate optimum parameters than SC method. Additionally, an abnormal phenomenon that uniformity varies with structural parameters non-monotonically in LALS with non-Lambertian LEDs was found and analyzed. Based on the analysis, a design method of LALS that can bring about better practicability, lower cost and more attractive appearance was summarized.
Modeling the frequency response of microwave radiometers with QUCS
NASA Astrophysics Data System (ADS)
Zonca, A.; Roucaries, B.; Williams, B.; Rubin, I.; D'Arcangelo, O.; Meinhold, P.; Lubin, P.; Franceschet, C.; Jahn, S.; Mennella, A.; Bersanelli, M.
2010-12-01
Characterization of the frequency response of coherent radiometric receivers is a key element in estimating the flux of astrophysical emissions, since the measured signal depends on the convolution of the source spectral emission with the instrument band shape. Laboratory Radio Frequency (RF) measurements of the instrument bandpass often require complex test setups and are subject to a number of systematic effects driven by thermal issues and impedance matching, particularly if cryogenic operation is involved. In this paper we present an approach to modeling radiometers bandpasses by integrating simulations and RF measurements of individual components. This method is based on QUCS (Quasi Universal Circuit Simulator), an open-source circuit simulator, which gives the flexibility of choosing among the available devices, implementing new analytical software models or using measured S-parameters. Therefore an independent estimate of the instrument bandpass is achieved using standard individual component measurements and validated analytical simulations. In order to automate the process of preparing input data, running simulations and exporting results we developed the Python package python-qucs and released it under GNU Public License. We discuss, as working cases, bandpass response modeling of the COFE and Planck Low Frequency Instrument (LFI) radiometers and compare results obtained with QUCS and with a commercial circuit simulator software. The main purpose of bandpass modeling in COFE is to optimize component matching, while in LFI they represent the best estimation of frequency response, since end-to-end measurements were strongly affected by systematic effects.
Imai, Chisato; Hashizume, Masahiro
2015-03-01
Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases.
2014-01-01
Background Structured comparison of pharmacoeconomic analyses for ACEIs and ARBs in patients with type 2 diabetic nephropathy is still lacking. This review aims to systematically review the cost-effectiveness of both ACEIs and ARBs in type 2 diabetic patients with nephropathy. Methods A systematic literature search was performed in MEDLINE and EMBASE for the period from November 1, 1999 to Oct 31, 2011. Two reviewers independently assessed the quality of the articles included and extracted data. All cost-effectiveness results were converted to 2011 Euros. Results Up to October 2011, 434 articles were identified. After full-text checking and quality assessment, 30 articles were finally included in this review involving 39 study settings. All 6 ACEIs studies were literature-based evaluations which synthesized data from different sources. Other 33 studies were directed at ARBs and were designed based on specific trials. The Markov model was the most common decision analytic method used in the evaluations. From the cost-effectiveness results, 37 out of 39 studies indicated either ACEIs or ARBs were cost-saving comparing with placebo/conventional treatment, such as amlodipine. A lack of evidence was assessed for valid direct comparison of cost-effectiveness between ACEIs and ARBs. Conclusion There is a lack of direct comparisons of ACEIs and ARBs in existing economic evaluations. Considering the current evidence, both ACEIs and ARBs are likely cost-saving comparing with conventional therapy, excluding such RAAS inhibitors. PMID:24428868
Management of Ready-to-Use Parenteral Nutrition in Newborns: Systematic Review.
Mena, Karen Daniela Romero; Espitia, Olga Lucia Pinzón; Vergara, José Alejandro Daza
2018-04-27
Parenteral support has increased the possibility of neonatal recovery. However, complications associated with its use have been documented. One commercial method developed to decrease the complications of this type of support is the ready-to-use parenteral nutrition (PN), a 3-chamber bag that provides a complete nutrient mix. This systematic review seeks, through the Preferred Reporting Items for Systematic Reviews and Meta-Analyses methodology, to establish the benefits in newborns. Seven databases and gray literature were used. The search was limited to publications from 2007-2017 and to articles written in English, Spanish, and Portuguese. Articles that did not meet the inclusion criteria and studies with low quality evaluated with the Scottish Intercollegiate Guidelines Network guidelines, which were without information about the study or analytical methods, were excluded. A total of 24,193 articles were obtained, which were initially evaluated by title and abstract according to the inclusion criteria. A total of 24,167 articles were discarded, obtaining 27 eligible for follow-up evaluation. After a detailed evaluation of the full text, 13 articles were selected. It was found that ready-to-use PN has the potential benefit to reduce the risks for infections, provide an adequate supply of nutrients, generate growth within the expected range, provide ease of use, decrease prescription errors, and potentially reduce costs. It is necessary to evaluate the short- and long-term impact of its use. © 2018 American Society for Parenteral and Enteral Nutrition.
Mc Gillicuddy, Aoife; Kelly, Maria; Crean, Abina M; Sahm, Laura J
The objective of this systematic review was to synthesize the available qualitative evidence on the knowledge, attitudes and beliefs of adult patients, healthcare professionals and carers about oral dosage form modification. A systematic review and synthesis of qualitative studies was undertaken, utilising the thematic synthesis approach. The following databases were searched from inception to September 2015: PubMed, Medline (EBSCO), EMBASE, CINAHL, PsycINFO, Web of Science, ProQuest Databases, Scopus, Turning Research Into Practice (TRIP), Cochrane Central Register of Controlled Trials (CENTRAL) and the Cochrane Database of Systematic Reviews (CDSR). Citation tracking and searching the references lists of included studies was also undertaken. Grey literature was searched using the OpenGrey database, internet searching and personal knowledge. An updated search was undertaken in June 2016. Studies meeting the following criteria were eligible for inclusion; (i) used qualitative data collection and analysis methods; (ii) full-text was available in English; (iii) included adult patients who require oral dosage forms to be modified to meet their needs or; (iv) carers or healthcare professionals of patients who require oral dosage forms to be modified. Two reviewers independently appraised the quality of the included studies using the Critical Appraisal Skills Programme Checklist. A thematic synthesis was conducted and analytical themes were generated. Of 5455 records screened, seven studies were eligible for inclusion; three involved healthcare professionals and the remaining four studies involved patients. Four analytical themes emerged from the thematic synthesis: (i) patient-centred individuality and variability; (ii) communication; (iii) knowledge and uncertainty and; (iv) complexity. The variability of individual patient's requirements, poor communication practices and lack of knowledge about oral dosage form modification, when combined with the complex and multi-faceted healthcare environment complicate decision making regarding oral dosage form modification and administration. This systematic review has highlighted the key factors influencing the knowledge, attitudes and beliefs of patients and healthcare professionals about oral dosage form modifications. The findings suggest that in order to optimise oral medicine modification practices the needs of individual patients should be routinely and systematically assessed and decision-making should be supported by evidence based recommendations with multidisciplinary input. Further research is needed to optimise oral dosage form modification practices and the factors identified in this review should be considered in the development of future interventions. Copyright © 2016 Elsevier Inc. All rights reserved.
Wei, Wen-Long; Zeng, Rui; Gu, Cai-Mei; Qu, Yan; Huang, Lin-Fang
2016-08-22
Angelica sinensis (Oliv.) Diels, known as Dang Gui (in Chinese), is a traditional medicinal and edible plant that has long been used for tonifying, replenishing, and invigorating blood as well as relieving pain, lubricating the intestines, and treating female irregular menstruation and amenorrhea. A. sinensis has also been used as a health product and become increasingly popular in China, Japan, and Korea. This paper aims to provide a systemic review of traditional uses of A. sinensis and its recent advances in the fields of phytochemistry, analytical methods and toxicology. In addition, possible trends, therapeutic potentials, and perspectives for future research of this plant are also briefly discussed. An extensive review of the literature was conducted, and electronic databases including China National Knowledge Infrastructure, PubMed, Google Scholar, Science Direct, and Reaxys were used to assemble the data. Ethnopharmacological literature and digitalised sources of academic libraries were also systematically searched. In addition, information was obtained from local books and The Plant List (TPL, www.theplantlist.org). This study reviews the progress in chemical analysis of A. sinensis and its preparations. Previously and newly established methods, including spectroscopy, thin-layer chromatography (TLC), gas chromatography (GC), high-performance liquid chromatography (HPLC), ultra-performance liquid chromatography(UPLC), and nuclear magnetic resonance analysis (NMR), are summarized. Moreover, identified bioactive components such as polysaccharides, ligustilide and ferulic acid were reviewed, along with analytical methods for quantitative and qualitative determination of target analytes, and fingerprinting authentication, quality evaluation of A. sinensis, and toxicology and pharmacodynamic studies. Scientific reports on crude extracts and pure compounds and formulations revealed a wide range of pharmacological activities, including anti-inflammatory activity, antifibrotic action, antispasmodic activity, antioxidant activities, and neuroprotective action, as well as cardio- and cerebrovascular effects. Within the published scientific literature are numerous reports regarding analytical methods that use various chromatographic and spectrophotometric technologies to monitor various types of components with different physicochemical properties simultaneously. This review discusses the reasonable selection of marker compounds based on high concentrations, analytical methods, and commercial availabilities with the goal of developing quick, accurate, and applicable analytical approaches for quality evaluation and establishing harmonised criteria for the analysis of A. sinensis and its finished products. Compounds isolated from A. sinensis are abundant sources of chemical diversity, from which we can discover active molecules. Thus, more studies on the pharmacological mechanisms of the predominant active compounds of A. sinensis are needed. In addition, given that A. sinensis is one of the most popular traditional herbal medicines, its main therapeutic aspects, toxicity, and adverse effects warrant further investigation in the future. Copyright © 2016. Published by Elsevier Ireland Ltd.
Dual-process cognitive interventions to enhance diagnostic reasoning: a systematic review.
Lambe, Kathryn Ann; O'Reilly, Gary; Kelly, Brendan D; Curristan, Sarah
2016-10-01
Diagnostic error incurs enormous human and economic costs. The dual-process model reasoning provides a framework for understanding the diagnostic process and attributes certain errors to faulty cognitive shortcuts (heuristics). The literature contains many suggestions to counteract these and to enhance analytical and non-analytical modes of reasoning. To identify, describe and appraise studies that have empirically investigated interventions to enhance analytical and non-analytical reasoning among medical trainees and doctors, and to assess their effectiveness. Systematic searches of five databases were carried out (Medline, PsycInfo, Embase, Education Resource Information Centre (ERIC) and Cochrane Database of Controlled Trials), supplemented with searches of bibliographies and relevant journals. Included studies evaluated an intervention to enhance analytical and/or non-analytical reasoning among medical trainees or doctors. Twenty-eight studies were included under five categories: educational interventions, checklists, cognitive forcing strategies, guided reflection, instructions at test and other interventions. While many of the studies found some effect of interventions, guided reflection interventions emerged as the most consistently successful across five studies, and cognitive forcing strategies improved accuracy and confidence judgements. Significant heterogeneity of measurement approaches was observed, and existing studies are largely limited to early-career doctors. Results to date are promising and this relatively young field is now close to a point where these kinds of cognitive interventions can be recommended to educators. Further research with refined methodology and more diverse samples is required before firm recommendations may be made for medical education and policy; however, these results suggest that such interventions hold promise, with much current enthusiasm for new research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
van de Water, A T M; Benjamin, D R
2016-02-01
Systematic literature review. Diastasis of the rectus abdominis muscle (DRAM) has been linked with low back pain, abdominal and pelvic dysfunction. Measurement is used to either screen or to monitor DRAM width. Determining which methods are suitable for screening and monitoring DRAM is of clinical value. To identify the best methods to screen for DRAM presence and monitor DRAM width. AMED, Embase, Medline, PubMed and CINAHL databases were searched for measurement property studies of DRAM measurement methods. Population characteristics, measurement methods/procedures and measurement information were extracted from included studies. Quality of all studies was evaluated using 'quality rating criteria'. When possible, reliability generalisation was conducted to provide combined reliability estimations. Thirteen studies evaluated measurement properties of the 'finger width'-method, tape measure, calipers, ultrasound, CT and MRI. Ultrasound was most evaluated. Methodological quality of these studies varied widely. Pearson's correlations of r = 0.66-0.79 were found between calipers and ultrasound measurements. Calipers and ultrasound had Intraclass Correlation Coefficients (ICC) of 0.78-0.97 for test-retest, inter- and intra-rater reliability. The 'finger width'-method had weighted Kappa's of 0.73-0.77 for test-retest reliability, but moderate agreement (63%; weighted Kappa = 0.53) between raters. Comparing calipers and ultrasound, low measurement error was found (above the umbilicus), and the methods had good agreement (83%; weighted Kappa = 0.66) for discriminative purposes. The available information support ultrasound and calipers as adequate methods to assess DRAM. For other methods limited measurement information of low to moderate quality is available and further evaluation of their measurement properties is required. Copyright © 2015 Elsevier Ltd. All rights reserved.
Somayajula, Srikanth Ayyala; Devred, Emmanuel; Bélanger, Simon; Antoine, David; Vellucci, V; Babin, Marcel
2018-04-20
In this study, we report on the performance of satellite-based photosynthetically available radiation (PAR) algorithms used in published oceanic primary production models. The performance of these algorithms was evaluated using buoy observations under clear and cloudy skies, and for the particular case of low sun angles typically encountered at high latitudes or at moderate latitudes in winter. The PAR models consisted of (i) the standard one from the NASA-Ocean Biology Processing Group (OBPG), (ii) the Gregg and Carder (GC) semi-analytical clear-sky model, and (iii) look-up-tables based on the Santa Barbara DISORT atmospheric radiative transfer (SBDART) model. Various combinations of atmospheric inputs, empirical cloud corrections, and semi-analytical irradiance models yielded a total of 13 (11 + 2 developed in this study) different PAR products, which were compared with in situ measurements collected at high frequency (15 min) at a buoy site in the Mediterranean Sea (the "BOUée pour l'acquiSition d'une Série Optique à Long termE," or, "BOUSSOLE" site). An objective ranking method applied to the algorithm results indicated that seven PAR products out of 13 were well in agreement with the in situ measurements. Specifically, the OBPG method showed the best overall performance with a root mean square difference (RMSD) (bias) of 19.7% (6.6%) and 10% (6.3%) followed by the look-up-table method with a RMSD (bias) of 25.5% (6.8%) and 9.6% (2.6%) at daily and monthly scales, respectively. Among the four methods based on clear-sky PAR empirically corrected for cloud cover, the Dobson and Smith method consistently underestimated daily PAR while the Budyko formulation overestimated daily PAR. Empirically cloud-corrected methods using cloud fraction (CF) performed better under quasi-clear skies (CF<0.3) with an RMSD (bias) of 9.7%-14.8% (3.6%-11.3%) than under partially clear to cloudy skies (0.3
Challenges and Opportunities of Big Data in Health Care: A Systematic Review
Goswamy, Rishi; Raval, Yesha; Marawi, Sarah
2016-01-01
Background Big data analytics offers promise in many business sectors, and health care is looking at big data to provide answers to many age-related issues, particularly dementia and chronic disease management. Objective The purpose of this review was to summarize the challenges faced by big data analytics and the opportunities that big data opens in health care. Methods A total of 3 searches were performed for publications between January 1, 2010 and January 1, 2016 (PubMed/MEDLINE, CINAHL, and Google Scholar), and an assessment was made on content germane to big data in health care. From the results of the searches in research databases and Google Scholar (N=28), the authors summarized content and identified 9 and 14 themes under the categories Challenges and Opportunities, respectively. We rank-ordered and analyzed the themes based on the frequency of occurrence. Results The top challenges were issues of data structure, security, data standardization, storage and transfers, and managerial skills such as data governance. The top opportunities revealed were quality improvement, population management and health, early detection of disease, data quality, structure, and accessibility, improved decision making, and cost reduction. Conclusions Big data analytics has the potential for positive impact and global implications; however, it must overcome some legitimate obstacles. PMID:27872036
Yildizoglu, Tugce; Weislogel, Jan-Marek; Mohammad, Farhan; Chan, Edwin S-Y; Assam, Pryseley N; Claridge-Chang, Adam
2015-12-01
Genetic studies in Drosophila reveal that olfactory memory relies on a brain structure called the mushroom body. The mainstream view is that each of the three lobes of the mushroom body play specialized roles in short-term aversive olfactory memory, but a number of studies have made divergent conclusions based on their varying experimental findings. Like many fields, neurogenetics uses null hypothesis significance testing for data analysis. Critics of significance testing claim that this method promotes discrepancies by using arbitrary thresholds (α) to apply reject/accept dichotomies to continuous data, which is not reflective of the biological reality of quantitative phenotypes. We explored using estimation statistics, an alternative data analysis framework, to examine published fly short-term memory data. Systematic review was used to identify behavioral experiments examining the physiological basis of olfactory memory and meta-analytic approaches were applied to assess the role of lobular specialization. Multivariate meta-regression models revealed that short-term memory lobular specialization is not supported by the data; it identified the cellular extent of a transgenic driver as the major predictor of its effect on short-term memory. These findings demonstrate that effect sizes, meta-analysis, meta-regression, hierarchical models and estimation methods in general can be successfully harnessed to identify knowledge gaps, synthesize divergent results, accommodate heterogeneous experimental design and quantify genetic mechanisms.
NASA Astrophysics Data System (ADS)
Takeda, M.; Nakajima, H.; Zhang, M.; Hiratsuka, T.
2008-04-01
To obtain reliable diffusion parameters for diffusion testing, multiple experiments should not only be cross-checked but the internal consistency of each experiment should also be verified. In the through- and in-diffusion tests with solution reservoirs, test interpretation of different phases often makes use of simplified analytical solutions. This study explores the feasibility of steady, quasi-steady, equilibrium and transient-state analyses using simplified analytical solutions with respect to (i) valid conditions for each analytical solution, (ii) potential error, and (iii) experimental time. For increased generality, a series of numerical analyses are performed using unified dimensionless parameters and the results are all related to dimensionless reservoir volume (DRV) which includes only the sorptive parameter as an unknown. This means the above factors can be investigated on the basis of the sorption properties of the testing material and/or tracer. The main findings are that steady, quasi-steady and equilibrium-state analyses are applicable when the tracer is not highly sorptive. However, quasi-steady and equilibrium-state analyses become inefficient or impractical compared to steady state analysis when the tracer is non-sorbing and material porosity is significantly low. Systematic and comprehensive reformulation of analytical models enables the comparison of experimental times between different test methods. The applicability and potential error of each test interpretation can also be studied. These can be applied in designing, performing, and interpreting diffusion experiments by deducing DRV from the available information for the target material and tracer, combined with the results of this study.
Maximum likelihood solution for inclination-only data in paleomagnetism
NASA Astrophysics Data System (ADS)
Arason, P.; Levi, S.
2010-08-01
We have developed a new robust maximum likelihood method for estimating the unbiased mean inclination from inclination-only data. In paleomagnetic analysis, the arithmetic mean of inclination-only data is known to introduce a shallowing bias. Several methods have been introduced to estimate the unbiased mean inclination of inclination-only data together with measures of the dispersion. Some inclination-only methods were designed to maximize the likelihood function of the marginal Fisher distribution. However, the exact analytical form of the maximum likelihood function is fairly complicated, and all the methods require various assumptions and approximations that are often inappropriate. For some steep and dispersed data sets, these methods provide estimates that are significantly displaced from the peak of the likelihood function to systematically shallower inclination. The problem locating the maximum of the likelihood function is partly due to difficulties in accurately evaluating the function for all values of interest, because some elements of the likelihood function increase exponentially as precision parameters increase, leading to numerical instabilities. In this study, we succeeded in analytically cancelling exponential elements from the log-likelihood function, and we are now able to calculate its value anywhere in the parameter space and for any inclination-only data set. Furthermore, we can now calculate the partial derivatives of the log-likelihood function with desired accuracy, and locate the maximum likelihood without the assumptions required by previous methods. To assess the reliability and accuracy of our method, we generated large numbers of random Fisher-distributed data sets, for which we calculated mean inclinations and precision parameters. The comparisons show that our new robust Arason-Levi maximum likelihood method is the most reliable, and the mean inclination estimates are the least biased towards shallow values.
Electromagnetic fields of an ultra-short tightly-focused radially-polarized laser pulse
NASA Astrophysics Data System (ADS)
Salamin, Yousef I.; Li, Jian-Xing
2017-12-01
Fully analytic expressions, for the electric and magnetic fields of an ultrashort and tightly focused laser pulse of the radially polarized category, are presented to lowest order of approximation. The fields are derived from scalar and vector potentials, along the lines of our earlier work for a similar pulse of the linearly polarized variety. A systematic program is also described from which the fields may be obtained to any desired accuracy, analytically or numerically.
Analytical quality by design: a tool for regulatory flexibility and robust analytics.
Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).
Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics
Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723
Park, Jung In; Pruinelli, Lisiane; Westra, Bonnie L; Delaney, Connie W
2014-01-01
With the pervasive implementation of electronic health records (EHR), new opportunities arise for nursing research through use of EHR data. Increasingly, comparative effectiveness research within and across health systems is conducted to identify the impact of nursing for improving health, health care, and lowering costs of care. Use of EHR data for this type of research requires use of national and internationally recognized nursing terminologies to normalize data. Research methods are evolving as large data sets become available through EHRs. Little is known about the types of research and analytic methods for applied to nursing research using EHR data normalized with nursing terminologies. The purpose of this paper is to report on a subset of a systematic review of peer reviewed studies related to applied nursing informatics research involving EHR data using standardized nursing terminologies.
Biomarker-specific conjugated nanopolyplexes for the active coloring of stem-like cancer cells
NASA Astrophysics Data System (ADS)
Hong, Yoochan; Lee, Eugene; Choi, Jihye; Haam, Seungjoo; Suh, Jin-Suck; Yang, Jaemoon
2016-06-01
Stem-like cancer cells possess intrinsic features and their CD44 regulate redox balance in cancer cells to survive under stress conditions. Thus, we have fabricated biomarker-specific conjugated polyplexes using CD44-targetable hyaluronic acid and redox-sensible polyaniline based on a nanoemulsion method. For the most sensitive recognition of the cellular redox at a single nanoparticle scale, a nano-scattering spectrum imaging analyzer system was introduced. The conjugated polyplexes showed a specific targeting ability toward CD44-expressing cancer cells as well as a dramatic change in its color, which depended on the redox potential in the light-scattered images. Therefore, these polyaniline-based conjugated polyplexes as well as analytical processes that include light-scattering imaging and measurements of scattering spectra, clearly establish a systematic method for the detection and monitoring of cancer microenvironments.
Algebraic geometry and Bethe ansatz. Part I. The quotient ring for BAE
NASA Astrophysics Data System (ADS)
Jiang, Yunfeng; Zhang, Yang
2018-03-01
In this paper and upcoming ones, we initiate a systematic study of Bethe ansatz equations for integrable models by modern computational algebraic geometry. We show that algebraic geometry provides a natural mathematical language and powerful tools for understanding the structure of solution space of Bethe ansatz equations. In particular, we find novel efficient methods to count the number of solutions of Bethe ansatz equations based on Gröbner basis and quotient ring. We also develop analytical approach based on companion matrix to perform the sum of on-shell quantities over all physical solutions without solving Bethe ansatz equations explicitly. To demonstrate the power of our method, we revisit the completeness problem of Bethe ansatz of Heisenberg spin chain, and calculate the sum rules of OPE coefficients in planar N=4 super-Yang-Mills theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Ruqiang; Chen, Xuefeng; Li, Weihua
Modern mathematics has commonly been utilized as an effective tool to model mechanical equipment so that their dynamic characteristics can be studied analytically. This will help identify potential failures of mechanical equipment by observing change in the equipment’s dynamic parameters. On the other hand, dynamic signals are also important and provide reliable information about the equipment’s working status. Modern mathematics has also provided us with a systematic way to design and implement various signal processing methods, which are used to analyze these dynamic signals, and to enhance intrinsic signal components that are directly related to machine failures. This special issuemore » is aimed at stimulating not only new insights on mathematical methods for modeling but also recently developed signal processing methods, such as sparse decomposition with potential applications in machine fault diagnosis. Finally, the papers included in this special issue provide a glimpse into some of the research and applications in the field of machine fault diagnosis through applications of the modern mathematical methods.« less
Metabolomics and Diabetes: Analytical and Computational Approaches
Sas, Kelli M.; Karnovsky, Alla; Michailidis, George
2015-01-01
Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200
Systematic review of the use of online questionnaires of older adults.
Remillard, Meegan L; Mazor, Kathleen M; Cutrona, Sarah L; Gurwitz, Jerry H; Tjia, Jennifer
2014-04-01
To describe methodological approaches to population targeting and sampling and to summarize limitations of Internet-based questionnaires in older adults. Systematic literature review. Studies using online questionnaires in older adult populations. English-language articles using search terms for geriatric, age 65 and over, Internet survey, online survey, Internet questionnaire, and online questionnaire in PubMed and EBSCO host between 1984 and July 2012. Inclusion criteria were study population mean age 65 and older and use of an online questionnaire for research. Review of 336 abstracts yielded 14 articles for full review by two investigators; 11 articles met inclusion criteria. Articles were extracted for study design and setting, participant characteristics, recruitment strategy, country, and study limitations. Eleven articles were published after 2001. Studies had populations with a mean age of 65 to 78, included descriptive and analytical designs, and were conducted in the United States, Australia, and Japan. Recruiting methods varied widely from paper fliers and personal e-mails to use of consumer marketing panels. Investigator-reported study limitations included the use of small convenience samples and limited generalizability. Online questionnaires are a feasible method of surveying older adults in some geographic regions and for some subsets of older adults, but limited Internet access constrains recruiting methods and often limits study generalizability. © 2014, Copyright the Authors Journal compilation © 2014, The American Geriatrics Society.
Reichow, Brian; Kogan, Cary; Barbui, Corrado; Smith, Isaac; Yasamy, M Taghi; Servili, Chiara
2014-08-27
Developmental disorders, including intellectual disability and autism spectrum disorders, may limit an individual's capacity to conduct daily activities. The emotional and economic burden on families caring for an individual with a developmental disorder is substantial, and quality of life may be limited by a lack of services. Therefore, finding effective treatments to help this population should be a priority. Recent work has shown parent skills training interventions improve developmental, behavioural and family outcomes. The purpose of this review protocol is to extend previous findings by systematically analysing randomised controlled trials of parent skills training programmes for parents of children with developmental disorders including intellectual disabilities and autism spectrum disorders and use meta-analytic techniques to identify programme components reliably associated with successful outcomes of parent skills training programmes. We will include all studies conducted using randomised control trials designs that compare a group of parents receiving a parent skills training programme to a group of parents in a no-treatment control, waitlist control or treatment as usual comparison group. To locate studies, we will conduct an extensive electronic database search and then use snowball methods, with no limits to publication year or language. We will present a narrative synthesis including visual displays of study effects on child and parental outcomes and conduct a quantitative synthesis of the effects of parent skills training programmes using meta-analytic techniques. No ethical issues are foreseen and ethical approval is not required given this is a protocol for a systematic review. The findings of this study will be disseminated through peer-reviewed publications and international conference presentations. Updates of the review will be conducted, as necessary, to inform and guide practice. PROSPERO (CRD42014006993). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
McConville, Fran; Portela, Anayda
2016-01-01
Background Quality of care is essential for further progress in reducing maternal and newborn deaths. The integration of educated, trained, regulated and licensed midwives into the health system is associated with improved quality of care and sustained decreases in maternal and newborn mortality. To date, research on barriers to quality of care for women and newborns has not given due attention to the care provider’s perspective. This paper addresses this gap by presenting the findings of a systematic mapping of the literature of the social, economic and professional barriers preventing midwifery personnel in low and middle income countries (LMICs) from providing quality of care. Methods and Findings A systematic search of five electronic databases for literature published between January 1990 and August 2013. Eligible items included published and unpublished items in all languages. Items were screened against inclusion and exclusion criteria, yielding 82 items from 34 countries. 44% discussed countries or regions in Africa, 38% in Asia, and 5% in the Americas. Nearly half the articles were published since 2011. Data was extracted and presented in a narrative synthesis and tables. Items were organized into three categories; social; economic and professional barriers, based on an analytical framework. Barriers connected to the socially and culturally constructed context of childbirth, although least reported, appear instrumental in preventing quality midwifery care. Conclusions Significant social and cultural, economic and professional barriers can prevent the provision of quality midwifery care in LMICs. An analytical framework is proposed to show how the overlaps between the barriers reinforce each other, and that they arise from gender inequality. Links are made between burn out and moral distress, caused by the barriers, and poor quality care. Ongoing mechanisms to improve quality care will need to address the barriers from the midwifery provider perspective, as well as the underlying gender inequality. PMID:27135248
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report was prepared at the request of the Lawrence Livermore Laboratory (LLL) to provide background information for analyzing soil-structure interaction by the frequency-independent impedance function approach. LLL is conducting such analyses as part of its seismic review of selected operating plants under the Systematic Evaluation Program for the US Nuclear Regulatory Commission. The analytical background and basic assumptionsof the impedance function theory are briefly reviewed, and the role of radiation damping in soil-structure interaction analysis is discussed. The validity of modeling soil-structure interaction by using frequency-independent functions is evaluated based on data from several field tests. Finally, the recommendedmore » procedures for performing soil-structure interaction analyses are discussed with emphasis on the modal superposition method.« less
The detection and correction of outlying determinations that may occur during geochemical analysis
Harvey, P.K.
1974-01-01
'Wild', 'rogue' or outlying determinations occur periodically during geochemical analysis. Existing tests in the literature for the detection of such determinations within a set of replicate measurements are often misleading. This account describes the chances of detecting outliers and the extent to which correction may be made for their presence in sample sizes of three to seven replicate measurements. A systematic procedure for monitoring data for outliers is outlined. The problem of outliers becomes more important as instrumental methods of analysis become faster and more highly automated; a state in which it becomes increasingly difficult for the analyst to examine every determination. The recommended procedure is easily adapted to such analytical systems. ?? 1974.
Modal characteristics of a simplified brake rotor model using semi-analytical Rayleigh Ritz method
NASA Astrophysics Data System (ADS)
Zhang, F.; Cheng, L.; Yam, L. H.; Zhou, L. M.
2006-10-01
Emphasis of this paper is given to the modal characteristics of a brake rotor which is utilized in automotive disc brake system. The brake rotor is modeled as a combined structure comprising an annular plate connected to a segment of cylindrical shell by distributed artificial springs. Modal analysis shows the existence of three types of modes for the combined structure, depending on the involvement of each substructure. A decomposition technique is proposed, allowing each mode of the combined structure to be decomposed into a linear combination of the individual substructure modes. It is shown that the decomposition coefficients provide a direct and systematic means to carry out modal classification and quantification.
Bridgman growth of semiconductors
NASA Technical Reports Server (NTRS)
Carlson, F. M.
1985-01-01
The purpose of this study was to improve the understanding of the transport phenomena which occurs in the directional solidification of alloy semiconductors. In particular, emphasis was placed on the strong role of convection in the melt. Analytical solutions were not deemed possible for such an involved problem. Accordingly, a numerical model of the process was developed which simulated the transport. This translates into solving the partial differential equations of energy, mass, species, and momentum transfer subject to various boundary and initial conditions. A finite element method with simple elements was initially chosen. This simulation tool will enable the crystal grower to systematically identify and modify the important design factors within her control to produce better crystals.
León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.
2013-01-01
The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921
A Systematic Method for Reviewing and Analyzing Health Information on Consumer-Oriented Websites.
Rew, Lynn; Saenz, Ashley; Walker, Lorraine O
2018-05-29
A discussion of a proposed method for analyzing the quality of consumer-oriented websites that provide health-related information. The quality of health information available to consumers online varies widely in quality. In an effort to improve the quality of online information, experts have undertaken systematic reviews on selected health topics; however, no standardized comprehensive methodology currently exists for such review. An eight-step method is recommended embracing the following steps: (1) select topic; (2) determine the purpose of the analysis; (3) select search terms and engines; (4) develop and apply website inclusion and exclusion criteria; (5) develop processes and tools to manage search results; (6) specify measures of quality; (7) compute readability; (8) evaluate websites. Each of these steps is illustrated in relation to the health topic of gynecomastia, a physical and mental health challenge for many adolescent males and young men. Although most extant analyses of consumer-oriented websites have focused on disease conditions and their treatment, website-analysis methodology would encourage analyses that fall into the nursing care domain. The method outlined in this paper is intended to provide nurses and others who work with specific patient populations with the tools needed for website analytic studies. Such studies provide a foundation for making recommendations about quality websites, as well as identifying gaps in online information for health consumers. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
[Work-related Stress and the Allostatic Load Index - A Systematic Review].
Mauss, D; Li, J; Schmidt, B; Angerer, P; Jarczok, M N
2017-12-01
Work-related stress is a growing social challenge and has been associated with reduced employee health, well-being, and productivity. One tool to measure the stress-related wear and tear of the body is the Allostatic Load Index (ALI). This review summarizes recent evidence on the association between work-related stress and ALI in working adults. A systematic literature search following the PRISMA-Statement was conducted in 21 databases including Medline, PubPsych, MedPilot, and Cochrane Register. Publications addressing work related-stress and medical parameters using ALI were considered. Data on study population, analytic techniques, and results were tabulated. Methodological quality was evaluated using a standardized checklist. 9 articles were identified with a total of 3 532 employees from 5 countries reporting cross-sectional data from the years 2003-2013. Overall, 7 studies reported a positive and significant association between work-related stress and ALI, while 2 studies showed no or an insignificant association. Substantial heterogeneity was observed in methods applied and study quality. This systematic review provides evidence that work-related stress is associated with ALI in cross-sectional studies. This association needs to be demonstrated by future studies using longitudinal data on working populations. © Georg Thieme Verlag KG Stuttgart · New York.
Reference values of elements in human hair: a systematic review.
Mikulewicz, Marcin; Chojnacka, Katarzyna; Gedrange, Thomas; Górecki, Henryk
2013-11-01
The lack of systematic review on reference values of elements in human hair with the consideration of methodological approach. The absence of worldwide accepted and implemented universal reference ranges causes that hair mineral analysis has not become yet a reliable and useful method of assessment of nutritional status and exposure of individuals. Systematic review of reference values of elements in human hair. PubMed, ISI Web of Knowledge, Scopus. Humans, hair mineral analysis, elements or minerals, reference values, original studies. The number of studies screened and assessed for eligibility was 52. Eventually, included in the review were 5 papers. The studies report reference ranges for the content of elements in hair: macroelements, microelements, toxic elements and other elements. Reference ranges were elaborated for different populations in the years 2000-2012. The analytical methodology differed, in particular sample preparation, digestion and analysis (ICP-AES, ICP-MS). Consequently, the levels of hair minerals reported as reference values varied. It is necessary to elaborate the standard procedures and furtherly validate hair mineral analysis and deliver detailed methodology. Only then it would be possible to provide meaningful reference ranges and take advantage of the potential that lies in Hair Mineral Analysis as a medical diagnostic technique. Copyright © 2013 Elsevier B.V. All rights reserved.
Amaya-Amaya, Jenny; Caro-Moreno, Julián; Molano-González, Nicolás; Mantilla, Rubén D.; Rojas-Villarraga, Adriana; Anaya, Juan-Manuel
2013-01-01
Objective. This study was performed to determine the prevalence of and associated risk factors for cardiovascular disease (CVD) in Latin American (LA) patients with systemic lupus erythematosus (SLE). Methods. First, a cross-sectional analytical study was conducted in 310 Colombian patients with SLE in whom CVD was assessed. Associated factors were examined by multivariate regression analyses. Second, a systematic review of the literature on CVD in SLE in LA was performed. Results. There were 133 (36.5%) Colombian SLE patients with CVD. Dyslipidemia, smoking, coffee consumption, and pleural effusion were positively associated with CVD. An independent effect of coffee consumption and cigarette on CVD was found regardless of gender and duration of disease. In the systematic review, 60 articles fulfilling the eligibility criteria were included. A wide range of CVD prevalence was found (4%–79.5%). Several studies reported ancestry, genetic factors, and polyautoimmunity as novel risk factors for such a condition. Conclusions. A high rate of CVD is observed in LA patients with SLE. Awareness of the observed risk factors should encourage preventive population strategies for CVD in patients with SLE aimed at facilitating the suppression of cigarette smoking and coffee consumption as well as at the tight control of dyslipidemia and other modifiable risk factors. PMID:24294522
Accounting for Multiple Births in Neonatal and Perinatal Trials: Systematic Review and Case Study
Hibbs, Anna Maria; Black, Dennis; Palermo, Lisa; Cnaan, Avital; Luan, Xianqun; Truog, William E; Walsh, Michele C; Ballard, Roberta A
2010-01-01
Objectives To determine the prevalence in the neonatal literature of statistical approaches accounting for the unique clustering patterns of multiple births. To explore the sensitivity of an actual trial to several analytic approaches to multiples. Methods A systematic review of recent perinatal trials assessed the prevalence of studies accounting for clustering of multiples. The NO CLD trial served as a case study of the sensitivity of the outcome to several statistical strategies. We calculated odds ratios using non-clustered (logistic regression) and clustered (generalized estimating equations, multiple outputation) analyses. Results In the systematic review, most studies did not describe the randomization of twins and did not account for clustering. Of those studies that did, exclusion of multiples and generalized estimating equations were the most common strategies. The NO CLD study included 84 infants with a sibling enrolled in the study. Multiples were more likely than singletons to be white and were born to older mothers (p<0.01). Analyses that accounted for clustering were statistically significant; analyses assuming independence were not. Conclusions The statistical approach to multiples can influence the odds ratio and width of confidence intervals, thereby affecting the interpretation of a study outcome. A minority of perinatal studies address this issue. PMID:19969305
von Kobyletzki, Laura Beate; Beckman, Linda; Smeeth, Liam; McKee, Martin; Abuabara, Katrina; Langan, Sinead
2017-01-01
Introduction Childhood allergic diseases may prevent affected children from achieving their academic potential. Potential mechanisms include absence from school due to illness and medical appointments. Experience of symptoms in classes or leisure time, and stigma associated with visible signs and symptoms, including skin disease, requirements for medication during school time or the need for specific diets, may also contribute to reduced educational attainment. Studies have investigated the association between specific allergic diseases and educational attainment. The aim of this study is to systematically review the literature on allergic diseases, educational attainment and occupational status, and if possible, calculate meta-analytic summary estimates for the associations. Methods Systematic electronic searches in Medline, EMBASE, Cochrane, Cumulative Index to Nursing & Allied Health Literature (CINAHL), PsycINFO and education Resources Information Center (ERIC); hand search in reference lists of included papers and conference reports; search for unpublished studies in clinical trial registers and the New York Academy of Medicine Grey Literature Report; data extraction; and study quality assessment (Newcastle-Ottawa Scale) will be performed. Analysis Data will be summarised descriptively, and meta-analysis including meta-regression to explore sources of heterogeneities will be performed if possible. Ethics and dissemination Dissemination in a peer-reviewed, open-access, international scientific journal is planned. PROSPERO registration number CRD42017058036. PMID:29025838
Characteristics of steady vibration in a rotating hub-beam system
NASA Astrophysics Data System (ADS)
Zhao, Zhen; Liu, Caishan; Ma, Wei
2016-02-01
A rotating beam features a puzzling character in which its frequencies and modal shapes may vary with the hub's inertia and its rotating speed. To highlight the essential nature behind the vibration phenomena, we analyze the steady vibration of a rotating Euler-Bernoulli beam with a quasi-steady-state stretch. Newton's law is used to derive the equations governing the beam's elastic motion and the hub's rotation. A combination of these equations results in a nonlinear partial differential equation (PDE) that fully reflects the mutual interaction between the two kinds of motion. Via the Fourier series expansion within a finite interval of time, we reduce the PDE into an infinite system of a nonlinear ordinary differential equation (ODE) in spatial domain. We further nondimensionalize the ODE and discretize it via a difference method. The frequencies and modal shapes of a general rotating beam are then determined numerically. For a low-speed beam where the ignorance of geometric stiffening is feasible, the beam's vibration characteristics are solved analytically. We validate our numerical method and the analytical solutions by comparing with either the past experiments or the past numerical findings reported in existing literature. Finally, systematic simulations are performed to demonstrate how the beam's eigenfrequencies vary with the hub's inertia and rotating speed.
Modeling and evaluating user behavior in exploratory visual analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reda, Khairi; Johnson, Andrew E.; Papka, Michael E.
Empirical evaluation methods for visualizations have traditionally focused on assessing the outcome of the visual analytic process as opposed to characterizing how that process unfolds. There are only a handful of methods that can be used to systematically study how people use visualizations, making it difficult for researchers to capture and characterize the subtlety of cognitive and interaction behaviors users exhibit during visual analysis. To validate and improve visualization design, however, it is important for researchers to be able to assess and understand how users interact with visualization systems under realistic scenarios. This paper presents a methodology for modeling andmore » evaluating the behavior of users in exploratory visual analysis. We model visual exploration using a Markov chain process comprising transitions between mental, interaction, and computational states. These states and the transitions between them can be deduced from a variety of sources, including verbal transcripts, videos and audio recordings, and log files. This model enables the evaluator to characterize the cognitive and computational processes that are essential to insight acquisition in exploratory visual analysis, and reconstruct the dynamics of interaction between the user and the visualization system. We illustrate this model with two exemplar user studies, and demonstrate the qualitative and quantitative analytical tools it affords.« less
Analysis, Occurrence and Toxicity of Haloacetaldehydes in ...
Chlorinated and brominated haloacetaldehydes (HALs) are consideredthe 3rd largest class of disinfection by-products (DBPs) by weight. The iodinatedHAL, iodoacetaldehyde, has been recently reported as an emerging DBP infinished drinking waters. Overall, iodinated DBPs, e.g., iodoacetic acids,iodoacetamides, and iodonitriles, are among the most genotoxic of all DBPsidentified. In this context, this chapter reviews the analytical methods available todate to determine HALs in water, and the concentrations at which they are presentin finished drinking waters. Since systematic toxicological effects have been onlyinvestigated for selected chloro- and bromo- HALs, a comparative study of thegenotoxicity and cytotoxicity of this DBP class to mammalian ce11s is alsopresented. This research is part of the Safe and Sustainable Water Research (SSWR) Program, specifically SSWR 2.2.D, which focuses on water contaminants. Haloacetaldehydes are an important class of emerging (non-regulated), disinfection byproducts. Haloacetaldehydes were the third largest disinfection byproduct class by weight in a U.S. Nationwide DBP Occurrence Study. Why was this study done? This study was done because a) improved analytical methods are needed for the haloacetaldehyde disinfection byproducts; b) occurrence data in drinking water are needed; and c) in vitro toxicology data on the class (iodo-, bromo, chloro-) of the haloacetaldehydes are lacking. What is the impact to the scientific field in ge
Kim, Dong-Kwan; Hwang, Yoon Jo; Yoon, Cheolho; Yoon, Hye-On; Chang, Ki Soo; Lee, Gaehang; Lee, Seungwoo; Yi, Gi-Ra
2015-08-28
The theoretical extinction coefficients of gold nanoparticles (AuNPs) have been mainly verified by the analytical solving of the Maxwell equation for an ideal sphere, which was firstly founded by Mie (generally referred to as Mie theory). However, in principle, it has not been directly feasible with experimental verification especially for relatively large AuNPs (i.e., >40 nm), as conventionally proposed synthetic methods have inevitably resulted in a polygonal shaped, non-ideal Au nanosphere. Here, mono-crystalline, ultra-smooth, and highly spherical AuNPs of 40-100 nm were prepared by the procedure reported in our recent work (ACS Nano, 2013, 7, 11064). The extinction coefficients of the ideally spherical AuNPs of 40-100 nm were empirically extracted using the Beer-Lambert law, and were then compared with the theoretical limits obtained by the analytical and numerical methods. The obtained extinction coefficients of the ideally spherical AuNPs herein agree much more closely with the theoretical limits, compared with those of the faceted or polygonal shaped AuNPs. In addition, in order to further elucidate the importance of being spherical, we systematically compared our ideally spherical AuNPs with the polygonal counterparts; effectively addressing the role of the surface morphology on the spectral responses in both theoretical and experimental manners.
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
SAM Radiochemical Methods Query
Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.
Systematic Assessment of the Hemolysis Index: Pros and Cons.
Lippi, Giuseppe
2015-01-01
Preanalytical quality is as important as the analytical and postanalytical quality in laboratory diagnostics. After decades of visual inspection to establish whether or not a diagnostic sample may be suitable for testing, automated assessment of hemolysis index (HI) has now become available in a large number of laboratory analyzers. Although most national and international guidelines support systematic assessment of sample quality via HI, there is widespread perception that this indication has not been thoughtfully acknowledged. Potential explanations include concern of increased specimen rejection rate, poor harmonization of analytical techniques, lack of standardized units of measure, differences in instrument-specific cutoff, negative impact on throughput, organization and laboratory economics, and lack of a reliable quality control system. Many of these concerns have been addressed. Evidence now supports automated HI in improving quality and patient safety. These will be discussed. © 2015 Elsevier Inc. All rights reserved.
A flexible influence of affective feelings on creative and analytic performance.
Huntsinger, Jeffrey R; Ray, Cara
2016-09-01
Considerable research shows that positive affect improves performance on creative tasks and negative affect improves performance on analytic tasks. The present research entertained the idea that affective feelings have flexible, rather than fixed, effects on cognitive performance. Consistent with the idea that positive and negative affect signal the value of accessible processing inclinations, the influence of affective feelings on performance on analytic or creative tasks was found to be flexibly responsive to the relative accessibility of different styles of processing (i.e., heuristic vs. systematic, global vs. local). When a global processing orientation was accessible happy participants generated more creative uses for a brick (Experiment 1), successfully solved more remote associates and insight problems (Experiment 2) and displayed broader categorization (Experiment 3) than those in sad moods. When a local processing orientation was accessible this pattern reversed. When a heuristic processing style was accessible happy participants were more likely to commit the conjunction fallacy (Experiment 3) and showed less pronounced anchoring effects (Experiment 4) than sad participants. When a systematic processing style was accessible this pattern reversed. Implications of these results for relevant affect-cognition models are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Miyahara, M; Lagisz, M; Nakagawa, S; Henderson, S E
2017-09-01
Systematic reviews and meta-analyses are considered to be the 'gold standards' for synthesizing research evidence in particular areas of enquiry. However, such reviews are only useful if they themselves are conducted to a sufficiently high standard. The aim of this study was to conduct a narrative meta-review of existing analyses of the effectiveness of interventions designed for children with developmental co-ordination disorder (DCD). A narrative meta-review of systematic and meta-analytic reviews aimed at evaluating the effectiveness of intervention for children with DCD was conducted on studies published between 1950 and 2014. We identified suitable reviews, using a modification of the Population, Intervention, Comparison, Outcome (PICO) system and evaluated their methodological quality using the Assessment of Multiple Systematic Reviews (AMSTAR). In addition, the consistency of the quality of evidence and classification of intervention approaches was assessed independently by two assessors. The literature search yielded a total of four appropriate reviews published in the selected time span. The Assessment of Multiple Systematic Reviews percentage quality scores assigned to each review ranged from 0% (low quality) to 55% (medium quality). Evaluation of the quality of evidence and classification of intervention approaches yielded a discrepancy rate of 25%. All reviews concluded that some kind of intervention was better than none at all. Although the quality of the reviews progressively improved over the years, the shortcomings identified need to be addressed before concrete evidence regarding the best approach to intervention for children with DCD can be specified. © 2016 John Wiley & Sons Ltd.
7 CFR 98.4 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...
7 CFR 93.4 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...
7 CFR 98.4 - Analytical methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...
7 CFR 93.4 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...
7 CFR 93.4 - Analytical methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...
7 CFR 98.4 - Analytical methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...
7 CFR 98.4 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...
7 CFR 98.4 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture... Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MREs are listed as follows: (1) Official Methods of Analysis of AOAC...
7 CFR 93.4 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...
7 CFR 93.4 - Analytical methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...
Onasanya, Oluwadamilola; Iyer, Geetha; Lucas, Eleanor; Lin, Dora; Singh, Sonal; Alexander, G Caleb
2016-11-01
Given the conflicting evidence regarding the association between exogenous testosterone and cardiovascular events, we systematically assessed published systematic reviews for evidence of the association between exogenous testosterone and cardiovascular events. We searched PubMed, MEDLINE, Embase, Cochrane Collaboration Clinical Trials, ClinicalTrials.gov, and the US Food and Drug Administration website for systematic reviews of randomised controlled trials published up to July 19, 2016. Two independent reviewers screened 954 full texts from 29 335 abstracts to identify systematic reviews of randomised controlled trials in which the cardiovascular effects of exogenous testosterone on men aged 18 years or older were examined. We extracted data for study characteristics, analytic methods, and key findings, and applied the AMSTAR (A Measurement Tool to Assess Systematic Reviews) checklist to assess methodological quality of each review. Our primary outcome measure was the direction and magnitude of association between exogenous testosterone and cardiovascular events. We identified seven reviews and meta-analyses, which had substantial clinical heterogeneity, differing statistical methods, and variable methodological quality and quality of data abstraction. AMSTAR scores ranged from 3 to 9 out of 11. Six systematic reviews that each included a meta-analysis showed no significant association between exogenous testosterone and cardiovascular events, with summary estimates ranging from 1·07 to 1·82 and imprecise confidence intervals. Two of these six meta-analyses showed increased risk in subgroup analyses of oral testosterone and men aged 65 years or older during their first treatment year. One meta-analysis showed a significant association between exogenous testosterone and cardiovascular events, in men aged 18 years or older generally, with a summary estimate of 1·54 (95% CI 1·09-2·18). Our optimal information size analysis showed that any randomised controlled trial aiming to detect a true difference in cardiovascular risk between treatment groups receiving exogenous testosterone and their controls (with a two-sided p value of 0·05 and a power of 80%) would require at least 17 664 participants in each trial group. Therefore, given the challenge of adequately powering clinical trials for rare outcomes, rigorous observational studies are needed to clarify the association between testosterone-replacement therapy and major adverse cardiovascular outcomes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Discrete choice experiments of pharmacy services: a systematic review.
Vass, Caroline; Gray, Ewan; Payne, Katherine
2016-06-01
Background Two previous systematic reviews have summarised the application of discrete choice experiments to value preferences for pharmacy services. These reviews identified a total of twelve studies and described how discrete choice experiments have been used to value pharmacy services but did not describe or discuss the application of methods used in the design or analysis. Aims (1) To update the most recent systematic review and critically appraise current discrete choice experiments of pharmacy services in line with published reporting criteria and; (2) To provide an overview of key methodological developments in the design and analysis of discrete choice experiments. Methods The review used a comprehensive strategy to identify eligible studies (published between 1990 and 2015) by searching electronic databases for key terms related to discrete choice and best-worst scaling (BWS) experiments. All healthcare choice experiments were then hand-searched for key terms relating to pharmacy. Data were extracted using a published checklist. Results A total of 17 discrete choice experiments eliciting preferences for pharmacy services were identified for inclusion in the review. No BWS studies were identified. The studies elicited preferences from a variety of populations (pharmacists, patients, students) for a range of pharmacy services. Most studies were from a United Kingdom setting, although examples from Europe, Australia and North America were also identified. Discrete choice experiments for pharmacy services tended to include more attributes than non-pharmacy choice experiments. Few studies reported the use of qualitative research methods in the design and interpretation of the experiments (n = 9) or use of new methods of analysis to identify and quantify preference and scale heterogeneity (n = 4). No studies reported the use of Bayesian methods in their experimental design. Conclusion Incorporating more sophisticated methods in the design of pharmacy-related discrete choice experiments could help researchers produce more efficient experiments which are better suited to valuing complex pharmacy services. Pharmacy-related discrete choice experiments could also benefit from more sophisticated analytical techniques such as investigations into scale and preference heterogeneity. Employing these sophisticated methods for both design and analysis could extend the usefulness of discrete choice experiments to inform health and pharmacy policy.
40 CFR 161.180 - Enforcement analytical method.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Enforcement analytical method. 161.180... DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements § 161.180 Enforcement analytical method. An analytical method suitable for enforcement purposes must be...
Wong, Kin-Yiu; Gao, Jiali
2008-09-09
In this paper, we describe an automated integration-free path-integral (AIF-PI) method, based on Kleinert's variational perturbation (KP) theory, to treat internuclear quantum-statistical effects in molecular systems. We have developed an analytical method to obtain the centroid potential as a function of the variational parameter in the KP theory, which avoids numerical difficulties in path-integral Monte Carlo or molecular dynamics simulations, especially at the limit of zero-temperature. Consequently, the variational calculations using the KP theory can be efficiently carried out beyond the first order, i.e., the Giachetti-Tognetti-Feynman-Kleinert variational approach, for realistic chemical applications. By making use of the approximation of independent instantaneous normal modes (INM), the AIF-PI method can readily be applied to many-body systems. Previously, we have shown that in the INM approximation, the AIF-PI method is accurate for computing the quantum partition function of a water molecule (3 degrees of freedom) and the quantum correction factor for the collinear H(3) reaction rate (2 degrees of freedom). In this work, the accuracy and properties of the KP theory are further investigated by using the first three order perturbations on an asymmetric double-well potential, the bond vibrations of H(2), HF, and HCl represented by the Morse potential, and a proton-transfer barrier modeled by the Eckart potential. The zero-point energy, quantum partition function, and tunneling factor for these systems have been determined and are found to be in excellent agreement with the exact quantum results. Using our new analytical results at the zero-temperature limit, we show that the minimum value of the computed centroid potential in the KP theory is in excellent agreement with the ground state energy (zero-point energy) and the position of the centroid potential minimum is the expectation value of particle position in wave mechanics. The fast convergent property of the KP theory is further examined in comparison with results from the traditional Rayleigh-Ritz variational approach and Rayleigh-Schrödinger perturbation theory in wave mechanics. The present method can be used for thermodynamic and quantum dynamic calculations, including to systematically determine the exact value of zero-point energy and to study kinetic isotope effects for chemical reactions in solution and in enzymes.
Reich, Christian G; Ryan, Patrick B; Schuemie, Martijn J
2013-10-01
A systematic risk identification system has the potential to test marketed drugs for important Health Outcomes of Interest or HOI. For each HOI, multiple definitions are used in the literature, and some of them are validated for certain databases. However, little is known about the effect of different definitions on the ability of methods to estimate their association with medical products. Alternative definitions of HOI were studied for their effect on the performance of analytical methods in observational outcome studies. A set of alternative definitions for three HOI were defined based on literature review and clinical diagnosis guidelines: acute kidney injury, acute liver injury and acute myocardial infarction. The definitions varied by the choice of diagnostic codes and the inclusion of procedure codes and lab values. They were then used to empirically study an array of analytical methods with various analytical choices in four observational healthcare databases. The methods were executed against predefined drug-HOI pairs to generate an effect estimate and standard error for each pair. These test cases included positive controls (active ingredients with evidence to suspect a positive association with the outcome) and negative controls (active ingredients with no evidence to expect an effect on the outcome). Three different performance metrics where used: (i) Area Under the Receiver Operator Characteristics (ROC) curve (AUC) as a measure of a method's ability to distinguish between positive and negative test cases, (ii) Measure of bias by estimation of distribution of observed effect estimates for the negative test pairs where the true effect can be assumed to be one (no relative risk), and (iii) Minimal Detectable Relative Risk (MDRR) as a measure of whether there is sufficient power to generate effect estimates. In the three outcomes studied, different definitions of outcomes show comparable ability to differentiate true from false control cases (AUC) and a similar bias estimation. However, broader definitions generating larger outcome cohorts allowed more drugs to be studied with sufficient statistical power. Broader definitions are preferred since they allow studying drugs with lower prevalence than the more precise or narrow definitions while showing comparable performance characteristics in differentiation of signal vs. no signal as well as effect size estimation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagler, Peter C.; Tucker, Gregory S.; Fixsen, Dale J.
The detection of the primordial B-mode polarization signal of the cosmic microwave background (CMB) would provide evidence for inflation. Yet as has become increasingly clear, the detection of a such a faint signal requires an instrument with both wide frequency coverage to reject foregrounds and excellent control over instrumental systematic effects. Using a polarizing Fourier transform spectrometer (FTS) for CMB observations meets both of these requirements. In this work, we present an analysis of instrumental systematic effects in polarizing FTSs, using the Primordial Inflation Explorer (PIXIE) as a worked example. We analytically solve for the most important systematic effects inherentmore » to the FTS—emissive optical components, misaligned optical components, sampling and phase errors, and spin synchronous effects—and demonstrate that residual systematic error terms after corrections will all be at the sub-nK level, well below the predicted 100 nK B-mode signal.« less
Influence of ECG measurement accuracy on ECG diagnostic statements.
Zywietz, C; Celikag, D; Joseph, G
1996-01-01
Computer analysis of electrocardiograms (ECGs) provides a large amount of ECG measurement data, which may be used for diagnostic classification and storage in ECG databases. Until now, neither error limits for ECG measurements have been specified nor has their influence on diagnostic statements been systematically investigated. An analytical method is presented to estimate the influence of measurement errors on the accuracy of diagnostic ECG statements. Systematic (offset) errors will usually result in an increase of false positive or false negative statements since they cause a shift of the working point on the receiver operating characteristics curve. Measurement error dispersion broadens the distribution function of discriminative measurement parameters and, therefore, usually increases the overlap between discriminative parameters. This results in a flattening of the receiver operating characteristics curve and an increase of false positive and false negative classifications. The method developed has been applied to ECG conduction defect diagnoses by using the proposed International Electrotechnical Commission's interval measurement tolerance limits. These limits appear too large because more than 30% of false positive atrial conduction defect statements and 10-18% of false intraventricular conduction defect statements could be expected due to tolerated measurement errors. To assure long-term usability of ECG measurement databases, it is recommended that systems provide its error tolerance limits obtained on a defined test set.
SPIRE: Systematic protein investigative research environment.
Kolker, Eugene; Higdon, Roger; Morgan, Phil; Sedensky, Margaret; Welch, Dean; Bauman, Andrew; Stewart, Elizabeth; Haynes, Winston; Broomall, William; Kolker, Natali
2011-12-10
The SPIRE (Systematic Protein Investigative Research Environment) provides web-based experiment-specific mass spectrometry (MS) proteomics analysis (https://www.proteinspire.org). Its emphasis is on usability and integration of the best analytic tools. SPIRE provides an easy to use web-interface and generates results in both interactive and simple data formats. In contrast to run-based approaches, SPIRE conducts the analysis based on the experimental design. It employs novel methods to generate false discovery rates and local false discovery rates (FDR, LFDR) and integrates the best and complementary open-source search and data analysis methods. The SPIRE approach of integrating X!Tandem, OMSSA and SpectraST can produce an increase in protein IDs (52-88%) over current combinations of scoring and single search engines while also providing accurate multi-faceted error estimation. One of SPIRE's primary assets is combining the results with data on protein function, pathways and protein expression from model organisms. We demonstrate some of SPIRE's capabilities by analyzing mitochondrial proteins from the wild type and 3 mutants of C. elegans. SPIRE also connects results to publically available proteomics data through its Model Organism Protein Expression Database (MOPED). SPIRE can also provide analysis and annotation for user supplied protein ID and expression data. Copyright © 2011. Published by Elsevier B.V.
40 CFR 158.355 - Enforcement analytical method.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Enforcement analytical method. 158.355... DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.355 Enforcement analytical method. An analytical method suitable for enforcement purposes must be provided for each active ingredient in the...
Psychosocial factors at work, long work hours, and obesity: a systematic review.
Solovieva, Svetlana; Lallukka, Tea; Virtanen, Marianna; Viikari-Juntura, Eira
2013-05-01
Associations between psychosocial work environment and excess weight have not been systematically addressed. The aim of this systematic review was to summarize the published evidence for the associations of psychosocial factors at work and long work hours with weight-related outcomes . Methods We conducted a search of Medline and Embase for all original articles published up to September 2012 using predefined keywords. After excluding studies with a definite selection bias, we included 39 articles. About 60% of the studies reported at least one positive association between psychosocial factors at work and a weight-related outcome. However, 76% of the tested associations were found to be non-significant. Furthermore, the associations were rather weak. Studies of higher quality tended to observe associations more often than those of lower quality. Positive associations were found more frequently (i) among women versus men, (ii) in cross-sectional versus longitudinal studies, and (iii) for overweight or obesity versus other outcomes. About 70% of the studies reported positive associations between long work hours and weight-related outcomes. All four studies that evaluated the association between working overtime and weight gain (three longitudinal and one cross-sectional), showed a positive association among men and two of them also observed associations among women. We found evidence for weak associations between psychosocial factors at work and excess weight. Associations were observed between long work hours, working overtime, and weight gain, especially among men. More cohort studies among non-obese baseline participants using appropriate analytical methods based on an elaborated hypothetical model are needed.
Analysis and Purification of Bioactive Natural Products: The AnaPurNa Study
2012-01-01
Based on a meta-analysis of data mined from almost 2000 publications on bioactive natural products (NPs) from >80 000 pages of 13 different journals published in 1998–1999, 2004–2005, and 2009–2010, the aim of this systematic review is to provide both a survey of the status quo and a perspective for analytical methodology used for isolation and purity assessment of bioactive NPs. The study provides numerical measures of the common means of sourcing NPs, the chromatographic methodology employed for NP purification, and the role of spectroscopy and purity assessment in NP characterization. A link is proposed between the observed use of various analytical methodologies, the challenges posed by the complexity of metabolomes, and the inescapable residual complexity of purified NPs and their biological assessment. The data provide inspiration for the development of innovative methods for NP analysis as a means of advancing the role of naturally occurring compounds as a viable source of biologically active agents with relevance for human health and global benefit. PMID:22620854
Quantitative analysis of time-resolved microwave conductivity data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reid, Obadiah G.; Moore, David T.; Li, Zhen
Flash-photolysis time-resolved microwave conductivity (fp-TRMC) is a versatile, highly sensitive technique for studying the complex photoconductivity of solution, solid, and gas-phase samples. The purpose of this paper is to provide a standard reference work for experimentalists interested in using microwave conductivity methods to study functional electronic materials, describing how to conduct and calibrate these experiments in order to obtain quantitative results. The main focus of the paper is on calculating the calibration factor, K, which is used to connect the measured change in microwave power absorption to the conductance of the sample. We describe the standard analytical formulae that havemore » been used in the past, and compare them to numerical simulations. This comparison shows that the most widely used analytical analysis of fp-TRMC data systematically under-estimates the transient conductivity by ~60%. We suggest a more accurate semi-empirical way of calibrating these experiments. However, we emphasize that the full numerical calculation is necessary to quantify both transient and steady-state conductance for arbitrary sample properties and geometry.« less
Quantitative analysis of time-resolved microwave conductivity data
Reid, Obadiah G.; Moore, David T.; Li, Zhen; ...
2017-11-10
Flash-photolysis time-resolved microwave conductivity (fp-TRMC) is a versatile, highly sensitive technique for studying the complex photoconductivity of solution, solid, and gas-phase samples. The purpose of this paper is to provide a standard reference work for experimentalists interested in using microwave conductivity methods to study functional electronic materials, describing how to conduct and calibrate these experiments in order to obtain quantitative results. The main focus of the paper is on calculating the calibration factor, K, which is used to connect the measured change in microwave power absorption to the conductance of the sample. We describe the standard analytical formulae that havemore » been used in the past, and compare them to numerical simulations. This comparison shows that the most widely used analytical analysis of fp-TRMC data systematically under-estimates the transient conductivity by ~60%. We suggest a more accurate semi-empirical way of calibrating these experiments. However, we emphasize that the full numerical calculation is necessary to quantify both transient and steady-state conductance for arbitrary sample properties and geometry.« less
NASA Astrophysics Data System (ADS)
Fernández-Ruiz, Ramón; Friedrich K., E. Josue; Redrejo, M. J.
2018-02-01
The main goal of this work was to investigate, in a systematic way, the influence of the controlled modulation of the particle size distribution of a representative solid sample with respect to the more relevant analytical parameters of the Direct Solid Analysis (DSA) by Total-reflection X-Ray Fluorescence (TXRF) quantitative method. In particular, accuracy, uncertainty, linearity and detection limits were correlated with the main parameters of their size distributions for the following elements; Al, Si, P, S, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, As, Se, Rb, Sr, Ba and Pb. In all cases strong correlations were finded. The main conclusion of this work can be resumed as follows; the modulation of particles shape to lower average sizes next to a minimization of the width of particle size distributions, produce a strong increment of accuracy, minimization of uncertainties and limit of detections for DSA-TXRF methodology. These achievements allow the future use of the DSA-TXRF analytical methodology for development of ISO norms and standardized protocols for the direct analysis of solids by mean of TXRF.
PACS—Realization of an adaptive concept using pressure actuated cellular structures
NASA Astrophysics Data System (ADS)
Gramüller, B.; Boblenz, J.; Hühne, C.
2014-10-01
A biologically inspired concept is investigated which can be utilized to develop energy efficient, lightweight and applicational flexible adaptive structures. Building a real life morphing unit is an ambitious task as the numerous works in the particular field show. Summarizing fundamental demands and barriers regarding shape changing structures, the basic challenges of designing morphing structures are listed. The concept of Pressure Actuated Cellular Structures (PACS) is arranged within the recent morphing activities and it is shown that it complies with the underlying demands. Systematically divided into energy-related and structural subcomponents the working principle is illuminated and relationships between basic design parameters are expressed. The analytical background describing the physical mechanisms of PACS is presented in concentrated manner. This work focuses on the procedure of dimensioning, realizing and experimental testing of a single cell and a single row cantilever made of PACS. The experimental outcomes as well as the results from the FEM computations are used for evaluating the analytical methods. The functionality of the basic principle is thus validated and open issues are determined pointing the way ahead.
NASA Astrophysics Data System (ADS)
de Saint Jean, C.; Habert, B.; Archier, P.; Noguere, G.; Bernard, D.; Tommasi, J.; Blaise, P.
2010-10-01
In the [eV;MeV] energy range, modelling of the neutron induced reactions are based on nuclear reaction models having parameters. Estimation of co-variances on cross sections or on nuclear reaction model parameters is a recurrent puzzle in nuclear data evaluation. Major breakthroughs were asked by nuclear reactor physicists to assess proper uncertainties to be used in applications. In this paper, mathematical methods developped in the CONRAD code[2] will be presented to explain the treatment of all type of uncertainties, including experimental ones (statistical and systematic) and propagate them to nuclear reaction model parameters or cross sections. Marginalization procedure will thus be exposed using analytical or Monte-Carlo solutions. Furthermore, one major drawback found by reactor physicist is the fact that integral or analytical experiments (reactor mock-up or simple integral experiment, e.g. ICSBEP, …) were not taken into account sufficiently soon in the evaluation process to remove discrepancies. In this paper, we will describe a mathematical framework to take into account properly this kind of information.
Melt inclusions come of age: Volatiles, volcanoes, and sorby's legacy
Lowenstern, J. B.
2003-01-01
Despite nearly forty years of modern research on silicate melt inclusions (MI), only within the past 10-15 years have volcanologists and petrologists come to regularly accept their utility for characterizing magmatic systems. Their relatively slow acceptance was likely due to a number of factors including: 1) Lack of reliable analytical techniques, 2) Concern that MI represent anomalous boundary-layer melts or are altered by leakage or post-entrapment crystallization, 3) Data sets indicative of heterogeneous melts and, 4) Homogenization temperatures greater than those calculated by other techniques. With improvements in analytical methods and careful studies of MI systematics, workers are increasingly convinced of the utility of these features to unravel the complexities of volcanic systems: melt inclusions have "come of age." Recent studies provide compelling evidence for the compositions of dissolved and exsolved volatiles in magma reservoirs. Evidence for immiscibility of gases, hydrosaline brines and pegmatitic fluids demonstrate that magmatic phase relations are often more complicated than can be inferred by inspection of crystalline phases alone. ?? 2003 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szoka de Valladares, M.R.; Mack, S.
The DOE Hydrogen Program needs to develop criteria as part of a systematic evaluation process for proposal identification, evaluation and selection. The H Scan component of this process provides a framework in which a project proposer can fully describe their candidate technology system and its components. The H Scan complements traditional methods of capturing cost and technical information. It consists of a special set of survey forms designed to elicit information so expert reviewers can assess the proposal relative to DOE specified selection criteria. The Analytic Hierarchy Process (AHP) component of the decision process assembles the management defined evaluation andmore » selection criteria into a coherent multi-level decision construct by which projects can be evaluated in pair-wise comparisons. The AHP model will reflect management`s objectives and it will assist in the ranking of individual projects based on the extent to which each contributes to management`s objectives. This paper contains a detailed description of the products and activities associated with the planning and evaluation process: The objectives or criteria; the H Scan; and The Analytic Hierarchy Process (AHP).« less
Zechmeister-Koss, Ingrid; Schnell-Inderst, Petra; Zauner, Günther
2014-04-01
An increasing number of evidence sources are relevant for populating decision analytic models. What is needed is detailed methodological advice on which type of data is to be used for what type of model parameter. We aim to identify standards in health technology assessment manuals and economic (modeling) guidelines on appropriate evidence sources and on the role different types of data play within a model. Documents were identified via a call among members of the International Network of Agencies for Health Technology Assessment and by hand search. We included documents from Europe, the United States, Canada, Australia, and New Zealand as well as transnational guidelines written in English or German. We systematically summarized in a narrative manner information on appropriate evidence sources for model parameters, their advantages and limitations, data identification methods, and data quality issues. A large variety of evidence sources for populating models are mentioned in the 28 documents included. They comprise research- and non-research-based sources. Valid and less appropriate sources are identified for informing different types of model parameters, such as clinical effect size, natural history of disease, resource use, unit costs, and health state utility values. Guidelines do not provide structured and detailed advice on this issue. The article does not include information from guidelines in languages other than English or German, and the information is not tailored to specific modeling techniques. The usability of guidelines and manuals for modeling could be improved by addressing the issue of evidence sources in a more structured and comprehensive format.
Robertshaw, Luke; Dhesi, Surindar
2017-01-01
Objectives To thematically synthesise primary qualitative studies that explore challenges and facilitators for health professionals providing primary healthcare for refugees and asylum seekers in high-income countries. Design Systematic review and qualitative thematic synthesis. Methods Searches of MEDLINE, EMBASE, PsycINFO, CINAHL and Web of Science. Search terms were combined for qualitative research, primary healthcare professionals, refugees and asylum seekers, and were supplemented by searches of reference lists and citations. Study selection was conducted by two researchers using prespecified selection criteria. Data extraction and quality assessment using the Critical Appraisal Skills Programme tool was conducted by the first author. A thematic synthesis was undertaken to develop descriptive themes and analytical constructs. Results Twenty-six articles reporting on 21 studies and involving 357 participants were included. Eleven descriptive themes were interpreted, embedded within three analytical constructs: healthcare encounter (trusting relationship, communication, cultural understanding, health and social conditions, time); healthcare system (training and guidance, professional support, connecting with other services, organisation, resources and capacity); asylum and resettlement. Challenges and facilitators were described within these themes. Conclusions A range of challenges and facilitators have been identified for health professionals providing primary healthcare for refugees and asylum seekers that are experienced in the dimensions of the healthcare encounter, the healthcare system and wider asylum and resettlement situation. Comprehensive understanding of these challenges and facilitators is important to shape policy, improve the quality of services and provide more equitable health services for this vulnerable group. PMID:28780549
2011-01-01
Background In the past years, cumulative evidence has convincingly demonstrated that the work environment is a critical determinant of workers' mental health. Nevertheless, much less attention has been dedicated towards understanding the pathways through which other pivotal life environments might also concomitantly intervene, along with the work environment, to bring about mental health outcomes in the workforce. The aim of this study consisted in conducting a systematic review examining the relative contribution of non-work determinants to the prediction of workers' mental health in order to bridge that gap in knowledge. Methods We searched electronic databases and bibliographies up to 2008 for observational longitudinal studies jointly investigating work and non-work determinants of workers' mental health. A narrative synthesis (MOOSE) was performed to synthesize data and provide an assessment of study conceptual and methodological quality. Results Thirteen studies were selected for evaluation. Seven of these were of relatively high methodological quality. Assessment of study conceptual quality yielded modest analytical breadth and depth in the ways studies conceptualized the non-work domain as defined by family, network and community/society-level indicators. We found evidence of moderate strength supporting a causal association between social support from the networks and workers' mental health, but insufficient evidence of specific indicator involvement for other analytical levels considered (i.e., family, community/society). Conclusions Largely underinvestigated, non-work determinants are important to the prediction of workers' mental health. More longitudinal studies concomitantly investigating work and non-work determinants of workers' mental health are warranted to better inform healthy workplace research, intervention, and policy. PMID:21645393
Directly comparing gravitational wave data to numerical relativity simulations: systematics
NASA Astrophysics Data System (ADS)
Lange, Jacob; O'Shaughnessy, Richard; Healy, James; Lousto, Carlos; Zlochower, Yosef; Shoemaker, Deirdre; Lovelace, Geoffrey; Pankow, Christopher; Brady, Patrick; Scheel, Mark; Pfeiffer, Harald; Ossokine, Serguei
2017-01-01
We compare synthetic data directly to complete numerical relativity simulations of binary black holes. In doing so, we circumvent ad-hoc approximations introduced in semi-analytical models previously used in gravitational wave parameter estimation and compare the data against the most accurate waveforms including higher modes. In this talk, we focus on the synthetic studies that test potential sources of systematic errors. We also run ``end-to-end'' studies of intrinsically different synthetic sources to show we can recover parameters for different systems.
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
Visual Analytics of integrated Data Systems for Space Weather Purposes
NASA Astrophysics Data System (ADS)
Rosa, Reinaldo; Veronese, Thalita; Giovani, Paulo
Analysis of information from multiple data sources obtained through high resolution instrumental measurements has become a fundamental task in all scientific areas. The development of expert methods able to treat such multi-source data systems, with both large variability and measurement extension, is a key for studying complex scientific phenomena, especially those related to systemic analysis in space and environmental sciences. In this talk, we present a time series generalization introducing the concept of generalized numerical lattice, which represents a discrete sequence of temporal measures for a given variable. In this novel representation approach each generalized numerical lattice brings post-analytical data information. We define a generalized numerical lattice as a set of three parameters representing the following data properties: dimensionality, size and post-analytical measure (e.g., the autocorrelation, Hurst exponent, etc)[1]. From this representation generalization, any multi-source database can be reduced to a closed set of classified time series in spatiotemporal generalized dimensions. As a case study, we show a preliminary application in space science data, highlighting the possibility of a real time analysis expert system. In this particular application, we have selected and analyzed, using detrended fluctuation analysis (DFA), several decimetric solar bursts associated to X flare-classes. The association with geomagnetic activity is also reported. DFA method is performed in the framework of a radio burst automatic monitoring system. Our results may characterize the variability pattern evolution, computing the DFA scaling exponent, scanning the time series by a short windowing before the extreme event [2]. For the first time, the application of systematic fluctuation analysis for space weather purposes is presented. The prototype for visual analytics is implemented in a Compute Unified Device Architecture (CUDA) by using the K20 Nvidia graphics processing units (GPUs) to reduce the integrated analysis runtime. [1] Veronese et al. doi: 10.6062/jcis.2009.01.02.0021, 2010. [2] Veronese et al. doi:http://dx.doi.org/10.1016/j.jastp.2010.09.030, 2011.
NASA Technical Reports Server (NTRS)
Dubinskiy, Mark A.; Kamal, Mohammed M.; Misra, Prabhaker
1995-01-01
The availability of manned laboratory facilities in space offers wonderful opportunities and challenges in microgravity combustion science and technology. In turn, the fundamentals of microgravity combustion science can be studied via spectroscopic characterization of free radicals generated in flames. The laser-induced fluorescence (LIF) technique is a noninvasive method of considerable utility in combustion physics and chemistry suitable for monitoring not only specific species and their kinetics, but it is also important for imaging of flames. This makes LIF one of the most important tools for microgravity combustion science. Flame characterization under microgravity conditions using LIF is expected to be more informative than other methods aimed at searching for effects like pumping phenomenon that can be modeled via ground level experiments. A primary goal of our work consisted in working out an innovative approach to devising an LIF-based analytical unit suitable for in-space flame characterization. It was decided to follow two approaches in tandem: (1) use the existing laboratory (non-portable) equipment and determine the optimal set of parameters for flames that can be used as analytical criteria for flame characterization under microgravity conditions; and (2) use state-of-the-art developments in laser technology and concentrate some effort in devising a layout for the portable analytical equipment. This paper presents an up-to-date summary of the results of our experiments aimed at the creation of the portable device for combustion studies in a microgravity environment, which is based on a portable UV tunable solid-state laser for excitation of free radicals normally present in flames in detectable amounts. A systematic approach has allowed us to make a convenient choice of species under investigation, as well as the proper tunable laser system, and also enabled us to carry out LIF experiments on free radicals using a solid-state laser tunable in the UV.
Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid
2014-01-01
Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270
A simple method for identifying parameter correlations in partially observed linear dynamic models.
Li, Pu; Vu, Quoc Dong
2015-12-14
Parameter estimation represents one of the most significant challenges in systems biology. This is because biological models commonly contain a large number of parameters among which there may be functional interrelationships, thus leading to the problem of non-identifiability. Although identifiability analysis has been extensively studied by analytical as well as numerical approaches, systematic methods for remedying practically non-identifiable models have rarely been investigated. We propose a simple method for identifying pairwise correlations and higher order interrelationships of parameters in partially observed linear dynamic models. This is made by derivation of the output sensitivity matrix and analysis of the linear dependencies of its columns. Consequently, analytical relations between the identifiability of the model parameters and the initial conditions as well as the input functions can be achieved. In the case of structural non-identifiability, identifiable combinations can be obtained by solving the resulting homogenous linear equations. In the case of practical non-identifiability, experiment conditions (i.e. initial condition and constant control signals) can be provided which are necessary for remedying the non-identifiability and unique parameter estimation. It is noted that the approach does not consider noisy data. In this way, the practical non-identifiability issue, which is popular for linear biological models, can be remedied. Several linear compartment models including an insulin receptor dynamics model are taken to illustrate the application of the proposed approach. Both structural and practical identifiability of partially observed linear dynamic models can be clarified by the proposed method. The result of this method provides important information for experimental design to remedy the practical non-identifiability if applicable. The derivation of the method is straightforward and thus the algorithm can be easily implemented into a software packet.
Amukele, Timothy K; Sokoll, Lori J; Pepper, Daniel; Howard, Dana P; Street, Jeff
2015-01-01
Unmanned Aerial Systems (UAS or drones) could potentially be used for the routine transport of small goods such as diagnostic clinical laboratory specimens. To the best of our knowledge, there is no published study of the impact of UAS transportation on laboratory tests. Three paired samples were obtained from each one of 56 adult volunteers in a single phlebotomy event (336 samples total): two tubes each for chemistry, hematology, and coagulation testing respectively. 168 samples were driven to the flight field and held stationary. The other 168 samples were flown in the UAS for a range of times, from 6 to 38 minutes. After the flight, 33 of the most common chemistry, hematology, and coagulation tests were performed. Statistical methods as well as performance criteria from four distinct clinical, academic, and regulatory bodies were used to evaluate the results. Results from flown and stationary sample pairs were similar for all 33 analytes. Bias and intercepts were <10% and <13% respectively for all analytes. Bland-Altman comparisons showed a mean difference of 3.2% for Glucose and <1% for other analytes. Only bicarbonate did not meet the strictest (Royal College of Pathologists of Australasia Quality Assurance Program) performance criteria. This was due to poor precision rather than bias. There were no systematic differences between laboratory-derived (analytic) CV's and the CV's of our flown versus terrestrial sample pairs however CV's from the sample pairs tended to be slightly higher than analytic CV's. The overall concordance, based on clinical stratification (normal versus abnormal), was 97%. Length of flight had no impact on the results. Transportation of laboratory specimens via small UASs does not affect the accuracy of routine chemistry, hematology, and coagulation tests results from selfsame samples. However it results in slightly poorer precision for some analytes.
Evidence-based practice: extending the search to find material for the systematic review
Helmer, Diane; Savoie, Isabelle; Green, Carolyn; Kazanjian, Arminée
2001-01-01
Background: Cochrane-style systematic reviews increasingly require the participation of librarians. Guidelines on the appropriate search strategy to use for systematic reviews have been proposed. However, research evidence supporting these recommendations is limited. Objective: This study investigates the effectiveness of various systematic search methods used to uncover randomized controlled trials (RCTs) for systematic reviews. Effectiveness is defined as the proportion of relevant material uncovered for the systematic review using extended systematic review search methods. The following extended systematic search methods are evaluated: searching subject-specific or specialized databases (including trial registries), hand searching, scanning reference lists, and communicating personally. Methods: Two systematic review projects were prospectively monitored regarding the method used to identify items as well as the type of items retrieved. The proportion of RCTs identified by each systematic search method was calculated. Results: The extended systematic search methods uncovered 29.2% of all items retrieved for the systematic reviews. The search of specialized databases was the most effective method, followed by scanning of reference lists, communicating personally, and hand searching. Although the number of items identified through hand searching was small, these unique items would otherwise have been missed. Conclusions: Extended systematic search methods are effective tools for uncovering material for the systematic review. The quality of the items uncovered has yet to be assessed and will be key in evaluating the value of the systematic search methods. PMID:11837256
7 CFR 94.4 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.4 Section 94.4 Agriculture... POULTRY AND EGG PRODUCTS Mandatory Analyses of Egg Products § 94.4 Analytical methods. The majority of analytical methods used by the USDA laboratories to perform mandatory analyses for egg products are listed as...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra
2018-02-01
The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.
Systematic strategies for the third industrial accident prevention plan in Korea.
Kang, Young-sig; Yang, Sung-hwan; Kim, Tae-gu; Kim, Day-sung
2012-01-01
To minimize industrial accidents, it's critical to evaluate a firm's priorities for prevention factors and strategies since such evaluation provides decisive information for preventing industrial accidents and maintaining safety management. Therefore, this paper proposes the evaluation of priorities through statistical testing of prevention factors with a cause analysis in a cause and effect model. A priority matrix criterion is proposed to apply the ranking and for the objectivity of questionnaire results. This paper used regression method (RA), exponential smoothing method (ESM), double exponential smoothing method (DESM), autoregressive integrated moving average (ARIMA) model and proposed analytical function method (PAFM) to analyze trends of accident data that will lead to an accurate prediction. This paper standardized the questionnaire results of workers and managers in manufacturing and construction companies with less than 300 employees, located in the central Korean metropolitan areas where fatal accidents have occurred. Finally, a strategy was provided to construct safety management for the third industrial accident prevention plan and a forecasting method for occupational accident rates and fatality rates for occupational accidents per 10,000 people.
A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework
NASA Astrophysics Data System (ADS)
Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo
An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.
Tools for Systematic Identification of Cross-Disciplinary Research Relevant to Exploration Missions
NASA Astrophysics Data System (ADS)
Shelhamer, M.; Mindock, J. A.
2018-02-01
A Contributing Factor Map and text analytics on articles and reports can identify connections between major factors contributing to health and performance on Deep Space Gateway missions. Connections suggest experiment complements to maximize use of these flights.
Development of the Diabetes Technology Society Blood Glucose Monitor System Surveillance Protocol
Klonoff, David C.; Lias, Courtney; Beck, Stayce; Parkes, Joan Lee; Kovatchev, Boris; Vigersky, Robert A.; Arreaza-Rubin, Guillermo; Burk, Robert D.; Kowalski, Aaron; Little, Randie; Nichols, James; Petersen, Matt; Rawlings, Kelly; Sacks, David B.; Sampson, Eric; Scott, Steve; Seley, Jane Jeffrie; Slingerland, Robbert; Vesper, Hubert W.
2015-01-01
Background: Inaccurate blood glucsoe monitoring systems (BGMSs) can lead to adverse health effects. The Diabetes Technology Society (DTS) Surveillance Program for cleared BGMSs is intended to protect people with diabetes from inaccurate, unreliable BGMS products that are currently on the market in the United States. The Surveillance Program will provide an independent assessment of the analytical performance of cleared BGMSs. Methods: The DTS BGMS Surveillance Program Steering Committee included experts in glucose monitoring, surveillance testing, and regulatory science. Over one year, the committee engaged in meetings and teleconferences aiming to describe how to conduct BGMS surveillance studies in a scientifically sound manner that is in compliance with good clinical practice and all relevant regulations. Results: A clinical surveillance protocol was created that contains performance targets and analytical accuracy-testing studies with marketed BGMS products conducted by qualified clinical and laboratory sites. This protocol entitled “Protocol for the Diabetes Technology Society Blood Glucose Monitor System Surveillance Program” is attached as supplementary material. Conclusion: This program is needed because currently once a BGMS product has been cleared for use by the FDA, no systematic postmarket Surveillance Program exists that can monitor analytical performance and detect potential problems. This protocol will allow identification of inaccurate and unreliable BGMSs currently available on the US market. The DTS Surveillance Program will provide BGMS manufacturers a benchmark to understand the postmarket analytical performance of their products. Furthermore, patients, health care professionals, payers, and regulatory agencies will be able to use the results of the study to make informed decisions to, respectively, select, prescribe, finance, and regulate BGMSs on the market. PMID:26481642
Aromataris, Edoardo; Fernandez, Ritin; Godfrey, Christina M; Holly, Cheryl; Khalil, Hanan; Tungpunkom, Patraporn
2015-09-01
With the increase in the number of systematic reviews available, a logical next step to provide decision makers in healthcare with the evidence they require has been the conduct of reviews of existing systematic reviews. Syntheses of existing systematic reviews are referred to by many different names, one of which is an umbrella review. An umbrella review allows the findings of reviews relevant to a review question to be compared and contrasted. An umbrella review's most characteristic feature is that this type of evidence synthesis only considers for inclusion the highest level of evidence, namely other systematic reviews and meta-analyses. A methodology working group was formed by the Joanna Briggs Institute to develop methodological guidance for the conduct of an umbrella review, including diverse types of evidence, both quantitative and qualitative. The aim of this study is to describe the development and guidance for the conduct of an umbrella review. Discussion and testing of the elements of methods for the conduct of an umbrella review were held over a 6-month period by members of a methodology working group. The working group comprised six participants who corresponded via teleconference, e-mail and face-to-face meeting during this development period. In October 2013, the methodology was presented in a workshop at the Joanna Briggs Institute Convention. Workshop participants, review authors and methodologists provided further testing, critique and feedback on the proposed methodology. This study describes the methodology and methods developed for the conduct of an umbrella review that includes published systematic reviews and meta-analyses as the analytical unit of the review. Details are provided regarding the essential elements of an umbrella review, including presentation of the review question in a Population, Intervention, Comparator, Outcome format, nuances of the inclusion criteria and search strategy. A critical appraisal tool with 10 questions to help assess risk of bias in systematic reviews and meta-analyses was also developed and tested. Relevant details to extract from included reviews and how to best present the findings of both quantitative and qualitative systematic reviews in a reader friendly format are provided. Umbrella reviews provide a ready means for decision makers in healthcare to gain a clear understanding of a broad topic area. The umbrella review methodology described here is the first to consider reviews that report other than quantitative evidence derived from randomized controlled trials. The methodology includes an easy to use and informative summary of evidence table to readily provide decision makers with the available, highest level of evidence relevant to the question posed.
A Method for Analyzing Commonalities in Clinical Trial Target Populations
He, Zhe; Carini, Simona; Hao, Tianyong; Sim, Ida; Weng, Chunhua
2014-01-01
ClinicalTrials.gov presents great opportunities for analyzing commonalities in clinical trial target populations to facilitate knowledge reuse when designing eligibility criteria of future trials or to reveal potential systematic biases in selecting population subgroups for clinical research. Towards this goal, this paper presents a novel data resource for enabling such analyses. Our method includes two parts: (1) parsing and indexing eligibility criteria text; and (2) mining common eligibility features and attributes of common numeric features (e.g., A1c). We designed and built a database called “Commonalities in Target Populations of Clinical Trials” (COMPACT), which stores structured eligibility criteria and trial metadata in a readily computable format. We illustrate its use in an example analytic module called CONECT using COMPACT as the backend. Type 2 diabetes is used as an example to analyze commonalities in the target populations of 4,493 clinical trials on this disease. PMID:25954450
Seeking maximum linearity of transfer functions
NASA Astrophysics Data System (ADS)
Silva, Filipi N.; Comin, Cesar H.; Costa, Luciano da F.
2016-12-01
Linearity is an important and frequently sought property in electronics and instrumentation. Here, we report a method capable of, given a transfer function (theoretical or derived from some real system), identifying the respective most linear region of operation with a fixed width. This methodology, which is based on least squares regression and systematic consideration of all possible regions, has been illustrated with respect to both an analytical (sigmoid transfer function) and a simple situation involving experimental data of a low-power, one-stage class A transistor current amplifier. Such an approach, which has been addressed in terms of transfer functions derived from experimentally obtained characteristic surface, also yielded contributions such as the estimation of local constants of the device, as opposed to typically considered average values. The reported method and results pave the way to several further applications in other types of devices and systems, intelligent control operation, and other areas such as identifying regions of power law behavior.
Simulation of radiation damping in rings, using stepwise ray-tracing methods
Meot, F.
2015-06-26
The ray-tracing code Zgoubi computes particle trajectories in arbitrary magnetic and/or electric field maps or analytical field models. It includes a built-in fitting procedure, spin tracking many Monte Carlo processes. The accuracy of the integration method makes it an efficient tool for multi-turn tracking in periodic machines. Energy loss by synchrotron radiation, based on Monte Carlo techniques, had been introduced in Zgoubi in the early 2000s for studies regarding the linear collider beam delivery system. However, only recently has this Monte Carlo tool been used for systematic beam dynamics and spin diffusion studies in rings, including eRHIC electron-ion collider projectmore » at the Brookhaven National Laboratory. Some beam dynamics aspects of this recent use of Zgoubi capabilities, including considerations of accuracy as well as further benchmarking in the presence of synchrotron radiation in rings, are reported here.« less
Gain weighted eigenspace assignment
NASA Technical Reports Server (NTRS)
Davidson, John B.; Andrisani, Dominick, II
1994-01-01
This report presents the development of the gain weighted eigenspace assignment methodology. This provides a designer with a systematic methodology for trading off eigenvector placement versus gain magnitudes, while still maintaining desired closed-loop eigenvalue locations. This is accomplished by forming a cost function composed of a scalar measure of error between desired and achievable eigenvectors and a scalar measure of gain magnitude, determining analytical expressions for the gradients, and solving for the optimal solution by numerical iteration. For this development the scalar measure of gain magnitude is chosen to be a weighted sum of the squares of all the individual elements of the feedback gain matrix. An example is presented to demonstrate the method. In this example, solutions yielding achievable eigenvectors close to the desired eigenvectors are obtained with significant reductions in gain magnitude compared to a solution obtained using a previously developed eigenspace (eigenstructure) assignment method.
Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo
2013-09-17
We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.
Cooper, Chris; Booth, Andrew; Britten, Nicky; Garside, Ruth
2017-11-28
The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged. Numerous studies have demonstrated their potential in identifying studies or study data that would have been missed by bibliographic database searching alone. What is less certain is how supplementary search methods actually work, how they are applied, and the consequent advantages, disadvantages and resource implications of each search method. The aim of this study is to compare current practice in using supplementary search methods with methodological guidance. Four methodological handbooks in informing systematic review practice in the UK were read and audited to establish current methodological guidance. Studies evaluating the use of supplementary search methods were identified by searching five bibliographic databases. Studies were included if they (1) reported practical application of a supplementary search method (descriptive) or (2) examined the utility of a supplementary search method (analytical) or (3) identified/explored factors that impact on the utility of a supplementary method, when applied in practice. Thirty-five studies were included in this review in addition to the four methodological handbooks. Studies were published between 1989 and 2016, and dates of publication of the handbooks ranged from 1994 to 2014. Five supplementary search methods were reviewed: contacting study authors, citation chasing, handsearching, searching trial registers and web searching. There is reasonable consistency between recommended best practice (handbooks) and current practice (methodological studies) as it relates to the application of supplementary search methods. The methodological studies provide useful information on the effectiveness of the supplementary search methods, often seeking to evaluate aspects of the method to improve effectiveness or efficiency. In this way, the studies advance the understanding of the supplementary search methods. Further research is required, however, so that a rational choice can be made about which supplementary search strategies should be used, and when.
NASA Astrophysics Data System (ADS)
Brookman, Tom; Whittaker, Thomas
2012-09-01
Stable isotope dendroclimatology using α-cellulose has unique potential to deliver multimillennial-scale, sub-annually resolved, terrestrial climate records. However, lengthy processing and analytical methods often preclude such reconstructions. Variants of the Brendel extraction method have reduced these limitations, providing fast, easy methods of isolating α-cellulose in some species. Here, we investigate application of Standard Brendel (SBrendel) variants to resinous soft-woods by treating samples of kauri (Agathis australis), ponderosa pine (Pinus ponderosa) and huon pine (Lagarastrobus franklinii), varying reaction vessel, temperature, boiling time and reagent volume. Numerous samples were visibly `under-processed' and Fourier Transform infrared spectroscopic (FTIR) investigation showed absorption peaks at 1520 cm-1 and ˜1600 cm-1 in those fibers suggesting residual lignin and retained resin respectively. Replicate analyses of all samples processed at high temperature yielded consistent δ13C and δ18O despite color and spectral variations. Spectra and isotopic data revealed that α-cellulose δ13C can be altered during processing, most likely due to chemical contamination from insufficient acetone removal, but is not systematically affected by methodological variation. Reagent amount, temperature and extraction time all influence δ18O, however, and our results demonstrate that different species may require different processing methods. FTIR prior to isotopic analysis is a fast and cost effective way to determine α-cellulose extract purity. Furthermore, a systematic isotopic test such as we present here can also determine sensitivity of isotopic values to methodological variables. Without these tests, isotopic variability introduced by the method could obscure or `create' climatic signals within a data set.
Jackson, Dan; Kirkbride, James; Croudace, Tim; Morgan, Craig; Boydell, Jane; Errazuriz, Antonia; Murray, Robin M; Jones, Peter B
2013-03-01
A recent systematic review and meta-analysis of the incidence and prevalence of schizophrenia and other psychoses in England investigated the variation in the rates of psychotic disorders. However, some of the questions of interest, and the data collected to answer these, could not be adequately addressed using established meta-analysis techniques. We developed a novel statistical method, which makes combined use of fractional polynomials and meta-regression. This was used to quantify the evidence of gender differences and a secondary peak onset in women, where the outcome of interest is the incidence of schizophrenia. Statistically significant and epidemiologically important effects were obtained using our methods. Our analysis is based on data from four studies that provide 50 incidence rates, stratified by age and gender. We describe several variations of our method, in particular those that might be used where more data is available, and provide guidance for assessing the model fit. Copyright © 2013 John Wiley & Sons, Ltd.
Evaluations of Structural Interventions for HIV Prevention: A Review of Approaches and Methods.
Iskarpatyoti, Brittany S; Lebov, Jill; Hart, Lauren; Thomas, Jim; Mandal, Mahua
2018-04-01
Structural interventions alter the social, economic, legal, political, and built environments that underlie processes affecting population health. We conducted a systematic review of evaluations of structural interventions for HIV prevention in low- and middle-income countries (LMICs) to better understand methodological and other challenges and identify effective evaluation strategies. We included 27 peer-reviewed articles on interventions related to economic empowerment, education, and substance abuse in LMICs. Twenty-one evaluations included clearly articulated theories of change (TOCs); 14 of these assessed the TOC by measuring intermediary variables in the causal pathway between the intervention and HIV outcomes. Although structural interventions address complex interactions, no evaluation included methods designed to evaluate complex systems. To strengthen evaluations of structural interventions, we recommend clearly articulating a TOC and measuring intermediate variables between the predictor and outcome. We additionally recommend adapting study designs and analytic methods outside traditional epidemiology to better capture complex results, influences external to the intervention, and unintended consequences.
Azzouz, Abdelmonaim; Jurado-Sánchez, Beatriz; Souhail, Badredine; Ballesteros, Evaristo
2011-05-11
This paper reports a systematic approach to the development of a method that combines continuous solid-phase extraction and gas chromatography-mass spectrometry for the simultaneous determination of 20 pharmacologically active substances including antibacterials (chloramphenicol, florfenicol, pyrimethamine, thiamphenicol), nonsteroideal anti-inflammatories (diclofenac, flunixin, ibuprofen, ketoprofen, naproxen, mefenamic acid, niflumic acid, phenylbutazone), antiseptic (triclosan), antiepileptic (carbamazepine), lipid regulator (clofibric acid), β-blockers (metoprolol, propranolol), and hormones (17α-ethinylestradiol, estrone, 17β-estradiol) in milk samples. The sample preparation procedure involves deproteination of the milk, followed by sample enrichment and cleanup by continuous solid-phase extraction. The proposed method provides a linear response over the range of 0.6-5000 ng/kg and features limits of detection from 0.2 to 1.2 ng/kg depending on the particular analyte. The method was successfully applied to the determination of pharmacologically active substance residues in food samples including whole, raw, half-skim, skim, and powdered milk from different sources (cow, goat, and human breast).
NASA Astrophysics Data System (ADS)
Diveyev, Bohdan; Konyk, Solomija; Crocker, Malcolm J.
2018-01-01
The main aim of this study is to predict the elastic and damping properties of composite laminated plates. This problem has an exact elasticity solution for simple uniform bending and transverse loading conditions. This paper presents a new stress analysis method for the accurate determination of the detailed stress distributions in laminated plates subjected to cylindrical bending. Some approximate methods for the stress state predictions for laminated plates are presented here. The present method is adaptive and does not rely on strong assumptions about the model of the plate. The theoretical model described here incorporates deformations of each sheet of the lamina, which account for the effects of transverse shear deformation, transverse normal strain-stress and nonlinear variation of displacements with respect to the thickness coordinate. Predictions of the dynamic and damping values of laminated plates for various geometrical, mechanical and fastening properties are presented. Comparison with the Timoshenko beam theory is systematically made for analytical and approximation variants.
Rapid measurement of field-saturated hydraulic conductivity for areal characterization
Nimmo, J.R.; Schmidt, K.M.; Perkins, K.S.; Stock, J.D.
2009-01-01
To provide an improved methodology for characterizing the field-saturated hydraulic conductivity (Kfs) over broad areas with extreme spatial variability and ordinary limitations of time and resources, we developed and tested a simplified apparatus and procedure, correcting mathematically for the major deficiencies of the simplified implementation. The methodology includes use of a portable, falling-head, small-diameter (???20 cm) single-ring infiltrometer and an analytical formula for Kfs that compensates both for nonconstant falling head and for the subsurface radial spreading that unavoidably occurs with small ring size. We applied this method to alluvial fan deposits varying in degree of pedogenic maturity in the arid Mojave National Preserve, California. The measurements are consistent with a more rigorous and time-consuming Kfs measurement method, produce the expected systematic trends in Kfs when compared among soils of contrasting degrees of pedogenic development, and relate in expected ways to results of widely accepted methods. ?? Soil Science Society of America. All rights reserved.
NASA Astrophysics Data System (ADS)
Kleine-Boymann, Matthias; Rohnke, Marcus; Henss, Anja; Peppler, Klaus; Sann, Joachim; Janek, Juergen
2014-08-01
The spatially resolved phase identification of biologically relevant calcium phosphate phases (CPPs) in bone tissue is essential for the elucidation of bone remodeling mechanisms and for the diagnosis of bone diseases. Analytical methods with high spatial resolution for the discrimination between chemically quite close phases are rare. Therefore the applicability of state-of-the-art ToF-SIMS, XPS and EDX as chemically specific techniques was investigated. The eight CPPs hydroxyapatite (HAP), β-tricalcium phosphate (β-TCP), α-tricalcium phosphate (α-TCP), octacalcium phosphate (OCP), dicalcium phosphate dihydrate (DCPD), dicalcium phosphate (DCP), monocalcium phosphate (MCP) and amorphous calcium phosphate (ACP) were either commercial materials in high purity or synthesized by ourselves. The phase purity was proven by XRD analysis. All of the eight CPPs show different mass spectra and the phases can be discriminated by applying the principal component analysis method to the mass spectrometric data. The Ca/P ratios of all phosphates were determined by XPS and EDX. With both methods some CPPs can be distinguished, but the obtained Ca/P ratios deviate systematically from their theoretical values. It is necessary in any case to determine a calibration curve, respectively the ZAF values, from appropriate standards. In XPS also the O(1s)-satellite signals are correlated to the CPPs composition. Angle resolved and long-term XPS measurements of HAP clearly prove that there is no phosphate excess at the surface. Decomposition due to X-ray irradiation has not been observed.
Pitsiladis, Yannis P; Durussel, Jérôme; Rabin, Olivier
2014-05-01
Administration of recombinant human erythropoietin (rHumanEPO) improves sporting performance and hence is frequently subject to abuse by athletes, although rHumanEPO is prohibited by the WADA. Approaches to detect rHumanEPO doping have improved significantly in recent years but remain imperfect. A new transcriptomic-based longitudinal screening approach is being developed that has the potential to improve the analytical performance of current detection methods. In particular, studies are being funded by WADA to identify a 'molecular signature' of rHumanEPO doping and preliminary results are promising. In the first systematic study to be conducted, the expression of hundreds of genes were found to be altered by rHumanEPO with numerous gene transcripts being differentially expressed after the first injection and further transcripts profoundly upregulated during and subsequently downregulated up to 4 weeks postadministration of the drug; with the same transcriptomic pattern observed in all participants. The identification of a blood 'molecular signature' of rHumanEPO administration is the strongest evidence to date that gene biomarkers have the potential to substantially improve the analytical performance of current antidoping methods such as the Athlete Biological Passport for rHumanEPO detection. Given the early promise of transcriptomics, research using an 'omics'-based approach involving genomics, transcriptomics, proteomics and metabolomics should be intensified in order to achieve improved detection of rHumanEPO and other doping substances and methods difficult to detect such a recombinant human growth hormone and blood transfusions.
Jindal, Kriti; Narayanam, Mallikarjun; Singh, Saranjit
2015-04-10
In the present study, a novel analytical strategy was employed to study the occurrence of 40 drug residues belonging to different medicinal classes, e.g., antibiotics, β blockers, NSAIDs, antidiabetics, proton pump inhibitors, H2 receptor antagonists, antihypertensives, antihyperlipidemics, etc. in ground water samples collected from villages adjoining to S.A.S. Nagar, Punjab, India. The drug residues were extracted from the samples using solid-phase extraction, and LC-ESI-HRMS and LC-ESI-MS/MS were used for identification and quantitation of the analytes. Initially, qualifier and quantifier MRM transitions were classified for 40 targeted drugs, followed by development of LC-MS methods for the separation of all the drugs, which were divided into three categories to curtail overlapping of peaks. Overall identification was done through matching of retention times and MRM transitions; matching of intensity ratio of qualifier to quantifier transitions; comparison of base peak MS/MS profiles; and evaluation of isotopic abundances (wherever applicable). Final confirmation was carried out through comparison of accurate masses obtained from HRMS studies for both standard and targeted analytes in the samples. The application of the strategy allowed removal of false positives and helped in identification and quantitation of diclofenac in the ground water samples of four villages, and pitavastatin in a sample of one village. Copyright © 2015 Elsevier B.V. All rights reserved.
Liu, Yan-Ming; Shi, Yan-Mei; Liu, Zhuan-Li; Peng, Long-Fei
2010-05-01
A sensitive approach for the simultaneous determination of tilmicosin, erythromycin ethylsuccinate and clindamycin was developed by CE coupled with electrochemiluminescence detection with ionic liquid. The parameters for CE, electrochemiluminescence detection and the effect of ionic liquid were investigated systematically. The three analytes were well separated and detected within 8 min. The limits of detection (S/N=3) of tilmicosin, erythromycin ethylsuccinate and clindamycin are 3.4x10(-9), 2.3x10(-8) and 1.3x10(-8) mol/L, respectively. The precisions (RSD%) of the peak area and the migration time are from 0.8 to 1.5% and from 0.2 to 0.5% within a day and from 1.8 to 2.7% and from 0.6 to 0.8% in 3 days, respectively. The limits of quantitation (S/N=10) of tilmicosin, erythromycin ethylsuccinate and clindamycin are 3.2x10(-8), 2.9x10(-7) and 9.1x10(-8) mol/L in human urines and 5.5x10(-8), 3.2x10(-7) and 2.1x10(-7) mol/L in milk samples, respectively. The recoveries of three analytes at different concentration levels in urine, milk and drugs are between 90.0 and 104.7%. The proposed method was successfully applied to the determination of three analytes in human urine, milk and drugs.
A review of blood sample handling and pre-processing for metabolomics studies.
Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta
2017-09-01
Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Tang, Xiaolin Charlie; Nail, Steven L; Pikal, Michael J
2006-01-01
The purpose of this work was to study the factors that may cause systematic errors in the manometric temperature measurement (MTM) procedure used to determine product dry-layer resistance to vapor flow. Product temperature and dry-layer resistance were obtained using MTM software installed on a laboratory freeze-dryer. The MTM resistance values were compared with the resistance values obtained using the "vial method." The product dry-layer resistances obtained by MTM, assuming fixed temperature difference (DeltaT; 2 degrees C), were lower than the actual values, especially when the product temperatures and sublimation rates were low, but with DeltaT determined from the pressure rise data, more accurate results were obtained. MTM resistance values were generally lower than the values obtained with the vial method, particularly whenever freeze-drying was conducted under conditions that produced large variations in product temperature (ie, low shelf temperature, low chamber pressure, and without thermal shields). In an experiment designed to magnify temperature heterogeneity, MTM resistance values were much lower than the simple average of the product resistances. However, in experiments where product temperatures were homogenous, good agreement between MTM and "vial-method" resistances was obtained. The reason for the low MTM resistance problem is the fast vapor pressure rise from a few "warm" edge vials or vials with low resistance. With proper use of thermal shields, and the evaluation of DeltaT from the data, MTM resistance data are accurate. Thus, the MTM method for determining dry-layer resistance is a useful tool for freeze-drying process analytical technology.
Optimized approaches for quantification of drug transporters in tissues and cells by MRM proteomics.
Prasad, Bhagwat; Unadkat, Jashvant D
2014-07-01
Drug transporter expression in tissues (in vivo) usually differs from that in cell lines used to measure transporter activity (in vitro). Therefore, quantification of transporter expression in tissues and cell lines is important to develop scaling factor for in vitro to in vivo extrapolation (IVIVE) of transporter-mediated drug disposition. Since traditional immunoquantification methods are semiquantitative, targeted proteomics is now emerging as a superior method to quantify proteins, including membrane transporters. This superiority is derived from the selectivity, precision, accuracy, and speed of analysis by liquid chromatography tandem mass spectrometry (LC-MS/MS) in multiple reaction monitoring (MRM) mode. Moreover, LC-MS/MS proteomics has broader applicability because it does not require selective antibodies for individual proteins. There are a number of recent research and review papers that discuss the use of LC-MS/MS for transporter quantification. Here, we have compiled from the literature various elements of MRM proteomics to provide a comprehensive systematic strategy to quantify drug transporters. This review emphasizes practical aspects and challenges in surrogate peptide selection, peptide qualification, peptide synthesis and characterization, membrane protein isolation, protein digestion, sample preparation, LC-MS/MS parameter optimization, method validation, and sample analysis. In particular, bioinformatic tools used in method development and sample analysis are discussed in detail. Various pre-analytical and analytical sources of variability that should be considered during transporter quantification are highlighted. All these steps are illustrated using P-glycoprotein (P-gp) as a case example. Greater use of quantitative transporter proteomics will lead to a better understanding of the role of drug transporters in drug disposition.
Health Informatics for Neonatal Intensive Care Units: An Analytical Modeling Perspective
Mench-Bressan, Nadja; McGregor, Carolyn; Pugh, James Edward
2015-01-01
The effective use of data within intensive care units (ICUs) has great potential to create new cloud-based health analytics solutions for disease prevention or earlier condition onset detection. The Artemis project aims to achieve the above goals in the area of neonatal ICUs (NICU). In this paper, we proposed an analytical model for the Artemis cloud project which will be deployed at McMaster Children’s Hospital in Hamilton. We collect not only physiological data but also the infusion pumps data that are attached to NICU beds. Using the proposed analytical model, we predict the amount of storage, memory, and computation power required for the system. Capacity planning and tradeoff analysis would be more accurate and systematic by applying the proposed analytical model in this paper. Numerical results are obtained using real inputs acquired from McMaster Children’s Hospital and a pilot deployment of the system at The Hospital for Sick Children (SickKids) in Toronto. PMID:27170907
Big data and visual analytics in anaesthesia and health care.
Simpao, A F; Ahumada, L M; Rehman, M A
2015-09-01
Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Jin, Hui; Gui, Rijun; Yu, Jianbo; Lv, Wei; Wang, Zonghua
2017-05-15
Previously developed electrochemical biosensors with single-electric signal output are probably affected by intrinsic and extrinsic factors. In contrast, the ratiometric electrochemical biosensors (RECBSs) with dual-electric signal outputs have an intrinsic built-in correction to the effects from system or background electric signals, and therefore exhibit a significant potential to improve the accuracy and sensitivity in electrochemical sensing applications. In this review, we systematically summarize the fabrication strategies, sensing modes and analytical applications of RECBSs. First, the different fabrication strategies of RECBSs were introduced, referring to the analytes-induced single- and dual-dependent electrochemical signal strategies for RECBSs. Second, the different sensing modes of RECBSs were illustrated, such as differential pulse voltammetry, square wave voltammetry, cyclic voltammetry, alternating current voltammetry, electrochemiluminescence, and so forth. Third, the analytical applications of RECBSs were discussed based on the types of target analytes. Finally, the forthcoming development and future prospects in the research field of RECBSs were also highlighted. Copyright © 2017 Elsevier B.V. All rights reserved.
Flegal, Katherine M; Ioannidis, John P A
2017-08-01
Meta-analyses of individual participant data (MIPDs) offer many advantages and are considered the highest level of evidence. However, MIPDs can be seriously compromised when they are not solidly founded upon a systematic review. These data-intensive collaborative projects may be led by experts who already have deep knowledge of the literature in the field and of the results of published studies and how these results vary based on different analytical approaches. If investigators tailor the searches, eligibility criteria, and analysis plan of the MIPD, they run the risk of reaching foregone conclusions. We exemplify this potential bias in a MIPD on the association of body mass index with mortality conducted by a collaboration of outstanding and extremely knowledgeable investigators. Contrary to a previous meta-analysis of group data that used a systematic review approach, the MIPD did not seem to use a formal search: it considered 239 studies, of which the senior author was previously aware of at least 238, and it violated its own listed eligibility criteria to include those studies and exclude other studies. It also preferred an analysis plan that was also known to give a specific direction of effects in already published results of most of the included evidence. MIPDs where results of constituent studies are already largely known need safeguards to their validity. These may include careful systematic searches, adherence to the Preferred Reporting Items for Systematic Review and Meta-Analyses of individual participant data guidelines, and exploration of the robustness of results with different analyses. They should also avoid selective emphasis on foregone conclusions based on previously known results with specific analytical choices. Copyright © 2017 Elsevier Inc. All rights reserved.
The Role of Nanoparticle Design in Determining Analytical Performance of Lateral Flow Immunoassays.
Zhan, Li; Guo, Shuang-Zhuang; Song, Fayi; Gong, Yan; Xu, Feng; Boulware, David R; McAlpine, Michael C; Chan, Warren C W; Bischof, John C
2017-12-13
Rapid, simple, and cost-effective diagnostics are needed to improve healthcare at the point of care (POC). However, the most widely used POC diagnostic, the lateral flow immunoassay (LFA), is ∼1000-times less sensitive and has a smaller analytical range than laboratory tests, requiring a confirmatory test to establish truly negative results. Here, a rational and systematic strategy is used to design the LFA contrast label (i.e., gold nanoparticles) to improve the analytical sensitivity, analytical detection range, and antigen quantification of LFAs. Specifically, we discovered that the size (30, 60, or 100 nm) of the gold nanoparticles is a main contributor to the LFA analytical performance through both the degree of receptor interaction and the ultimate visual or thermal contrast signals. Using the optimal LFA design, we demonstrated the ability to improve the analytical sensitivity by 256-fold and expand the analytical detection range from 3 log 10 to 6 log 10 for diagnosing patients with inflammatory conditions by measuring C-reactive protein. This work demonstrates that, with appropriate design of the contrast label, a simple and commonly used diagnostic technology can compete with more expensive state-of-the-art laboratory tests.
Huedo-Medina, Tania B; Garcia, Marissa; Bihuniak, Jessica D; Kenny, Anne; Kerstetter, Jane
2016-03-01
Several systematic reviews/meta-analyses published within the past 10 y have examined the associations of Mediterranean-style diets (MedSDs) on cardiovascular disease (CVD) risk. However, these reviews have not been evaluated for satisfying contemporary methodologic quality standards. This study evaluated the quality of recent systematic reviews/meta-analyses on MedSD and CVD risk outcomes by using an established methodologic quality scale. The relation between review quality and impact per publication value of the journal in which the article had been published was also evaluated. To assess compliance with current standards, we applied a modified version of the Assessment of Multiple Systematic Reviews (AMSTARMedSD) quality scale to systematic reviews/meta-analyses retrieved from electronic databases that had met our selection criteria: 1) used systematic or meta-analytic procedures to review the literature, 2) examined MedSD trials, and 3) had MedSD interventions independently or combined with other interventions. Reviews completely satisfied from 8% to 75% of the AMSTARMedSD items (mean ± SD: 31.2% ± 19.4%), with those published in higher-impact journals having greater quality scores. At a minimum, 60% of the 24 reviews did not disclose full search details or apply appropriate statistical methods to combine study findings. Only 5 of the reviews included participant or study characteristics in their analyses, and none evaluated MedSD diet characteristics. These data suggest that current meta-analyses/systematic reviews evaluating the effect of MedSD on CVD risk do not fully comply with contemporary methodologic quality standards. As a result, there are more research questions to answer to enhance our understanding of how MedSD affects CVD risk or how these effects may be modified by the participant or MedSD characteristics. To clarify the associations between MedSD and CVD risk, future meta-analyses and systematic reviews should not only follow methodologic quality standards but also include more statistical modeling results when data allow. © 2016 American Society for Nutrition.
Chikayama, Eisuke; Suto, Michitaka; Nishihara, Takashi; Shinozaki, Kazuo; Hirayama, Takashi; Kikuchi, Jun
2008-01-01
Background Metabolic phenotyping has become an important ‘bird's-eye-view’ technology which can be applied to higher organisms, such as model plant and animal systems in the post-genomics and proteomics era. Although genotyping technology has expanded greatly over the past decade, metabolic phenotyping has languished due to the difficulty of ‘top-down’ chemical analyses. Here, we describe a systematic NMR methodology for stable isotope-labeling and analysis of metabolite mixtures in plant and animal systems. Methodology/Principal Findings The analysis method includes a stable isotope labeling technique for use in living organisms; a systematic method for simultaneously identifying a large number of metabolites by using a newly developed HSQC-based metabolite chemical shift database combined with heteronuclear multidimensional NMR spectroscopy; Principal Components Analysis; and a visualization method using a coarse-grained overview of the metabolic system. The database contains more than 1000 1H and 13C chemical shifts corresponding to 142 metabolites measured under identical physicochemical conditions. Using the stable isotope labeling technique in Arabidopsis T87 cultured cells and Bombyx mori, we systematically detected >450 HSQC peaks in each 13C-HSQC spectrum derived from model plant, Arabidopsis T87 cultured cells and the invertebrate animal model Bombyx mori. Furthermore, for the first time, efficient 13C labeling has allowed reliable signal assignment using analytical separation techniques such as 3D HCCH-COSY spectra in higher organism extracts. Conclusions/Significance Overall physiological changes could be detected and categorized in relation to a critical developmental phase change in B. mori by coarse-grained representations in which the organization of metabolic pathways related to a specific developmental phase was visualized on the basis of constituent changes of 56 identified metabolites. Based on the observed intensities of 13C atoms of given metabolites on development-dependent changes in the 56 identified 13C-HSQC signals, we have determined the changes in metabolic networks that are associated with energy and nitrogen metabolism. PMID:19030231
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2011 CFR
2011-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2014 CFR
2014-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2012 CFR
2012-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2013 CFR
2013-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasgupta-Schubert, N.; Reyes, M. A.; Tamez, V. A.
2009-04-20
Alpha decay is one of the two main decay modes of the heaviest nuclei, (SHE), and constitutes one of the dominant decay modes of highly neutron deficient medium mass nuclei ('exotics'). Thus identifying and characterizing the alpha decay chains form a crucial part of the identification of SHE. We report the extension of the previously developed method for the detailed and systematic investigation of the reliability of the three main extant analytical formulae of alpha decay half-lives: the generalized liquid drop model based formula of Royer et al. (FR), the Sobiczewski modified semi-empirical Viola-Seaborg formula (VSS) and the recent phenomenologicalmore » formula of Sobiczewski and Parkhomenko (SP)« less
Teunissen, S F; Rosing, H; Seoane, M Dominguez; Brunsveld, L; Schellens, J H M; Schinkel, A H; Beijnen, J H
2011-06-01
A comprehensive overview is presented of currently known phase I metabolites of tamoxifen consisting of their systematic name and molecular structure. Reference standards are utilized to elucidate the MS(n) fragmentation patterns of these metabolites using a linear ion trap mass spectrometer. UV-absorption spectra are recorded and absorption maxima are defined. Serum extracts from ten breast cancer patients receiving 40mg tamoxifen once daily were qualitatively analyzed for tamoxifen phase I metabolites using a liquid chromatography-tandem mass spectrometry set-up. In total, 19 metabolites have been identified in these serum samples. Additionally a synthetic method for the preparation of the putative metabolite 3',4'-dihydroxytamoxifen is described. Copyright © 2011 Elsevier B.V. All rights reserved.
The growth and in situ characterization of chemical vapor deposited SiO2
NASA Technical Reports Server (NTRS)
Iyer, R.; Chang, R. R.; Lile, D. L.
1987-01-01
This paper reports the results of studies of the kinetics of remote (indirect) plasma enhanced low pressure CVD growth of SiO2 on Si and InP and of the in situ characterization of the electrical surface properties of InP during CVD processing. In the latter case photoluminescence was employed as a convenient and sensitive noninvasive method for characterizing surface trap densities. It was determined that, provided certain precautions are taken, the growth of SiO2 occurs in a reproducible and systematic fashion that can be expressed in an analytic form useful for growth rate prediction. Moreover, the in situ photoluminescence studies have yielded information on sample degradation resulting from heating and chemical exposure during the CVD growth.
NASA Astrophysics Data System (ADS)
Long, Kailin; Du, Deyang; Luo, Xiaoguang; Zhao, Weiwei; Wu, Zhangting; Si, Lifang; Qiu, Teng
2014-08-01
This work reports a facile method to fabricate gold coated copper(II) hydroxide pine-needle-like micro/nanostructures for surface-enhanced Raman scattering (SERS) application. The effects of reaction parameters on the shape, size and surface morphology of the products are systematically investigated. The as-prepared 3D hierarchical structures have the advantage of a large surface area available for the formation of hot spots and the adsorption of target analytes, thus dramatically improving the Raman signals. The finite difference time domain calculations indicate that the pine-needle-like model pattern may demonstrate a high quality SERS property owing to the high density and abundant hot spot characteristic in closely spaced needle-like arms.
Functional Detachment of Totalitarian Nazi Architecture
NASA Astrophysics Data System (ADS)
Antoszczyszyn, Marek
2017-10-01
The paper describes the systematization process of architectural styles in use during Nazi period in Germany between 1933-45. In the results of the research some regularity about strict concern between function & styling has been observed. Using comparison & case study as well as analytical methods there were pointed out characteristic features of more than 500 objects’ architectural appearance that helped to specify their styling & group them into architectural trends. Ultimately the paper proves that the found trends of architectural styling could be collected by functional detachment key. This observation explains easy to recognize even nowadays traceability - so characteristic to Nazi German architecture. Facing today pluralism in architecture, the findings could be a helpful key in the organization of spatial architectural identification process.
Mid-infrared spectroscopy for characterization of Baltic amber (succinite)
NASA Astrophysics Data System (ADS)
Wagner-Wysiecka, Ewa
2018-05-01
Natural Baltic amber (succinite) is the most appreciated fossil resin of the rich cultural traditions dating back to prehistoric times. Its unequivocal identification is extremely important in many branches of science and trades including archeology, paleontology, chemistry and finally mineralogical and gemological societies. Current methods of modification of natural succinite are more and more sophisticated making the identification of natural Baltic amber often challenging. In article the systematic analytical approach for identification of natural and modified under different conditions succinite, using mid-infrared spectroscopy (transmission, Drifts and ATR techniques) is presented. The correlation between spectral characteristics and properties of succinite is discussed pointing that the understanding of the nature of changes is the key of identification of this precious material.
Cultivating Discontinuity: Pentecostal Pedagogies of Yielding and Control
ERIC Educational Resources Information Center
Brahinsky, Josh
2013-01-01
Exploring missionary study at an Assemblies of God Bible college through ethnography and training manuals demonstrates systematic pedagogies that cultivate sensory capabilities encouraging yielding, opening to rupture, and constraint. Ritual theory and the Anthropology of Christianity shift analytic scales to include "cultivation," a…
Comments on an Analytical Thermal Agglomeration for Problems with Surface Growth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, N. E.
2017-03-22
Up until Dec 2016, the thermal agglomeration was very heuristic, and as such, difficult to define. The lack of predictability became problematic, and the current notes represent the first real attempt to systematize the specification of the agglomerated process parameters.
Trace analysis of surfactants in Corexit oil dispersant formulations and seawater
NASA Astrophysics Data System (ADS)
Place, Benjamin J.; Perkins, Matt J.; Sinclair, Ewan; Barsamian, Adam L.; Blakemore, Paul R.; Field, Jennifer A.
2016-07-01
After the April 2010 explosion on the Deepwater Horizon oil rig, and subsequent release of millions of barrels of oil, two Corexit oil dispersant formulations were used in unprecedented quantities both on the surface and sub-surface of the Gulf of Mexico. Although the dispersant formulations contain four classes of surfactants, current studies to date focus on the anionic surfactant, bis-(2-ethylhexyl) sulfosuccinate (DOSS). Factors affecting the integrity of environmental and laboratory samples for Corexit analysis have not been systematically investigated. For this reason, a quantitative analytical method was developed for the detection of all four classes of surfactants, as well as the hydrolysis products of DOSS, the enantiomeric mixture of α- and β-ethylhexyl sulfosuccinate (α-/β-EHSS). The analytical method was then used to evaluate which practices for sample collection, storage, and analysis resulted in high quality data. Large volume, direct injection of seawater followed by liquid chromatography tandem mass spectrometry (LC-MS/MS) minimized analytical artifacts, analysis time, and both chemical and solid waste. Concentrations of DOSS in the seawater samples ranged from 71 to 13,000 ng/L, while the nonionic surfactants including Span 80, Tween 80, Tween 85 were detected infrequently (26% of samples) at concentrations from 840 to 9100 ng/L. The enantiomers α-/β-EHSS were detected in seawater, at concentrations from 200 to 1900 ng/L, and in both Corexit dispersant formulations, indicating α-/β-EHSS were applied to the oil spill and may be not unambiguous indicator of DOSS degradation. Best practices are provided to ensure sample integrity and data quality for environmental monitoring studies and laboratory that require the detection and quantification of Corexit-based surfactants in seawater.
The Gender Analysis Tools Applied in Natural Disasters Management: A Systematic Literature Review
Sohrabizadeh, Sanaz; Tourani, Sogand; Khankeh, Hamid Reza
2014-01-01
Background: Although natural disasters have caused considerable damages around the world, and gender analysis can improve community disaster preparedness or mitigation, there is little research about the gendered analytical tools and methods in communities exposed to natural disasters and hazards. These tools evaluate gender vulnerability and capacity in pre-disaster and post-disaster phases of the disaster management cycle. Objectives: Identifying the analytical gender tools and the strengths and limitations of them as well as determining gender analysis studies which had emphasized on the importance of using gender analysis in disasters. Methods: The literature search was conducted in June 2013 using PubMed, Web of Sciences, ProQuest Research Library, World Health Organization Library, Gender and Disaster Network (GDN) archive. All articles, guidelines, fact sheets and other materials that provided an analytical framework for a gender analysis approach in disasters were included and the non-English documents as well as gender studies of non-disasters area were excluded. Analysis of the included studies was done separately by descriptive and thematic analyses. Results: A total of 207 documents were retrieved, of which only nine references were included. Of these, 45% were in form of checklist, 33% case study report, and the remaining 22% were article. All selected papers were published within the period 1994-2012. Conclusions: A focus on women’s vulnerability in the related research and the lack of valid and reliable gender analysis tools were considerable issues identified by the literature review. Although non-English literatures with English abstract were included in the study, the possible exclusion of non-English ones was found as the limitation of this study. PMID:24678441
Meta-analysis is not an exact science: Call for guidance on quantitative synthesis decisions.
Haddaway, Neal R; Rytwinski, Trina
2018-05-01
Meta-analysis is becoming increasingly popular in the field of ecology and environmental management. It increases the effective power of analyses relative to single studies, and allows researchers to investigate effect modifiers and sources of heterogeneity that could not be easily examined within single studies. Many systematic reviewers will set out to conduct a meta-analysis as part of their synthesis, but meta-analysis requires a niche set of skills that are not widely held by the environmental research community. Each step in the process of carrying out a meta-analysis requires decisions that have both scientific and statistical implications. Reviewers are likely to be faced with a plethora of decisions over which effect size to choose, how to calculate variances, and how to build statistical models. Some of these decisions may be simple based on appropriateness of the options. At other times, reviewers must choose between equally valid approaches given the information available to them. This presents a significant problem when reviewers are attempting to conduct a reliable synthesis, such as a systematic review, where subjectivity is minimised and all decisions are documented and justified transparently. We propose three urgent, necessary developments within the evidence synthesis community. Firstly, we call on quantitative synthesis experts to improve guidance on how to prepare data for quantitative synthesis, providing explicit detail to support systematic reviewers. Secondly, we call on journal editors and evidence synthesis coordinating bodies (e.g. CEE) to ensure that quantitative synthesis methods are adequately reported in a transparent and repeatable manner in published systematic reviews. Finally, where faced with two or more broadly equally valid alternative methods or actions, reviewers should conduct multiple analyses, presenting all options, and discussing the implications of the different analytical approaches. We believe it is vital to tackle the possible subjectivity in quantitative synthesis described herein to ensure that the extensive efforts expended in producing systematic reviews and other evidence synthesis products is not wasted because of a lack of rigour or reliability in the final synthesis step. Copyright © 2018 Elsevier Ltd. All rights reserved.
Quality of Big Data in health care.
Sukumar, Sreenivas R; Natarajan, Ramachandran; Ferrell, Regina K
2015-01-01
The current trend in Big Data analytics and in particular health information technology is toward building sophisticated models, methods and tools for business, operational and clinical intelligence. However, the critical issue of data quality required for these models is not getting the attention it deserves. The purpose of this paper is to highlight the issues of data quality in the context of Big Data health care analytics. The insights presented in this paper are the results of analytics work that was done in different organizations on a variety of health data sets. The data sets include Medicare and Medicaid claims, provider enrollment data sets from both public and private sources, electronic health records from regional health centers accessed through partnerships with health care claims processing entities under health privacy protected guidelines. Assessment of data quality in health care has to consider: first, the entire lifecycle of health data; second, problems arising from errors and inaccuracies in the data itself; third, the source(s) and the pedigree of the data; and fourth, how the underlying purpose of data collection impact the analytic processing and knowledge expected to be derived. Automation in the form of data handling, storage, entry and processing technologies is to be viewed as a double-edged sword. At one level, automation can be a good solution, while at another level it can create a different set of data quality issues. Implementation of health care analytics with Big Data is enabled by a road map that addresses the organizational and technological aspects of data quality assurance. The value derived from the use of analytics should be the primary determinant of data quality. Based on this premise, health care enterprises embracing Big Data should have a road map for a systematic approach to data quality. Health care data quality problems can be so very specific that organizations might have to build their own custom software or data quality rule engines. Today, data quality issues are diagnosed and addressed in a piece-meal fashion. The authors recommend a data lifecycle approach and provide a road map, that is more appropriate with the dimensions of Big Data and fits different stages in the analytical workflow.
Albuquerque De Almeida, Fernando; Al, Maiwenn; Koymans, Ron; Caliskan, Kadir; Kerstens, Ankie; Severens, Johan L
2018-04-01
Describing the general and methodological characteristics of decision-analytical models used in the economic evaluation of early warning systems for the management of chronic heart failure patients and performing a quality assessment of their methodological characteristics is expected to provide concise and useful insight to inform the future development of decision-analytical models in the field of heart failure management. Areas covered: The literature on decision-analytical models for the economic evaluation of early warning systems for the management of chronic heart failure patients was systematically reviewed. Nine electronic databases were searched through the combination of synonyms for heart failure and sensitive filters for cost-effectiveness and early warning systems. Expert commentary: The retrieved models show some variability with regards to their general study characteristics. Overall, they display satisfactory methodological quality, even though some points could be improved, namely on the consideration and discussion of any competing theories regarding model structure and disease progression, identification of key parameters and the use of expert opinion, and uncertainty analyses. A comprehensive definition of early warning systems and further research under this label should be pursued. To improve the transparency of economic evaluation publications, authors should make available detailed technical information regarding the published models.
NASA Technical Reports Server (NTRS)
Estefan, J. A.; Thurman, S. W.
1992-01-01
An approximate six-parameter analytic model for Earth-based differential range measurements is presented and is used to derive a representative analytic approximation for differenced Doppler measurements. The analytical models are tasked to investigate the ability of these data types to estimate spacecraft geocentric angular motion, Deep Space Network station oscillator (clock/frequency) offsets, and signal-path calibration errors over a period of a few days, in the presence of systematic station location and transmission media calibration errors. Quantitative results indicate that a few differenced Doppler plus ranging passes yield angular position estimates with a precision on the order of 0.1 to 0.4 micro-rad, and angular rate precision on the order of 10 to 25 x 10(exp -12) rad/sec, assuming no a priori information on the coordinate parameters. Sensitivity analyses suggest that troposphere zenith delay calibration error is the dominant systematic error source in most of the tracking scenarios investigated; as expected, the differenced Doppler data were found to be much more sensitive to troposphere calibration errors than differenced range. By comparison, results computed using wideband and narrowband (delta) VLBI under similar circumstances yielded angular precisions of 0.07 to 0.4 micro-rad, and angular rate precisions of 0.5 to 1.0 x 10(exp -12) rad/sec.
NASA Technical Reports Server (NTRS)
Estefan, J. A.; Thurman, S. W.
1992-01-01
An approximate six-parameter analytic model for Earth-based differenced range measurements is presented and is used to derive a representative analytic approximation for differenced Doppler measurements. The analytical models are tasked to investigate the ability of these data types to estimate spacecraft geocentric angular motion, Deep Space Network station oscillator (clock/frequency) offsets, and signal-path calibration errors over a period of a few days, in the presence of systematic station location and transmission media calibration errors. Quantitative results indicate that a few differenced Doppler plus ranging passes yield angular position estimates with a precision on the order of 0.1 to 0.4 microrad, and angular rate precision on the order of 10 to 25(10)(exp -12) rad/sec, assuming no a priori information on the coordinate parameters. Sensitivity analyses suggest that troposphere zenith delay calibration error is the dominant systematic error source in most of the tracking scenarios investigated; as expected, the differenced Doppler data were found to be much more sensitive to troposphere calibration errors than differenced range. By comparison, results computed using wide band and narrow band (delta)VLBI under similar circumstances yielded angular precisions of 0.07 to 0.4 /microrad, and angular rate precisions of 0.5 to 1.0(10)(exp -12) rad/sec.
Nowak, Peter
2011-03-01
There is a broad range qualitative linguistic research (sequential analysis) on doctor-patient interaction that had only a marginal impact on clinical research and practice. At least in parts this is due to the lack of qualitative research synthesis in the field. Available research summaries are not systematic in their methodology. This paper proposes a synthesis methodology for qualitative, sequential analytic research on doctor-patient interaction. The presented methodology is not new but specifies standard methodology of qualitative research synthesis for sequential analytic research. This pilot review synthesizes twelve studies on German-speaking doctor-patient interactions, identifies 45 verbal actions of doctors and structures them in a systematics of eight interaction components. Three interaction components ("Listening", "Asking for information", and "Giving information") seem to be central and cover two thirds of the identified action types. This pilot review demonstrates that sequential analytic research can be synthesized in a consistent and meaningful way, thus providing a more comprehensive and unbiased integration of research. Future synthesis of qualitative research in the area of health communication research is very much needed. Qualitative research synthesis can support the development of quantitative research and of educational materials in medical training and patient training. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Che, Yanyun; Wang, Zhibin; Zhu, Zhiyun; Ma, Yangyang; Zhang, Yaqiong; Gu, Wen; Zhang, Jiayu; Rao, Gaoxiong
2016-12-16
Kuding tea, the leaves of Ilex Kudingcha C.J. Tseng, has been applied for treating obesity, hypertension, cardiovascular disease, hyperlipidemia, and so on. The chlorogenic acids (CGAs) in Kuding tea have shown excellent antioxidative, antiobesity, anti-atherosclerotic and anticancer activities. Nevertheless, the chemical profiles of CGAs in Kuding tea have not been comprehensively studied yet, which hinders further quality control. In the present study, a sensitive ultra-high-performance liquid chromatography-diode array detection coupled with a linear ion trap-Orbitrap (UHPLC-DAD-LTQ-Orbitrap) method was established to screen and identify CGAs in Kuding tea. Six CGA standards were first analyzed in negative ion mode with a CID-MS/MS experiment and then the diagnostic product ions (DPIs) were summarized. According to the retention behavior in the RP-ODS column, accurate mass measurement, DPIs and relevant bibliography data, a total of 68 CGA candidates attributed to 12 categories were unambiguously or preliminarily screened and characterized within 18 min of chromatographic time. This was the first systematic report on the distribution of CGAs in Kuding tea. Meanwhile, the contents of 6 major CGAs in Kuding tea were also determined by the UHPLC-DAD method. All the results indicated that the established analytical method could be employed as an effective technique for the comprehensive and systematic characterization of CGAs and quality control of the botanic extracts or Chinese medicinal formulas that contain various CGAs.
Systematic Review of the Use of Online Questionnaires among the Geriatric Population
Remillard, Meegan L.; Mazor, Kathleen M.; Cutrona, Sarah L.; Gurwitz, Jerry H.; Tjia, Jennifer
2014-01-01
Background/Objectives The use of internet-based questionnaires to collect information from older adults is not well established. This systematic literature review of studies using online questionnaires in older adult populations aims to 1. describe methodologic approaches to population targeting and sampling and 2. summarize limitations of Internet-based questionnaires in geriatric populations. Design, Setting, Participants We identified English language articles using search terms for geriatric, age 65 and over, Internet survey, online survey, Internet questionnaire, and online questionnaire in PubMed and EBSCO host between 1984 and July 2012. Inclusion criteria were: study population mean age ≥65 years old and use of an online questionnaire for research. Review of 336 abstracts yielded 14 articles for full review by 2 investigators; 11 articles met inclusion criteria. Measurements Articles were extracted for study design and setting, patient characteristics, recruitment strategy, country, and study limitations. Results Eleven (11) articles were published after 2001. Studies had populations with a mean age of 65 to 78 years, included descriptive and analytical designs, and were conducted in the United States, Australia, and Japan. Recruiting methods varied widely from paper fliers and personal emails to use of consumer marketing panels. Investigator-reported study limitations included the use of small convenience samples and limited generalizability. Conclusion Online questionnaires are a feasible method of surveying older adults in some geographic regions and for some subsets of older adults, but limited Internet access constrains recruiting methods and often limits study generalizability. PMID:24635138
Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation
NASA Astrophysics Data System (ADS)
Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.
2016-07-01
The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.
Bjerring, Morten; Jain, Sheetal; Paaske, Berit; Vinther, Joachim M; Nielsen, Niels Chr
2013-09-17
Rapid developments in solid-state NMR methodology have boosted this technique into a highly versatile tool for structural biology. The invention of increasingly advanced rf pulse sequences that take advantage of better hardware and sample preparation have played an important part in these advances. In the development of these new pulse sequences, researchers have taken advantage of analytical tools, such as average Hamiltonian theory or lately numerical methods based on optimal control theory. In this Account, we focus on the interplay between these strategies in the systematic development of simple pulse sequences that combines continuous wave (CW) irradiation with short pulses to obtain improved rf pulse, recoupling, sampling, and decoupling performance. Our initial work on this problem focused on the challenges associated with the increasing use of fully or partly deuterated proteins to obtain high-resolution, liquid-state-like solid-state NMR spectra. Here we exploit the overwhelming presence of (2)H in such samples as a source of polarization and to gain structural information. The (2)H nuclei possess dominant quadrupolar couplings which complicate even the simplest operations, such as rf pulses and polarization transfer to surrounding nuclei. Using optimal control and easy analytical adaptations, we demonstrate that a series of rotor synchronized short pulses may form the basis for essentially ideal rf pulse performance. Using similar approaches, we design (2)H to (13)C polarization transfer experiments that increase the efficiency by one order of magnitude over standard cross polarization experiments. We demonstrate how we can translate advanced optimal control waveforms into simple interleaved CW and rf pulse methods that form a new cross polarization experiment. This experiment significantly improves (1)H-(15)N and (15)N-(13)C transfers, which are key elements in the vast majority of biological solid-state NMR experiments. In addition, we demonstrate how interleaved sampling of spectra exploiting polarization from (1)H and (2)H nuclei can substantially enhance the sensitivity of such experiments. Finally, we present systematic development of (1)H decoupling methods where CW irradiation of moderate amplitude is interleaved with strong rotor-synchronized refocusing pulses. We show that these sequences remove residual cross terms between dipolar coupling and chemical shielding anisotropy more effectively and improve the spectral resolution over that observed in current state-of-the-art methods.
Mixed Methods in CAM Research: A Systematic Review of Studies Published in 2012
Bishop, Felicity L.; Holmes, Michelle M.
2013-01-01
Background. Mixed methods research uses qualitative and quantitative methods together in a single study or a series of related studies. Objectives. To review the prevalence and quality of mixed methods studies in complementary medicine. Methods. All studies published in the top 10 integrative and complementary medicine journals in 2012 were screened. The quality of mixed methods studies was appraised using a published tool designed for mixed methods studies. Results. 4% of papers (95 out of 2349) reported mixed methods studies, 80 of which met criteria for applying the quality appraisal tool. The most popular formal mixed methods design was triangulation (used by 74% of studies), followed by embedded (14%), sequential explanatory (8%), and finally sequential exploratory (5%). Quantitative components were generally of higher quality than qualitative components; when quantitative components involved RCTs they were of particularly high quality. Common methodological limitations were identified. Most strikingly, none of the 80 mixed methods studies addressed the philosophical tensions inherent in mixing qualitative and quantitative methods. Conclusions and Implications. The quality of mixed methods research in CAM can be enhanced by addressing philosophical tensions and improving reporting of (a) analytic methods and reflexivity (in qualitative components) and (b) sampling and recruitment-related procedures (in all components). PMID:24454489
Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...
Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.
Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
Systematic information processing style and perseverative worry.
Dash, Suzanne R; Meeten, Frances; Davey, Graham C L
2013-12-01
This review examines the theoretical rationale for conceiving of systematic information processing as a proximal mechanism for perseverative worry. Systematic processing is characterised by detailed, analytical thought about issue-relevant information, and in this way, is similar to the persistent, detailed processing of information that typifies perseverative worry. We review the key features and determinants of systematic processing, and examine the application of systematic processing to perseverative worry. We argue that systematic processing is a mechanism involved in perseverative worry because (1) systematic processing is more likely to be deployed when individuals feel that they have not reached a satisfactory level of confidence in their judgement and this is similar to the worrier's striving to feel adequately prepared, to have considered every possible negative outcome/detect all potential danger, and to be sure that they will successfully cope with perceived future problems; (2) systematic processing and worry are influenced by similar psychological cognitive states and appraisals; and (3) the functional neuroanatomy underlying systematic processing is located in the same brain regions that are activated during worrying. This proposed mechanism is derived from core psychological processes and offers a number of clinical implications, including the identification of psychological states and appraisals that may benefit from therapeutic interventions for worry-based problems. © 2013.
Laboratory services series: a programmed maintenance system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuxbury, D.C.; Srite, B.E.
1980-01-01
The diverse facilities, operations and equipment at a major national research and development laboratory require a systematic, analytical approach to operating equipment maintenance. A computer-scheduled preventive maintenance program is described including program development, equipment identification, maintenance and inspection instructions, scheduling, personnel, and equipment history.
21 CFR 862.1660 - Quality control material (assayed and unassayed).
Code of Federal Regulations, 2013 CFR
2013-04-01
... SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical Chemistry... control material (assayed and unassayed) for clinical chemistry is a device intended for medical purposes for use in a test system to estimate test precision and to detect systematic analytical deviations...
21 CFR 862.1660 - Quality control material (assayed and unassayed).
Code of Federal Regulations, 2014 CFR
2014-04-01
... SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical Chemistry... control material (assayed and unassayed) for clinical chemistry is a device intended for medical purposes for use in a test system to estimate test precision and to detect systematic analytical deviations...
21 CFR 862.1660 - Quality control material (assayed and unassayed).
Code of Federal Regulations, 2012 CFR
2012-04-01
... SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical Chemistry... control material (assayed and unassayed) for clinical chemistry is a device intended for medical purposes for use in a test system to estimate test precision and to detect systematic analytical deviations...
Assurance of Learning in the MIS Program
ERIC Educational Resources Information Center
Harper, Jeffrey S.; Harder, Joseph T.
2009-01-01
This article describes the development of a systematic and practical methodology for assessing program effectiveness and monitoring student development in undergraduate decision sciences programs. The model we present is based on a student's progression through learning stages associated with four key competencies: technical, analytical,…
NASA Astrophysics Data System (ADS)
Asadpour-Zeynali, Karim; Bastami, Mohammad
2010-02-01
In this work a new modification of the standard addition method called "net analyte signal standard addition method (NASSAM)" is presented for the simultaneous spectrofluorimetric and spectrophotometric analysis. The proposed method combines the advantages of standard addition method with those of net analyte signal concept. The method can be applied for the determination of analyte in the presence of known interferents. The accuracy of the predictions against H-point standard addition method is not dependent on the shape of the analyte and interferent spectra. The method was successfully applied to simultaneous spectrofluorimetric and spectrophotometric determination of pyridoxine (PY) and melatonin (MT) in synthetic mixtures and in a pharmaceutical formulation.
Propellant injection systems and processes
NASA Technical Reports Server (NTRS)
Ito, Jackson I.
1995-01-01
The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.
A continued fraction resummation form of bath relaxation effect in the spin-boson model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gong, Zhihao; Tang, Zhoufei; Wu, Jianlan, E-mail: jianlanwu@zju.edu.cn
2015-02-28
In the spin-boson model, a continued fraction form is proposed to systematically resum high-order quantum kinetic expansion (QKE) rate kernels, accounting for the bath relaxation effect beyond the second-order perturbation. In particular, the analytical expression of the sixth-order QKE rate kernel is derived for resummation. With higher-order correction terms systematically extracted from higher-order rate kernels, the resummed quantum kinetic expansion approach in the continued fraction form extends the Pade approximation and can fully recover the exact quantum dynamics as the expansion order increases.
Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach
NASA Astrophysics Data System (ADS)
Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh
2017-03-01
Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.
Zhou, Chan; Luo, Jian-Guang; Kong, Ling-Yi
2012-01-01
Desmodium styracifolium, with C-flavone glycosides as main pharmacological effective compounds, is a popular Chinese medicinal herb and has been used to treat urination disturbance, urolithiasis, edema and jaundice. However, few systematic methods have been reported on the quality control of this natural herb. To develop a method for control the quality of D. styracifolium by combining chromatographic fingerprints and major constituent quantification. Separations were performed on an Ultimate XB-C-18 column by gradient elution using acetonitrile and 0.1% aqueous formic acid. Analytes were identified by HPLC coupled with electrospray ionisation mass spectrometry experiments. Twenty common peaks in chromatographic fingerprints were first identified among 15 batches of D. styracifolium from various regions. On basis of this, a HPLC-PAD method was established to simultaneously quantify five major constituents, which was validated for limit of qualification, linearity and interday variation of precision and accuracy. The assay developed could be considered as a suitable quality control method of D. styracifolium. Copyright © 2011 John Wiley & Sons, Ltd.
Bukhari, Mahwish; Awan, M. Ali; Qazi, Ishtiaq A.; Baig, M. Anwar
2012-01-01
This paper illustrates systematic development of a convenient analytical method for the determination of chromium and cadmium in tannery wastewater using laser-induced breakdown spectroscopy (LIBS). A new approach was developed by which liquid was converted into solid phase sample surface using absorption paper for subsequent LIBS analysis. The optimized values of LIBS parameters were 146.7 mJ for chromium and 89.5 mJ for cadmium (laser pulse energy), 4.5 μs (delay time), 70 mm (lens to sample surface distance), and 7 mm (light collection system to sample surface distance). Optimized values of LIBS parameters demonstrated strong spectrum lines for each metal keeping the background noise at minimum level. The new method of preparing metal standards on absorption papers exhibited calibration curves with good linearity with correlation coefficients, R2 in the range of 0.992 to 0.998. The developed method was tested on real tannery wastewater samples for determination of chromium and cadmium. PMID:22567570
Extracting Effective Higgs Couplings in the Golden Channel
Chen, Yi; Vega-Morales, Roberto
2014-04-08
Kinematic distributions in Higgs decays to four charged leptons, the so called ‘golden channel, are a powerful probe of the tensor structure of its couplings to neutral electroweak gauge bosons. In this study we construct the first part of a comprehensive analysis framework designed to maximize the information contained in this channel in order to perform direct extraction of the various possible Higgs couplings. We first complete an earlier analytic calculation of the leading order fully differential cross sections for the golden channel signal and background to include the 4e and 4μ final states with interference between identical final states.more » We also examine the relative fractions of the different possible combinations of scalar-tensor couplings by integrating the fully differential cross section over all kinematic variables as well as show various doubly differential spectra for both the signal and background. From these analytic expressions we then construct a ‘generator level’ analysis framework based on the maximum likelihood method. Then, we demonstrate the ability of our framework to perform multi-parameter extractions of all the possible effective couplings of a spin-0 scalar to pairs of neutral electroweak gauge bosons including any correlations. Furthermore, this framework provides a powerful method for study of these couplings and can be readily adapted to include the relevant detector and systematic effects which we demonstrate in an accompanying study to follow.« less
Knudson, Marcus D.; Desjarlais, Michael P.; Pribram-Jones, Aurora
2015-06-15
Aluminum has been used prolifically as an impedance matching standard in the multimegabar regime (1 Mbar = 100 GPa), particularly in nuclear driven, early laser driven, and early magnetically driven flyer plate experiments. The accuracy of these impedance matching measurements depends upon the knowledge of both the Hugoniot and release or reshock response of aluminum. Here, we present the results of several adiabatic release measurements of aluminum from ~400–1200 GPa states along the principal Hugoniot using full density polymethylpentene (commonly known as TPX), and both ~190 and ~110 mg/cc silica aerogel standards. Additionally, these data were analyzed within the frameworkmore » of a simple, analytical model that was motivated by a first-principles molecular dynamics investigation into the release response of aluminum, as well as by a survey of the release response determined from several tabular equations of state for aluminum. Combined, this theoretical and experimental study provides a method to perform impedance matching calculations without the need to appeal to any tabular equation of state for aluminum. Furthermore, as an analytical model, this method allows for propagation of all uncertainty, including the random measurement uncertainties and the systematic uncertainties of the Hugoniot and release response of aluminum. This work establishes aluminum for use as a high-precision standard for impedance matching in the multimegabar regime.« less
NASA Technical Reports Server (NTRS)
Koch, Steven E.; Mcqueen, Jeffery T.
1987-01-01
A survey of various one- and two-way interactive nested grid techniques used in hydrostatic numerical weather prediction models is presented and the advantages and disadvantages of each method are discussed. The techniques for specifying the lateral boundary conditions for each nested grid scheme are described in detail. Averaging and interpolation techniques used when applying the coarse mesh grid (CMG) and fine mesh grid (FMG) interface conditions during two-way nesting are discussed separately. The survey shows that errors are commonly generated at the boundary between the CMG and FMG due to boundary formulation or specification discrepancies. Methods used to control this noise include application of smoothers, enhanced diffusion, or damping-type time integration schemes to model variables. The results from this survey provide the information needed to decide which one-way and two-way nested grid schemes merit future testing with the Mesoscale Atmospheric Simulation System (MASS) model. An analytically specified baroclinic wave will be used to conduct systematic tests of the chosen schemes since this will allow for objective determination of the interfacial noise in the kind of meteorological setting for which MASS is designed. Sample diagnostic plots from initial tests using the analytic wave are presented to illustrate how the model-generated noise is ascertained. These plots will be used to compare the accuracy of the various nesting schemes when incorporated into the MASS model.
Steuten, Lotte; van de Wetering, Gijs; Groothuis-Oudshoorn, Karin; Retèl, Valesca
2013-01-01
This article provides a systematic and critical review of the evolving methods and applications of value of information (VOI) in academia and practice and discusses where future research needs to be directed. Published VOI studies were identified by conducting a computerized search on Scopus and ISI Web of Science from 1980 until December 2011 using pre-specified search terms. Only full-text papers that outlined and discussed VOI methods for medical decision making, and studies that applied VOI and explicitly discussed the results with a view to informing healthcare decision makers, were included. The included papers were divided into methodological and applied papers, based on the aim of the study. A total of 118 papers were included of which 50 % (n = 59) are methodological. A rapidly accumulating literature base on VOI from 1999 onwards for methodological papers and from 2005 onwards for applied papers is observed. Expected value of sample information (EVSI) is the preferred method of VOI to inform decision making regarding specific future studies, but real-life applications of EVSI remain scarce. Methodological challenges to VOI are numerous and include the high computational demands, dealing with non-linear models and interdependency between parameters, estimations of effective time horizons and patient populations, and structural uncertainties. VOI analysis receives increasing attention in both the methodological and the applied literature bases, but challenges to applying VOI in real-life decision making remain. For many technical and methodological challenges to VOI analytic solutions have been proposed in the literature, including leaner methods for VOI. Further research should also focus on the needs of decision makers regarding VOI.
Communication about environmental health risks: a systematic review.
Fitzpatrick-Lewis, Donna; Yost, Jennifer; Ciliska, Donna; Krishnaratne, Shari
2010-11-01
Using the most effective methods and techniques for communicating risk to the public is critical. Understanding the impact that different types of risk communication have played in real and perceived public health risks can provide information about how messages, policies and programs can and should be communicated in order to be most effective. The purpose of this systematic review is to identify the effectiveness of communication strategies and factors that impact communication uptake related to environmental health risks. A systematic review of English articles using multiple databases with appropriate search terms. Data sources also included grey literature. Key organization websites and key journals were hand searched for relevant articles. Consultation with experts took place to locate any additional references.Articles had to meet relevance criteria for study design [randomized controlled trials, clinical controlled trials, cohort analytic, cohort, any pre-post, interrupted time series, mixed methods or any qualitative studies), participants (those in community-living, non-clinical populations), interventions (including, but not limited to, any community-based methods or tools such as Internet, telephone, media-based interventions or any combination thereof), and outcomes (reported measurable outcomes such as awareness, knowledge or attitudinal or behavioural change). Articles were assessed for quality and data was extracted using standardized tools by two independent reviewers. Articles were given an overall assessment of strong, moderate or weak quality. There were no strong or moderate studies. Meta-analysis was not appropriate to the data. Data for 24 articles were analyzed and reported in a narrative format. The findings suggest that a multi-media approach is more effective than any single media approach. Similarly, printed material that offers a combination of information types (i.e., text and diagrams) is a more effective than just a single type, such as all text. Findings also suggest that factors influencing response to risk communications are impacted by personal risk perception, previous personal experience with risk, sources of information and trust in those sources. No single method of message delivery is best. Risk communication strategies that incorporate the needs of the target audience(s) with a multi-faceted delivery method are most effective at reaching the audience.
Analysis of black carbon molecular markers by two chromatographic methods (GC-FID and HPLC-DAD)
NASA Astrophysics Data System (ADS)
Schneider, Maximilian P. W.; Smittenberg, Rienk H.; Dittmar, Thorsten; Schmidt, Michael W. I.
2010-05-01
The analysis of benzenepolycarboxylic acids (BPCA) as a quantitative measure for black carbon (BC) in soil and sediment samples is a well-established method [1, 2]. Briefly, the oxidation of polycondensated BC molecules forms seven molecular markers, which can be assigned to BC, and which subsequently can be quantified by GC-FID (gas chromatography with flame ionization detector). Recently this method has been refined for BC quantification in seawater samples measuring BPCA on HPLC-DAD (High performance liquid chromatography with diode array detector) [3]. However, a systematic comparison of BC as determined by both analytical techniques would be essential to the calculation of global BC budgets, but is lacking. Here we present data for the systematic comparison of the two BPCA methods, both for quantity and quality. We prepared chars under well-defined laboratory conditions. Chestnut hardwood chips and rice straw were pyrolysed at temperatures between 200 and 1000°C under constant N2 stream. The BC contents of the chars have been analysed using the BPCA extraction method followed by either GC-FID or HPLC-DAD quantification [4]. It appears that the GC-FID method yields systematically lower concentrations of BPCA in the chars compared to the HPLC-DAD method. Possible reasons for the observed difference are i) higher losses of sample material during preparation for GC-FID; ii) different quality of the linear regression used for quantification; iii) incomplete derivatisation of B5CA and B6CA, which is needed for GC-FID analysis. In a next step, we will test different derivatisation procedures (methylation with dimethyl sulfate or diazomethane, and silylation) for their influence on the GC-FID results. The aim of this study is to test if black carbon can be quantified in soil, sediment and water samples using one single method - a crucial step when attempting a global BC budget. References: [1] Brodowski, S., Rodionov, A., Haumeier L., Glaser, B., Amelung, W. (2005) Org. Geochem. 36, 1299-1310. [2] Glaser, B., Haumeier, L., Guggenberger, G., Zech, W. (1998) Org. Geochem. 29, 811-819. [3] Dittmar, T. (2008) Org. Geochem. 39. 396-407. [4] Schneider, M.P.W., Hilf, M., Vogt, U.F., Schmidt, M.W.I., Org. Geochem. (submitted)
ERIC Educational Resources Information Center
Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette
2017-01-01
Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…
Kothari, Bhaveshkumar H; Fahmy, Raafat; Claycamp, H Gregg; Moore, Christine M V; Chatterjee, Sharmista; Hoag, Stephen W
2017-05-01
The goal of this study was to utilize risk assessment techniques and statistical design of experiments (DoE) to gain process understanding and to identify critical process parameters for the manufacture of controlled release multiparticulate beads using a novel disk-jet fluid bed technology. The material attributes and process parameters were systematically assessed using the Ishikawa fish bone diagram and failure mode and effect analysis (FMEA) risk assessment methods. The high risk attributes identified by the FMEA analysis were further explored using resolution V fractional factorial design. To gain an understanding of the processing parameters, a resolution V fractional factorial study was conducted. Using knowledge gained from the resolution V study, a resolution IV fractional factorial study was conducted; the purpose of this IV study was to identify the critical process parameters (CPP) that impact the critical quality attributes and understand the influence of these parameters on film formation. For both studies, the microclimate, atomization pressure, inlet air volume, product temperature (during spraying and curing), curing time, and percent solids in the coating solutions were studied. The responses evaluated were percent agglomeration, percent fines, percent yield, bead aspect ratio, median particle size diameter (d50), assay, and drug release rate. Pyrobuttons® were used to record real-time temperature and humidity changes in the fluid bed. The risk assessment methods and process analytical tools helped to understand the novel disk-jet technology and to systematically develop models of the coating process parameters like process efficiency and the extent of curing during the coating process.